使用Python将非标准文件上传到Google云端硬盘的问题

问题描述 投票:0回答:1

我的组织正在将我们的现场网络驱动器转移到Google云端硬盘,我希望能够使该过程自动化。不是按行业开发人员,但我确实有编程经验。以前从未使用过Python或Google API,但我很喜欢挑战。但是,在执行过程中有些卡住了-我可以循环浏览所有文件和目录,而且我想我甚至找到了一种方法来正确映射整个文件系统。奇怪的是,我认为这将是一件很平常的事情,但是我还没有找到任何能做到这一点的代码。如果您知道将整个目录复制到Google云端硬盘的方法,那么所有子目录都将保留下来,请告诉我;我做了我自己的,有点像鞭子。但是,当我运行它时,它适用于某些文件类型,但如果遇到的文件类型不是txt,docx或xlsx,则崩溃并显示UnknownFileType错误。显然,需要转移其文件的人员将拥有所有类型的文件,因此根本不会这样做。虽然不确定如何解决。我认为,如果我设置mimeType元数据,则可以使单个文件工作,但是如果它在多个文件上运行,则无法手动设置mimeType。也许有另一种上传文件的方法,可以处理任何类型而无需知道mimeType?由于这是我第一次使用Python或Google API,因此我主要复制了他们在网站上和在其他位置找到的代码,然后对其进行了编辑以遍历我需要的所有文件。如果是一个奇怪的扩展名,则上传甚至无法在一个文件上进行。希望大家都能找到问题所在。这是相关的代码块。


for filename in filenames:
            print("Uploading file " + filename + " to " + folnames[i])
            file_metadata = {'name': filename,
                         'parents' : [folids[i]]}
            file = service.files().create(body=file_metadata,
                                        media_body=dirpath + "\\" + filename,
                                        fields='id').execute()
            print("Upload Complete")

感谢您的帮助。谢谢!

编辑:我将发布为测试单个文件上传而制作的迷你程序的完整代码。更改文件名以保护隐私

from __future__ import print_function
import pickle
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request

# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/drive']

def main():
    """Shows basic usage of the Drive v3 API.
    Prints the names and ids of the first 10 files the user has access to.
    """
    creds = None
    # The file token.pickle stores the user's access and refresh tokens, and is
    # created automatically when the authorization flow completes for the first
    # time.
    if os.path.exists('token.pickle'):
        with open('token.pickle', 'rb') as token:
            creds = pickle.load(token)
    # If there are no (valid) credentials available, let the user log in.
    if not creds or not creds.valid:
        if creds and creds.expired and creds.refresh_token:
            creds.refresh(Request())
        else:
            flow = InstalledAppFlow.from_client_secrets_file(
                'credentials.json', SCOPES)
            creds = flow.run_local_server(port=0)
        # Save the credentials for the next run
        with open('token.pickle', 'wb') as token:
            pickle.dump(creds, token)

    service = build('drive', 'v3', credentials=creds)


    file_metadata = {'name': 'FILENAME'}
    file = service.files().create(body=file_metadata,
                                        media_body='FILEPATH',
                                        fields='id').execute()
    print ("File ID: %s" % file.get('id'))

if __name__ == '__main__':
    main()

python google-drive-api
1个回答
0
投票

通过使用MimeTypes猜测媒体主体的mimetype和MediaFileUpload使其起作用。感谢大家的帮助和建议。

---
from __future__ import print_function
import pickle
import mimetypes
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from oauth2client.service_account import ServiceAccountCredentials
from apiclient.discovery import build
from apiclient.http import MediaFileUpload

# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/drive']

def main():
    """Shows basic usage of the Drive v3 API.
    Prints the names and ids of the first 10 files the user has access to.
    """
    creds = None
    # The file token.pickle stores the user's access and refresh tokens, and is
    # created automatically when the authorization flow completes for the first
    # time.
    if os.path.exists('token.pickle'):
        with open('token.pickle', 'rb') as token:
            creds = pickle.load(token)
    # If there are no (valid) credentials available, let the user log in.
    if not creds or not creds.valid:
        if creds and creds.expired and creds.refresh_token:
            creds.refresh(Request())
        else:
            flow = InstalledAppFlow.from_client_secrets_file(
                'credentials.json', SCOPES)
            creds = flow.run_local_server(port=0)
        # Save the credentials for the next run
        with open('token.pickle', 'wb') as token:
            pickle.dump(creds, token)

    service = build('drive', 'v3', credentials=creds)



    mime = mimetypes.MimeTypes().guess_type("FILE")[1]
    file_metadata = {'name': 'NAME',
                     'mimeType': mime}
    media = MediaFileUpload('FILE', mimetype = mime)
    file = service.files().create(body=file_metadata,
                                        media_body= media,
                                        fields='id').execute()
    print ("File ID: %s" % file.get('id'))

if __name__ == '__main__':
    main()
---
© www.soinside.com 2019 - 2024. All rights reserved.