如何在 Python 3 上将文件上传到 Google Cloud Storage?

新手上路,请多包涵

如何从 Python 3 将文件上传到 Google Cloud Storage ?最终是 Python 2,如果它在 Python 3 中不可行的话。

我看了又看,但还没有找到真正有效的解决方案。我尝试了 boto ,但是当我尝试通过 gsutil config -e 生成必要的 .boto 文件时,它一直说我需要通过 gcloud auth login 配置身份验证。但是,我已经多次完成后者,但没有帮助。

原文由 aknuds1 发布,翻译遵循 CC BY-SA 4.0 许可协议

阅读 644
2 个回答

使用标准的 gcloud 库,它同时支持 Python 2 和 Python 3。

上传文件到云存储示例

from gcloud import storage
from oauth2client.service_account import ServiceAccountCredentials
import os

credentials_dict = {
    'type': 'service_account',
    'client_id': os.environ['BACKUP_CLIENT_ID'],
    'client_email': os.environ['BACKUP_CLIENT_EMAIL'],
    'private_key_id': os.environ['BACKUP_PRIVATE_KEY_ID'],
    'private_key': os.environ['BACKUP_PRIVATE_KEY'],
}
credentials = ServiceAccountCredentials.from_json_keyfile_dict(
    credentials_dict
)
client = storage.Client(credentials=credentials, project='myproject')
bucket = client.get_bucket('mybucket')
blob = bucket.blob('myfile')
blob.upload_from_filename('myfile')

原文由 aknuds1 发布,翻译遵循 CC BY-SA 4.0 许可协议

将文件上传到 gcloud 存储桶的简单函数。

 from google.cloud import storage
#pip install --upgrade google-cloud-storage.
def upload_to_bucket(blob_name, path_to_file, bucket_name):
    """ Upload data to a bucket"""

    # Explicitly use service account credentials by specifying the private key
    # file.
    storage_client = storage.Client.from_service_account_json(
        'creds.json')

    #print(buckets = list(storage_client.list_buckets())

    bucket = storage_client.get_bucket(bucket_name)
    blob = bucket.blob(blob_name)
    blob.upload_from_filename(path_to_file)

    #returns a public url
    return blob.public_url

您可以使用此链接生成凭证文件: https ://cloud.google.com/storage/docs/reference/libraries?authuser=1#client-libraries-install-python

异步示例:

 import asyncio
import aiohttp
# pip install aiofile
from aiofile import AIOFile
# pip install gcloud-aio-storage
from gcloud.aio.storage import Storage

BUCKET_NAME = '<bucket_name>'
FILE_NAME  = 'requirements.txt'
async def async_upload_to_bucket(blob_name, file_obj, folder='uploads'):
    """ Upload csv files to bucket. """
    async with aiohttp.ClientSession() as session:
        storage = Storage(service_file='./creds.json', session=session)
        status = await storage.upload(BUCKET_NAME, f'{folder}/{blob_name}', file_obj)
        #info of the uploaded file
        # print(status)
        return status['selfLink']


async def main():
    async with AIOFile(FILE_NAME, mode='r') as afp:
        f = await afp.read()
        url = await async_upload_to_bucket(FILE_NAME, f)
        print(url)

# Python 3.6
loop = asyncio.get_event_loop()
loop.run_until_complete(main())

# Python 3.7+
# asyncio.run(main())

原文由 adam shamsudeen 发布,翻译遵循 CC BY-SA 4.0 许可协议

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进
推荐问题