python requests post 大文件和获取进度条

准备用HTTP 在局域网内 上传一些文件,文件一般都在1G左右

r = requests.post('****',
                 data={'path':'2016/07/08/5ASD5SDFASDFASDF/cad.zip'},
                 files={'file': open(filename, 'rb')}
                 )
                     

这样上传小文件可以, 但是上传大文件时候会py会报 memeryError的内存错误。
如何解决此问题呢?

阅读 19.1k
3 个回答
from requests_toolbelt import  *

m = MultipartEncoder(fields={'file': ('filename',open(f, 'rb'))},

                     boundary='---------------------------7de1ae242c06ca'
                    )

import time

def my_callback(monitor):
    # Your callback function
    print monitor.bytes_read

m = MultipartEncoderMonitor(m, my_callback)

req_headers = {'Content-Type': m.content_type,
               'path':'2016/07/09/5ASD5SDFASDFASDF/{}.zip'.format(time.time()),}

r = requests.post(url, data=m, headers=req_headers)

用了个扩展库,可以完美解决2个问题 requests_toolbelt

建设楼主使用requests的流式上传。
以下是一些说明

Streaming Uploads

Requests supports streaming uploads, which allow you to send large streams or files without reading them into memory. To stream and upload, simply provide a file-like object for your body:

with open('massive-body', 'rb') as f:
    requests.post('http://some.url/streamed', data=f)

http://www.python-requests.org/en/master/user/advanced/#streaming-uploads

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进
推荐问题