1

tornado 4.0 新加tornado.web.stream_request_body decorator ,用于stream request

Streaming uploads let you handle large requests without buffering everything into memory, but there is still generally some limits to what you're willing to handle. The max_buffer_size and max_body_size parameters are now separate, but they both default to 100MB. With streaming uploads, you can increase max_body_size as much as you want without increasing your memory requirements, but make sure you have enough disk space (or s3 budget, etc) to handle the uploads you'll get. You can even set max_body_size on a per-request basis by calling self.request.connection.set_max_body_size() from prepare()


import tornado.web
import tornado.ioloop

MB = 1024 * 1024
GB = 1024 * MB
TB = 1024 * GB

MAX_STREAMED_SIZE = 1*GB

@tornado.web.stream_request_body
class MainHandler(tornado.web.RequestHandler):
    def prepare(self):
        self.f = open("xxxxxxxx", "wb")
        
        # 如果不设max_body_size, 不能上传>100MB的文件
        self.request.connection.set_max_body_size(MAX_STREAMED_SIZE)
        
    def post(self):
        print("upload completed")
        self.f.close()

    def data_received(self, data):
        self.f.write(data)


if __name__ == "__main__":
    application = tornado.web.Application([
        (r"/", MainHandler),
    ])
    application.listen(7777)
    tornado.ioloop.IOLoop.instance().start()

tornado.web.stream_request_body 源码

测试:

curl -v -XPOST --data-binary @presto-server-0.144.2.tar.gz -127.0.0.1:7777/

dreambei
132 声望4 粉丝