AWS recently announced increasing Lambda response streaming's maximum default payload size from 20 MB to 200 MB. This allows developers to directly stream large datasets, image-heavy PDF files, or music files within Lambda. For generative AI and data-intensive workloads, it's a game-changer, enabling handling of substantial data like ~200,000 pages of text or dozens of high-resolution images. However, API gateway is not yet supported; only Lambda function URLs currently. Ivo Pinto pointed out what this increase unlocks: for text, ~5M to 50M characters (~20K to 200K typical LLM tokens); for PDFs, ~200 to 2,000 pages with images; for images, ~20 to 200 high-res processed results; and for audio, ~3 to 30 minutes of processed/enhanced audio files. It eliminates complex chunking logic for outputs exceeding 20 MB. Lambda response streaming supports Node.js-managed runtimes and custom runtimes, and the 200 MB payload limit is the default in all AWS regions where Lambda response streaming is supported.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。