POST large stdin as multipart/form-data with cURL

问题描述 投票:0回答:0

告诉 cURL 直接读取一个大(几千兆字节)文件 并将其发布为 multipart/form-data 作品:

$ # This works
$ curl localhost -F 'f=@large_file.txt'

但是,cURL 在尝试读取时失败 来自标准输入的相同数量的数据:

$ cat large_file.txt | curl localhost -F 'f=@-'
curl: option -F: is badly used here
curl: try 'curl --help' for more information

(实际上,我想做的是

tar
一个目录 并直接在 HTTP 请求中流式传输:
tar -cf - large_dir/ | curl localhost -F 'f=@-'
)

我认为这是因为 cURL 首先将所有标准输入存储到内存中 在请求中发送任何数据之前:

-F, --form <name=content>
    ...

    Tell curl to read content from stdin instead of a file by using
    - as filename. This goes for both @ and < constructs. When stdin
    is used, the contents is buffered in memory first by curl to
    determine its size and allow a possible resend. Defining a
    part's data from a named non-regular file (such as a named pipe
    or similar) is unfortunately not subject to buffering and will
    be effectively read at transmission time; since the full size is
    unknown before the transfer starts, such data is sent as chunks
    by HTTP and rejected by IMAP.

有没有办法让 cURL 构造 multipart/form-data 格式的请求主体 因为它从标准输入读取, 并将数据流式传输到服务器, 无需将其缓冲在内存中或将其保存在任何地方?

我不需要设置

Content-Length
标题。

http curl stream streaming multipartform-data
© www.soinside.com 2019 - 2024. All rights reserved.