Recently I’ve had the need to upload large files to a cloud storage back-end and was flabbergasted at the amount of memory required by curl to do this. At first I was trying to use curl via PHP and when uploading a 2G file, PHP would load the file into memory and then curl would do the same, consuming a total of 4G of ram for a very long time. This might be ok on someone’s desktop but not on a server thats expected to handle requests all the time. Next I tried shell_exec’ing curl but it still consumed the entire file size worth of memory for its entire execution time. I was totally unable to get curl to stream from a file pointer vs reading the entire file into memory. The solution, at least for me, was to shell_exec a python script that utilized Request’s streaming upload feature. PHP Fail.