Large HTTP uploads and CURL’s memory usage

Recently I’ve had the need to upload large files to a cloud storage back-end and was flabbergasted at the amount of memory required by curl to do this. At first I was trying to use curl via PHP and when uploading a 2G file, PHP would load the file into memory and then curl would do the same, consuming a total of 4G of ram for a very long time. This might be ok on someone’s desktop but not on a server thats expected to handle requests all the time. Next I tried shell_exec’ing curl but it still consumed the entire file size worth of memory for its entire execution time. I was totally unable to get curl to stream from a file pointer vs reading the entire file into memory. The solution, at least for me, was to shell_exec a python script that utilized Request’s streaming upload feature. PHP Fail.

Searching PHP for Syntax Errors

We use Codeigniter here at work as out development framework. Some of the projects we work on have gotten fairly large and some have multiple sites / portals running off the same framework installation. Because of this and individual configurations that have to be set for each site it can be a bit of an issue when first setting up a project on a new environment be it production or development. A single missing semicolon in a config file will cause a 500 error and nothing being logged to either Codeigniter or Apache’s logs. Going through each file, line by line of course isn’t an option, so I’ve wrote the following bash script to loop through all directories under a specific path and run a “php -l” against each file looking for syntax errors and will give you a little report of files with syntax errors at the end. I hope some other PHP developers else will find it useful.

#!/bin/bash

#
# Recusrsivly check all .php files in a path for synatx shinanigans
#

if [ -z "$1" ]
then
echo "Usage: $0 "
exit 1
fi

WORKING_DIR=$(find $1 -name '*.php')
WORKING_DIR_COUNT=$(find $1 -name '*.php' | grep -v ^l | wc -l )
declare -a HOSED_FILES

if [ 0 = $WORKING_DIR_COUNT ]; then
echo "No PHP files found in $WORKING_DIR"
exit 1
fi

for f in $WORKING_DIR; do
echo "Checking -> $f"
CHECK=$(php -l $f)
#echo $CHECK
#echo ${CHECK:0:29}
if [[ 'No syntax errors detected in' != ${CHECK:0:28} ]]; then
echo "Syntax errors detected in $f"
HOSED_FILES+=($f)
else
echo "No syntax errors detected."
fi
done

echo -e '\n\nFILES WITH SYNTAX ERRORS:\n'
for f in ${HOSED_FILES[@]}; do
echo $f
done