Using wget and xargs

The joy of the linux/unix command line is how versatile the commands are. I recently had 50,000 URL’s I needed to download in a text file. I was thinking about writing a crawler in Python to do it but ended up just doing the following,

cat urllist | xargs -P16 wget -i

A 16 thread (process really) webcrawler in a single command. Joy.

MySQL Export to CSV

Ever needed to export data from MySQL into a CSV file? Its actually fairly simple,

SELECT * INTO OUTFILE '/tmp/name.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM [tablename]

Certainly easier then writing a quick Python/Perl/PHP script to do the job.