Thursday, March 27, 2008

Interesting commands in linux ..check them out

Useful when u download in linux machines where u have lot of options
wget (multi purpose download tool)





(cd cli && wget -nd -pHEKk http://www.pixelbeat.org/cmdline.html) Store local browsable version of a page to the current dir
wget -c http://www.example.com/large.file Continue downloading a partially downloaded file
wget -r -nd -np -l1 -A '*.jpg' http://www.example.com/dir/ Download a set of files to the current directory
wget ftp://remote/file[1-9].iso/ FTP supports globbing directly
wget -q -O- http://www.pixelbeat.org/timeline.html | grep 'a href' | head Process output directly
echo 'wget url' | at 01:00 Download url at 1AM to current dir
wget --limit-rate=20k url Do a low priority download (limit to 20KB/s in this case)
wget -nv --spider --force-html -i bookmarks.html Check links in a file
wget --mirror http://www.example.com/ Efficiently update a local copy of a site (handy from cron)

No comments: