AutoIt file downloader script.

This is an AutoIt script I made that downloads files off the Internet. It works so well that I really ought to be using it more often.

$MyDocsFolder = “::{450D8FBA-AD25-11D0-98A8-0800361B1103}”
$f=ClipGet()
if $f=”” Then
$f=InputBox (“FileGet”, “Enter download URL: “)
EndIf
$array = StringSplit($f, “/”)
$saved=$array[$array[0]]
$dir=FileSaveDialog ( “FileGet”, $MyDocsFolder, “All (*.*)”,2, $saved)
InetGet($f,$dir,1,1)
While @InetGetActive
TrayTip(“Downloading”, “Bytes = ” & @InetGetBytesRead, 10, 16)
Sleep(250)
Wend
MsgBox(4096, “Fileget Summary”, “URL entered: ” & $f & @CRLF & “File saved: ” & $saved & @CRLF & @InetGetBytesRead & ” bytes transferred”)

When you run the script, it checks the clipboard for the URL of the file to download, but if the clipboard is empty, you’ll be prompted for the location of the file.
The tricky part in making this script was figuring out how to isolate the file name from the rest of the URL. For example, if a user puts in the following address:

http://www.autoitscript.com/cgi-bin/getfile.pl?autoit3/autoit-v3-setup.exe

I needed to have the script isolate “autoit-v3-setup.exe” from the rest of the URL so its name can be used when the script downloads the file. This was accomplished using AutoIt’s StringSplit function, which splits a string into smaller substrings according to a specified delimiter. In this case the delimiter is the backslash character. The StringSplit function then creates an array that stores each substring. The first item of the array is the number of substrings stored, and the last item of the array contains the isolated file name.
Then the script opens the File Save dialog to prompt you for the destination folder for saving the file. The default location is the “My Documents” folder although you can save the file in a different folder. Afterwards it begins the download, keeping you posted on the progress from its icon in the system tray. When it’s done, it gives you a summary of what’s been transferred.
There’s no way VBScript can do this.

Wget this.

Wget is a free file retriever that can download files via HTTP, HTTPS and FTP. If you’ve ever downloaded a huge file via dial-up, chances are you’ve experienced some anxiety in wondering if you’re going to get disconnected before the file’s finished downloading. Wget can put your mind more at ease. Just get the link to download the file and then paste it in a batch file to invoke Wget, such as this:

wget -c http://www.website.com/reallybigfile.exe

Now you can relax and download in peace. Should you get disconnected, just re-run the batch file. The “-c” argument tells Wget to continue the download from where it left off when you got disconnected.
You can also use Wget to do batch downloads. I visit the web sites I plan to download from and get the link to download the files I want. I then put together a batch file and let Wget grab the files, one right after the other. My batch file would look something like this:

wget – c http://www.website.com/file1.exe
wget – c http://www.website2.com/file2.exe
wget – c http://www.website3.com/file3.exe

Just run the batch file and Wget will download the files in sequence. Because it operates from the command line, it doesn’t use any fancy graphics and will download files faster than a web browser.
But for those of you who simply must have a downloader with a graphical interface, you’ll want to check out Wackget, which is a graphical front end for Wget.