Remove External Image Links From HTML Pages

(Version 4)

What is It?

It is a trivial program which will remove external image links (i.e. those with an explicit URL giving a server name/address address rather than just a relative path to the image file) from all HTML page files.

It is often useful to get rid of external image links (and anything else specified by a 'SRC' tag) from downloaded web pages which would otherwise annoyingly cause a download attempt to be made (possibly with problems like automatic modem dialling or privacy issues) when they are later viewed in a browser. These images are typically just adverts & counters anyway.

It is very crude in operation: it just finds every 'SRC' tag with address value beginning with 'HTTP://' or 'FTP://' and replaces their addresses with empty strings.

System Requirements

A Perl interpreter.

How to Use It

Just run it with the current working directory being the one in which you want the HTML files processed. It will process all *.html & *.htm files in that directory and all subdirectories thereof recursively.

If you use a GUI and run Perl scripts by double clicking on them (as with Active Perl on Microsoft Win32) and doing so on you system runs the scripts with the current working directory set to the directory the script itself is in, then a quick GUI way of using this program is to copy it to the directory you want processed and double click on it.

Warning: Careless use of this program can result in severe data loss! Because it automatically processes as many HTML files as it finds in & below the directory you tell it to process, it may process files you don't want processed if run it with wrong directory as the current working directory. For example if you run it in your root directory, it could (depending on your file system setup) strip the external image links from every HTML file on your computer! In particular, watch out for: there being HTML files you don't want processed in subdirectories of the directory of HTML files you want processed; there being links in that directory to other parts of the file system which can be followed as if they were subdirectories; and for accidentally running it by double-clicking on it whilst moving it around etc.. I made it process files so promiscuously to quickly process large numbers of files archived by using 'wget' (a popular recursive website download program) at the expense of safety against careless use.

Known Deficiencies


Download (3 Kb).

Other Perl Scripts, Disclaimers Etc.

See my computer programs index page for more simple useful computer programs.