Capture A Website As Image With PHP

7th April 2008 - 3 minutes read time

One thing that comes in useful when linking to other websites is to use a little thumbnail of the site as part of the link. Although it is possible to do this it usually involves having command line access to the server so that you can install various different programs and extensions.

The simplest mechanism accomplish this is to use third party sites to create the thumbnail for you. Here are a few examples of free thumbnail services.

www.thumbshots.org

Thumbshots provides access to small images (thumbnail size) of many different websites. However, they are very small, with no way to change this, and can often be weeks out of date if the site has had a redesign. You can access Thumbshots from any webpage by using an image with a source that reads from the Thumbshots website.

<img src="http://open.thumbshots.org/image.pxf?url=http%3A%2F%2Fwww.hashbangcode.com%2F" alt="Site image" />

Websnapr

Websnapr is a slightly more customisable system as you can select from four different image sizes. These are t for thumbnail, s for small, m for medium and l for large.

Here is an example code for a small image of #! code.

<img src="http://images.websnapr.com/?url=http%3A%2F%2Fwww.hashbangcode.com%2F&size=s" alt="Site image" />

Art Viper

Art Viper website thumbnails is a more complicated mechanism of getting images of websites. Here is a list of the parameters you can pass to the script.

  1. q: The quality of the image, about 90 seems like a good bet.
  2. h: The hight of the image.
  3. w: The width of the image.
  4. sdx: The width of the rendering browser in pixels.
  5. sdy: The height of the rendering browser in pixels.
  6. url: The URL that the image is to be created from.

So putting these things together you can create the following source.

<img src="http://www.artviper.net/screenshots/screener.php?url=http%3A%2F%2Fwww.hashbangcode.com%2F&q=90&h=280&w=340&sdx=1024&sdy=768" alt="Site image" />

Saving The Images

Displaying images from other peoples websites is possible, but it can lead to certain sections of the site slowing down as the browser goes off to fetch the image from a different server. The ideal situation is to save the image to your server and then link to it from there.

Take any of the above examples. It is possible to extract the information from the images served and store them on your own site. This can be accomplished with curl using the following code.

$url = urlencode('http://www.hashbangcode.com/');
$image_url = 'http://images.websnapr.com/?url=' . $url . '&size=s';
 
$ch = curl_init();
$timeout = 0;
curl_setopt($ch, CURLOPT_URL, $image_url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
 	
// Getting binary data
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
 	
$image = curl_exec($ch);
curl_close($ch);
 
// output to browser
header("Content-type: image/jpeg");
$imageFile=fopen('websiteimage.jpg', 'w');
fwrite($imageFile,$image);
print $image;

This saves the image and then prints it out.

Comments

Permalink
Thank you very much.. :)

onokung (Mon, 08/04/2008 - 07:48)

Add new comment

The content of this field is kept private and will not be shown publicly.