Not able to save an image from remote url

Hello,

I need to read an image from a remote url, and then save it on my hosting webspace.

This is the image that I’d like to save (it’s a webcam test): http://remoteimage.ddns.info:17000/snapshot.cgi

As you can see, if you put the image url in a browser then you can see the image properly. But if I try to save the image by using a php script, nothing happens. I’ve tried three different php scripts, but none of them works.

Some more notes:

Can anyone help me on how to save this remote image with a php script?

Here are the three scripts that I’ve tested so far with no result.

Script 1 - using file_get_contents:

$remoteUrl = "http://remoteimage.ddns.info:17000/snapshot.cgi"; $image = file_get_contents($remoteUrl); $fileName = "captured-image.jpg"; file_put_contents($fileName, $image);

With the above script I get this warning: file_get_contents(url): failed to open stream: Connection refused on line 2

Script 2 – using GD functions:

$remoteUrl = "http://remoteimage.ddns.info:17000/snapshot.cgi"; $image = imagecreatefromjpeg($remoteUrl); $fileName = "captured-image.jpg"; $quality = 90; imagejpeg($image, $fileName, $quality);

With the above script I get again the same warning: file_get_contents(url): failed to open stream: Connection refused on line 2

Script 3 – using curl:

$remoteUrl = “http://remoteimage.ddns.info:17000/snapshot.cgi”;

[code]$ch = curl_init();
$timeout = 20;
curl_setopt ($ch, CURLOPT_URL, $remoteUrl);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$image = curl_exec($ch);
curl_close($ch);

$fileName = “captured-image.jpg”;
file_put_contents($fileName, $image);[/code]

In this case I get no warning, but in the end I get an empty image file.

phpreader, simple simple…

Your script #1 would work just fine except your logic is a bit off center…

So, if you are loading data from a webpage that is GOOGLE’s link you showed us, all is well.
This is because you are loading one item on the page, just one image. Therefore, you can use the function
file_get_contents because the only thing in the page is an image.

But, if you are loading a webpage which is controlled using a CGI script, it is just like an HTML or PHP page.
It loads ALL of that page, not the data INSIDE the page. First, the URL you posted to the CGI page does
NOT let me see the image. Either the page is currently not working, or I do not have the correct login info
to access that page. Anyways, once you capture that page using your script#1, you need to print it out to
see what is really on the page. My guess is that it contains formatting codes, HTML, Javascript and other
items which are NOT an image.

To save the image, you would have to parse the page and remove the garbage before and after the real
image you want to save. Depending on the coding found on that page, it actually may be an tag
that points to the server’s image folder for the real image.

So, how to debug this issue… Run script #1 and look at the data inside the $image variable before you
try to save it. Something like: print_r($image); might work, or even just: echo $image; … Or even use
die($image); just before where you assign $fileName =… You need to see what is inside the page that
you get-contents of. Then, you can figure out how to scrap out just the picture on that page and leave all
the rest of the code behind.

One final comment, can you actually see that page? It would not load at all for me!

Sponsor our Newsletter | Privacy Policy | Terms of Service