Reputation: 27276
I have a simple HTML page with some javascript that continuously feeds an image control with images from a camera. My concern is that when I'm in Google Chrome, and am looking at the Resources, it is filling up with the same image filename over and over - every new refreshed image seems to be creating a copy.
Should I be worried about this? Is there a possibility that something might happen such as cache full or anything along those lines? Or would browsers respect this and handle it accordingly?
Just now, as I'm typing this question, Google Chrome's dev panel disappeared on this page of mine, as if it did in fact get overloaded with images. So I'm assuming this is not good. However the stream of images does continue to work as designed.
Currently it's on 2 frames per second, and here's my code:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>Camera View</title>
<script type="text/javascript" language="javascript">
var camurl = "";
var fps = 1;
var loop = null;
onload = function() {
camurl = getParameterByName('camurl');
fps = getParameterByName('fps');
LoopProc();
}
onunload = function() {
if (loop != null)
clearTimeout(loop);
}
function getParameterByName(name) {
name = name.replace(/[\[]/, "\\\[").replace(/[\]]/, "\\\]");
var regex = new RegExp("[\\?&]" + name + "=([^&#]*)"),
results = regex.exec(location.search);
return results == null ? "" : decodeURIComponent(results[1].replace(/\+/g, " "));
}
function DelayTime() {
if (fps < 1) fps = 1;
return (1000 / fps);
}
function NewUrl() {
return camurl + "#time=" + new Date().getTime();
}
function LoopProc() {
document.getElementById("CamImage").src = NewUrl();
loop = setTimeout('LoopProc();', DelayTime());
}
</script>
</head>
<body>
<img id="CamImage" src="" alt="" />
</body>
</html>
And a sample call to this page:
http://localhost:8081/?fps=2&camurl=http://192.168.1.150/image.jpg
Note that I am using a trick for the cache - in the function NewUrl()
I'm adding #time=[datetime]
so that the cache will think it's a new image.
You can test this actually on any image on the net, by using another image URL than the camurl
in the query string. Doesn't necessarily have to be a camera to test this scenario.
UPDATE
After running for 20 hours now, I have no issues at all in the web browser (Google Chrome) at a frame rate of 3 fps. That's ~3,600 frames, each being ~165 KB. So that makes it over half a gig downloaded - with no issues in the browser at all. But the dev tools crashed not even 30 minutes after being opened.
Upvotes: 3
Views: 168
Reputation: 26766
If the Url is the same for each image, it should work with no issues whatsoever as the new image will replace the old in the cache. The problem you're likely to hit is that you don't ever want the image retrieved from the cache, hence fudging the urls.
If you do continue to fudge the urls, the browser shouldn't crash but you will force (useful) resources out of the cache when it reaches capacity.
The ideal solution is to set the appropriate cache-control
headers when retrieving the image to tell the browser not to store the image.
If the issue is that your source for the images doesn't include cache control headers and you can't modify it, consider proxying the image through nginx with the HttpHeadersMore module which would allow you to decorate the response any way you like.
It's worth noting that Firebug/Chrome dev tools will die (eventually) with enough requests.
Upvotes: 2