Reputation: 5136
There's no expiration or storage details listed for image serving URLs on Google's Cloud platform: https://cloud.google.com/appengine/docs/go/images/reference#ServingURL — This documentation is quite unclear. Is the image stored temporarily on a CDN or something, or is it stored in the Blobstore of the project indefinitely and we're paying for many multiples of storage? Or, does the URL expire after a set amount of time and that size of the image is discarded?
The reason I'm asking is because I've heard that calls to this function add latency, and I wanted to be sure to cache the response if possible and if this was the case. However, I need to know the cache expiration point, if ever, if so.
Any help or clarification would be appreciated.
Upvotes: 0
Views: 2073
Reputation: 6039
Pricing and Caching
That is described a little better here:
You simply store a single copy of your original image in Blobstore, and then request a high-performance per-image URL.
As Paul said in his comment, you only pay for the storage space of 1 copy of the original image in the Blobstore, plus normal bandwidth charges when it is served. When you create URLs that serve the image at different sizes, it is up to Google whether they cache a copy of the image at that size or not; either way you will still only pay for the storage of the original image at the original size.
I have seen reports that serving URLs can work for days after deleting the original image, so obviously Google does some caching at least sometimes, but those details are not specified and could change case-by-case.
Expiration
The URL will never expire, unless you delete it or the original image explicitly.
Whether you store your images in Cloud Storage or Blobstore, the right way to stop an image from being publicly accessible through the serving URL is to call the
image.DeleteServingURL
function.
Performance
I cannot comment on how much latency could be added by serving a resized copy of an image. I assume the answer is "not enough to care about", but again, I don't know. If you experiment and find the added latency unacceptable, you could try creating multiple versions of the image to store in Blobstore yourself, to serve at their natural sizes. I cannot say whether that would actually increase performance or not. You would of course pay for storing each copy in that case. I suggest not worrying about that unless you see it becomes a problem.
Images are served with low latency from a highly optimized, cookieless infrastructure.
So I doubt you could gain much benefit from trying to optimize it more yourself.
Upvotes: 4