Reputation: 7604
I've heard all the cases in favour of using a CDN like Google APIs to host JavaScript libraries like JQuery and Prototype for my web application. It's faster, saves bandwidth, permits parallel loading of scripts, and so on. But I recently came across the following comment in Douglas Crockford's json2.js script:
USE YOUR OWN COPY. IT IS EXTREMELY UNWISE TO LOAD CODE FROM SERVERS YOU DO NOT CONTROL.
I'm curious what his argument might be behind this assertion, and whether it's specifically targeted at users of public CDNs like Google's, or something else?
Upvotes: 7
Views: 2844
Reputation: 771
Modern answer: yes, availability
Other people's servers (regardless of a public CDN or some random nondescript site) might go down, breaking your app's availability.
The CDN might also be compromised, causing your app to execute harmful code, but this issue can be mitigated with Subresource Integrity (SRI).
If you host it on your own server that you control, it would become unavailable at the same time your entire app becomes unavailable, rather than at some arbitrary time under someone else's control.
Using a public CDN has tradeoffs and might be worth it in some cases (for example, to save bandwidth).
<!-- best -->
<script src="your_own_server/framework.js"></script>
<!-- second-best (using public CDN) -->
<script src="https://public-cdn.example/framework.js">
integrity="sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"
crossorigin="anonymous"></script>
<!-- do not use -->
<script src="https://random-server-without-cors.example/framework.js"></script>
Upvotes: 0
Reputation: 4080
In addition to all the other answers:
You want to worry about serving your pages over SSL (i.e. https) but your JS over straight http from a different source. Browsers can complain (sometimes in an alarming way) about secured and unsecured items.
In addition, people browsing with the noscript extension (or similar) need to allow JS to run from multiple different sources. Not that big a deal if you are using a major CDN (as chances are they'll have allowed it at some point in the past) but you then need to worry that they are allowing only SOME of your JS.
Upvotes: 0
Reputation: 41832
While some of these other answers are certainly valid, we have a slightly different/additional reason.
We have a process that determines, on first request, evaluates which static content is required for any given page. In the background, this static content (js, css) is merged and minified into a single file (1 for JS, 1 for CSS), and then all future requests are served with a single file, instead of multiple.
While we could, theoretically, exclude files that may be served on a CDN and use the CDN for those, it's actually easier (because we'd actually have to add code to handle exclusions) and in some cases, faster than using a CDN.
Upvotes: 0
Reputation: 228162
Assuming he's talking about professionally hosted CDNs like Google, then the best bet is to do this:
<!-- Grab Google CDN's jQuery, with a protocol relative URL; fall back to local if necessary -->
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.5.1/jquery.js"></script>
<script>window.jQuery || document.write("<script src='js/libs/jquery-1.5.1.min.js'>\x3C/script>")</script>
(taken from http://html5boilerplate.com/)
That way, you get all the benefits, without the risk of your website breaking if Google's CDN goes down.
But, he said:
USE YOUR OWN COPY. IT IS EXTREMELY UNWISE TO LOAD CODE FROM SERVERS YOU DO NOT CONTROL.
I don't actually think he's talking about CDNs. I think he's just saying "don't hotlink scripts from random websites".
You wouldn't want to do this because the website might change where the script is located, or even change the script. A CDN would never do this.
Upvotes: 10
Reputation: 14927
Basically, it's a matter of trust. You need to trust the host to not change anything in the hosted file and you need to trust in the availability of the file. Can you be absolutely sure that the URL will not change? Are you comfortable with the fact that any downtime of their servers results in downtime of your application?
Upvotes: 2
Reputation: 41236
jQuery is open source. If you've made a modification to the internals, then obviously you can't host off another person's server. In general, hosting other people's scripts is a security risk; they could change the script without ever telling you, and now you're linking it onto your pages.
It's a matter of trust; do you trust that whatever CDN will be secure to not host a malicious script in the location of the script you want?
Upvotes: 0
Reputation: 25014
If a public server's js is compromised (availability, security or bug-wise), then the visitors to your site will be affected and likely blame you. On the other hand, what are the chances of Google's CDN being compromised over the chances of some smaller company's server? You also lose out on all the caching advantages that a CDN gives you when you host locally.
Upvotes: 0
Reputation: 19214
The reason is, if the server you are dependent on goes down, and yours doesn't. The experience of your site suffers. There are ways to have a fallback in place so if jquery or some other script doesn't load, then you can use a copy you host as a backup.
The other time you shouldn't use it is in a Intranet application scenario, where the bandwidth is not typically an issue.
A way to create a fallback from Jon Galloway: http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cdn-hosted-jquery-with-a-local-fall-back-copy.aspx
<script type="text/javascript" src="http://ajax.microsoft.com/ajax/jquery/jquery-1.3.2.min.js"></script>
<script type="text/javascript">
if (typeof jQuery == 'undefined')
{
document.write(unescape("%3Cscript src='/Scripts/jquery-1.3.2.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
Upvotes: 0