Reputation: 5371
I'm developing a site with a client side javascript framework (dojo/dijit) at the moment. As with all javascript/framworks you start using Ajax to do quick calls and updates. My question is there a general rule of thumb when to use Ajax and when to use a link? I only ask because I seem to be using Ajax more then not and I'm worried that any errors in the initial page might propagate to other elements. Or with content getting constantly replaced something might go wrong.
I suppose what I'm asking is there any downfalls to heavy Ajax usage in a web pages?
EDIT
SEO - not an issue. I'm just thinking of client server issues for now. Links would bet Ajax hands down if you wanted good SEO
Upvotes: 10
Views: 6658
Reputation: 25473
According to me lot of ajax is bad for your website's SEO health... :-( The best use of ajax is when it is used just for displaying errors, messages and little tasks. I think submitting a form using Ajax is not a good thing aswell.
I'm mentioning some of major disadvantages below, if i've missed some please comment.
Firstly
, Owing to their dynamic nature, Ajax interfaces are often harder to develop when compared to static pages.
Secondly
, Pages dynamically created using successive Ajax requests do not automatically register themselves with the browser's history engine, so clicking the browser's "back" button may not return the user to an earlier state of the Ajax-enabled page, but may instead return them to the last full page visited before it. Workarounds include the use of invisible IFrames to trigger changes in the browser's history and changing the anchor portion of the URL (following a #) when Ajax is run and monitoring it for changes.
Thirdly
, Dynamic web page updates also make it difficult for a user to bookmark a particular state of the application. Solutions to this problem exist, many of which use the URL fragment identifier (the portion of a URL after the '#') to keep track of, and allow users to return to, the application in a given state.
Fourthly
, Because most web crawlers do not execute JavaScript code, publicly indexable web applications should provide an alternative means of accessing the content that would normally be retrieved with Ajax, to allow search engines to index it.
Fifthly
, Any user whose browser does not support JavaScript or XMLHttpRequest, or simply has this functionality disabled, will not be able to properly use pages which depend on Ajax. Similarly, devices such as mobile phones, PDAs, and screen readers may not have support for the required technologies. Screen readers that are able to use Ajax may still not be able to properly read the dynamically generated content. The only way to let the user carry out functionality is to fall back to non-JavaScript methods. This can be achieved by making sure links and forms can be resolved properly and do not rely solely on Ajax. In JavaScript, form submission could then be halted with "return false".
Sixthly
, The same origin policy prevents some Ajax techniques from being used across domains, although the W3C has a draft of the XMLHttpRequest object that would enable this functionality.
Seventhly
, Like other web technologies, Ajax has its own set of vulnerabilities that developers must address. Developers familiar with other web technologies may have to learn new testing and coding methods to write secure Ajax applications.
and
Lastly
, Ajax-powered interfaces may dramatically increase the number of user-generated requests to web servers and their back-ends (databases, or other). This can lead to longer response times and/or additional hardware needs.
Finally
, i will not say that ajax is bad but as Daniel commented, "Too much" is always a bad thing
is true. Facebook is one of website which uses huge ajax but in a proper way.
Many solutions to above problems have been implemented. For example, using invisible iframes allows the retrieval of history data. URL fragment identifiers let users bookmark and return to a particular state of an application. It also supports back-button functions.
Hope this helps.
Upvotes: 4
Reputation: 12543
In my mind, there are three issues with using tons of AJAX calls.
The first is from a user perspective. If I am doing a lot of navigation, as a user, I want to be able to use my back/forward buttons in my browser and have them work correctly. If they do, then there isn't an issue. If they don't, you've broken fundamental navigation in my browser.
Second is bookmarking/indexing. As a user, I may want to bookmark something so I can come back to it or share it. As an indexer for a search engine, you as a developer want to let the search engine "see" all the pages of information you have so that people can find your site. Both of these require some sort of unique url.
Third is debugging from a development point of view. The more random stuff you are throwing on a page and/or replacing dynamically, the harder it gets to track down what's wrong. The more you have the more that needs to be integrated well or could interact badly.
Upvotes: 15
Reputation: 12240
One problem that pops to my mind when using too much AJAX is SEO.
If you're creating web application then using AJAX is a good thing. But if you want search engine to to find every word on your page, using AJAX will make it difficult.
Upvotes: 2
Reputation: 115809
Personally, I'm fine with much Ajax as I encounter as long as long as works properly (and fast) and doesn't break my behavior. More specifically, I want to be able to email a link to a specific page, bookmark it -- the usual stuff.
Upvotes: 2
Reputation: 3721
Would your page be using more Ajax than Google Docs or Gmail or Facebook? Than it would be too much.
Upvotes: 1
Reputation: 343
No,
I think not using both pages or ajax can have their advantages, but once the javascript file is loaded and cached and also the ajax sheet, all that is left of the overhead is the database calls, using links would cause more pages being loaded which I think is a heavier cost than making a database call.
Upvotes: 4