Reputation: 1894
I am a noob with angular, and making javascript crawlable. I've been searching for it, but I don't really get it so far.
I am working on a AngularJs thingie which is using client-side JSON. There is a navigation with pages, but... each link is using a function getPage(n) to slice a chunk of JSON and Angular renders it.
Is it OK to put a href="#!page=n" to each link? When I add that hash #! to the url and press enter, and a function renders the right items, is that enough to make it crawlable?
I've read something about snapshots, but it requires Java? I have a webhost which is not really flexible, it does NOT TomCat or NodeJs.
Upvotes: 0
Views: 845
Reputation: 181
Check this older stackoverflow question - Making angular crawlable - Beginning of Project
A friend of mine uses - https://prerender.io/
Both these solutions are essentially caching versions of your rendered views, so the crawler can index your site.
Upvotes: 1
Reputation: 747
I think it's much better practice these days to use HTML5 history.pushState
, and thus provide a unique URL for every page.
Upvotes: 1