Boong
Boong

Reputation: 149

Can we measure complexity of web site?

I am familiar with using cyclomatic complexity to measure software. However, in terms of web site, do we have a kind of metrics to measure complexity of website?

Upvotes: 8

Views: 2491

Answers (4)

Ranjit Vadakkan
Ranjit Vadakkan

Reputation: 11

This is a great paper on the topic.

Upvotes: 0

luis.espinal
luis.espinal

Reputation: 10529

Though the question was asked 6 moths ago...

If your website is 100% static site with no javascript at all, then it needs to be powered by a back-end programmed in a programming language. So indirectly, the complexity measures that afflict the back-end programming will also affect the complexity of maintaining the site.

Typicall, I've observed a corellation between the maintainability and quality (or lack thereof) of the web pages themselves to the quality (or lack thereof) exhibited, through software metrics, in the back-end programming. Don't quote me on that, and take it with a grain of salt. It is purely an observation I've made in the gigs I've worked on.

If your site - dynamic content or not - also has JavaScript in it, then this is also source code that demonstrate measurable attributes in terms of software complexity metrics. And since JavaScript is typically used for rendering HTML content, it stands as a possibility (but not as a certainty) that attrocious, hard to maintain JavaScript will render similarly attrocious, hard to maintain HTML (or be embedded in attrocious, hard to maintain markup.)

For a completely static site, you could still devise some type of metrics, though I'm not aware of any that are publisized.

Regarless, a good web site should have uniform linking.

It should provide uniform navigation.

Also, html pages shouldn't be replicated or duplicated, little to no dead links.

Links within the site should be relative (either to their current location or to the logical root '/') and not absolute. That is, don't hard-code the domain name.

URI naming patterns should be uniform (preferably lower case.) URLs are case-insensitive, so it makes absolutely no sense to have links that mix both cases. Additionally, links might be mapping to stuff in actual filesystems that might be case-sensitive, which leads to the next.

URIs that represent phrases should be uniform (either use - or _ to separate words, but not both, and certainly no spaces). Avoid camel case (see previous comment on lower casing.)

I'm not aware of any published or advocated software-like metrics for web sites, but I would imagine that if there are, they might try to measure some of the attributes I mentioned above.

Upvotes: 2

Ira Baxter
Ira Baxter

Reputation: 95334

If you count HTML tags in the displayed HTML pages, as "Operators", you can compute a Halstead number for each web page.

If you inspect the source code that produces the web pages, you can compute complexity measures (Halstead, McCabe, SLOC, ...) of those. To do that, you need tools that can compute such metrics from the web page sources.

Our SD Source Code Search Engine (SCSE) is normally used to search across large code bases (e.g., the web site code) even if the code base is set of mixed languages (HTML, PHP, ASP.net, ...). As a side effect, the SCSE just so happens to compute Halstead, McCabe, SLOC, comment counts, and a variety of other basic measurements, for each file it can search (has indexed).
Those metrics are exported as an XML file; see the web link above for an example. This would give you a rough but immediate ability to compute web site complexity metrics.

Upvotes: 3

kc2001
kc2001

Reputation: 5247

I suppose you could consider "hub scores" to be a complexity metric since it considers how many external sites are referenced. "Authoritative sources in a hyperlinked environment" by Jon Kleinberg discusses it.

Upvotes: 0

Related Questions