Large publishing or informational websites pose problems for web analytics, because there is no clear “conversion”. These are sites which might have hundreds of thousands of pages, employing a vertical search to sort through their content. The business goal of the site might be to simply attract volume or drive people to look at banner-ads.
The site might be nothing more than a gigantic router, like Yahoo or AOL, incorporating news stories supplied through RSS feeds. Or the site might be purely informational, communicating information about products or services without much lead-generation or ecommerce (I’m thinking of big pharmaceutical company websites with thousands of prescription drug information pages). When it comes to web analytics, business groups in charge of these sites usually tell me that success on the site is measured in terms of the vague and nebulous “site engagement”.
My answer is usually, “what is ‘site engagement?'” Out of the box answers are usually deceptive or misleading. Take pages per visit (or “path length”, “visit depth”, according to what tool you’re using). While this can be a powerful KPI, it’s silly to report out how many pages are viewed per visit without some analysis of what those pages are. I can come to a site and perform 15 searches, then leave in utter frustration because I haven’t found what I want, never to return. I wouldn’t call me “engaged,” but that’s 15 pages per visit.
Return frequency is another favorite metric I’m given for visitor “engagement”. Tools like Visual Sciences / HBX are big on the notion of “return visitor”. This is fine when your site sees reasonable return frequency (say, once a week). But what if my average return frequency is once a day – like a big portal site such as Yahoo or MSN, or an intranet site which is the default home page in a large corporation? Everyone becomes a return visitor and the metric is useless. Or the converse, suppose my average return frequency is once every three months. Not only will the cookie deletion/rejection issue be problematic, but how “engaged” could you really say these visitors are?
I think the only way to measure site engagement is to make such KPI’s relative to the site as a whole. 3 pages per visit might be great for some sites but miserable for others, so the first question to ask is “what is my average pages per visit?” To make this actionable (and isn’t that the point?), the individual pieces of the website should be parsed out and studied separately, comparing them to one another. On one large publishing website we’ve worked with, we identified dozens of tools, site sections, interactive elements, topics, and functionalist categories, and created page-view-, visitor-, and visit-based segments for each. By comparing visit-based KPI’s within these segments with visitor-based KPI’s for this segment, and page-level KPI’s with visit-level KPI’s, you can compute the effect-size on site engagement for each tool or site section. This lets you directly compare different tools to each other, and also eliminates any circular logic in measuring engagement (what I’ve called “epiphenomenal effects”).
For example: there’s Tool X which is viewed 2 times on average per visit (page-level metrics). The visit-level segment tells you that visits containing Tool X had 10 pages per visit, implying an effect size of 0.2. The visit-level segment also tells you there are 3 visits per visitor to Tool X; the visitor-based segment tells you that visitors who at one point saw Tool X made 30 visits to the site. The effect size on this KPI is thus 0.1. Each tool can be measured this way, and these effect sizes can be used to identify real tools or site sections which drive visitor engagement.
I’ll be discussing the issue of site engagement at the upcoming XChange Web Analytics Conference in Napa in September, so I’d love to hear other ideas of how web analysts tackle the problem of defining site engagement and making it actionable.