I'm still wading through Google Analytics to get an idea of how the web site is used so I can have some 'evidence-based' proposals for the site home page and persistent navigation.
One thing that I'm pondering after my last blog post is referrals from Search Engines. If a page is linked to much more often from a search engine results page (SERP) than from another page in your site does that indicate your site is failing the user, or that the user prefers to use a search engine?
If a page is a common exit point does that mean it satisfied the user need, or did it just frustrate them enough to give up? Does an elongated 'time on page' mean the content engrossed the reader or that they glazed over into catatonia?
If your total page hits go up down after a redesign does that mean you lost popularity or your site is providing the required content with a lot less clicks?
I think I could argue two opposite sides to almost anything Analytics appears to hint at - each supposition on a facet of Analytics would probably make an ideal topic for a formal debate in the vein of 'can money by happiness' or 'is honesty the best policy'
The answer is site Analytics on their own are ambiguous guides to user behaviour - you really need to observe, consult and 'know' your users.
Analytics' value is in aggregating data so you can visualise behaviours that prompts you to formulate questions like WHY IS IT DOING THAT? IS THAT A GOOD THING?
If only users could be consulted in large numbers at any time that was convenient to me.
PS GA has some tables of Google Queries mapped against how often a page from your site appeared in a SERP and how often one of your pages was clicked on, referred to as CTR (Click Through Rate). It makes for interesting perusing, maybe one approach would be to interpret user goal from the search, and then see how closely the target page matched or referred to the goal. An iterative approach that would probably improve user experience over time, but it would be difficult evaluate the impact.
Analysis of that information is making me think about using Summon's Best Bets
2 comments:
Alan your thoughts on how google Analytics data could mean anything is something we have been struggling with.
We are a bit GA crazy here embedding it in almost every library system we can from library portal, Encore, catalogue, summon , link resolver screen , libguides, libfaq, ezproxy login and a few more I am forgetting . Even our bookmarklets are enhanced that way.
It's gotten to the point I see new library projects starting and people routinely just embed it in.
The idea here is that the data would be comparable eg the unique visitors from Encore could be compared to Summon but in practice it's not so simple. Eg Encore has more page views same period last year but then again Encore includes item detail pages while we didn't turn on Summon catalogue page etc. so you got to filter that etc
I wonder what thoughts you have on using GA to decide on best bets. We been doing so since best bets came about, but in a very inefficient way looking at searches with high number of refinements/next page clicks , running those searches to see why that happens and creating best bets.
Often these are known item searches , once it detected a cataloging error in title.
http://journal.code4lib.org/articles/8693 has an interesting article where Serialssolutions track query abandonment as a proxy for quality of search.
We don't have access to that data, I don't think GA can track that?
Aaagh, Aaron, I've only just seen your comment - sorry!
My method to determine what should be a best bet is to examine the Summon search queries, identify which queries were probably not aimed at retrieve 'article' type information, then replicate the search and see how well it answered my perception of what the user was looking for.
So a journal title will almost always return a link to the journal. Databases are more 50/50 - So I've been adding tags to the Recommender for some Australian dbs with 'colloquial' alternative titles.
Other searches are clearly for services e.g. a search for 'exams' is mostly likely looking for our collection of digitised past exam papers, so I've added that to Best Bets.
For my first run I limited myself to queries that had been done more than 40 times in the first 6 months of 2013. I think I'll do that job every 6 months and slowly work my way down to less issued searches.
I think because we present the Summon search box front and centre of our home page we have to cater to all possible searches. Fortunately Summon already takes care of most (libguides, catalogue, bib databases, IR, ejournals - so filling the gaps doesn't feel to onerous. My one stumbling block is users searching for subject codes (I assume for course readings) haven't yet figured the best way of doing that as our course readings software presents some challenges to a unified index.
I've only just started adding Best Bets and I'm using Google Campaigns to track the use it generates.
Our staff have also included campaign strings in some university-wide orientation material, QR codes and some other places. So far what it shows is that these referrers aren't doing much referring.
Will try and get to that article and possibly answer your question after digesting it.
Cheers, Alan.
Post a Comment