Wednesday, December 18, 2013

Connecting the clicks: using usage data, APIs, jQuery (and some common sense) to improve our users’ online library experience.

 [Today’s guest blogger is Ben Durrant of the University of St. Thomas. The following is a summary of his lightning round presentation given at CLIC’s Hush the Shushing Conference on October 24. During the lightning round session at the Conference, each CLIC member library presented an overview of a recent project focusing on the library’s users. Ben presented, "Counting the Clicks," which looked at using usage data, APIs, jQuery (and some common sense) to improve our users’ online library experience.]





The online library user experience is complicated - you have clusters of different systems, like your library website, a catalog, your research guides, your databases, Learning Management Systems, ILL, your proxy server, A-Z lists, etc (and even with some form of next-gen/discovery you are only combining 2 or 3 of these systems). Plus users are coming from and going between these systems in all sorts of ways – the ways we think (and hope) they use it, but also from Google searches, other campus sites, social media posts. Then, often these systems have different interfaces and terminology.

It is all very confusing!

 So, how do we find out what users are doing? Where are the pain points? What are some things that we can do to help? Four quick options are:

·       Analytics is an obvious and easy place to start, but look for some of the outliers – unexpected popular pages and landing pages. With a bit of extra work/coding you can also do event tracking to get beyond what pages users are visiting to where they are clicking and other scenarios.
·       Another great thing to do is some sort of transaction log – from your reference/circ/tech desks, IM and other service points. We use SharePoint for this but there are other options.  Great for seeing where there are recurring or timely problems. We watch it daily to identify and troubleshoot access problems, and longer term to try and pick up usage or problem trends that we can address with systems changes.

·       Some form of in-person user testing is very helpful – give users tasks to complete and watch to see if they are able to finish the task, how long it took and what steps they took in the process.


·       Finally, ask your staff! Reference librarians, student workers will often know common problem spots.

After doing all four of these items, here are the major problems we saw:
·       Using wrong tool or wrong site for what they are looking for. Confusion about different systems and what they are each for. For example: citation and subject searches in A-Z Journals or in LibGuides.  “Search box syndrome” - if there’s a search box, people will use it!
·       Search problems – wrong search type, word order, spelling errors, etc. Different systems all handle these issues differently and give different results.

So, now what to do with the information you’ve gathered?
I break it down into 2 categories - Static (or manual) and Dynamic (API, jQuery) changes and below I’ll give some examples of both.

·       Manual changes
o   Example: based on looking at our analytics and question logs we make home page ads, blog posts, and social media posts to respond to current issues and events from the last day on our website.
·       Dynamic changes
o   Example: Popular databases list automatically populated from previous days analytics using API’s

Here are some examples of things I’ve done in our various systems:

·       By making some manual style changes we were able to make it much simpler visually to use by establishing more hierarchy, and sorting print/online holdings
·       But we also used APIs to get more dynamic details about a journal holding: description, peer reviewed status, RSS feeds.
·       Used the Summon API to see dynamically test if a title is indexed in Summon and allow a scoped search if it is. This has been very popular – used nearly 3k times this year! Bounce rate down almost 10%.
After looking at our A-Z journal stats we realized a large % of our page hits (nearly 30%) were getting zero results! Variety of reasons for why this happened, but here’s what we did –
·       Added a “No results” message – DUH! To do this I used jQuery to search the page content for the words “Sorry, your search for returned no results.” If it was there then I add in a special message. View an example.
·       Added a step by step help guide – did you mean to search using title contains rather than begins with, etc. This uses a tool named JoyRide and jQuery.
·       Created an auto-complete search input using Summon and Ulrichs APIs. The Ulrich’s API had the advantage of being faster (since it is only journals, not also articles, books, etc), but it had the disadvantage of not being able to limit to our holdings. So we went with a Summon API search, limited to Journals and within our holdings.
·       For the case where someone was using the wrong tool (searching for a subject, or citation) we added
helper links to Summon from the “No results” page in the Journal portal. In many cases if the person would have been searching in Summon they would have found what they were looking for!

Link Resolver
I took a similar approach with our Link Resolver (example).
·       Manual changes to improve the design and readability
·       Added dynamic visual helpers – “try this first, then this…” again using JoyRide and jQuery.

Interlibrary Loan
From talking to staff we found out that many ILL requests were for items that we actually own!
So we added a scoped Summon search box in the Article request process, a catalog search to the Book request process, and in the Dissertation request we added a Summon search box scoped to only dissertations. Each of these scoped search boxes is also automatically populated from the OpenURL request if it is present.
Since doing this we found that the number of requests for items we own has dropped nearly in half each of the last 3 years!

Through analytics and testing we found that many users weren’t going past the first page of a guide. We also heard from our staff and the LibGuides email list that students often don’t seem to see the navigational tabs. So we changed our page design – simplified the header by moving things to the footer, also made design changes to make the tabs/pages more obvious.
We created boxes for popular databases and guides, Illiad, Summon, Catalog searches, etc that our staff could re-use in their own guides.
Using APIs from LibGuides we created a “Related guides” dynamic link that gets added to each guide to cross-link to other guides in the same subject area.
Library catalog:
We created a dynamic mapping tool so one a user found an item in the catalog they could find it on the shelf. This feature is very popular, even for staff and student workers. Traffic is coming from catalog and journal pages not from our site – the tool needed to be in the flow of the user From this example page click the location link (St Thomas – O’Shaughnessy Frey Library Stacks)

Campus Search:
Your campus search engine is another important tool.
·       We added keymatch/ads for our databases – these ads were clicked over 1200 times in the last year.
·       Auto-complete enabled, based on actual search usage data
·       Even if you don’t have the level of access to do these sort of changes directly you should be able to submit requests for keymatches/ads to your administrator.
·       Also, you can make sure your content is indexed/findable by submitting URL’s for your sites to the administrator. You’ll want to do this for all domains/sub-domains where your content is located to make it searchable - this would include your website, research guides, repositories, digital collections, etc.

LMS/Blackboard:
We created a custom landing page on our website from Blackboard.
·       Removed the normal page header since it is inside of a frameset which made it look strange.
·       Set all links to open in new window so the user doesn’t lose their place in Blackboard when they click a link.
·       Tracked usage
We’ve also created custom faculty and student content that displays after logging into Blackboard.
Finally, we built a custom Blackboard Building Block that allows an instructor to easily choose and embed links to research guides inside of a course.

Changes to our website (during a redesign):
·       Simplified – less content and links
·       Used tabs and accordions to cluster content and not display all at once.
·       Auto-complete when possible (database list, search, Summon, journal title)
·       Added help buttons
·       Chat widgets across all sites/systems where it was possible.

In Summary:
·       Look at your usage patterns and try to find problem areas
·       Make changes/additions that make sense in the context of the system you are in.
·       Try to guide the user rather than showing every option and link on a given page.
·       Start small – what are some manual content additions and linking strategies that could help your users in a complex system?


·       Make small changes, add analytics code to measure whether users are trying them.

Friday, December 13, 2013

CLIC Board appoints Implementation Team

Newly- appointed by the CLIC Board, the Implementation Team has already started to work on their charge of coordinating the implementation of CLIC's library services platform (LSP).


Implementation Team members (left to right around the table): Rhonda Gilbraith, Greg Argo, Jon Neilson, Steve Waage, Ruth Dukelow, Mike Bloomberg, Nate Farley, Dani Roach, and Amy Shaw.

In addition to working on the deliverables section of the license, the Team is planning a Kick-Off Event in January for CLIC member library staffs to learn about the 18-month phased-in deployment of the LSP components.

Watch the CLIC Blog, clic-announce list, and CLIC's Facebook page for further developments.

Tuesday, December 10, 2013

2013 CLIC Awards Luncheon

Congratulations to the 2013 winners of CLIC Awards!



Greg Argo of Concordia University, St. Paul, won the CLIC User Service Award. This award is presented annually to an individual within CLIC who has done the most to improve service to users during the past year.

One nominator wrote in support of Greg: "His grasp of library and web technology is very sophisticated and he often notices things that could easily go unnoticed. He recognizes possibilities and takes the initiative to help make it happen. He has the ability to take complicated problems, break them down and analyze them and then proceed to explain the issues or problems to the greater community in a manner that is understandable to the rest of us. He has the dual gift of technological expertise and communications skills to solve the problem and then bring the rest of us along. He demonstrates great tenacity and patience in grasping, working through, and then explaining the issues and the solution." Another nominator wrote: "'What is this guy, a Swiss Army knife?' Well, he has proven himself to be so versatile and helpful that I think he really might be, in part!"



The Conference Planning Task Force won the CLIC Group Effectiveness Award for their stellar work in planning and hosting the CLIC annual conference, "Hush the Shushing! What Are Users Saying?" This award is presented annually to a group within CLIC which has best exemplified group action for the benefit of CLIC and its mission. Task Force members included: Augsburg College – Missy Motl, Bethel University – Erica Ross, Concordia University, St. Paul – Greg Argo, Hamline University – Siobhan DiZio, Macalester College – Jesse Sawyer, St. Catherine University – Emily Asch, University of Northwestern-St.Paul – Linda Rust, and University of St. Thomas – Kari Petryszyn.

One nominator wrote in support of the Task Force: "The 2013 Conference Planning Task Force is one of the most pleasant, collaborative, and hardworking committees that I have encountered. From day one, the group coalesced, set their goals, and worked well together throughout the process. The members were generous in allowing everyone to contribute to the decisionmaking process, and as a result, there was a high level of creativity in putting together a conference that was both fun and informative."



Our luncheon speaker was Joyce Sutphen, the Poet Laureate of Minnesota.  Joyce gave a delightful talk on poetry and stayed afterwards to sign books.



For more photographs taken at the luncheon (with thanks to CLIC photographer, Steve Waage!), see CLIC's Facebook page.

Citation Interpretation as Competition: Putting the Em-PHA-sis on the Learner

 [Today’s guest blogger is Kate Borowske, Academic Librarian at Bush Library, Hamline University. The following is a summary of her lightning round presentation given at CLIC’s Hush the Shushing Conference on October 24. During the lightning round session at the Conference, each CLIC member library presented an overview of a recent project focusing on the library’s users. Borowske designed a game to create a learning experience for students that allows them to practice this skill rather than simply read about it.  This approach puts the focus on the learner rather than on the content.]



One of the many skills our students need as researchers is the ability to identify the type of material cited in a citation.  They’re usually not too motivated to learn this.  Until they come across a citation to something they really want and can’t figure out how to get, it’s not something that’s even on their radar screen.  One of our librarians, Siobhan DiZio, makes a competition out of “decoding” citations when teaching first-year students.  We decided to turn this group activity into a game that would be available to librarians and faculty to embed in LibGuides or Blackboard.  Students can use it as practice on their own or instructors can use it as a class activity.
When a librarian determines there is information a student needs to learn, the first thing we usually think of is to write down all the Information we know on a topic so that the student can know everything we know.  The content/information drives what we include, how we arrange it, and how we describe it.  However, for the learner, it’s not engaging and it’s overwhelming.  

If, instead, we create an activity that mimics real life (in academia, anyway), we can focus on the “what do we want students to do?” rather than “what do we want students to know?”   Students don’t need to memorize information about citations; they need to know what to do when face-to-face with one.  We put the focus on their experience rather than the arrangement of content on the page.  This gives the student an opportunity to practice in a game environment, rather than in real life (no one has died from misidentifying citations, but it can lead to frustration and dead ends).   And games are more motivating than reading text; giving the learner control improves engagement and motivation.  And competition ramps it up even more.  

The game we developed is very simple:  students simply drag-and-drop a citation to the correct place:  books, book chapters, articles, and web pages.  If they drag it to the incorrect box, it bounces back to the starting point.  If they drag it to the correct place, it stays.   There is no information on how to tell these citations apart.  Students need to actively examine the elements in order to solve the puzzle themselves, with simple correct-or-not-correct feedback.  It allows them to make mistakes and work their way out of them.  Try it:  Decode the Citation.

When there is something you’d like your students to learn, put the learner at the center rather than the content.   Instead of starting with, “What information do they need to know,” start with, “What behavior are we trying to encourage?”  and “What is the context in which the learner needs to apply this behavior?”  When you approach learning from the learner’s point of view, it is, at first, rather disorienting.  It literally turns your project upside down and forces you to change how you approach it.   For example, if you were to do a lesson on Scholarly v Popular, with content-centered learning, you’d start with, perhaps, a list of characteristics for each.   What happens in real life, though, is that the student is faced with an article and needs to accurately identify it as one or the other.  If this lesson is designed with the learner at the center, it is just like real life:  there’s an article in front of her and she needs to identify it.  She learns by looking closely, making an attempt, and getting feedback.  
It’s actually more difficult and takes more time to design e-learning this way.  It’s also more fun.  In the end, what you’re doing is designing a learning experience rather than composing text.
One last note:  I used a product called ZebraZapps to make this.

Friday, December 6, 2013

Everything You Wanted to Know About the Library But Were Afraid to Ask: Meeting the Library Needs of Adjunct Faculty

[Today’s guest blogger is Anika Fajardo of St. Catherine University. The following is a summary of her lightning round presentation given at CLIC’s Hush the Shushing Conference on October 24. During the lightning round session at the Conference, each CLIC member library presented an overview of a recent project focusing on the library’s users. St. Kate's project focused on meeting the needs of the elusive adjunct faculty by creating an online video to introduce this user to the resources available from the library.]


When thinking about library users, the needs of adjunct faculty often fall through the cracks. Because of their part-time and/or temporary status, routing information to this group can be challenging. They often have gaps in their knowledge of library services and can end up feeling like second-class citizens. 

In order to address the needs of adjunct faculty, I knew I needed some method of reaching out to them that didn’t require on-campus attendance, would be easy to fit into their buy schedules, and—hopefully—would be relatively painless to endure. I chose to create an online video to introduce adjunct faculty to the library and its services. 
Creating a Film for Library Users in 10 Easy Steps
  1. Determine the user type and their limitations/needs
  2. Get buy-in from stakeholders
  3. Set goals and objectives
  4. Create an outline
  5. Write a script
  6. Recruit “volunteers” and other resources available in your library (including talent and equipment)
  7. Film
  8. Set aside time for post-production
  9. Share, share, share
  10. Celebrate

The resulting 8-minute film was far from perfect but provided a way to push information to users. The film is available on YouTube and is linked from the St. Catherine University Library’s LibGuides.
View online:


Monday, December 2, 2013

Who Are You? Getting to Know Our Users at Concordia

[Today’s guest bloggers are Greg Argo and Zach Moss of Concordia University, St. Paul. The following is a summary of their lightning round presentation given at CLIC’s Hush the Shushing Conference on October 24. During the lightning round session at the Conference, each CLIC member library presented an overview of a recent project focusing on the library’s users. Concordia's project focused on updating their perceptions of their students using Concordia's Institutional Research reports. By moving the data into Google Drive and Google Sites, they were able to provide user-friendly visualizations and will be able to track changes to the student body.]





We’ve served a very diverse student body for some time, but we never definitively knew in what ways and to what extent. Our less visible student groups (online students, cohort students who come once a week for evening class, traditional students who just don’t come to the library) were mostly bypassing our general impressions. It was difficult to tell which of the people in our library building were students and which were non-students. Even if we could, a general impression is only so reliable.
One day this spring as we perused Concordia’s new internal-facing Website we happened upon the school’s Internal Research Reports. Curiosity about graduation rates led to poking around which reacquainted us with this valuable data source, some which was rather surprising. The need to share this with staff was obvious, but we thought that this information should be moved out of yearly reports comprised of tables and reported in the same way we were planning to report our usage statistics: longitudinally and visually. This treatment would allow us to update our perceptions, centralize data for better decision making, and reexamine processes.

We performed the data transfer and visualizations with Google Apps. Google is our school’s enterprise system, it is free, and its suite of software is highly integrated. That made it easy for us to create spreadsheets, graphs, and charts which we could then easily insert into a Google Site. Using Google Apps also helps us protect that info and decide how we will share it across our domain. The site we created ended up comprised mainly of charts and graphs, with a few data tables and annotations added for clarity. One of the harder parts of the process was maintaining consistent verbiage and coloration for variables across different charts. 





Once the site was complete, we had reports for each node and subnode of this nested structure:
Age
  • Undergrad vs. Grad
Degrees Granted
  • By College
  • By Major
Departmental Enrollment and Degrees Granted Comparison
  • Art
  • College of Business and Organizational Leadership
  • Communications
  • Criminal Justice
  • College of Vocational Ministry
  • Education
  • English
  • History
  • Kinesiology and Health
  • Math
  • Music
  • Psychology
  • Science
  • Sociology
  • Theatre
Enrollment
  • By College
  • By Major
Faculty
Graduation and Retention Rates
  • National Comparison
Race/Ethnicity
Religious Affiliation
Sex
Traditional v. Non-Traditional Enrollment
Undergraduate vs. Graduate Enrollment
Undisclosed Responses

Here are some loose classes of findings, and examples:
  • Extent: How many students are LCMS Lutherans? How many are earned a degree in Art last year?
  • Gradation: How many undergraduate students are below the age of 22? How many Education majors are graduate students?
  • Longitudinal: How has the proportion of graduate and undergraduate students changed over the last ten years?
  • Comparison: Does a drop in database usage which specializes in X coincide with a drop in majors enrolled to study X?
  • Trends: When might we expect graduate students to make up 50% of the student body headcount? FTE?
  • Headscratchers: Why have students become more likely to disclose their race, but less likely to disclose their religious affiliation in the last 3 years?
We look forward to having this on hand at all times to inform decision making. Already this work has helped us reshape our budget allocations in general, and it will continue to do so at an even more granular level as we go forward. Format considerations will be informed by the types of users in each program. Combined with our database and circulation statistics, the user data will aid us in estimating library cost for each program. This data will also inform our outreach strategies and could impact staffing decisions. As we add more areas to our Internal Research site – InterLibrary Loan, Serials, Library Instruction – more complex comparisons will be possible. As promising as all these analyses are, it also feels great to be able to say, “yes, we know who our users are.”