Tuesday, November 27, 2012

CLIC Circ Committee Tackles Loan Rules


Although the future of our ILS is undecided, the Circulation Committee endeavors to streamline the Loan Rules table (the system rules that control all the circulation of materials).  Since many changes have taken place in the past 10 years, we are determined to make the current list of over 150 rules more manageable and more coordinated instead of super-customized.  



Becky showed a spreadsheet of the rules at this morning’s meeting and talked about different elements of the rules that each school could work on, making minor adjustments to accommodate an existing rule rather than making new ones.  The group will continue to work on this over the next year or so.

Wednesday, November 14, 2012

Assessment at St. Catherine University


The following post is submitted by guest blogger Sue Gray of St. Catherine University Library, Minneapolis Campus, and summarizes her presentation at CLIC's Kick-off Program on October 26.

For the past two years, St. Catherine University has been preparing for accreditation by the Higher Learning Commission (site visit will happen in February 2013). Carol Johnson, our library director, is on the team charged with preparations for the site visit, and the self-study report that is being created in advance of the visit.



As a part of this process, St. Kate’s has developed a university-wide model of systematic program evaluation called Outcomes Based Assessment Plan (OBAP).  This model is the recommended format for each academic department to describe their assessment process, and is a method to align existing assessment activities with the University’s mission, goals, and strategic plans.  Each department OBAP has a mission statement, student learning and program effectiveness goals, measurable outcomes, and a delivery plan.  

Each unit of the St. Kate’s libraries, including archives and media services, created a departmental OBAP.  Goals ranged from preserving collections for future scholarship, to collaboration with academic departments to create a rigorous research agenda, to preparing classrooms for digital/lecture capture.  As an example of the OBAP’s goal, delivery, outcome process, one of our goals has been to develop a robust online library presence.  Our web librarian utilized User Experience (UX) design principles and practices to initiate a redesign of the library website.  User interviews were performed in May 2011, and repeated in May 2012, to assess the library’s homepage.  Comprehensive personas were developed to address the core needs of users.  Based on user interviews and the development of user personas, target users and their needs were identified, and the homepage was redesigned to address those needs.



Many of our ongoing assessment activities have to do with addressing gaps in our collection.  One librarian used Journal Citation Reports as disciplinary benchmarks to analyze our holdings in Nursing, Physical Therapy, Occupational Therapy, and Public Health to determine if we have adequate access to high impact journals in these disciplines.  This analysis created the foundation for a journal purchase priority list for titles not held that are a high priority for St. Kate’s faculty and students.

Circulation staff pulled 25 current master and doctoral theses from each department, and culled titles from the reference lists to see what we owned and what items needed to be requested through ILLiad.  (The other not-so-surprising finding was that graduate students need more citation support.  In response, two librarians will be offering an intensive APA workshop at the beginning of winter semester).  Evaluating copyright costs and VDX requests has resulted in identifying frequently requested journals that we should purchase.

As for next steps in our work, one of our priorities is to develop goals and learning outcomes for “The Reflective Woman,” a core course for first-year students.  Five librarians, on both campuses, teach a library instruction component for each class.  And while individual librarians have implemented their own assessment strategies (pen-paper/online feedback surveys, and, pre/post testing of students’ confidence in their research skills), we do not have a standardized method to assess student learning across all sections.  We hope to have an assessment workshop, conducted by Office of Institutional Research, Planning & Assessment staff, to assist us in moving forward on this goal.

Tuesday, November 13, 2012

LibQUAL+® at Northwestern

The following post is submitted by guest blogger Jessica Nelson Moore from Northwestern College.



In order to assess library service quality, the Berntsen Library at Northwestern College administered the LibQUAL+® survey to NWC’s students and faculty in 2008 and 2012. 

The LibQUAL+® survey assesses library patron’s perceptions of 22 aspects of library service quality, grouped into three major dimensions: Affect of Service, Information Control, and Library as place. 

·         Affect of Service refers to users’ perceptions of the service given by library staff.  Questions include “Employees who understand the needs of their users” and “Giving users individual attention”.
·         Information Control refers to users’ perceptions of access to information.  Questions include “Making electronic resources accessible from my home or office” and “Print and/or electronic journal collections I require for my work”.
·         Library as place refers to users’ perceptions of the library’s environment.  Questions include “Library space that inspires study and learning” and “A comfortable and inviting location”. 

One of the benefits of LibQUAL+® is that it shows not only the perceived level of service for each dimension, but also the desired level of service.  The results show not only how patrons felt about each aspect of the library, but whether or not it was important to them.  For example, in the 2012 survey undergraduates ranked their top five desired (the most important) aspects of library service quality as “Employees who are consistently courteous”, “Employees who deal with users in a caring fashion”, “A library web site enabling me to locate information on my own”,  “Willingness to help users”, and  “Making information easily accessible for independent use”.  Of these top five desired qualities, three were considered by undergraduates to be in the top five areas of highest perceived service quality.  To put it another way, of the top five things that undergraduates think the library is doing well, three of them were in the top five things they most cared about.

Survey results are plotted graphically on a circular chart, referred to as an “Antarctica chart” because of its resemblance to the polar continent.  The yellow band on the chart represents areas where the perceived level of service for that aspect of service was less than the desired level of service.  (A wider yellow space indicates that there is a wider “gap” between the desired level of service and the perceived level of service for those attributes.)  The blue band on the chart represents areas where the perceived level of service was greater than the minimum level of service, i.e., adequate service.  The green on the chart represents areas where the perceived level of service was greater than the desired level of service, i.e., superior service.  Any areas of red on the chart would indicate that the perceived level of service for that area was below the minimum accepted level of service.  





A huge benefit of LibQUAL+® is the comments that survey respondents offer with their response.  About 1/3 of survey respondents left a comment with their survey.  Many of the comments reveal suggestions for specific areas of improvement for the library, such as noise level.  Example: “Once in awhile it is too noisy, but mostly it works great!”  Other comments reveal the aspects that patrons most appreciate about the library, such as reference service.  Example: “The reference librarians at NWC are amazing. Not only are they incredibly helpful, they are also very friendly.”

After the 2008 survey, results were communicated to library staff and other stakeholders such as the campus Assessment Steering Committee.  Changes were made based upon survey feedback, including updating library computers with new software and enhanced printing capabilities, and enhancing study spaces with lighting and electric outlets.  A follow-up focus group was conducted to further understand what students wanted from their library.  The staff of the Berntsen Library is looking forward to analyzing the results of the 2012 survey, especially in comparison to the initial, benchmark survey in 2008.  

Monday, November 12, 2012

Assessment at Concordia

The following post is submitted by guest blogger Greg Argo of Concordia University Library, St. Paul, and summarizes his presentation at CLIC's Kick-off Program on October 26.


The CLIC Year of Assessment has been a good motivator for the Concordia Library to prioritize assessment activities. After working intensely from 2005-2008 on information literacy assessment initiatives for a university-wide grant project, the intervening years have seen various independent assessments like Website screenshot markups, Faculty Retreat “captive audience surveys” and Google Analytics. Now it is time for us to get back in the assessment mindset, and we will be applying assessment activities toward both 1) our users and 2) our student workers.

The overarching goal of our next phase of assessment with users is something more systematic, repeatable and sustainable than what we’ve done in the past. More than measuring how much users like us, how good we are, and so on, we’d like to measure how the library is being used, the level of familiarity with specific services, and how we might better meet our users’ expectations. To that end, we’ve created survey tools for both students and faculty.
Our student survey is short and varied and was modeled on existing library assessment programs (the University of Washington Triennial Survey, for instance) and design suggestions from student workers. One off-the-wall suggestion we will try in hopes of getting a greater completion rate is a less stilted tone, replacing Likert scale values like “very important” and “somewhat satisfied” with scales (still labeled with the familiar numerical continuum) bearing less formal values like “not too shabby” and "speed of light." We are hoping to get a clearer picture of where students are starting their research and which services of ours they are aware of.  Students also gave us good suggestions about how to increase our response rate. We will be following up with each user we touch via the various library processes – chat, email, interlibrary loan, account problems, and hold shelf notifications, as well as trying a trickle-down effect through student groups.

Our faculty survey is focused on our partnerships with them, with a mix of open-ended questions and questions with answer options which may suggest to them new ways of doing things that we haven’t previously been able to suggest. We’re hoping to find out their preferred strategies for communications and outreach, any and all ways they may be interested in integrating the library into their teaching, and fail points in their use of the library. We are also looking at assessment as an opportunity for outreach, so each faculty member will be contacted individually by their subject liaison librarian to both request they take the survey and either start or rekindle a working relationship.
Starting this year, we’ve hired more student workers as student supervisors in the hopes of increasing productivity and accuracy in student work. These supervisors are playing an important role in new student worker assessment initiatives. They are tracking and correcting shelving mistakes of new student workers, and leaving details about the mistakes. They are also submitting “student supervisor end of shift forms” which track who was working, what tasks were accomplished during the shift by both students, if homework was done, and so on. Also, mandatory student worker quizzes on various pertinent topics are being assigned every few weeks. All of this info will help Zach, the student worker supervisor, keep track of who has learned what as well as inform his evaluations.
All of our assessment activities are being executed with the help of Google Forms. The functionality has gotten our creative juices flowing, and we like how they simplify Web authoring, data collection and management, workflow tracking, collaboration, and data presentation. Concordia has implemented Google Apps at the enterprise level for email and calendar, and the benefits of an integrated platform make Google Forms and Apps great tools for our assessment activities.

Friday, November 2, 2012

UST Assessment Studies


The following post is submitted by guest blogger Marianne Hageman, University of St. Thomas (UST) Libraries, and summarizes one of UST's presentations at CLIC's Kick-off Program on October 26.

Have you ever wondered about the information literacy skill levels of your incoming freshmen?  At UST, we’ve asked that question several times over the years. 


We began the process to find an answer when my colleague, Donna Nix, came back very excited from ALA in 2008. She’d attended a session by Kate Zoellner and Charlie Potter from the University of Montana, reporting on their study of high school media specialists’ perceptions of high school student preparedness for university-level research, looking particularly at information literacy skills.  They’d based their research on the methodology of Islam and Murno. [Islam, R.L. and Murno, L.A. (2006). From perceptions to connections: Informing information literacy program planning in academic libraries through examination of high school library media center curricula. College and Research Libraries, 67, 496-514.  http://crl.acrl.org/content/67/6/491.full.pdf+html ]


Donna wanted to replicate this research. It sounded cool to me, so I said, “Can I play, too?” That led to our first research project, in 2009-2010, where we interviewed Catholic high school media specialists at 15 schools in the Midwest (ask us sometime about the January ice storm in Iowa.)  The next year, we surveyed St. Thomas faculty who teach introductory research classes in their discipline, to get their take on the IL skills of their students.


Here are some things we learned:
  • Few of our faculty think our students do even “fairly well” on any of the skills.
  • By comparison, the Catholic high school librarians that we surveyed think students are doing pretty well on IL skills.
  • Three of the skills that our faculty think are the MOST important for students to have before taking their introductory research class, they expect them to have already, before they hit that first research class. These skills are plagiarism/citation style, developing a thesis statement, and brainstorming questions.
  • Two of the skills our faculty think are most important to have, they expect them to develop in that first research class. These skills are determining authority, accuracy, timeliness, and bias of sources, and selecting appropriate resources.

Thursday, November 1, 2012

CLIC Assessment Workshop Topics Survey

CLIC is hosting a series of mini workshops on assessment in early 2013. The Assessment Program Planning Task Force is in the process of selecting topics for three mini workshops. Your answers on the survey will assist us in developing the workshop agendas. Please respond by Thursday, November 15, 2012.

Go to this link to take the survey: http://www.surveymonkey.com/s/8SHZ2SF

For a list of the upcoming workshop dates, please see the Assessment Program Planning Task Force's web page on CLIC Direct at:
http://clic.edu/newclic/dir/asmtplan/asmtplan.asp

If you missed our first program on October 26, A View from the Field: Overview of Recent Assessment Activities in CLIC Libraries, Kick-Off Program for CLIC's Year of Assessment, we are posting summaries of the presentations on CLIC News over the next two weeks.  See http://clic.edu/newclic/dir/asmtplan/asmtplan20121026.asp for links to presentations, summaries, handouts, and resources.

Assessment in Action: Academic Libraries and Student Success – ACRL IMLS Grant Funded Program


The following post is submitted by guest blogger, Terri Fishel, CLIC Board member and Director of the DeWitt Wallace Library at Macalester College, and summarizes her presentation at CLIC's Kick-off Program on October 26.

On Friday, October 26th, I spoke to those attending the CLIC Kickoff on Assessment about an upcoming grant opportunity.  I currently serve as vice-chair, chair-elect for the ACRL Value of Academic Libraries committee.  The ACRL Value of Academic Libraries is one of three major areas as part of the ACRL Plan for Excellence.


I wanted to talk about the ACRL grant funded by the IMLS because of the opportunities it presents to individual CLIC libraries and their campuses and for us collectively in CLIC.  The focus of this grant is on student learning, both to prove the value that the library has in how we contribute to student learning and to use our assessment methods to work toward improvements in the work we do and outcomes achieved.  This is an opportunity to work closely with partners on your campus who are outside of the library.  If you haven’t already started to work with your institutional research colleagues, or have the attention of your campus committee for assessment or the committee that is perhaps working on student learning outcomes, this is your opportunity.  Assessment on your campus might be under the committee that oversees the accreditation process.  At the session last Friday, only a few attendees were familiar with members of their campus who managed their campus learning assessment program.  A very few CLIC institutions had librarians who served on campus committees that worked on student learning outcomes. So this grant also provides an opportunity to get to know members of your campus assessment team. You can read more about the grant here (Assessment in Action), but briefly, each participating institution will identify a team, consisting of a librarian and at least two additional team members as determined by the campus (e.g., faculty member, student affairs representative, institutional researchers, or academic administrator). 



Please consider applying and start now to prepare your strategy for campus involvement. Institutions need to apply individually, but as a cohort, if more than one CLIC institution is accepted, we will be able to collectively build on the work of each other and within CLIC if we can have multiple institutions, it provides opportunities to collaborate and reinforce with our administrations the value that we bring to our campuses.  It is also an opportunity to contribute to the broader audience of academic libraries and share our progress and stories of success.

You have time between now and January when the instructions for applications will be made available to do some background reading.  If you haven’t already perused:

  • Standards for Libraries in Higher Education - sample outcomes along with performance indicators and measures - valuable tool if you’re hoping to demonstrate to campus administrators your value and how your work influences student learning outcomes
These resources may help you to clearly articulate the value of the library in demonstrating: 
  • your programs help students succeed which improves retention which helps the bottom line as far as college finances
  • your programs contribute to positive experiences students have at your institution
  • your programs provide positive experiences for students which helps make them supportive alums who contribute back to the college
  • and most importantly, your programs provide them with the critical thinking skills that help them success in college and beyond as lifelong learners

Consider also the important role you play in the lives of students who are employees in your library.  We shouldn’t underestimate the important aspect of this activity in the lives of the students on our campuses.

I recognize that these aren’t your only contributions because we contribute greatly to the work our faculty members do, but in the perspective of keeping the library forefront in the minds of administrators on campus, those are just a few of the points you can think about in developing a frame of reference and a plan for continuous improvement as part of your assessment program focused on student learning.  I hope that you will all consider submitting an application for the Assessment in Action program and hope that we will be able to develop best practices and share our stories of success with a broader audience.

Assessment at Bethel University


The following post is submitted by guest blogger, Michael Mitchell of Bethel University Library, and summarizes his presentation at CLIC's Kick-off Program on October 26.

At the CLIC Assessment: A View from the Field event, I presented the results of an ethnographic study that had been conducted in the Bethel University Library.  Below are some highlights of what we did and our findings from the study.  But first, let me provide a definition of "ethnography," taken from Oxford University Press' Dictionary of Sociology:  Ethnography is"a term usually applied to the acts both of observing directly the behaviour of a social group and producing a written description thereof."


We observe our libraries daily, but our ethnographic study aimed to make a more intentional observation.  We used two complementary tactics for doing this: the first was a series of one-on-one interviews conducted in the Bethel University Library with our users, asking them to tell us: (1) What they were doing in the library, (2) Why they chose the library, and (3) any other information they'd like to provide.   After gathering the data (more than 600 interviews total), we analyzed the responses to come up with themes about our library users and what they valued.   They told us about the benefit of studying in a library that aided their concentration and productivity, and about how valuable it was to have computers and printers available to them.  They also told us to create more group study rooms.


The second part of our study was a "mapping" project that used a seating chart of the Bethel University Library to map out where people were sitting and using computers within the library.  The mapping was done 5 times a day for a full calendar week, and the results were placed into a relational database. By doing this, we were able to see the popular places within our library, the busy and slow times of day, and the spots in which people were likely to be using laptops or  sitting in groups.   We found that the most concentrated spots of our library were our computer labs, and we determined which days were the busiest.  


Some of these results are intuitive, and we probably could've guessed at what most of the findings would be.  The benefit of the ethnographic study, however, is that now we can say those things with certainty; we have data to back up our gut feelings, and that data is much more convincing when talking to outside constituencies.  The results also challenged some of our own assumptions going into the project, showing us that it's important to talk to your users to find out what their actual habits and needs are.  We were constantly forced to think about this data both in a holistic way and on a case-by-case basis to see what future improvements we can make to our space and the services we provide.