Monday, December 10, 2012

Assessment at Macalester

The following post is submitted by guest blogger Ginny Moran Heinrich, Assessment Coordinator & Instruction Librarian at DeWitt Wallace Library, Macalester College, and summarizes her presentation at CLIC's Kick-off Program on October 26.


At the DeWitt Wallace Library at Macalester College, formal and informal assessment has been going on for a long time, and this year’s CLIC focus on assessment is giving us a good opportunity to reflect on all that we have been doing over time. One of the strengths of the library has been that assessment is not something for which only one person has responsibility. The library has a group of staff involved in assessment planning, bringing expertise and varied perspectives to our approach to assessment. This has broken down many barriers that other institutions may experience, in that there is a spirit contribution from all staff, using a variety of strategies. The following are some examples of the various ways we have captured information to assess our activities and inform ourselves and our communities.



In 2009, the library started publishing our “dashboard,” which acts a little like an annual report’s executive summary. Our particular implementation is based on a model created by the Indianapolis Museum of Art, and we use it to highlight different things each year, across the library’s operations and instruction programs. While initially this was developed primarily as a promotional vehicle for the library itself, we have found that it is used by other campus audiences in ways not specifically anticipated by the library. In particular, the campus Development Office has found it to be useful:

The library's dashboard offers a wonderful visual perspective on key areas of use. The statistics allow us to communicate specifically about the library's stable and strong relevance in a time of constant change for how information is collected and absorbed. With the dashboard snapshot, our reports become much more than a list of titles and authors; it's clear that students and faculty rely on the library as a vital and integral resource for advancing their scholarship.

In addition, the dashboard is used by the Admissions Office for tour information and by people applying for jobs and practicum positions in the library. 

While the dashboard provides a certain snapshot of library activities, it, of course, has limitations. Lack of context for the information is one of them. While it’s interesting to know the average number of people coming into the library in a week, for example, we don’t know how many of them are unique visitors or repeat visitors, and we don’t know what they’re doing once they’re here. Also, by it’s nature, it doesn’t include qualitative analysis; for that we look to other tools such as the MISO (Measuring Information Service Outcomes) survey for general satisfaction and individual usage patterns (to be implemented in spring 2013), and the RPS (Research Practices Survey) for student research competency (administered in Fall 2012). Nonetheless, this tool helps us share our activities with the broader community and provides a touchpoint for us.


Looking at instruction, in Fall 2012 we started taking attendance at instruction sessions and research consultations by collecting student ID numbers and course number and section information when possible. For instruction sessions, we use a Google form open on an iPad and pass the iPad around so students may enter their information or we obtain a class list from the faculty member and enter the student information for the class from that. To track consultations, we simply ask students for the information. Ultimately, we want to see how the number of information fluency instruction sessions or consultations a student attends impacts their performance in various areas. Specifically, to start, we want to look at:
  • Overall grades
  • Grades/performance in capstone and honors projects
  • Critical thinking skills
Eventually we would like to examine any correlations between the number of information fluency instruction sessions or consultations a student attends to other indicators such as academic department information fluency-related outcomes. Collecting this will also tell us exactly how many students participate in information fluency instruction as part of a class or as individuals, as we can create an unduplicated count with the ID information. Starting to collect attendance information now lets us begin with the first class asked critical thinking questions tied to Macalester’s Statement of Student Learning as part of a survey given to all first year students. Collecting ID information also lets us collect additional data about student performance without resurveying students, helping us minimize student survey fatigue. That’s a “win” for everyone.



A less formal feedback mechanism we use is our Ask & Tell comment board. While we provide an online comment “box,” the regular pen-and-paper box is used the most. Through this tool, we receive a mix of collection purchase suggestions, comments about space use, and other random questions or comments. We usually receive about one comment per week; our goal is to respond within 24 hours. Responses are posted on the comment board along with the original question. Informal information collection strategies such as this one have helped us choose furnishings and evaluate iPad check-out services. Not every assessment tool has to be a “formal” one to be effective; it just needs the right attention and mindful reaction.

As we are developing our assessment plan for the next couple of years, and are looking across our various activities, it’s exciting to see all the things we’re doing and what we’re learning. Assessment is about asking questions in order to learn and then acting on what we learn; in many ways, isn’t that the heart of a liberal arts education?

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.