2016 Research Award winners!
Congratulations to 2016 Research Award winner Phil Jones, Julia Bauder, and Kevin Engel of Grinnell College! They were awarded this honor for their article:
Phil Jones, Julia Bauder, and Kevin Engel of Grinnell College. (2016). Mixed or complementary messages: Making the most of unexpected assessment results. College & Research Libraries, 77(2), 197-211.
Below is a summary of and reflection about their research. Congratulations!
We start with thanks to the ILA/ACRL Awards Committee for reading and selecting our article for this honor. We appreciate, as well, the committee’s generous comments on our work; it is always helpful, and sometimes rare, to receive feedback on scholarship.
Our college was a member of the first cohort of Assessment in Action (AiA), ACRL’s a 14-month professional development program helping academic librarians to design, implement and evaluate a program or service. Our article shares the results of Grinnell’s AiA project exploring what our students learn in representative research literacy sessions from each of our college’s three academic divisions. During the three years we worked on this study and article, we were also reading and sending comments to ACRL on drafts of the Framework for Information Literacy for Higher Education. What we noticed at the time and have continued to see is that ideas and practices emerging from our own research were well aligned with the frames of ACRL’s new set of professional guidelines. We are pleased with this convergence. As you will see below, what our study and the Framework tell us is that we have been underestimating both our students as researchers and ourselves as teachers.
We worked with 200 and 300-level classes in Economics, Spanish, and Psychology (involving second, third, and fourth year students). In each class, students prepared a draft bibliography of sources that would be used in a later project. Those draft bibliographies were shared with the librarian, and a research literacy session was planned and delivered based on the course objectives and questions raised by the students’ work. After the research literacy sessions, the students revised their bibliographies.
Toward the end of the semester, the students answered a survey in which they rated the items on their final bibliographies on a five-point scale for relevance, timeliness, and authority, and answered three open-ended questions. Faculty also answered a survey—rating the items that appeared on both the draft and final student bibliographies.
Contrary to what might have been expected, the results of the surveys showed that faculty rated the student bibliographies highly in each category—whether the item had appeared in the draft or final bibliography. Students were less confident in what they had found rating themselves lower, especially in the areas of timeliness and authority. And, based on the faculty ratings, the research literacy sessions did not measurably improve the quality of the student bibliographies.
That was disappointing; it appeared that students were not learning much from the research literacy sessions based on the quantitative data. However, the qualitative data told a different story. The students stated quite clearly that they were learning and were able to describe in detail what they were learning. Student questions and comments focused less on skills than on concepts especially involving source evaluation.
The conclusion is that today’s students know how to search and can find materials that faculty rate highly (though they may not search as efficiently and effectively as librarians would like or in the places librarians would recommend); students have questions though about what they find—what is the item, is it appropriate to use, how can I judge its quality, and how can I use the information to bolster my project?
Based on these findings, we came up with a list of recommendations to guide our teaching and interaction with faculty and students. One recommendation is to maintain the traditional research literacy session content (teaching the mechanics of searching, introducing databases, showing how to access full text, etc.) for courses primarily involving first-year students. And, to ask faculty teaching these courses for additional class time to discuss and work on source evaluation.
A second recommendation is to focus on higher-level concepts (source identification, source evaluation, use of the information to support an argument, database structure, and more) for 200+ level courses limiting the class time used for database introduction and searching mechanics.
Building these recommendations and the new Framework into our teaching and the objectives of our academic departments will be a gradual process and will involve continued and substantial communication and collaboration with faculty. Especially since fall 2015, we have reached out to faculty several times for discussion and feedback regarding our study, and we have been experimenting with our teaching.
In the end, the results of our study and the concepts of the Framework point toward the changing information practices of our students and faculty. All of this gets at the very heart of what we do as a Library—what we buy, how we buy it, how we make it available, how we structure our services, how we design our spaces, etc. How we respond will determine how successful we will be in keeping the Library relevant on campus.
Julia Bauder, Kevin Engel and Phil Jones
Grinnell College, Burling Library