TOWARD A MORE INFORMED PRACTICE:
A closer look at the Library’s contribution to the
research experience of our high school students
At the High School Library of The American School in Japan, we constantly reflect on our contributions to student learning and our collaborative efforts with faculty. But our reflections, although earnest and critical, are based mainly on our impressions and informal avenues of feedback. We have not made concerted efforts to gather outside data, such as the examining student work or conducting formal interviews, surveys or other means of user needs assessment. We would like to develop a more evidence-based practice. This study is a first step in that direction. While taking a deeper-than-usual look at a particular research project, I utilized a variety of measures, currently at our disposal, for a better assessment of our students’ learning experience and the library’s role in it.
A Chance to Experiment
The teacher of our high school’s Modern World History (MWH) course sought out library involvement on his launch of a new research component, called the Revolutionary Movement Project. In addition to enthusiastically welcoming a new faculty-library partnership, we had just entered into a LibGuide trial and were actively looking to apply it to specific projects. We also saw it as a chance to incorporate, for the first time, a user needs assessments into our collaboration on an in-class research project. Brief descriptions of both the research project and the library’s contribution will set the stage.
Description: The MWH Revolutionary Movement Project
Students chose a modern day or historical social/revolutionary movement to research. Their goal was to determine to what extent their chosen movement fit the criteria of a revolution, a concept they had been studying over the course of the first semester of the year. The 62 students, from three sections of MWH, formed 20 research groups. Each group researched their chosen social/revolutionary movement. The group members collaboratively curated sources into a shared Google Doc and examined their evidence against the criteria of a revolution. As a culminating activity, each group gave an informal presentation to the rest of the class, sharing what they had discovered.
Library support for this project
After discussions with the teacher and a review of the assignment description, rubric and list of student topic choices, the head librarian, Linda Hayakawa, and I supported student research efforts in the following ways:
- LibGuide: We created a project-specific LibGuide, highlighting databases and websites selected for the students’ set of topics.
- Prototype: Linda created a prototype Google Doc repository for group resource discoveries. The intention was to give students a clearer picture of what was expected. The prototype explicitly modeled exploring a range of sources as well as the proper use of images and citations. Both the LibGuide and the prototype were embedded in the MWH course site on Blackboard.
- Small group conferencing: I visited the classroom to give a brief orientation to the LibGuide and prototype. Then, while the rest of the class went about their research, I conferenced with each group. The main goal of the conference was to model an effective search within at least one database well-suited for their topic. We hoped this more targeted database coaching would improve database usage for this project.
I combined three methods to take a post-project, or “after-the-fact”, look at the students’ research experience. I hoped to determine whether or not they found our project-specific LibGuide, and additional support, to be helpful and to simply experiment for the first time with the assessment tools that are currently available to us.
1. Analysis of Google Docs
I sought permission to view the Google Doc of each research group, as a repository of their research discoveries it would provide a window into their research process. Of the 20 research groups, nine shared their Google Doc. Using the students’ citation trail, I hoped to assess student usage of the databases and curated websites as well as their image use and citation practices.
2. Survey of Student’s Research Experience
I developed a survey to gauge student use and opinion of library support for this project. The survey included 4 rating scale questions followed by 3 questions to inspire more detailed comments. The teacher posted the survey form on the course Blackboard site and allowed me 10 minutes of class time to conduct the survey, thereby insuring a high response rate (85%), hampered only by student absence.
3. Group interviews
I conducted face-to-face interviews with 2 research groups, consisting of 3 students each, to ask these more open-ended questions:
Describe how you started your research?
What research paths proved effective?
Of the sources on your Google Doc, which did you showcase for the presentation portion of this assignment?
How did this research activity compare with others you have been involved in?Can you see Research Guide being of use in other courses?
Google Document Analysis Results
I conducted a quick scan of the citations in of the nine Google Docs. Only two showed clear evidence of effective use of database resources. In both cases, the research strategies discussed in the reference conference had been employed. Five of the nine groups showcased information from the websites curated on the LibGuide. The citation trail for some Google Docs was not always obvious and, in some cases, the citation conventions were not carefully followed. Wikipedia was a commonly cited source. Also notable, across the nine documents, was the clever use of images, cartoons and YouTube videos to enliven the final in-class presentation.
Survey results from the three rating scale questions were quite positive for the LibGuide, its database and websites resources, as well as the group conferences for targetted database assistance.
A closer look at the written comments section of the survey though gave a more nuanced, and realistic, portrait of student usage and opinion. A categorization of the written responses revealed a similar overall positive reception of the LibGuide assistance. But this was tempered with very specific criticisms and insights into what made database searching and the curated websites less than helpful. Examples of such comments from students include:
“I thought the databases were a bit difficult and time-consuming to use compared to the websites on the Research Guide.”
“I found them [the databases] useful, but it was sometimes hard to find a lot of information because our movement was moderately new.”
“There weren’t many websites with information we hadn’t already seen.”
“The websites were useful but some content did not relate to what exactly we wanted to find.”
Group Interview Results
The interview sessions involved two groups.
- The first group stated that they started their research with Wikipedia, finding it a good place to get familiar with their topic about which they had almost no prior knowledge. They felt that Wikipedia gave them a good overall introduction and suggested keywords that they employed with success in other search venues. For their presentation, they used images as prompts to engage the audience. The interview revealed that they found the suggested websites from the LibGuide useful but confirmed that they found no need to use the databases.
- The second group had not done any research prior to the unveiling of the LibGuide so they found the ABC-Clio World History database introduction in the group conference an extremely useful starting point. After that, they thought their Google searches were more productive. They did not find the rather esoteric set of curated websites for their topic very helpful. For their presentation, they followed an outline summary of their research which they enhanced with images. They felt that, although this research project had a tight timeframe, they were able to go into more depth on a fairly specific topic because they could divide the work with their research partners.
Upon first glance, the Google Docs of the nine groups were discouraging on several levels.
- The customized database help didn’t have the impact I had hoped. Using a project-specific LibGuide to showcase particularly useful databases and offering targeted database advice for each topic, appeared to get only a small amount of traction within the sample of documents.
- Tracking citations in student work isn’t easy! It became clear quite quickly that the provenance of sources wouldn’t be obvious without a bit more inspection, although this too brought only partial success.
- Citation tracking also laid bare that there is still a lot of work to be done with our students on proper image use and citation. Our prototype of model usage wasn’t a mark many groups were able to meet consistently.
- Students seemed to rely more on Wikipedia, YouTube and their own Google searching than on targeted LibGuide information although there did seem to be good evidence that research groups found the curated websites helpful.
The more discouraging findings were particularly surprising because they conflicted so dramatically with my impression of the effectiveness of the LibGuide and in-class support. In line with our conventional insular reflection practices, I had marked our efforts as a clear success! But, already, my initial data gathering had jarred me out of my self-congratulatory mood!
The Google Doc analysis, while sobering, came into better focus when combined with the survey and interview results. The survey results were generally positive about the usefulness of the LibGuide in general, its curated websites specifically and its even its selected databases to some extent. The written comments from the survey, as well as the remarks by students in interviews, revealed that students often had very good reasons for de-selecting databases in favor of websites, Wikipedia and their own Google searches for their Google Doc final product.
- The LibGuide databases and websites acted as a launch pad to other finds.
- Solid information on their chosen topic was easily available on the Web or, in the case of more current or obscure social movements, not well-covered in our subscription databases.
- The assignment outcome — an in-class presentation meant to inform and engage fellow students — dictated a different set of sources, such as mixed media, than a traditional academic research paper.
- Several comments mentioned the difficulty of database searching. In retrospect, I think the individual group conferencing on databases might be more effective if students have something concrete to refer to later. Perhaps a quick screenshot of the database searches that were effective could be added to the LibGuide, or emailed to each group member, as an after-the-fact reminder.
- Finally, it is worth celebrating that their own discoveries, taken as a whole, often contributed to a forceful presentation of their topic.
The written comment section of the survey also elicited very thoughtful, practical advice from the student researcher about how to improve the LibGuide. For instance:
- “What would make it more useful is maybe adding some videos as well? Some people might have had trouble finding good videos” speaks directly to the fact that we did not adequately account for this project’s presentation format in our LibGuide design.
- “Maybe including the best possible database for each group’s revolution” combined with “putting the databases links on the same page as the ….websites may come in handy” suggests a completely different organization of the LibGuide by topic rather than by source type.
FINDINGS TO INFORM FUTURE ASSESSMENTS
Multiple outside assessment measures strengthen our previously insular reflective practice and help to disabuse us of easy, subjective, first impressions. This process let us experiment with the varied assessment tools that are at our fingertips. In future iterations, the ease of embedding surveys within LibGuides and utilizing LibGuide’s built-in analytics will provide frequent, relatively effortless feedback. Use of these ready analytics will allow us to be more nimble and user-centered in our adjustments. We’ll be able to direct our efforts to more time-intensive assessment methods, such as interviewing students and faculty, analyzing student work and, hopefully in the future, tracking more closely the research process of a sample of students.
Data gathering, such as that gleaned from this user assessment, helps us modify our services in ways that are truly responsive to the needs of our students. It makes learning an exchange, where students teach us as much as we teach them. In addition, it provides evidence to bring to the planning table with teachers. From this more informed position, we can lobby more convincingly for adjustments to assignment rubrics that would explicitly state citation and image expectations and, where appropriate, source usage requirements. We may be able to extend our collaboration with faculty into assessment of student work for information literacy outcomes that have previously gone unmeasured.