Micro User Needs Assessment

TOWARD A MORE INFORMED PRACTICE:
A closer look at the Library’s contribution to the
research experience of our high school students

INTRODUCTION

At the High School Library of The American School in Japan, we constantly reflect on our contributions to student learning and our collaborative efforts with faculty.  But our reflections, although earnest and critical, are based mainly on our impressions and informal avenues of feedback.  We have not made concerted efforts to gather outside data, such as the examining student work or conducting formal interviews, surveys or other means of user needs assessment.  We would like to develop a more evidence-based practice. This study is a first step in that direction.  While taking a deeper-than-usual look at a particular research project, I utilized a variety of measures, currently at our disposal, for a better assessment of our students’ learning experience and the library’s role in it.

A Chance to Experiment
The teacher of our high school’s Modern World History (MWH) course sought out library involvement on his launch of a new research component, called the Revolutionary Movement Project.  In addition to enthusiastically welcoming a new faculty-library partnership, we had just entered into a LibGuide trial and were actively looking to apply it to specific projects.  We also saw it as a chance to incorporate, for the first time, a user needs assessments into our collaboration on an in-class research project.  Brief descriptions of both the research project and the library’s contribution will set the stage.

Description: The MWH Revolutionary Movement Project
Students chose a modern day or historical social/revolutionary movement to research.  Their goal was to determine to what extent their chosen movement fit the criteria of a revolution, a concept they had been studying over the course of the first semester of the year.  The 62 students, from three sections of MWH, formed 20 research groups.  Each group researched their chosen social/revolutionary movement.  The group members collaboratively curated sources into a shared Google Doc and examined their evidence against the criteria of a revolution.  As a culminating activity, each group gave an informal presentation to the rest of the class, sharing what they had discovered.

Library support for this project
After discussions with the teacher and a review of the assignment description, rubric and list of student topic choices, the head librarian, Linda Hayakawa, and I supported student research efforts in the following ways:

  1. LibGuide: We created a project-specific LibGuide, highlighting databases and websites selected for the students’ set of topics.

    Home LibGuide
  2. Prototype: Linda created a prototype Google Doc repository for group resource discoveries.  The intention was to give students a clearer picture of what was expected.  The prototype explicitly modeled exploring a range of sources as well as the proper use of images and citations.  Both the LibGuide and the prototype were embedded in the MWH course site on Blackboard. Google Doc Prototype
  3. Small group conferencing: I visited the classroom to give a brief orientation to the LibGuide and prototype.  Then, while the rest of the class went about their research, I conferenced with each group.  The main goal of the conference was to model an effective search within at least one database well-suited for their topic.  We hoped this more targeted database coaching would improve database usage for this project.

METHODS

I combined three methods to take a post-project, or “after-the-fact”, look at the students’ research experience. I hoped to determine whether or not they found our project-specific LibGuide, and additional support, to be helpful and to simply experiment for the first time with the assessment tools that are currently available to us.

1.  Analysis of Google Docs
I sought permission to view the Google Doc of each research group, as a repository of their research discoveries it would provide a window into their research process.  Of the 20 research groups, nine shared their Google Doc.  Using the students’ citation trail, I hoped to assess student usage of the databases and curated websites as well as their image use and citation practices.

2.  Survey of Student’s Research Experience
I developed a survey to gauge student use and opinion of library support for this project.  The survey included 4 rating scale questions followed by 3 questions to inspire more detailed comments.  The teacher posted the survey form on the course Blackboard site and allowed me 10 minutes of class time to conduct the survey, thereby insuring a high response rate (85%), hampered only by student absence.

3.  Group interviews
I conducted face-to-face interviews with 2 research groups, consisting of 3 students each, to ask these more open-ended questions:

Describe how you started your research?
What research paths proved effective?
Of the sources on your Google Doc, which did you showcase for the presentation portion of this assignment?
How did this research activity compare with others you have been involved in?Can you see Research Guide being of use in other courses?

RESULTS

Google Document Analysis Results

I conducted a quick scan of the citations in of the nine Google Docs.  Only two showed clear evidence of effective use of database resources.  In both cases, the research strategies discussed in the reference conference had been employed.  Five of the nine groups showcased information from the websites curated on the LibGuide.  The citation trail for some Google Docs was not always obvious and, in some cases, the citation conventions were not carefully followed.  Wikipedia was a commonly cited source.  Also notable, across the nine documents, was the clever use of images, cartoons and YouTube videos to enliven the final in-class presentation.

Survey Results
Survey results from the three rating scale questions were quite positive for the LibGuide, its database and websites resources, as well as the group conferences for targetted database assistance.

Survey Summary 2

A closer look at the written comments section of the survey though gave a more nuanced, and realistic, portrait of student usage and opinion.  A categorization of the written responses revealed a similar overall positive reception of the LibGuide assistance.  But this was tempered with very specific criticisms and insights into what made database searching and the curated websites less than helpful.  Examples of such comments from students include:

“I thought the databases were a bit difficult and time-consuming to use compared to the websites on the Research Guide.”

“I found them [the databases] useful, but it was sometimes hard to find a lot of information because our movement was moderately new.”

“There weren’t many websites with information we hadn’t already seen.”

“The websites were useful but some content did not relate to what exactly we wanted to find.”

Group Interview Results
The interview sessions involved two groups.

  • The first group stated that they started their research with Wikipedia, finding it a good place to get familiar with their topic about which they had almost no prior knowledge.  They felt that Wikipedia gave them a good overall introduction and suggested keywords that they employed with success in other search venues.  For their presentation, they used images as prompts to engage the audience.   The interview revealed that they found the suggested websites from the LibGuide useful but confirmed that they found no need to use the databases.
  • The second group had not done any research prior to the unveiling of the LibGuide so they found the ABC-Clio World History database introduction in the group conference an extremely useful starting point.  After that, they thought their Google searches were more productive.  They did not find the rather esoteric set of curated websites for their topic very helpful.  For their presentation, they followed an outline summary of their research which they enhanced with images.  They felt that, although this research project had a tight timeframe, they were able to go into more depth on a fairly specific topic because they could divide the work with their research partners.

CONCLUSION

Upon first glance, the Google Docs of the nine groups were discouraging on several levels.

  • The customized database help didn’t have the impact I had hoped.  Using a project-specific LibGuide to showcase particularly useful databases and offering targeted database advice for each topic, appeared to get only a small amount of traction within the sample of documents.
  • Tracking citations in student work isn’t easy!  It became clear quite quickly that the provenance of sources wouldn’t be obvious without a bit more inspection, although this too brought only partial success.
  • Citation tracking also laid bare that there is still a lot of work to be done with our students on proper image use and citation.  Our prototype of model usage wasn’t a mark many groups were able to meet consistently.
  • Students seemed to rely more on Wikipedia, YouTube and their own Google searching than on targeted LibGuide information although there did seem to be good evidence that research groups found the curated websites helpful.

The more discouraging findings were particularly surprising because they conflicted so dramatically with my impression of the effectiveness of the LibGuide and in-class support.  In line with our conventional insular reflection practices, I had marked our efforts as a clear success!  But, already, my initial data gathering had jarred me out of my self-congratulatory mood!

The Google Doc analysis, while sobering, came into better focus when combined with the survey and interview results.   The survey results were generally positive about the usefulness of the LibGuide in general, its curated websites specifically and its even its selected databases to some extent. The written comments from the survey, as well as the remarks by students in interviews, revealed that students often had very good reasons for de-selecting databases in favor of websites, Wikipedia and their own Google searches for their Google Doc final product.

  • The LibGuide databases and websites acted as a launch pad to other finds.
  • Solid information on their chosen topic was easily available on the Web or, in the case of more current or obscure social movements, not well-covered in our subscription databases.
  • The assignment outcome — an in-class presentation meant to inform and engage fellow students — dictated a different set of sources, such as mixed media, than a traditional academic research paper.
  • Several comments mentioned the difficulty of database searching.  In retrospect, I think the individual group conferencing on databases might be more effective if students have something concrete to refer to later.  Perhaps a quick screenshot of the database searches that were effective could be added to the LibGuide, or emailed to each group member, as an after-the-fact reminder.
  • Finally, it is worth celebrating that their own discoveries, taken as a whole, often contributed to a forceful presentation of their topic.

The written comment section of the survey also elicited very thoughtful, practical advice from the student researcher about how to improve the LibGuide.  For instance:

  • “What would make it more useful is maybe adding some videos as well? Some people might have had trouble finding good videos” speaks directly to the fact that we did not adequately account for this project’s presentation format in our LibGuide design.
  • “Maybe including the best possible database for each group’s revolution” combined with “putting the databases links on the same page as the ….websites may come in handy” suggests a completely different organization of the LibGuide by topic rather than by source type.

FINDINGS TO INFORM FUTURE ASSESSMENTS

Multiple outside assessment measures strengthen our previously insular reflective practice and help to disabuse us of easy, subjective, first impressions.  This process let us experiment with the varied assessment tools that are at our fingertips.  In future iterations, the ease of embedding surveys within LibGuides and utilizing LibGuide’s built-in analytics will provide frequent, relatively effortless feedback.  Use of these ready analytics will allow us to be more nimble and user-centered in our adjustments. We’ll be able to direct our efforts to more time-intensive assessment methods, such as interviewing students and faculty, analyzing student work and, hopefully in the future, tracking more closely the research process of a sample of students.

Data gathering, such as that gleaned from this user assessment, helps us modify our services in ways that are truly responsive to the needs of our students.  It makes learning an exchange, where students teach us as much as we teach them. In addition, it provides evidence to bring to the planning table with teachers.   From this more informed position, we can lobby more convincingly for adjustments to assignment rubrics that would explicitly state citation and image expectations and, where appropriate, source usage requirements.  We may be able to extend our collaboration with faculty into assessment of student work for information literacy outcomes that have previously gone unmeasured.

Advertisements

About tellthetruthruth

I'm a 12 Thinger.
This entry was posted in Uncategorized. Bookmark the permalink.

5 Responses to Micro User Needs Assessment

  1. ZemLee says:

    Ruth,
    It sounds like your library is very forward thinking in engaging faculty support and in making real and conscious efforts in gauging student needs through measurable research! The study you detailed in your assignment sounds huge—like had your library already thought about this and started implementation prior to you taking this class? I’m absolutely fascinated by what new insights you gleaned from your assessment!
    First of all, I’m a firm believer in teaching from a class-specific LibGuide because it serves two purposes in 1) introducing the resource to the students so they can find it again later 2) limiting the ocean of library resources to a “wading pool” so students can begin with a more targeted set of resources that are specific to them and their needs for that moment. However, I’m absolutely floored that the LibGuide didn’t prove to be more useful in helping to guide the students’ research “trail”! I think I’m just as surprised as you are.
    I like also that you employed a Google Doc to track these types of searches, but was a little unclear on the actual procedures for how the students were supposed to map out their search trails here. Was it more of a bibliography of sorts? Or maybe a research journal? Or were they asked to provide their step-by-step process for conducting research? Obviously if they relied heavily on Wikipedia, there really isn’t a very long trail to follow, but I’m curious as to what the expectations were in terms of using this document. I sympathize with you in commenting that tracking citations in student work isn’t easy. That whole process of your study seemed like an enormous undertaking.
    The student comments on the qualitative analysis portion of your survey were interesting—it’s usually what students say when they don’t understand how to “drive” a database. “It’s too hard to use, ergo, I’ll use something else.” I know a lot of us begin our research with Wikipedia to help give an overview of something we have absolutely no idea about—I actually use it to help me get started on my broad keyword searching anyway! But maybe starting in on Wikipedia as a search, coupled with the difficulties in using the databases resulted in a low-DB use? Interesting.
    Anyway—I really enjoyed reading your report. It makes me wonder if this model for study were employed on students in higher education if maybe the results would be remain the same or not… Thanks for your post! 🙂

  2. embendered says:

    Thank you for your thoughtful comments, Zem! It is very interesting that you zeroed in on the timeframe because that was no small factor in this project, both in terms of the library piece as well as the student’s research experience.

    This project has had a short life. Not a long one! It began after we returned from the holidays so, from start to finish, it took about 6 weeks:

    Week 1: Prepare the prototype & LibGuide (our trial had started Jan. 13th) while the MWH teacher introduced the project, students selected topics & formed groups around a topic and embarked on their own self-directed research.
    Week 2: Introduce the LibGuide after which students continue researching with that stepped-up support.
    Week 3: In-class presentations
    Weeks 4-6: My after-the-fact user needs assessment

    Being involved in this project from start to finish, just as the LIBR 220 Environmental Scanning Assignment became clear, got me thinking about using it as the platform for data gathering. As I said in my write up, my impressions of the library contribution for this project were so self-congratulatory that I naively thought my data gathering would be just pulling bullets out of a bull’s-eye. Instead, I was just picking up buck-shot again…..to borrow your perfect analogy! ☺

    What is so great though, is that this time – thanks to this assignment – I am actually hanging in there long enough to hear what students have to say. This is something we do not do…..gather data.

    I think the short time frame of this research project was a factor for students. Students, in the interest of time, relied on their go-to strategies. What I saw on their Google Docs was very much in keeping with the ethnographic findings of Project Information Literacy studies on the information seeking habits of young adults. I also now think that the assignment lent itself more to Web-based searching than database searching for many topics. Also the lack of explicit expectations in the rubric meant that students felt no requirement to include a database-derived source nor follow citation conventions to the letter.

    The Google Doc analysis exposes my position on the “baby-steps” end of the learning curve. I have always thought we should look at student work as a measurement tool but I have never really thought it through. Even with perfect citation practices, divining the provenance of sources can only be nailed down by direct observation of the research process, a talk-aloud protocol of some sort or, as you say, a research journal.

    Overall, this process has been completely myth-busting, mentally engaging……and downright FUN! It is worthy of many longer discussions……and bottles of good wine! Let’s do that some day, Zem! ☺

  3. georgereads says:

    Hi Ruth,
    Great project to assess. You’ve got some very interesting results in terms of databases vs. website relevance. I completely agree with you when you mentioned that in projects past you would either toot your own horn or write the project off as not working. Like you, this assignment actually provided me with some concrete feedback to go off of. On a side note, I’m curious why you chose a Google Doc over Diigo to have the students record their sources? As always, excellent work! I’m learning so much from your posts this semester–thank you!

  4. embendered says:

    Thank, George! I just read your environmental scan and learned a lot from you too!

    You’re right! Diigo could have been a good platform too. I think the Google Doc idea sprang from the presentation format for the final share-out. Some groups arrayed their finds on the Google Doc and scrolled down showing images or videos to embellish their presentation. But other groups moved into the Google Presentation format for a more traditional powerpoint.

    Our students have used Diigo for other group work though and it has been great. Powerful even. It’s been particularly effective when groups have explored controversial topics, populated a Diigo group account with pro and con articles and then kept a dialog going via Diigo comments. I know that kids have really had their minds opened to new ways of looking at controversial topics that way and have learned to respect the fact that people (their peers, in this case….not some unknown strangers) can hold different views for very valid reasons.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s