Low Instruction Numbers Call for More Aggressive Outreach

As the semester draws to its end, I find myself compiling instruction statistics. While the College of Arts and Science is undergoing a core curriculum revision, which will integrate an information literacy learning outcome into a required course, the current core curriculum lacks such a requirement. There have been two courses which represent our instructional program’s bread and butter – Ratio Studiorum Program (RSP) and Civic Engagement through Public Communication (COM 152).

COM 152 is a speech course that the majority of students take to fulfill a core requirement. This fall, I taught 11 of the 12 sections (92%) of COM 152. This percentage is higher than last fall, which was only 58%. Anecdotally, I attribute the rise to two factors. The first is that I began working at Creighton two days before the start of the Fall 2012 semester; I imagine some instructors hesitated to contact me because of this. The second is that word of mouth of my instruction spread. I know this is true for two instructors who I taught for in the Spring 2013 and Fall 2013 semesters. They told me an information literacy session was recommended by another faculty member. All in all, I’m pleased with the improvement in the raw numbers for this course.

RSP is a required one credit course for freshmen. It focuses on advising, acclimating students to collegiate level academics, and introducing students to Jesuit values taught at Creighton. This fall, librarians taught only 20 of the 49 sections (41%) in the College of Arts and Sciences. The reverse occurred in Fall 2012 when 22 of 39 sections (58%) had an encounter with library instruction. This decrease unsettles me.

I analyzed these numbers to see if there are any patterns or conclusions to draw. Here is what I found:

  • More sections of RSP were taught in Fall 2013 to allow for smaller class sizes
  • Over half of the instructors who taught in Fall 2012 and requested library instruction did not teach a section in Fall 2013
  • No faculty members abandoned ship – if a faculty member requested instruction in Fall 2012, they also requested it in Fall 2013
  • There doesn’t seem to be a pattern in faculty members by department who did or did not request instruction. Notably high departments who did not request instruction for Fall 2013 were Modern Languages and Chemistry

I’d like to note that I’m a firm believer that correlation does not equal causation, but I still see the value in examining observations and data to find patterns that may lead to further research.

I expect numbers to fluctuate each year; however, I have designed instruction for COM 152 to scaffold from instruction in RSP. Information literacy instruction in RSP focuses on Bloom’s lower level skills. We teach students the building blocks of research and show them the various resources the library owns. COM 152 focuses almost exclusively on evaluation and analysis of sources. Perhaps this scaffolded approach is not appropriate if information literacy instruction in RSP is reaching only 40-60% of students?

One of the biggest disappointments in the lowered numbers is that as an instructor for a section of RSP this Fall semester I seemed to have failed at recruiting more faculty requests for instruction. Not only was I at all of the instructor meets, networking with faculty, but I also presented multiple times to the faculty. I believed this would encourage more faculty to utilize our instructional program, but this did not happen.

Perhaps our liaisons, myself included, need to encourage faculty to bring their sections to the library more aggressively. Another idea emerged from teaching a section instructed by a Chemistry faculty member. He showed high interest in RefWorks and wanted me to teach his students about it. I typically don’t mention RefWorks during instruction for RSP. Maybe we need to appeal more to the interests of each faculty member’s discipline. For example, the faculty in the sciences may request sessions if we advertise teaching how to use RefWorks and the differences between primary and secondary sources.

With the new core curriculum beginning in Fall 2014, this all may be needless extrapolation. The information literacy outcome imposed on one of the new core courses poses promise for the future of the instructional program. What the Library needs to do going into the future is try to emphasize the partnering of faculty with librarians to achieve this outcome. I know there will be faculty who choose to go at it alone, but it is our job to show that we can be collaborators with them to help their students learn how to become information literate students.

It has occurred to me that I’m putting a lot of emphasis on “usage” numbers. Student outcomes are the most important assessment piece when dealing with information literacy. If students are not learning anything, then the library is failing. But in order to teach students, we need to get them inside the doors, whether physically or virtually.

Do We Need to Teach This?

As Reference and Instructional Services Librarian, a significant percentage of my work focuses on assessment. One of my favorite classes to teach as the social science liaison is Communication Studies 152: Civic Engagement through Public Communication. Throughout the semester students produce a series of informative, persuasive, and group speeches. Instructors encourage students to speak about topics that not only interest them, but that also engage with the current civic discourse.

Because topics generally focus on current issues, sources run the gamut from open web sources to scholarly articles. Most speeches utilize an array of government statistical sources, local newspaper articles, and scholarly, peer-reviewed material. On top of this breadth lies the fact that a majority of COM 152 students are freshmen, many of whom have never been exposed to college-level research or library databases. This leaves 50 minutes to teach students the information cycle, the difference between scholarly and popular, how to effectively use databases (and a wide variety of them, since topics can fall under any discipline), and how to critically evaluate the information they find. A daunting task, but one that instruction librarians regularly face.

Working with course instructors, information literacy librarians make tough decisions about what to teach and what not to teach on a daily basis. Creating clear, relevant, and measurable learning outcomes helps us prioritize and focus our learning objectives. Assessing these outcomes illustrates if students learn what we teach. We then close the assessment loop by using the results to inform our future teaching. But what do we do if the results continue to show students aren’t grasping a particular concept? We can incorporate different teaching techniques. We can ensure we’re teaching to various learning styles. We can develop different active learning exercises. In sum, we go back to the drawing board. We don’t give up. But how often do we take a step back and ask ourselves, “Is this a skill a students (still) need? Do we need to move on and focus our attention elsewhere?” After all, time is precious and we may be letting one piece distract us from the greater puzzle.

This spring I piloted a pre- and post-test assessment piece in four of the sections of COM 152. I learned a lot about the effectiveness of my teaching and how students conceptualize some of the material. The tests also revealed that students repeatedly failed to grasp the difference between keywords and subjects. A handful of conclusions can be drawn from the results, but I began to wonder if this was an essential skill for students in this course to master. A large portion of students’ sources (for better or worse) come from open web sources and newspapers, both of which typically do not use controlled vocabulary and often allow for full text searching. Recognizing the difference between subjects and keywords might prove useful when searching for scholarly articles or using the catalog, but since sources need to be within the past 5 years, students often ignore the catalog and their scholarly sources trend toward broad pieces on general issues.

Through working with students one-on-one in research consultations, I think students often discover controlled vocabulary serendipitously. They’ll search a database and begin to look through the records. They notice that within each record are hyperlinked terms that often reflect their initial keyword search. Since Millennials are so accustomed to URLs, they’ll click on the subject term and realize that the database has now returned more relevant records. Some students will ask why it happened, while other students don’t care why it happened but realize that they’ve “done something right.” They begin to notice the database limiters. Serendipity.

Is it possible to quantify which learning method is more valuable – learning within the classroom or learning by doing? Are the two separate? I can create active learning exercises that incorporate this same process, but the activity is divorced from the students’ point of need. Or is this all irrelevant? Knowing the difference between controlled vocabulary and keywords is a lower order skill. Should my efforts focus on higher order information literacy skills? Is this possible without knowing the difference between keyword searching and controlled vocabulary?

I believe it is possible. Teaching the difference between keywords and subjects may be a traditional learning objective, but its time may have run out. In an increasingly digital educational landscape where information overload is almost inevitable, this is one piece that I’m growing increasingly comfortable removing from the puzzle.