Radical Information Literacy Assessment

It happened over falafel.

Once a week, I work from one of my university’s distributed campuses. The primary reason that I work there is to work with the two subject librarians stationed there on a regular basis. Last week I went to lunch with one of the librarians and as usual, we talked a lot of shop. She asked me how assessment was going. I’m sure she heard more from me than she bargained. But then she asked, “What would radical assessment look like?” I think she asked it to herself as much as she asked me.

We spit-balled the first ideas that came to mind. What I landed on was this: Radical assessment removes assessment from the institutional accountability prerogative. In other words, I don’t think it’s possible as a singular form of assessment; however, I think it’s what assessment should aspire to be. Radical assessment is collecting evidence of student learning in order to improve instruction so that students are able to have a higher quality learning experience.

You might be saying to yourself, “but that is assessment.” I would argue that is what assessment is supposed to be, but not what it is in reality. Too often we assess because we’re told we must. We’re told by our accreditation agencies, our administrators, and by our assessment librarians.

Don’t get me wrong, we need to be assessing. Accountability, and whatnot. And so for me, radical assessment is designing assessments that are meaningful first, and evidence for institutional accountability second.

What does “meaningful” mean in this context? I get asked this a lot since my Library’s Student Learning Assessment Plan is based on the concept of meaningful assessment. Each librarian who teaches is asked to complete one meaningful assessment project over an academic year. Meaningful is at the interpretation of the librarian. I tell them: It should be meaningful to your instruction while applying to one of our learning outcomes (the institutional, non-radical aspect). How will it improve your instruction? How can you use it as evidence to teach more (or less)? How can it help your students learn? To me, if you start and build from there, you’re assessing radically.

I know there are plenty of different interpretations of what radical assessment looks like. One of my colleagues had some interesting thoughts. He has multiple interesting tweets, which I encourage you to read, but this one has me thinking. . .

What is radical assessment, particularly in context of information literacy, to you? I’d love to hear your thoughts.

Advertisements

Learning Styles In and Out of the Classroom

Recently I’ve been thinking a lot about learning styles. Settling into a new job requires a lot of learning, and at this point in my career most of this learning involves internal policies, processes, and politics. Or as I’ve been told by a few, mostly just politics.

I’ve been reminded how often times the teacher teaches to their preferred learning style. Everyone is guilty of this, myself included. It takes careful, deliberate effort not to teach solely to your particular style.

If you believe that learning styles exist, there are plenty of approaches to learning styles. Since I had a sip from ACRL’s Immersion Kool Aid, I tend to keep Kolb’s Learning Styles as a model in the back of my mind. I also think that the VARK model has a lot to offer, especially in regards to active learning.

So let’s review Kolb’s Learning Styles Inventory with the graphic below (If you adhere to the VARK model, sorry auditory and kinesthetic learners) …

Kolb's Learning Style Model

Kolb’s Learning Style Model

Another way to think of it:

  • Concrete experience = doing
  • Reflective observation = observing
  • Abstract conceptualization = thinking
  • Active experimentation = planning

You can take a fancy test to determine where you fall. I tested as an Assimilator. If you look at the career characteristics for Assimilators, it’s basically a librarian.

Whether you follow Kolb’s model or another model, what’s important to remember is that everyone learns differently. Just as we try to teach to different learning styles in the classroom, new employee training and professional development training should attempt to teach to different learning styles too. In my case, my new supervisor has created a flexible training environment, capable of molding to how I learn best.

Some will argue that learning styles are a myth. They may be right, but I like to think there is a middle ground and that ground is providing multiple ways to learn the same thing. Teach the same material in different ways. If you can know how your students learn most effectively, that’s ideal, but librarians usually don’t have the time to know students at this level. So we diversify how we teach to reach as many students as possible. We diversify to engage, and when students (or new employees) are engaged with the content, they will learn.

Low Instruction Numbers Call for More Aggressive Outreach

As the semester draws to its end, I find myself compiling instruction statistics. While the College of Arts and Science is undergoing a core curriculum revision, which will integrate an information literacy learning outcome into a required course, the current core curriculum lacks such a requirement. There have been two courses which represent our instructional program’s bread and butter – Ratio Studiorum Program (RSP) and Civic Engagement through Public Communication (COM 152).

COM 152 is a speech course that the majority of students take to fulfill a core requirement. This fall, I taught 11 of the 12 sections (92%) of COM 152. This percentage is higher than last fall, which was only 58%. Anecdotally, I attribute the rise to two factors. The first is that I began working at Creighton two days before the start of the Fall 2012 semester; I imagine some instructors hesitated to contact me because of this. The second is that word of mouth of my instruction spread. I know this is true for two instructors who I taught for in the Spring 2013 and Fall 2013 semesters. They told me an information literacy session was recommended by another faculty member. All in all, I’m pleased with the improvement in the raw numbers for this course.

RSP is a required one credit course for freshmen. It focuses on advising, acclimating students to collegiate level academics, and introducing students to Jesuit values taught at Creighton. This fall, librarians taught only 20 of the 49 sections (41%) in the College of Arts and Sciences. The reverse occurred in Fall 2012 when 22 of 39 sections (58%) had an encounter with library instruction. This decrease unsettles me.

I analyzed these numbers to see if there are any patterns or conclusions to draw. Here is what I found:

  • More sections of RSP were taught in Fall 2013 to allow for smaller class sizes
  • Over half of the instructors who taught in Fall 2012 and requested library instruction did not teach a section in Fall 2013
  • No faculty members abandoned ship – if a faculty member requested instruction in Fall 2012, they also requested it in Fall 2013
  • There doesn’t seem to be a pattern in faculty members by department who did or did not request instruction. Notably high departments who did not request instruction for Fall 2013 were Modern Languages and Chemistry

I’d like to note that I’m a firm believer that correlation does not equal causation, but I still see the value in examining observations and data to find patterns that may lead to further research.

I expect numbers to fluctuate each year; however, I have designed instruction for COM 152 to scaffold from instruction in RSP. Information literacy instruction in RSP focuses on Bloom’s lower level skills. We teach students the building blocks of research and show them the various resources the library owns. COM 152 focuses almost exclusively on evaluation and analysis of sources. Perhaps this scaffolded approach is not appropriate if information literacy instruction in RSP is reaching only 40-60% of students?

One of the biggest disappointments in the lowered numbers is that as an instructor for a section of RSP this Fall semester I seemed to have failed at recruiting more faculty requests for instruction. Not only was I at all of the instructor meets, networking with faculty, but I also presented multiple times to the faculty. I believed this would encourage more faculty to utilize our instructional program, but this did not happen.

Perhaps our liaisons, myself included, need to encourage faculty to bring their sections to the library more aggressively. Another idea emerged from teaching a section instructed by a Chemistry faculty member. He showed high interest in RefWorks and wanted me to teach his students about it. I typically don’t mention RefWorks during instruction for RSP. Maybe we need to appeal more to the interests of each faculty member’s discipline. For example, the faculty in the sciences may request sessions if we advertise teaching how to use RefWorks and the differences between primary and secondary sources.

With the new core curriculum beginning in Fall 2014, this all may be needless extrapolation. The information literacy outcome imposed on one of the new core courses poses promise for the future of the instructional program. What the Library needs to do going into the future is try to emphasize the partnering of faculty with librarians to achieve this outcome. I know there will be faculty who choose to go at it alone, but it is our job to show that we can be collaborators with them to help their students learn how to become information literate students.

It has occurred to me that I’m putting a lot of emphasis on “usage” numbers. Student outcomes are the most important assessment piece when dealing with information literacy. If students are not learning anything, then the library is failing. But in order to teach students, we need to get them inside the doors, whether physically or virtually.

Coming Up for Air: Living in a Post-Immersion World

It’s been over two months since attending ACRL Immersion Teacher Track in Seattle and I have yet to post about my experiences because, quite frankly, I haven’t had time. In the weeks following Immersion my mind was racing at over 100mph. How could I do justice to this transformative experience by only summarizing it? And in the immediate weeks following Immersion, a summary seemed the like the only post I could create. This past Friday gave me an opportunity to push beyond a summary.

Immersion provided great practical advise on how to improve what I was already doing. Let’s begin with what I incorporated immediately:

  • A vast network of kick ass librarians – Honestly, this is probably the most valuable asset I gained from Immersion. The librarians I met and became friends with offer objective opinions on teaching strategies and tools I want to incorporate into my classes. A simple Tweet about an idea garners response from many of these librarians. I have a connection of information literacy professionals spread throughout the country at my fingertips.
  • More (and better) assessment – I already was assessing in many of my information literacy sessions, but now my assessment actually aligns with the learning theory I was teaching. The new assessment pieces often take more work to analyze, but the students perform better and you can see real learning.
  • Hell hath no fury like learning outcomes scorned – I used learning outcomes before Immersion, but I wasn’t utilizing them to their fullest potential. I feel more confident constructing learning outcomes in order to assess whether or not my students are learning. (See what I did there.)
IMG_20130730_195339

A whiskey rubric – because it makes assessment go down easier

There are also some much larger, programmatic components to Immersion that I brought home with me. This past week, I had the opportunity to present to faculty about how the Library can help teach and assess the information literacy learning outcome in a course they’re developing for Creighton’s new Magis Core Curriculum. It’s critical for the Library to become involved because it’s the only course in the new curriculum that is required to assess information literacy.

In preparation for the meeting, I presented to my coworkers in reference. Throughout the course of this preliminary meeting I found myself talking about learning outcomes versus the tasks to achieve them, something I have a distinct memory of Lisa Hinchliffe discussing. For example, learning Boolean operators isn’t a learning outcome, but it’s a skill that can help them to construct an effective search strategy. It was an opportunity to teach the teachers.

The faculty meeting went well. Most faculty seemed on board with incorporating the Library into their courses; however, only time will tell. It’s an exciting opportunity for the information literacy program at Creighton, and one in which I’ll have the experiences and lessons learned at Immersion to help support me in leading the Library in this new and exciting chapter.

Do We Need to Teach This?

As Reference and Instructional Services Librarian, a significant percentage of my work focuses on assessment. One of my favorite classes to teach as the social science liaison is Communication Studies 152: Civic Engagement through Public Communication. Throughout the semester students produce a series of informative, persuasive, and group speeches. Instructors encourage students to speak about topics that not only interest them, but that also engage with the current civic discourse.

Because topics generally focus on current issues, sources run the gamut from open web sources to scholarly articles. Most speeches utilize an array of government statistical sources, local newspaper articles, and scholarly, peer-reviewed material. On top of this breadth lies the fact that a majority of COM 152 students are freshmen, many of whom have never been exposed to college-level research or library databases. This leaves 50 minutes to teach students the information cycle, the difference between scholarly and popular, how to effectively use databases (and a wide variety of them, since topics can fall under any discipline), and how to critically evaluate the information they find. A daunting task, but one that instruction librarians regularly face.

Working with course instructors, information literacy librarians make tough decisions about what to teach and what not to teach on a daily basis. Creating clear, relevant, and measurable learning outcomes helps us prioritize and focus our learning objectives. Assessing these outcomes illustrates if students learn what we teach. We then close the assessment loop by using the results to inform our future teaching. But what do we do if the results continue to show students aren’t grasping a particular concept? We can incorporate different teaching techniques. We can ensure we’re teaching to various learning styles. We can develop different active learning exercises. In sum, we go back to the drawing board. We don’t give up. But how often do we take a step back and ask ourselves, “Is this a skill a students (still) need? Do we need to move on and focus our attention elsewhere?” After all, time is precious and we may be letting one piece distract us from the greater puzzle.

This spring I piloted a pre- and post-test assessment piece in four of the sections of COM 152. I learned a lot about the effectiveness of my teaching and how students conceptualize some of the material. The tests also revealed that students repeatedly failed to grasp the difference between keywords and subjects. A handful of conclusions can be drawn from the results, but I began to wonder if this was an essential skill for students in this course to master. A large portion of students’ sources (for better or worse) come from open web sources and newspapers, both of which typically do not use controlled vocabulary and often allow for full text searching. Recognizing the difference between subjects and keywords might prove useful when searching for scholarly articles or using the catalog, but since sources need to be within the past 5 years, students often ignore the catalog and their scholarly sources trend toward broad pieces on general issues.

Through working with students one-on-one in research consultations, I think students often discover controlled vocabulary serendipitously. They’ll search a database and begin to look through the records. They notice that within each record are hyperlinked terms that often reflect their initial keyword search. Since Millennials are so accustomed to URLs, they’ll click on the subject term and realize that the database has now returned more relevant records. Some students will ask why it happened, while other students don’t care why it happened but realize that they’ve “done something right.” They begin to notice the database limiters. Serendipity.

Is it possible to quantify which learning method is more valuable – learning within the classroom or learning by doing? Are the two separate? I can create active learning exercises that incorporate this same process, but the activity is divorced from the students’ point of need. Or is this all irrelevant? Knowing the difference between controlled vocabulary and keywords is a lower order skill. Should my efforts focus on higher order information literacy skills? Is this possible without knowing the difference between keyword searching and controlled vocabulary?

I believe it is possible. Teaching the difference between keywords and subjects may be a traditional learning objective, but its time may have run out. In an increasingly digital educational landscape where information overload is almost inevitable, this is one piece that I’m growing increasingly comfortable removing from the puzzle.