Round Table Discussion: The Great CRAAP Debate

Since starting at George Mason University, my department regularly holds a  Round Table group. One of my colleagues began it shortly before I arrived. Someone in the department selects a couple of articles, book chapters, etc., around a theme; we read the articles; and then meet to talk about them. It’s not a particularity novel idea, but it has been instrumental in professional development in my department. Since my research often focuses on communities of practice, I’ve found it as an effective way to develop library instructors who are not in professional positions, who have limited teaching experience, or who do not have an MLS – our legitimate peripheral participants in the profession. As applicable, we’ll follow up these round tables with a workshop where we work on instructional ideas related to the readings.

As the spring instructional load wanes, we’ll be meeting for the first time since August to discuss a new set of readings. I chose the topic for next week (April 19th). We’ll be reading about teaching source evaluation to undergraduates. My (personal) goal as instruction coordinator is to begin pushing our library instructors beyond teaching to the CRAAP Test or other checklist-like device. We are reading the following articles:

Caulfield, M. (2016, December 19). Yes, digital literacy. But which one? Retrieved from https://hapgood.us/2016/12/19/yes-digital-literacy-but-which-one/

Caulfield, M. (2017, April 4). How “news literacy” gets web misinformation wrong. Retrieved from https://medium.com/@holden/how-media-literacy-gets-web-misinformation-wrong-45aa6323829d

Radcliff, S., & Wong, E. Y. (2015). Evaluation of sources: A new sustainable approach. Reference Services Review, 43, 231–250. https://doi.org/10.1108/RSR-09-2014-0041

Seeber, K. (2017, March 18). Wiretaps and CRAAP. Retrieved from http://kevinseeber.com/blog/wiretaps-and-craap/

Do you teach students how to evaluate sources without using a checklist approach? If so, how? Have any readings you’d suggest for future reading? Let me know!

Radical Information Literacy Assessment

It happened over falafel.

Once a week, I work from one of my university’s distributed campuses. The primary reason that I work there is to work with the two subject librarians stationed there on a regular basis. Last week I went to lunch with one of the librarians and as usual, we talked a lot of shop. She asked me how assessment was going. I’m sure she heard more from me than she bargained. But then she asked, “What would radical assessment look like?” I think she asked it to herself as much as she asked me.

We spit-balled the first ideas that came to mind. What I landed on was this: Radical assessment removes assessment from the institutional accountability prerogative. In other words, I don’t think it’s possible as a singular form of assessment; however, I think it’s what assessment should aspire to be. Radical assessment is collecting evidence of student learning in order to improve instruction so that students are able to have a higher quality learning experience.

You might be saying to yourself, “but that is assessment.” I would argue that is what assessment is supposed to be, but not what it is in reality. Too often we assess because we’re told we must. We’re told by our accreditation agencies, our administrators, and by our assessment librarians.

Don’t get me wrong, we need to be assessing. Accountability, and whatnot. And so for me, radical assessment is designing assessments that are meaningful first, and evidence for institutional accountability second.

What does “meaningful” mean in this context? I get asked this a lot since my Library’s Student Learning Assessment Plan is based on the concept of meaningful assessment. Each librarian who teaches is asked to complete one meaningful assessment project over an academic year. Meaningful is at the interpretation of the librarian. I tell them: It should be meaningful to your instruction while applying to one of our learning outcomes (the institutional, non-radical aspect). How will it improve your instruction? How can you use it as evidence to teach more (or less)? How can it help your students learn? To me, if you start and build from there, you’re assessing radically.

I know there are plenty of different interpretations of what radical assessment looks like. One of my colleagues had some interesting thoughts. He has multiple interesting tweets, which I encourage you to read, but this one has me thinking. . .

What is radical assessment, particularly in context of information literacy, to you? I’d love to hear your thoughts.

Is a Theoretical Division Such a Bad Thing?

The ACRL Board of Directors has decided to adopt the Framework for Information Literacy in Higher Education. Taking the lead from professional sports, the Board has adopted the Framework with an asterisk. The Board decided not to sunset the current Information Literacy Competency Standards for Higher Education until the profession sees how the Framework plays out.

Much has been written about the new Framework and within the past 24 hours there have been many blog posts and tweets about the Board’s decision. Although I’m not fully sold on the threshold concept theory, I can support the new Framework. I can also support the Competency Standards, if they were updated. But I’m not writing about that today. What really struck me during all of the Twitter chatter, was a particular tweet from the open mic session at ALA Midwinter:

I read a handful of tweets and blog posts over the course of the Task Force’s activities that mirrored this sentiment. The Framework has the potential to divide instructional librarians across the country – those who support the new Framework and those who reject it. And last night, as I read the Board’s decision, which poses a greater possibility of this happening, I thought to myself, “Is a theoretical division such a bad thing?”

Of course having a clear vision and direction for information literacy instruction has its benefits. It unites the profession; it gives us clear direction; we’re all on the same (theoretical) page. But if we want to continuously move forward as a profession, doesn’t a division into various camps help guide and accelerate that progress? Sure, there will be some camps that stagnate or even regress, but I’m optimistic enough to believe those would be in the extreme minority. But a division into different theoretical or practical approaches to information literacy instruction might be exactly what our profession needs to keep pushing us forward.

Learning Styles In and Out of the Classroom

Recently I’ve been thinking a lot about learning styles. Settling into a new job requires a lot of learning, and at this point in my career most of this learning involves internal policies, processes, and politics. Or as I’ve been told by a few, mostly just politics.

I’ve been reminded how often times the teacher teaches to their preferred learning style. Everyone is guilty of this, myself included. It takes careful, deliberate effort not to teach solely to your particular style.

If you believe that learning styles exist, there are plenty of approaches to learning styles. Since I had a sip from ACRL’s Immersion Kool Aid, I tend to keep Kolb’s Learning Styles as a model in the back of my mind. I also think that the VARK model has a lot to offer, especially in regards to active learning.

So let’s review Kolb’s Learning Styles Inventory with the graphic below (If you adhere to the VARK model, sorry auditory and kinesthetic learners) …

Kolb's Learning Style Model

Kolb’s Learning Style Model

Another way to think of it:

  • Concrete experience = doing
  • Reflective observation = observing
  • Abstract conceptualization = thinking
  • Active experimentation = planning

You can take a fancy test to determine where you fall. I tested as an Assimilator. If you look at the career characteristics for Assimilators, it’s basically a librarian.

Whether you follow Kolb’s model or another model, what’s important to remember is that everyone learns differently. Just as we try to teach to different learning styles in the classroom, new employee training and professional development training should attempt to teach to different learning styles too. In my case, my new supervisor has created a flexible training environment, capable of molding to how I learn best.

Some will argue that learning styles are a myth. They may be right, but I like to think there is a middle ground and that ground is providing multiple ways to learn the same thing. Teach the same material in different ways. If you can know how your students learn most effectively, that’s ideal, but librarians usually don’t have the time to know students at this level. So we diversify how we teach to reach as many students as possible. We diversify to engage, and when students (or new employees) are engaged with the content, they will learn.

[Insert Misleading Headline]

Websites need clicks to get advertisers in order to survive in the constantly fluctuating internet marketplace. You’ve seen them – “I’m Not Sure It’s Possible To See This Chart And *Not* Think It’s A Big Problem,” “Everyone I Talk To Says ‘Who Cares?’ Then I Show Them This And They Freak Out A Little,” “This Is The Personality Trait That Most Often Predicts Success,” and the list goes on and on. (I’m intentionally not linking to the related websites because, again, click bait.) More often than not the headline fails to give you any indication about what the article is actually about. “I’m Not Sure It’s Possible To See This Chart And *Not* Think It’s A Big Problem” could be about the rise of poverty in the United States (unlikely), the continued belief in creationism (maybe), or something to do with Kim Kardashian (most likely.) Either way, you probably need to click on the link to actually find out. And that’s what the websites want. I get it.

What I cannot accept is historically reliable news outlets following this trend. As an instruction librarian, I teach students how to critically evaluate information sources. What happens when a source such as The New York Times starts inserting misleading headlines?

The following is a tweet by a friend of mine, Daniel Victor, who is a social media editor at the NYT. Dan always posts interesting content from the NYT, especially when it’s sports related, so I naturally clicked on the article.

Having worked a couple of years with social science students and faculty, who rely heavily on data for their research, I’ve become increasingly critical of the representation of data in news pieces. A few things struck me with this article:

  • “In Terms of Fans, the Heat Have Already Beaten the Spurs” – “Fans” in the sense it’s used in the headline is all encompassing. How did the NYT collect data that accurately represents the demographic spread of NBA fans across the country?

  • Ow, that’s a pretty map!

  • “Based on estimates derived from which teams people “like” on Facebook…” Did you read enough of the article to get to this point? If you did, did you connect this sentence to their definition of “fans”?

So this is a map of fans who use Facebook. More specifically, those fans who use Facebook at a level where they “like” their favorite sports teams. I feel as though we’re getting into a niche group here. A group that doesn’t fully represent NBA fans across the country. (You could also disprove that assertion with a different data set.)

After some tweets back and forth, Dan provided me with links to methodology and I told him my issue was really with the headline and not the data itself:

So what’s the big deal? It’s just a headline and those who are truly interested will read the article and can extrapolate how they define “fans.”  I think it’s a big deal when I think about the students with whom I work. The majority will not read the entire article. They’ll stop at the map. They won’t bridge that final gap of interpreting what the map is actually representing. Maybe this is OK for daily, casual information digestion, but it’s a habit that seeps into the classroom as well. It breeds an information semi-illiterate society.

So what can librarians do? The same thing we’ve always done. Continue to push for information literacy instruction in college classrooms. Push for critical IL instruction and not database demonstrations. Use examples like this! It isn’t egregious, it’s from a reliable news source, it’s an interesting topic, and it demands higher level thinking. A critical evaluation of an article like this represents the lifelong information literacy skills the LIS field espouses. Embrace it!

When Do We Become Experts?

An ACRL committee membership recently placed me and a handful of colleagues in the position to choose the recipients of two prestigious awards. As the secretary of the committee, I saw my role as a cross between an observer and cautious participant. Here I was, thrown into the big leagues, having to judge whether a particular publication was significant enough in its advancement of the field and I suddenly felt like a tee ball player stepping up to bat against Greg Maddox circa 1993. I felt as though I had a decent grasp of the field, but I began to wonder – at what point will I consider myself an expert?

I read through the nominations, all while taking diligent notes and creating my own scoring system. There were publications where I knew I’d read iterations of the same topics and projects, and then there were publications where I was blown away and knew there it was unique, innovative, and exactly what our profession needed. But then I’d wonder, “Do I really know? Sure, I read a lot of literature in graduate school and even more in the 2 and half years I’ve worked in the profession, but am I qualified enough to make an assessment on whether it’s award worthy or not?”

When the committee met to make the final decisions, I saw that I was right on target with the majority. Of course a few of my top choices were on the periphery, but a few were aligned with the majority of the committee as well. Consensus is a difficult thing, even in small groups, but the discussions which emerged surrounding professional disagreements highlighted the nuances in committee members’ individual expertise.

The entire experience was illuminating, not only as an introduction to ACRL committee work, but also for my own professional development. The nomination and decision process made me a bit uneasy at first, but I finished feeling more confident in my ability to critically examine literature in our field. I left understanding that although I’m not exactly an expert yet, I’m on my way and doing better than perhaps I thought.

So why does this matter in the bigger picture? My greatest struggle with my developing relationship with expertise centered around the idea of how I present my expertise to students and faculty. In the context of researching and helping students become information literate, I do feel as though I’m an expert. I don’t know where the disconnect is when it comes to information knowledge within my own profession. I don’t have the answer, but I’ll keep looking. And isn’t that what experts do – continually look for more knowledge to better answer the types of questions that really have no answers?

Low Instruction Numbers Call for More Aggressive Outreach

As the semester draws to its end, I find myself compiling instruction statistics. While the College of Arts and Science is undergoing a core curriculum revision, which will integrate an information literacy learning outcome into a required course, the current core curriculum lacks such a requirement. There have been two courses which represent our instructional program’s bread and butter – Ratio Studiorum Program (RSP) and Civic Engagement through Public Communication (COM 152).

COM 152 is a speech course that the majority of students take to fulfill a core requirement. This fall, I taught 11 of the 12 sections (92%) of COM 152. This percentage is higher than last fall, which was only 58%. Anecdotally, I attribute the rise to two factors. The first is that I began working at Creighton two days before the start of the Fall 2012 semester; I imagine some instructors hesitated to contact me because of this. The second is that word of mouth of my instruction spread. I know this is true for two instructors who I taught for in the Spring 2013 and Fall 2013 semesters. They told me an information literacy session was recommended by another faculty member. All in all, I’m pleased with the improvement in the raw numbers for this course.

RSP is a required one credit course for freshmen. It focuses on advising, acclimating students to collegiate level academics, and introducing students to Jesuit values taught at Creighton. This fall, librarians taught only 20 of the 49 sections (41%) in the College of Arts and Sciences. The reverse occurred in Fall 2012 when 22 of 39 sections (58%) had an encounter with library instruction. This decrease unsettles me.

I analyzed these numbers to see if there are any patterns or conclusions to draw. Here is what I found:

  • More sections of RSP were taught in Fall 2013 to allow for smaller class sizes
  • Over half of the instructors who taught in Fall 2012 and requested library instruction did not teach a section in Fall 2013
  • No faculty members abandoned ship – if a faculty member requested instruction in Fall 2012, they also requested it in Fall 2013
  • There doesn’t seem to be a pattern in faculty members by department who did or did not request instruction. Notably high departments who did not request instruction for Fall 2013 were Modern Languages and Chemistry

I’d like to note that I’m a firm believer that correlation does not equal causation, but I still see the value in examining observations and data to find patterns that may lead to further research.

I expect numbers to fluctuate each year; however, I have designed instruction for COM 152 to scaffold from instruction in RSP. Information literacy instruction in RSP focuses on Bloom’s lower level skills. We teach students the building blocks of research and show them the various resources the library owns. COM 152 focuses almost exclusively on evaluation and analysis of sources. Perhaps this scaffolded approach is not appropriate if information literacy instruction in RSP is reaching only 40-60% of students?

One of the biggest disappointments in the lowered numbers is that as an instructor for a section of RSP this Fall semester I seemed to have failed at recruiting more faculty requests for instruction. Not only was I at all of the instructor meets, networking with faculty, but I also presented multiple times to the faculty. I believed this would encourage more faculty to utilize our instructional program, but this did not happen.

Perhaps our liaisons, myself included, need to encourage faculty to bring their sections to the library more aggressively. Another idea emerged from teaching a section instructed by a Chemistry faculty member. He showed high interest in RefWorks and wanted me to teach his students about it. I typically don’t mention RefWorks during instruction for RSP. Maybe we need to appeal more to the interests of each faculty member’s discipline. For example, the faculty in the sciences may request sessions if we advertise teaching how to use RefWorks and the differences between primary and secondary sources.

With the new core curriculum beginning in Fall 2014, this all may be needless extrapolation. The information literacy outcome imposed on one of the new core courses poses promise for the future of the instructional program. What the Library needs to do going into the future is try to emphasize the partnering of faculty with librarians to achieve this outcome. I know there will be faculty who choose to go at it alone, but it is our job to show that we can be collaborators with them to help their students learn how to become information literate students.

It has occurred to me that I’m putting a lot of emphasis on “usage” numbers. Student outcomes are the most important assessment piece when dealing with information literacy. If students are not learning anything, then the library is failing. But in order to teach students, we need to get them inside the doors, whether physically or virtually.

Dear Faculty, I Get It Now

Dear Faculty,

I never understood. What’s one class period? You’ve always said you want your students to produce the best research papers possible. When your students submit papers with only an assortment of open web sources, you shrugged your shoulders, deducted points, and said, “Well, they should have known better.” But how could they include appropriate, scholarly sources when they know neither what they are nor where to find them?

But I get it now. I’ve spent time in the trenches. My one credit class that meets only 50 minutes a week overflows with content. I struggled to fit in the time. But I did. Granted, I had the advantage of being able to disperse the content throughout the course and not in a one-shot session, but I did it. You can too! Even in your three credit course that meets 2 and half hours a week.

Break the mold of the one-shot session. Embed a librarian into your CMS. Invite the librarian for 15-20 minute segments throughout the semester. Partner with a librarian in way that is meaningful for your class. Don’t just say, “I don’t have enough time.”

But I have now walked a small portion of your journey, and I can promise you this: The next time I hear that you don’t have enough time for librarians to teach information literacy skills, my eyes won’t roll quite as much.

Sincerely,

Your Friendly Instructional Services Librarian

Librarians and Academic Honesty: “Misusing Academic Resources”

Each fall semester I teach a number of information literacy sessions for freshmen in RSP 101: Introduction to the Culture of Collegiate Life. For the past two years, one of the faculty preceptors for this course asked me to focus on academic honesty, plagiarism, and citing sources. His class session is always one of my favorites. I love a good debate and nothing seems to fire students up like the topic of academic honesty and plagiarism. It’s a chance to witness confirmation bias at its finest; unless I reaffirm what a student already believes as academic honesty and plagiarism, they fight back.

Creighton University’s College of Arts & Science has a six page policy outlining academic honesty. One of the examples of academic dishonesty is “misusing academic resources.” I included this in my presentation and asked students to think of examples of what activities might fall under this ambiguous phrase. After students volunteered a series of examples, the faculty preceptor asked a very interesting question, a question which I have been thinking about for several weeks. He asked, “Could the use of a reference librarian ever fall under this category?”

After much thought, I remembered a student who recently came in for a research appointment. He had an unrefined topic for a political science literature review. All he told me was that he was writing about the causes of war. When I attempted a reference interview, he refused to give me anymore to go off, insisting that his professor wanted it to be that broad. So i showed him a few databases and how many search results he would get with such a broad topic. I urged him to narrow it down and offered a few suggestions based on the results we found. As I prepared to send him on his way, he said this to me:

“I’m confused. I was told that if I came to a reference librarian, you would find all of my sources for me.”

The student needed 20 sources, I helped him get started with about 7. (As a side note, the student and I were both on a time crunch – we only had 20 minutes together.) Now, there are few things that irritates me more than a student who blurts out the untold secret. Sure, we help students find articles, but when you need 20, we’re not going to sit there and handpick them for you. We’re going to give you the skills and tools to discern between the results yourself.

Academic Honesty

Word of mouth: The preferred source of the 21st century

Then I compared this to the student who comes in asking for help that only needs 3 or 4 sources. Usually, we help them find all of their sources. Are we entering a grey area of academic honesty? Are we misusing academic resources – the resources being our own expertise?

After much thought and discussion among my colleagues, I’ve come to the conclusion that the occasion of crossing the line into academic dishonesty is rare. The distinction aligns with the teaching mission of reference. We are working with the student, not for the student. We’re showing them how to search effectively. We’re not telling them how to use the information we help them find. We’re not working in a vacuum. We’re helping them learn the research process. We’re making them information literate students. Rare is the case that the librarian does it all for the student.

I’m aware this is a grey area. Some may think that we need to help the student find the 20 sources; however, I feel as though that’s ultimately doing them a disservice. As a librarian, we are professionals and must use our best judgement to determine how far we’re comfortable pushing the line between academic dishonesty and genuinely helping a student.

I think of this line often and I know where it’s placed in my mind. Whenever I work with a student, I’m very aware of how close we get to that line. And I know that it’ll be crossed occasionally. As a librarian, I want to help students succeed on their own and know it’s OK to ask for help, but they also need to learn to work with me.