Competency lists considered harmful

Can we rethink them?

Could we talk about skill and competency lists, please? They’re everywhere, inescapable as change. Professional organizations have made dozens. Dozens more come from the LIS literature as content analyses of collections of job ads or position descriptions. Whatever job you do or want to do in libraries, someone’s made a list of the skills you must supposedly have mastered.

I’m not convinced these lists are as useful as they could be. I’m completely convinced they do a lot of unnecessary harm. If we must have them, we could stand to rethink how we make and present them.

I avoid showing competency lists to my students because they reliably freak out over list length and complexity, never mind the highly prescriptive, even accusatory, tone in which the list’s surrounding text is often written. The more conscientious the students are, the worse they panic. Worse yet, I’ve seen their panic send them into Impostor Syndrome tailspins that sap their curiosity as well as their willingness to tackle exactly the new and growing areas where academic libraries most need them. I do my best to talk them down, but it doesn’t always work. Sadly, it’s commonly the brightest, most promising students who retreat fastest and furthest, afraid they’re in for nothing but harsh judgment and failure if they pursue jobs described in these lists. Frankly, I think these lists too often provide workplace grist for exactly the harsh judgments my students are desperate to avoid. Skill lists unaccompanied by information about available resources and job context make it easy to subscribe to the fundamental attribution error when something goes wrong, blaming a student or working librarian for not having enough (or the right) skills instead of doing the broad honest analysis of the situation that might implicate the library in some or all of the difficulty.

A number of competency-list interventions, some easier to implement than others, could stem the unproductive panic. Some sense of priority, some ranking by centrality to the job or association with specific job tasks, would be an enormous relief. A roadmap would be even better: “start here, expand into this, eventually pick up that, but only the die-hards find that other thing useful.” My students experience lengthy, unranked, unprioritized laundry lists of skills as accusations that they can never learn enough or be good enough, or even subtextual gloating that they’ll never win jobs. Understandably wanting to dump the stress, they turn furiously on us instructors for yet another tired round of the theory-praxis wars. This is neither necessary nor useful. No one really expects students to pick up a lengthy career’s worth of knowledge in a mere twelve to fourteen three-credit courses! How tremendously insulting to longtime professionals such an expectation would be. The problem is that laundry lists of unranked skills imply precisely that expectation.

Another useful change, then, though it would take real research, would be an indication of how, and roughly when in their careers, practitioners acquire job-related skills and knowledge. Taking scholarly communication as an example, I learned to read journal-publication contracts by experience on the job, and I strongly doubt I’m alone among scholarly-communication specialists in that. The same goes for any number of technical chores, too numerous and boring to list, specific to the various roles I’ve undertaken. Not only would a sense of timing, optionality, and learning modality relieve my students’ (and consequently my) stress, it would also help librarians who need to update their skills, cross-train in something new to them, or change their specialty. It doesn’t always make sense to try to learn some things in classrooms, much less learn everything right away. It’d be awfully nice to know which skills belong where and when.

It doesn’t help that competency lists are written from the point of view of some sort of neo-Platonic universal library that does everything imaginable in-house and is simultaneously tiny, gigantic, and every size in between. In real academic libraries, the skills needed for what is nominally the same job are partial, context-based subsets of the whole. A library whose institutional repository runs on open-source software managed in-house will need different skills in its institutional-repository manager from a library that pays for a vendor’s software-as-a-service offering. A library working toward a campus open-access policy needs different people skills from one whose faculty have already implemented such a policy. When competency lists do not clearly tie listed skills to real-world tasks and situations, they fail to heed the contexts that shape need for certain skills, much less help list users winnow the list wisely in accordance with their local context.

Distinguishing between a skill or knowledge that must be always at the librarian’s fingertips and one that can be looked up as needed would be nice. “Publisher self-archiving policies” often appear on scholarly-communication competency lists. Nobody in the field would ever go about memorizing them all, though, not least because they change on a whim. Looking them up as needed is what SHERPA/RoMEO is for, and when that service doesn’t come through, librarians investigate publisher websites or read example contracts at time of need. My students don’t know that, though, and it’s impossible for the inexperienced to tell the difference from the competency lists. The lack of differentiation between “know this” and “know where to look this up” doesn’t just panic my students, of course; library managers and search committees can be forgiven for letting competency lists send them on wild-goose chases for employees with encyclopedic knowledge on a topic that practitioners in the field actually just look up.

That leads me to job-ad content analyses in the LIS literature, a genre I honestly find exasperating. My problem isn’t so much with content-analysis technique as with the uncritical acceptance of job ads as realistic guides to employee skills. Stop me if you’ve heard this one before: a search-committee chair sends out a plea to librarian friends on social media, “We’re hiring a Library Shininess Specialist, which we’ve never had before. Somebody please tell me what I should put in the job ad!” Or this one: the skills and responsibilities sections of a job ad are nothing but giant laundry lists compiled from other job ads and content analyses from the LIS literature, coupled with stingy or even absent discussion of what resources the library will provide to whoever wins the job. When I see these social-media requests and patchwork ads, I make a mental note to warn my students against applying to the job. These ads come from libraries that have not thought hard enough about their context and their milestones, much less what a new Library Shininess Specialist needs from their library employer in order to succeed. I don’t want my new graduates to burn out and leave.

In other words, too many job ads are pure wishlists. Some are even wishlists patched together from other wishlists! Unfortunately, the cost of an unrealistic, naively-compiled laundry list of a job ad does not become evident until a search fails or a hire doesn’t work out, which is not enough to keep bad ads from being written and published in the first place. If the LIS literature has any way to tell the difference between a thoughtful, carefully-crafted job ad and a hasty sloppy patchwork wishlist, I have yet to see it; bad ads are analyzed as though equivalent to excellent ones. Nor does the job-ad analysis literature assess job-ad outcomes. This is understandable, as gathering data would be fraught with human-resource confidentiality pitfalls, but the unfortunate result is that no one actually knows how well job ads do at attracting viable candidates, much less achieving successful hires. Why, then, do we grant job-ad analyses so much credence? In addition to feeding back into more bad job ads, these analyses also fuel competency lists, which is nothing if not troubling; a realistic competency list cannot be grounded in untested, assessment-free wishlists. In LIS education, these content analyses and the resulting competency lists become sticks to beat educators with, fueling staggeringly impractical expectations from students and practitioners about what a two-year master’s curriculum can realistically accomplish. Garbage in, garbage out, garbage everywhere!

I like the idea of competency lists, just not their present construction. In an ideal world, these lists would reduce anxiety in library-school students and practitioners committed to lifelong learning, channelling their energy productively by breaking down jargon-laden job titles into a sensible succession of digestible pieces. Properly coupled with task analysis, competency lists could also be useful professional advocacy tools, expressing clearly what librarians really do with their days. Finally, competency lists ought to be much better tools than they are for libraries and librarians working out how to implement new initiatives. If we reconceive these lists as tools to help librarians and library-school students plan their learning, and libraries plan their evolution, we can perhaps escape the anxiety, censorious finger-pointing, and poor planning such lists far too often incite today.

Note: This post is copyright 2014 by Library Journal. Reposted under the terms of my author’s agreement, which permits me to “reuse the work in whole or in part in your own professional activities and subsequent writings.”