Saturday, January 31, 2009

Information Skills Needed

I've been doing a good deal of research/work lately with knowledge management. One of my concerns is that the focus so often seems to be only on output: where can we store knowledge? What sort of database can we build for it? Do we need more procedures manuals?

Here is a piece out of Millikin University on the information skills needed by those entering knowledge work roles. Apart from providing an opposite-side-of-the-coin view, it points to new tasks for educators and trainers in developing workers.

"The seven information skills highlighted are: (1) retrieving information; (2) evaluating information; (3) organizing information; (4) collaborating around information; (5) analyzing information; (6) presenting information; and (7) securing information. For each information skill, there is a discussion of its significance, the logical skills required for its effective use, and its technological components."

Monday, January 26, 2009

Final Version: E-Learning Buzzword Bingo Card

Here 'tis, with thanks to all those who contributed (see original post and comments). I had more suggestions than spaces (especially loved Bex's "needs more cowbell") so if I compile enough maybe there'll be a Card 2. I am scared to think there might be that many buzzwords associated with e-learning, but fear there probably are...

Saturday, January 24, 2009

The Collapse of a Community of Practice (CoP)

I have long been a subscriber to, first, the old TR-DEV listserv and its revised format as a moderated Yahoo group. While the site shows 4,000 members, I would guess that truly active membership -- lots of posting, interaction, some argument -- is in the range of 50-100. Debates have been long and often spirited, and while I have not always found it all useful (too much parsing of semantics, too many side visits to politics last fall) it did keep me informed about current interests in the training field and what practitioners were really working on (as opposed to what the media often report). While a true community of practice is usually characterized by its lack of formal oversight, the moderators did a good job of blocking out blatant marketing attempts and people phising for email addresses, and refocusing/refereeing discussions when needed.

ANYWAY, the announcement came from the moderators this week that the site will be shut down effective Tuesday, and they will not be entertaining any further discussion or answering responses about it. They did provide a long explanation, including acknowledgement of new social media technologies that did not exist back when the listserv was started. And, really, they said, they're tired. It is an often thankless job, with anyone with a beef about anything taking it out on the moderators who were doing this voluntarily in the first place. The moderators have already deleted all the materials in the archives, things like handouts and whitepapers and tools submitted by members.

The response has been, not unexpectedly, dramatic and emotional. People are shocked at the swiftness of the decision; comments on the board this week tend to alternate between "thanks for all the years of service" and "how dare you?" The conversations have raised some points to ponder on the matter of CoPs. Let's cogitate:

1. Who "owns" a CoP?
2. To whom does the material shared by, created by, and stored in a community repository belong?
3. Does the life of a community have such a definite end point? What will happen next?

While I am sad to see TR-DEV go I admit I have been fascinated at watching the drama play out this week. For those really interested in the philosophical side of all this, there is a small body of academic literature on power issues in CoPs; authors include Huzzard; Pemberton, Mavin, & Stalker; and Roberts.

Friday, January 23, 2009

E-Learning Buzzword Bingo Card

Clark Quinn, Cammy Bean, Steve Sorden and I have been having a Twitter discussion about buzzwords associated with e-learning. The conversation quickly showed that once-useful concepts are often cannibalized and reduced down to little more than hype for the marketing and the misguided. For more, read Clark's excellent post, "Less than Words."

Meanwhile, help me complete the "Official E-Learning Buzzword Bingo" card as we are still short a few terms -- but I know they're out there. What terms did we miss?

Saturday, January 17, 2009

Alternatives to Kirkpatrick

While the Kirkpatrick taxonomy is something of a sacred cow in training circles—and much credit goes to Donald Kirkpatrick for being the first to attempt to apply intentional evaluation to workplace training efforts—it is not the only approach. Apart from being largely atheoretical and ascientific (hence, 'taxonomy', not 'model' or 'theory'), several critics find the Kirkpatrick taxonomy seriously flawed. For one thing, the taxonomy invites evaluating everything after the fact, focusing too heavily on end results while gathering little data that will help inform training program improvement efforts. (Discovering after training that customer service complaints have not decreased only tells us that the customer service training program didn’t “work”; it tells us little about how to improve it.)

Too, the linear causality implied within the taxonomy (for instance, the assumption that passing a test at level 2 will result in improved performance on the job at level 3) masks the reality of transfer of training efforts into measurable results. There are many factors that enable or hinder the transfer of training to on-the-job behavior change, including support from supervisors, rewards for improved performance, culture of the work unit, issues with procedures and paperwork, and political concerns. Learners work within a system, and the Kirkpatrick taxonomy essentially attempts to isolate training efforts from the systems, context, and culture in which the learner operates.

In the interest of fairness I would like to add that that Kirkpatrick himself has pointed out some of the problems with the taxonomy, and suggested that in seeking to apply it the training field has perhaps put the cart before the horse. He advises working backwards through his four levels more as a design, rather than an evaluation, strategy; that is: What business results are you after? What on-the-job behavior/performance change will this require? How can we be confident that learners, sent back to the work site, are equipped to perform as desired? And finally: how can we deliver the instruction in a way that is appealing and engaging?

An alternative approach to evaluation was developed Daniel Stufflebeam. His CIPP model, originally covering Context-Input-Process- Product/Impact, and later extended to include Sustainability, Effectiveness, and Transportability, provides a different take on the evaluation of training. Western Michigan University has an extensive overview of the application of the model, complete with tools, and a good online bibliography of
literature on the Stufflebeam model. Short story: this one is more about improving what you're doing than proving what you did.

More life beyond Kirkpatrick: Will Thalhimer endorses Brinkerhoff's Success Case evaluation method and commends him for advocating that learning professionals play a more “courageous” role in their organizations.

Enough already, Jane! More later on alternatives to the Kirkpatrick taxonomy. Yes, there are more.

(Some comments adapted from the 'evaluation' chapter in my book, From Analysis to Evaluation: Tools, Tips, and Techniques for Trainers. Pfeiffer, 2008.)

Tuesday, January 13, 2009

The First Help Desk Call

"Compared to the scroll, it takes longer to turn the pages of a book." And what about the manual?

Monday, January 05, 2009

Hemorrhaging Money

I've talked about this before and want to add a new voice to the choir. I get two kinds of calls from people wanting to "do" e-learning. The first come form those who are interested in expanding their scope to include more learners, to reduce travel and other costs, or to otherwise solve a business problem. The other calls come from those who want to know how to track and monitor and measure completions. They are always more interested in buying an LMS they don't yet need (and often don't even really know what it does) than in designing anything resembling effective online training.

The question of buying an LMS to track and monitor and yada yada recently came up on one of the Yahoo discussion groups to which I belong. Here are some fabulously in-your-face comments from Peter Hunter,, quoted with his permission:

"If your training is not producing added value to your bottom line,then what is
the point of tracking it?

All you are doing is measuring the exact rate that the training
department is hemorrhaging money out of the company.

If your training is adding value, then measure the value it is adding.

When we train for the sake of training we are destroying the
organisation we are supposed to be supporting.

Think carefully about why you need this software and if the reason
turns out to be that your boss told you to get it, go ahead."

Friday, January 02, 2009

Tony Karrer's E-Learning Learning Community

Thanks to Techpower's Tony Karrer for including the Bozarthzone blog on his list of sources for eLearning Learning It's "a community that tries to collect and organize the best information on the web that will help you learn and stay current on eLearning."

Be sure to check it out, and while you're at it be sure to also take a look at Tony's
elearning tech blog.

(And for you Twitterers/Tweeters/Twitterpeeps types, he's well worth following there, too.)

And PS: Happy New Year!