Tuesday, February 22, 2011

Knowledge Exchange - the Assessment Method

When Jen Salvo-Eaton was designing the Knowledge Exchange program, she determined a need for assessing its success. The Access Services Supervisors decided to create an anonymous aptitude test that would be administered to every staff member and supervisor before the Knowledge Exchange training began, and then again upon the completion of the program. I was tasked with organizing and editing the questionnaire, based on the input of the Access Services Supervisors.

The questionnaire consisted of six questions for each of the Access Services departments (with the exception of Offisite). The first question was the same for each department: On a scale of 1(Low) to 10(High), please rate how familiar you are with the following department’s policies and procedures. The other five questions would pertain to specific policies for the department that the Supervisors felt every staff member should be aware of.

The questionnaire was taken by 34 staff and supervisors from Access Services, and below are the initial results (with the familiarity rating and the percentage of questions answered correctly):

Circulation, 6.12 familiarity rating, 47.6% correct

Delivery Services, 3.97 familiarity rating, 16.5% correct

Interlibrary Loan, 4.21 familiarity rating, 44.1% correct

Library Privileges, 5.79 familiarity rating, 84.7% correct

Reserves, 5.38 familiarity rating, 63.5% correct

Stacks, 5.81 familiarity rating, 67.6% correct

(Fun Facts: The question most people answered correctly was “What is the shortest length of time someone can rent a locker for?”; the question most people did not answer correctly was “What type of items can be requested through Delivery Services?” If you do not know the answers to these questions yourself, you should seek them out.)

The Knowledge Exchange is an entirely new approach to interdepartmental learning, and as we move further along we are adjusting the program as needed. Similarly, this is the first time we have attempted to assess one of our projects in this manner, and in retrospect there are certain changes I should have made, to help the questionnaire be more uniform. The test should have included at least one multiple choice question and one true or false question for each department, as well as a long-form response. Instead, one department was heavy with true or false questions, while another department was devoted to long-form responses, and a third department asked questions solely with numeric answers. I feel the disparity of the questions may have influenced, in part, some of the disparity between departments in the number of questions answered correctly (and may be reflected in the amount of change when the questionnaire is administered the second time).

So far there has been a lot of positive feedback from staff about the trainings they have received. It will be interesting to see how their experiences will translate into improved knowledge and understanding of the various departments. Fortunately, we now have an assessment tool that will help us determine that. Stay tuned.

No comments:

Post a Comment