E301: Keys to Success With Assessment & Evaluation #CILDC

Lincoln East

Wednesday, April 29 2015
10:30 a.m. – 11:15 a.m.

Cervone_H_Frank_01_small_squareFrank Cervone, Director of Information Technology and CISO, School of Public Health, University of Illinois at Chicago

Lecturer, School of Information, San Jose State University

Libraries and information agencies are under increasing pressure to demonstrate value. The key to demonstrating actual value is good data on services and programs. Cervone reviews the key considerations in an effective assessment and evaluation program and provides ideas and tools for your own assessment and evaluation efforts.

Dr. Frank Cervone – E301_Cervone.pdf (1 MB) Username/Password – CIL2015

Based on course he teaches at San Jose State.

Research aspect is not often rigorous as other disciplines

Assessment – increase quality

Evaluation – judge quality

Evaluation and assessment are grounded in research.  Need to take into account political aspects as well.

Evaluation and assessment are social research. Social research is different.

Must be able to deal with politics of the organization being researched.

Major Issues of conducting research in libraries?

perform data analysis. – background in stats

  • data collection methods
  • culture of assessment

Research Quality

  • Evidence based librarianship
  • hierarchy of evidence
  • case studies, expert opinion
  • program evaluation, opinion surveys
  • case-controlled studies
  • cohort studies
  • more in the pyramid
  • systems thinking will improve results***

goals

  • provide useful feedback

Research methodology is critical

  • Scientific-experimental model – quantitative data driven – i.e.. prove effect of library on education of students
  • qualitative/anthropological models – i.e.  Margaret Mead.  lots of qualitative data
  • participant-oriented models – i.e. what do stakeholders want?  Questions must come from perspective of the stakeholders.

What should be assessed and evaluated?

 

Types

  • formative vs. summative
  • formative – looks to improve what is being evaluated. program review
  • summative – looks at outcomes and impact.  meta-analysis.cost-benefit, cost effectiveness. impact analysis

Survey Research – at bottom of pyramid

  • problem- self reported data – what you think you do, not what you actually do.
  • representative population needed. Needs to selected randomly
  • standardized questionnaire needed
  • method of administration – self or interview
  • Need to pretest the survey

Focus Groups

  • organized
  • Problems – group dynamics can give invalid data. Need to clearly identify population of the group. Qualitative data hard to code.

Open versus closed ended questions.

Voice of the customer

  • Primary customer?
  • pain points of different groups?
  • which customers have greatest worth?
  • issues with largest impact?
  • best way to rectify pain points and weaknesses?
Advertisements