Title of Article: Reflections of Information Retrieval Evaluation
Author : Mei-Mei Wu and Diane H. Sonnenwald
URL : http://pnclink.org/annual/annual1999/1999pdf/wu-mm.pdf
Abstract:
The paper reviews major issues of IR evaluation in the systems and user paradigms, and discusses the strengths and weaknesses of both paradigms. The paradigms are synthesized in a new proposed framework for IR evaluation. The proposed framework is based on attributes that have shown to influence adoptions of innovations and criteria and measures for this attributes are suggested.
Three things I learned from my reading assignments:
The attributes used for proposed evaluation framework are: relative advantage, compatibility, complexity, triability, and observability.
The criteria used are: system relevance, topical relevance, speed, economic gain, motivational relevance, organizational relevance, social relevance, usability, cognitive relevance, situational relevance, ease of experimentation, and degree of demonstration.
The measures used are: recall and precision, source contents (type of coverage), system response time, cost benefit analysis, meet users expectations, meets organizations expectations, meets society’s expectations, compatibility with public policy, task completion time, error rate, error correction time, task completion rate, user satisfaction, user satisfaction in problem or work contexts, availability, training time, & other start up costs, cost of observation.
Implications on me and my work:
The evaluation framework in this reading will serve as guidelines on our plan to digitize the researches of the medical consultants, fellows, and resident physicians at the National Kidney and Transplant Institute hospital.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment