Search National Agricultural Library Digital Collections

NALDC Record Details:

An overview of the BioCreative 2012 Workshop Track III: interactive text mining task

Permanent URL:
http://handle.nal.usda.gov/10113/56119
Abstract:
In many databases, biocuration primarily involves literature curation, which usually involves retrieving relevant articles, extracting information that will translate into annotations and identifying new incoming literature. As the volume of biological literature increases, the use of text mining to assist in biocuration becomes increasingly relevant. A number of groups have developed tools for text mining from a computer science/linguistics perspective, and there are many initiatives to curate some aspect of biology from the literature. Some biocuration efforts already make use of a text mining tool, but there have not been many broad-based systematic efforts to study which aspects of a text mining tool contribute to its usefulness for a curation task. Here, we report on an effort to bring together text mining tool developers and database biocurators to test the utility and usability of tools. Six text mining systems presenting diverse biocuration tasks participated in a formal evaluation, and appropriate biocurators were recruited for testing. The performance results from this evaluation indicate that some of the systems were able to improve efficiency of curation by speeding up the curation task significantly (similar to 1.7-to 2.5-fold) over manual curation. In addition, some of the systems were able to improve annotation accuracy when compared with the performance on the manually curated set. In terms of inter-annotator agreement, the factors that contributed to significant differences for some of the systems included the expertise of the biocurator on the given curation task, the inherent difficulty of the curation and attention to annotation guidelines. After the task, annotators were asked to complete a survey to help identify strengths and weaknesses of the various systems. The analysis of this survey highlights how important task completion is to the biocurators' overall experience of a system, regardless of the system's high score on design, learnability and usability. In addition, strategies to refine the annotation guidelines and systems documentation, to adapt the tools to the needs and query types the end user might have and to evaluate performance in terms of efficiency, user interface, result export and traditional evaluation metrics have been analyzed during this task. This analysis will help to plan for a more intense study in BioCreative IV.
Author(s):
Celilia N. Arighi , Ben Carterette , K. Bretonnel Cohen , Martin Krallinger , W. John Wilbur , Petra Fey , Robert Dodson , Laurel Cooper , Ceri Van Slyke E. , Wasila Dahdul , Paula Mabee , Donghui Li , Bethany Harris , Marc Gillespie , Silvia Jimenez , Phoebe Roberts , Lisa Matthews , Kevin Becker , Harold Drabkin , Susan Bello , Luana Licata , Andrew Chatr-aryamontri , Mary L. Schaeffer , Julie Park , Melissa Haendel , Kimberly Van Auken , Yuling Li , Juancarlos Chan , Hans-Michael Muller , Hong Cui , James P. Balhoff , Johnny Chi-Yang Wu , Zhiyong Lu , Chih-Hsuan Wei , Catalina O. Tudor , Kalpana Raja , Suresh Subramani , Jeyakumar Natarajan , Juan Miguel Cejuela , Pratibha Dubey , Cathy Wu
Subject(s):
Biological Sciences , accuracy , databases , design , guidelines , information retrieval , surveys , user interface
Source:
Database: The Journal of Biological Databases and Curation 2013 v.2012
Language:
English
Year:
2013
Collection:
Journal Articles, USDA Authors, Peer-Reviewed
File:
Download [PDF File]
Rights:
Works produced by employees of the U.S. Government as part of their official duties are not copyrighted within the U.S. The content of this document is not copyrighted.