Abstract

Researchers analyze text within electronic medical records to understand patient conditions and disease progression.

Current chart review processes are slow and laborious, which makes reviewing large patient cohorts difficult. This

demonstration presents VBOSSA, a crowdsourcing framework built on top of PYBOSSA for scalable and efficient

chart reviews. VBOSSA has been used by 18 workers to assist 10 researchers from a variety of clinical specialties

answer 22,726 unique questions of varying degrees of difficulty. These workers have saved experts over 700 hours of

manual chart review. Projects for which a gold standard were established had an average accuracy of 86%, while

projects which had coverage greater than one worker had an average agreement between workers of 78%.