60 likes | 172 Views
Explore the world of crowdsourcing for ontology engineering, including collaborative approaches, challenges, experiments using MTurk and CrowdFlower, and open questions relating to quality assurance, incentives, and crowdsourcing approaches.
E N D
Crowdsourcing ontology engineering Elena Simperl Web and Internet Science, University of Southampton 11 April 2013
Overview • "online, distributed problem-solving and production model“ [Brabham, 2008] • Varieties: wisdom of the crowds/collective intelligence, open innovation, human computation... • Why is it a good idea? • Cost and efficiency savings • Wider acceptance, closer to user needs, diversity • Approaches • Collaborative ontology engineering • Challenges/competitions • Games with a purpose • Microtask/paid crowdsourcing • In combination with automatic techniques
Crowdsourcing ontology alignment • Experiments usingMTurk, CrowdFlowerandestablishedbenchmarks • Enhancingtheresultsofautomatictechniques • Fast, accurate, cost-effective [Sarasua, Simperl, Noy, ISWC2012]
Open questions • Quality assurance and evaluation • Incentives and motivators • Choice of crowdsourcing approach and combinations of different approaches • Reusable collection of algorithms for quality assurance, task assignment, workflow management, results consolidation etc • Schemas recording provenance of crowdsourced data • Descriptive framework for classification of human computation systems • Typesoftasksandtheirmodeofexecution • Participantsandtheirroles • Interaction withsystemandamongparticipants • Validation ofresults • Consolidationandaggregationofinputsintocompletesolution
Theory and practice of social machineswww.sociam.org http://sociam.org/www2013/