JO
J.E.G. Oosterman
9 records found
1
The successful execution of knowledge crowdsourcing (KC) tasks requires contributors to possess knowledge or mastery in a specific domain. The need for expert contributors limits the capacity of online crowdsourcing marketplaces to cope with KC tasks. While online social platform
...
CroKnow
Tructured Crowd Knowledge Creation
This demo presents the CrowdKnowledge Curator (CroKnow), a novel web-based platform that streamlines the processes required to enrich existing knowledge bases (e.g. Wikis) by tapping on the latent knowledge of expert contributors in online platforms. The platform integrates a num
...
Annotations obtained by Cultural Heritage institutions from the crowd need to be automatically assessed for their quality. Machine learning using graph kernels is an effective technique to use structural information in datasets to make predictions. We employ the Weisfeiler-Lehman
...
Crowd vs. Experts
Nichesourcing for knowledge intensive tasks in cultural heritage
The results of our exploratory study provide new insights to crowdsourcing knowledge intensive tasks. We designed and performed an annotation task on a print collection of the Rijksmuseum Amsterdam, involving experts and crowd workers in the domain-specific description of depicte
...