Zur Kurzanzeige

dc.identifier.uri http://dx.doi.org/10.15488/16917
dc.identifier.uri https://www.repo.uni-hannover.de/handle/123456789/17044
dc.contributor.author Oelen, Allard
dc.contributor.author Stocker, Markus
dc.contributor.author Auer, Sören
dc.contributor.editor Hammond, Tracy
dc.contributor.editor Verbert, Katrien
dc.contributor.editor Parra, Dennis
dc.contributor.editor Knijnenburg, Bart
dc.contributor.editor O'Donovan, John
dc.contributor.editor Teale, Paul
dc.date.accessioned 2024-04-08T06:46:43Z
dc.date.available 2024-04-08T06:46:43Z
dc.date.issued 2021
dc.identifier.citation Oelen, A.; Stocker, M.; Auer, S.: Crowdsourcing Scholarly Discourse Annotations. In: Hammond, T.; Verbert, K.; Parra, D.; Knijnenburg, B.; O'Donovan, J. et al. (eds.): IUI '21: 26th International Conference on Intelligent User Interfaces. New York, NY : Association for Computing Machinery, 2021, S. 464-474. DOI: https://doi.org/10.1145/3397481.3450685
dc.description.abstract The number of scholarly publications grows steadily every year and it becomes harder to find, assess and compare scholarly knowledge effectively. Scholarly knowledge graphs have the potential to address these challenges. However, creating such graphs remains a complex task. We propose a method to crowdsource structured scholarly knowledge from paper authors with a web-based user interface supported by artificial intelligence. The interface enables authors to select key sentences for annotation. It integrates multiple machine learning algorithms to assist authors during the annotation, including class recommendation and key sentence highlighting. We envision that the interface is integrated in paper submission processes for which we define three main task requirements: The task has to be . We evaluated the interface with a user study in which participants were assigned the task to annotate one of their own articles. With the resulting data, we determined whether the participants were successfully able to perform the task. Furthermore, we evaluated the interface's usability and the participant's attitude towards the interface with a survey. The results suggest that sentence annotation is a feasible task for researchers and that they do not object to annotate their articles during the submission process. eng
dc.language.iso eng
dc.publisher New York, NY : Association for Computing Machinery
dc.relation.ispartof IUI '21: 26th International Conference on Intelligent User Interfaces
dc.rights CC BY 4.0 Unported
dc.rights.uri https://creativecommons.org/licenses/by/4.0/
dc.subject Crowdsourcing Text Annotations eng
dc.subject Intelligent User Interface eng
dc.subject Knowledge Graph Construction eng
dc.subject Structured Scholarly Knowledge eng
dc.subject Web-based Annotation Interface eng
dc.subject.classification Konferenzschrift ger
dc.subject.ddc 004 | Informatik
dc.title Crowdsourcing Scholarly Discourse Annotations eng
dc.type BookPart
dc.type Text
dc.relation.isbn 978-1-4503-8017-1
dc.relation.doi https://doi.org/10.1145/3397481.3450685
dc.bibliographicCitation.firstPage 464
dc.bibliographicCitation.lastPage 474
dc.description.version publishedVersion eng
tib.accessRights frei zug�nglich


Die Publikation erscheint in Sammlung(en):

Zur Kurzanzeige

 

Suche im Repositorium


Durchblättern

Mein Nutzer/innenkonto

Nutzungsstatistiken