arbeiten:comparing_annotation_tools

Unterschiede

Hier werden die Unterschiede zwischen zwei Versionen angezeigt.

Link zu dieser Vergleichsansicht

Beide Seiten der vorigen Revision Vorhergehende Überarbeitung
Nächste Überarbeitung
Vorhergehende Überarbeitung
arbeiten:comparing_annotation_tools [04.11.2020 11:32] – [Data-Entry] Andreas Schmidarbeiten:comparing_annotation_tools [04.05.2021 13:11] (aktuell) – [Data-Entry] Andreas Schmid
Zeile 2: Zeile 2:
  
 ---- dataentry StudentischeArbeit ---- ---- dataentry StudentischeArbeit ----
-Thema                                  : Systematic Comparison of Annotation Tools  +Thema                                  : Systematic Comparison of Annotation Tools 
-Art_thesistypes                        : MA, FPM  +Art_thesistypes                        : FPM 
-BetreuerIn_thesisadvisor               : Andreas Schmid  +BetreuerIn_thesisadvisor               : Andreas Schmid 
-BearbeiterIn                           :  #  +BearbeiterIn                           : Florian Kaindl, Philipp Schauhuber 
-ErstgutachterIn_thesisprofessor        :  #  +ErstgutachterIn_thesisprofessor        :  
-ZweitgutachterIn_secondthesisprofessor :  #  +ZweitgutachterIn_secondthesisprofessor :  
-Status_thesisstate                     : ausgeschrieben #  +Status_thesisstate                     : abgeschlossen 
-Stichworte_thesiskeywords              : Physical-Digital, Annotation, PDF, Paper  +Stichworte_thesiskeywords              : Physical-Digital, Annotation, PDF, Paper 
-angelegt_dt                            : 2020-11-03  +angelegt_dt                            : 2020-11-03 
-Anmeldung_dt                           :  #  +Anmeldung_dt                           :  
-Antrittsvortrag_dt                     :  #  +Antrittsvortrag_dt                     :  
-Abschlussvortrag_dt                    :  #  +Abschlussvortrag_dt                    :  
-Abgabe_dt                              :  # +Abgabe_dt                              : 2021-04-15
 Textlizenz_textlicense                 :  # #Lizenz|## Textlizenz_textlicense                 :  # #Lizenz|##
 Codelizenz_codelicense                 :  # #Lizenz|## Codelizenz_codelicense                 :  # #Lizenz|##
Zeile 20: Zeile 20:
  
  
-=== Hintergrund === 
  
-The internet grants free and quick access to educational content for anyone. 
-Especially video platforms like [YouTube](https://youtube.com) with video tutorials or [Coursera](https://coursera.org) with professional-grade courses and lectures facilitate self-education. 
  
-Due to the current Covid-19 pandemic, most university courses have also shifted from a classical classroom or lab context to an online format. +=== Hintergrund ===
-While some courses and lectures are still taught in real time via live streaming or video conferences, many lecturers have seized the opportunity to change their courses to an asynchronous format with pre-recorded video snippets of certain topics.+
  
-But even though university lecturers have undoubtedly vast knowledge of their field, their skills, experience, and equipment needed for proper media production might be insufficient [1]. +Annotating text documents is a common task among office workers. 
-But how does the production quality of educational videos actually influence student's learning progress?+Besides the "classical" method of highlighting and taking notes on paper there is a variety of different tools, ranging from built-in functions of different PDF readers over annotating LaTeX source code (e.g. in [Overleaf](https://overleaf.com)) to emulating the paper-based process with tablet and stylus. 
 +Each of those techniques offers different affordances for users, making some types of annotation easier and others harder.
  
 === Zielsetzung der Arbeit === === Zielsetzung der Arbeit ===
  
-This work aims to find out how the audio and video quality of educational videos influence how good viewers understand its content+This work aims to find out to what extend the available tools of a certain annotation method influence which kinds of annotations are made and how many errors in a text are found
-One way of approaching this question might be a controlled user study in which participants are shown educational videos of different quality levels. +This includes creating an overview of existing annotation methods beforehand. 
-Content understanding could be observed with a short test after the video. +In a controlled user study, representative annotation methods should be compared in terms of qualitative characteristics and error recall.
- +
-Another interesting contribution would be to investigate how well university lecturers are actually equipped for the production of educational videos and how the Covid-crisis has changed this.+
  
 === Konkrete Aufgaben === === Konkrete Aufgaben ===
Zeile 45: Zeile 40:
 === Erwartete Vorkenntnisse === === Erwartete Vorkenntnisse ===
  
- media production (recommended) + designing and conducting a user study
- * survey and study design+
  
 === Weiterführende Quellen === === Weiterführende Quellen ===
  
-[1] Hansch, A., Hillers, L., McConachie, K., Newman, C., Schildhauer, T., & Schmidt, P. (2015). Video and Online Learning: Critical Reflections and Findings from the Field. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2577882+TBD 
 +