arbeiten:conceptual_design_of_an_adaptive_aircraft_cockpit_user_interface_based_on_visual_complexity_analysis

Towards an Automated Measurement for Visual Complexity of User Interfaces: Concept and Implementation of a Software Prototype

Thema:
Towards an Automated Measurement for Visual Complexity of User Interfaces: Concept and Implementation of a Software Prototype
Art:
MA
BetreuerIn:
Christian Wolff
BearbeiterIn:
Arne Tiedemann
ErstgutachterIn:
Christian Wolff
Status:
in Bearbeitung
Stichworte:
Visual Complexity, User Experience, Image Compression, User Interface, Adaptive Systems
angelegt:
2023-10-26
Anmeldung:
2023-10-31
Antrittsvortrag:
2023-11-13


Hintergrund | Background

In our daily life, we are surrounded by multimodal, adaptive User Interfaces (UIs) everywhere. Thus, SRI International (SRI)1 has developed the HCI framework bRIGHT (Senanayake et al., 2018; Senanayake & Denker, 2019, 2022) that can be used for several use-cases. Among other things, User Experience (UX) plays a central role here, specifically the user’s cognitive load due to visual stimulus overload or the visual complexity of the User Interface (UI). To measure and quantify visual complexity, different methods and approaches have been used in the past. One of the most promising approaches is Image Compression (Donderi, 2006b; Forsythe, 2009). However, other approaches are also being pursued by researchers. Rosenholtz et al. (2005) used Visual Clutter to quantify and measure visual complexity. In a later paper, the authors combined and compared this approach with other methods such as Feature Congestion, Subband Entropy, and Edge Density (Rosenholtz et al., 2007). Recent research that is based on Deep Neural Networks and Machine Learning algorithms, such as the work of Machado et al. (2015) or Fernandez-Lozano et al. (2019), also show promising results. Yet, none of these approaches have been implemented in a running software to automatically measure a UI’s visual complexity.


Zielsetzung der Arbeit | Objective of this work

In cooperation with the SRI, the objective of this thesis is to investigate to what extent visual complexity of a UI can be collected at runtime. Presumably, with a real-time complexity scale, systems will be able to perform more adequate decisions for displayed data that in return results in a better UX. For this purpose, it is necessary to evaluate and prioritize known methods from the literature with respect to their efficacy and effectiveness. With the help of three preliminary studies, the most promising method to be integrated into the bRIGHT framework will be verified. Thereafter, a proof-of-concept (PoC) in the form of a web-based software prototype (HTML) will be implemented and evaluated in a final study. The objective of the PoC is to (1) validate a developed conceptual design of such automated measurement and (2) develop a first running code snippet of a method for computing a visual complexity scale of the UI that can be built upon.


Konkrete Aufgaben | Tasks

In order to achieve these objectives, concrete steps and tasks have been defined beforehand, that are listed in the following:

  • Literature review on different approaches to measure and quantify visual complexity
  • Evaluation of the most promising approach through three preliminary studies
  • Conceptual development of an automated measurement approach
  • Implementation of a PoC for the above-mentioned approach
  • Conducting a conclusive study to evaluate the automated measurement concept


Erwartete Vorkenntnisse | Prior Knowledge

  • UX-Design & -Research
  • Scientific work & writing
  • Web development


Weiterführende Quellen | References

Donderi, D. C. (2006). Visual complexity: A review. Psychological Bulletin, 132(1), 73–97. (Place: US Publisher: American Psychological Association) doi: 10.1037/0033-2909.132.1.73

Fernandez-Lozano, C., Carballal, A., Machado, P., Santos, A., & Romero, J. (2019, July). Visual complexity modelling based on image features fusion of multiple kernels. PeerJ, 7, e7075. Retrieved 2023-08-10, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6642794/ doi: 10.7717/peerj.7075

Forsythe, A. (2009). Visual Complexity: Is That All There Is? In D. Harris (Ed.), Engineering Psychology and Cognitive Ergonomics (pp. 158–166). Berlin, Heidelberg: Springer. doi: 10.1007/978-3-642-02728-4_17

Machado, P., Romero, J., Nadal, M., Santos, A., Correia, J., & Carballal, A. (2015, September). Computerized measures of visual complexity. Acta Psychologica, 160, 43–57. Retrieved 2023-08-07, from https://www.sciencedirect.com/science/article/pii/S0001691815300160 doi:10.1016/j.actpsy.2015.06.005

Rosenholtz, R., Li, Y., Mansfield, J., & Jin, Z. (2005, April). Feature congestion: a measure of display clutter. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 761–770). New York, NY, USA: Association for Computing Machinery. Retrieved 2023-08-11, from https://dl.acm.org/doi/10.1145/1054972.1055078 doi: 10.1145/1054972.1055078

Rosenholtz, R., Li, Y., & Nakano, L. (2007, August). Measuring visual clutter. Journal of Vision, 7(2), 17. Retrieved 2023-08-10, from https://doi.org/10.1167/7.2.17 doi: 10.1167/7.2.17

Senanayake, R., & Denker, G. (2019). Workstations of the Future for Transformational Gains in Solving Complex Problems. In M. Kurosu (Ed.), Human-Computer Interaction. Design Practice in Contemporary Societies (pp. 476–488). Cham: Springer International Publishing. doi: 10.1007/978-3-030-22636-7_36

Senanayake, R., & Denker, G. (2022). bRIGHT – A Framework for Capturing and Adapting to Context for User-Centered Design. In C. Ardito et al. (Eds.), Sense, Feel, Design (pp. 158–173). Cham: Springer International Publishing. doi: 10.1007/978-3-030-98388-8_15

Senanayake, R., Denker, G., & Lincoln, P. (2018). bRIGHT – Workstations of the Future and Leveraging Contextual Models. In S. Yamamoto & H. Mori (Eds.), Human Interface and the Management of Information. Interaction, Visualization, and Analytics (pp. 346–357). Cham: Springer International Publishing. doi: 10.1007/978-3-319-92043-6_29


1 https://www.sri.com; Last access: September 26, 2023; 10:24 AM