Wiki
User Experience Engineering

Jan 15-19: Adaptive Systems and Beyond
Page last edited by Per Bækgaard (pgba) 18/01-2018

Adaptive systems tries to learn from user behaviour and biometric measurements, and adapts to our preferences and intents. This week will focus on designing systems that improves the UX by incorporating biometric measurements and learns from observed behaviour and metrics.

Notice: This page will be updated throughout the week with additional information and material.

Learning objectives
  • Incorporate biometric measurements into MVP UX prototypes
  • Create MVP UX prototypes that adapt to the cognitive or emotional state of the user
  • (Identify health-related aspects of MVP UX prototypes that can assist users in their personal life journey)

Monday

08:00: Wrap-Up and Introductory lecture on Adaptive Systems and Beyond

09:00: Group work and Hand-In of 3 ideas, each described by a Lean Canvas

12:00: Lunch break

13:00: Guest Lecture (Michael Kai Petersen)

14:00: Continued Group work, validating problems and markets and selecting one key idea

17:00: Hand-In of a Landing Page, Lean Canvas (and maybe initial USM and Annotated Wireframes) for the  selected idea

The FITTS law experiment will be available from around lunch and the rest of the day

Slides: Morning Lecture. Guest Lecture.

Tuesday

08:00: Feedback and lecture on Eye Tracking

09:00: Group work

12:00: Hand-In to CampusNet and Lunch break

13:00: Short feedback session followed by group work

15:59: Hand-In to Peergrade

16:00: Peergrade open until 20:00

The FITTS law experiment will be available during most of the day

Slides: Morning Lecture.

Wednesday

08:00: Feedback and short discussion

08:30: Group work

12:00: Lunch break

13:00: Guest lecture on usability scoring and scoring of the MORIBUS app

15:59: Hand-In to Peergrade (partial focus on validation)

16:00: Peergrade open until 20:00

The FITTS law experiment will be available during the day

Thursday

08:00: Feedback and final reflecting lecture

09:00: Group work

12:00: Hand-In to CampusNet and Lunch break

13:00: Short feedback session followed by group work and preparation for final presentation

17:00: Hand-In to CampusNet (your presentation for tomorrow)

The FITTS law experiment will be available durin the day

Slides: Morning Lecture

Friday

08:00: Group Presentation with Live Feedback Form (Results)

10:00: Group work

17:00: Hand-In of Report (0.5 page/groupmember) with Lean Canvas, Landing Page, User Story Map and Annotated (Micro-interactions) Wireframes, as well as documentation of your validations, as appendices in one pdf file to CampusNet.

Name your file "GroupNN_title.pdf" (where title is a description of your work).

Include also links to POP/Marvell/Invision "executable prototypes" if relevant, but make sure you have all details of the wireframes readable in the PDF file you hand in.

For the report content, please follow the guidelines given earlier (page 25 of this deck, also adding relevant validation documentation).

Literature (use findit.dtu.dk to read the articles not linked directly)

Findlater, L. and McGrenere, J., 2004, April. A comparison of static, adaptive, and adaptable menus. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 89-96). ACM.

Siefert, D.M. and McCollum, T.A., Ncr Corporation, 1998. Predictive, adaptive computer interface. U.S. Patent 5,726,688.

Gomez-Uribe, C.A. and Hunt, N., 2016. The netflix recommender system: Algorithms, business value, and innovation. ACM Transactions on Management Information Systems (TMIS), 6(4), p.13.

Bradley, M.M. and Lang, P.J., 1994. Measuring emotion: the self-assessment manikin and the semantic differential.

Journal of behavior therapy and experimental psychiatry, 25(1), pp.49-59

Emotion Classification (wikipedia)

Jackson, Beatty: Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychological Bulletin, 91(2):276–292, 1982.

affectiva.com (or AffdexMe)

Wallentin, M., Nielsen, A.H., Vuust, P., Dohn, A., Roepstorff, A. and Lund, T.E., 2011. 

Amygdala and heart rate variability responses from listening to emotionally intense parts of a story. Neuroimage, 58(3), pp.963-973.

iMotions on GSR

Khalfa, S., Isabelle, P., Jean-Pierre, B. and Manon, R., 2002. 

Event-related skin conductance responses to musical emotions in humans. Neuroscience letters, 328(2), pp.145-149.

Eckstein et al. 2016: Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development

Bækgaard, Petersen, and Larsen 2016: Separating Components of Attention and Surprise

Bækgaard, Jalaliniya, and Paulin. 2016: Pupillary Measurement During an Assembly Task (DRAFT, published as part of PhD Thesis)

Bækgaard 2015: Eye Movements (short summary)

DL4J http://deeplearning4j.org Word2Vec

Rare Technologies: Word2Vec tutorial and demo app

Jon Kolko: "Design thinking comes of age" (Harvard Business Review, 2015)


Support: +45 45 25 74 43