Title Page
Abstract
Contents
I. Introduction 15
1.1. Background 15
1.2. Thesis Statement 17
1.3. Thesis Structure 17
II. Literature Review 18
2.1. Problem of Online Meeting 18
2.2. Non-verbal Communication on Online Meeting 18
2.3. Non-verbal Expression Recognition 21
III. Interactive Design of NEAS 23
3.1. Analysis of Prior Online Meeting Experience 23
3.1.1. Participants & Method 23
3.1.2. Results 23
3.1.3. Discussion 26
3.2. PilotWorkshop 26
3.3. Workshop 1. The Moment User Need Non-verbal Expression in Online Meeting 27
3.3.1. Participants & Method 27
3.3.2. Results 27
3.4. Workshop 2. Non-verbal Expression & Visual Aid 29
3.4.1. Participants & Method 29
3.4.2. Results 29
3.5. Co-design for Non-verbal Expression and Visual Aid in NEAS 30
IV. System Design of NEAS 32
4.1. Pilot Demonstration with Google Teachable Machine 32
4.1.1. Demonstration 32
4.1.2. Discussion 32
4.2. Pilot Demonstration with Google MediaPipe 33
4.2.1. System Flow 33
4.2.2. Data Collection and Preprocessing 33
4.2.3. Model Selection 34
4.2.4. Machine Learning System Design 35
4.2.5. Pilot Demonstration 37
4.3. Preliminary Experiment 37
4.3.1. Participants & Method 37
4.3.2. Experimental Environment & Apparatus 38
4.3.3. Results & Discussion 39
4.4. NEAS: Non-verbal Expression Assistant System 40
4.4.1. Data Collection and Preprocessing 41
4.4.2. Model Evaluation and Selection 41
4.4.3. Final NEAS Concept 41
V. User Evaluation 43
5.1. Experimental Design 43
5.2. Evaluation Methodology 43
5.2.1. Meeting Progress Analysis 43
5.2.2. Questionnaire 45
5.2.3. Exit Interview 45
5.3. Participants 45
5.4. Procedure 46
5.4.1. Control Group 46
5.4.2. Experimental Group 48
5.5. Results 49
5.5.1. Meeting Progress Analysis 51
5.5.2. Questionnaire 52
5.5.3. Exit Interview 54
VI. Discussion 60
6.1. Overcoming Restrictions on Non-verbal Communication in Online Meeting 60
6.2. Co-Design for Non-verbal Expressions and Visual Aid in Online Ideation Meeting 61
6.3. System Design to Recognize Non-verbal Expressions in Real-time for Online Ideation Meeting 62
6.4. Positive User Experience in Online Ideation Meeting through NEAS 64
VII. Conclusion 66
7.1. Conclusion 66
7.2. Limitations & Future Work 66
References 68
Table 1. Score and confidence interval(CI) of Likert-scale questions. 24
Table 2. Result of open-ended questions: pain points, attempt to overcome, desired points for online... 25
Table 3. The moment when non-verbal expressions are needed on online meeting.... 28
Table 4. Result(accuracy) of Hold-out validation: Logistic Regression for facial expressions and Random... 35
Table 5. Ideation guideline for preliminary study. 38
Table 6. Questionnaire list: novelty, meeting satisfaction and system evaluation. 39
Table 7. Classes where data from facial expressions are collected 40
Table 8. Classes in which data of body posture and gestures (the sum of pose landmarks and both hands... 41
Table 9. Result(accuracy) of 5-fold cross-validation: Logistic Regression for facial expressions and... 42
Table 10. Measures for quantitative user evaluation for main experiment (7-points Likert-scale). 44
Table 11. Workshop to present new Marvel superhero concept with double diamond process. 48
Table 12. Points to be aware of when MediaPipe recognizes landmarks and non-verbal expressions. 49
Table 13. Positive user experience of experimental group: the overall meeting atmosphere and experi-... 57
Table 14. Negative user experience of experimental group: technical problem, design problem, and meet-... 58
Table 15. Feedback on system improvements of experimental group: non-verbal expression, UX im-... 59
Figure 1. Restrictions on face-to-face communication in online classes. 19
Figure 2. Video Meeting Signals to overcome psychological issues in online meeting. 21
Figure 3. Google MediaPipe holistic pipeline overview. 22
Figure 4. Result of multiple-answer questions. 23
Figure 5. Result of Likert-scale questions. 24
Figure 6. The first workshop for eliciting the moment when users need non-verbal expression in online... 28
Figure 7. The second workshop: eliciting non-verbal expression process. 29
Figure 8. Final non-verbal expression set with visual aid for interactive design. 30
Figure 9. The training data set of 4 classes: thumb up, raise one hand, smile, and default pose & face. 32
Figure 10. System Schematic Diagram of NEAS. 33
Figure 11. Landmarks of body posture and gestures: pose(left) and hand(right) landmarks in Google MediaPipe. 34
Figure 12. The process of collecting data considering the expressive variation of individual non-verbal expressions. 35
Figure 13. Data flow chart: the model for body gestures is judged prior to the model for facial expressions. 36
Figure 14. The process of two models alternating at 1-second intervals and predicting the average value... 36
Figure 15. Landmarks visualization(left) and pilot demonstration for preliminary experiment(right). 36
Figure 16. Preliminary experiment with pilot demonstration with Google MediaPipe. 37
Figure 17. Data collection process for the final NEAS concept: data was collected from a total of 10 people. 40
Figure 18. Final NEAS concept with fingertips visualization. 42
Figure 19. Design ideation workshop of control group through ZOOM. 47
Figure 20. Exit interview of control group. 47
Figure 21. NEAS practice session of experimental group. 49
Figure 22. Design ideation workshop of experimental group through ZOOM and NEAS. 50
Figure 23. Exit interview of experimental group. 50
Figure 24. Progress analysis of experimental group: summary of the frequency of NEAS use. 52
Figure 25. Summary of NEAS emoji use frequency by the experimental group: hit, false alarm, and miss. 53
Figure 26. Positive experience through NEAS: positive atmosphere, decision making, and feedback &... 53
Figure 27. False alarm classification in cases where unintended misrecognition occurs and cases where... 54
Figure 28. Meeting experience of both control group and experimental group. 54
Figure 29. Concept and system evaluation of experimental group. 55