My research works on the intersection of Social Psychology and Human-Computer Interaction. I apply novel methods and techniques to understand dyadic and group process, to examine cultural patterns, and to integrate theories of functions of emotion to social agents.
Nonverbal Synchrony as an Adaptation to Social Environments
Thinking about traveling to a foreign country, or talking to a person from another place, what would you do to express yourself clearly? How do people use nonverbal coordination to adapt socially in which language may not be able to serve its optimal function? To study this question, I created in lab a set of tasks that elicit emotions, and facilitate cooperation. Using automated facial and bodily analysis, I also developed state-of-the-art methods to measure and automated nonverbal synchrony.
To learn more about this project: ** I will be presenting my work at the SAS 2019 in Boston, MA Also, check out my poster at SPSP 2019 in Portland, OR [Poster]
Heterogeneity of Long-history Migration in Emotion Expressivity
Let’s conduct a little thought experiment here: Imagine people from different countries, are settlers who just landed on a new continent. They do not share the same language or any norms. How would people adjust to achieve coordination and cooperation?
Recent work on cultures of facial expressions suggest that people from historically heterogenous countries exhibit greater, and more recognizable facial expressivity than people from historically homogeneous countries. We propose that these cultural norms originate from ecological pressure to cooperate and establish rapport with people with little common ground (in the form of shared cultural norms) and little shared language.
To study that, I extend the models and procedures that I created in nonverbal synchrony research to people from different nations.
Designing Socially Adept Agents
Human emotions are responses to momentary challenges and opportunities in the social environment that are relevant to a person’s immediate goals. Facial, vocal, and bodily expressions of emotion are functional insofar as they signal the person’s interpretation of their challenges or opportunities and inform others about how to respond.
My current research involves applying theories of dyadic interaction and emotion on human-robot interaction. I’m interested in developing and synthesis computational models that develop from dyadic interaction to social agents. My goal is to enable the design emotion-adept social agents. When endowed with emotional capabilities, smart agents will offer more rewarding interactions.
Human-robot collaboration has been inevitably prevalent in today’s industry and even daily life. Inspiring by collaborative dynamics during human-animal teamwork, I’m interested in studying psychological outcomes of human-robot collaboration, with the goal of looking for the optimal balance between human and robot contribution.