Nazanin Andalibi, Ph.D.
SOCIAL COMPUTING RESEARCHER
My work’s overarching vision is making possible a more compassionate and inclusive world where vulnerable individuals are more empowered and their wellbeing is enhanced. I would be excited to work with students who share this vision.
Social technologies/media, self-disclosure/privacy, social support, and stigma
I use mixed methods to investigate social media’s use and roles in relation to self-disclosure, social support exchange, and other disclosure behavior outcomes and responses to them. I concentrate on experiences that can be distressing, traumatizing, isolating, or stigmatized, and contribute to poor wellbeing. Broadly, in these contexts, I address how we can design social computing systems that facilitate beneficial sensitive disclosures and desired disclosure outcomes such as (but not limited to) exchanging social support, meaningful interactions, reciprocal disclosures, and reduced stigma.
Some contexts my work has focused on in the past include: mental health, sexual abuse, and pregnancy loss.
The image below summarizes my past work in the pregnancy loss/miscarriage space in which I have provided theoretical constructs and frameworks that explain self-disclosure/support seeking and response/support provision decisions as well as associated outcomes in the social media context. I like it as an introduction point because it entails several important and interconnected processes I examine in my research.
These days, I am curious to examine how and to what extent these frameworks explain disclosure and response/support provision decisions for diverse populations and at scale across social media platforms. In my work, I also translate insights grounded in an in-depth understanding of people's needs and experiences into designs.
Some of my other ongoing/future work that broadly fits in this space include:
designing technology to support coping and other needs for those who experience pregnancy loss and other reproductive health-related complications associated with stigma (e.g., infertility)
an intersectional approach to understanding stigmatizing experiences such as pregnancy loss by considering diverse identity facets such as genders and sexualities (e.g., through working with LGBTQ people experiencing loss)
peer/mental health support for young adults through social technologies
Emotion artificial intelligence's implications: privacy, ethics, and accountability
The research trajectory described above focuses on other social media users as information/disclosure recipients. I also investigate people's attitudes and concerns when companies and algorithms are audiences or recipients of one's sensitive information. This work goes beyond social media applications to include other types of technology such as voice assistants. For example, I critically examine the ways emerging technologies such as emotion artificial intelligence may engage with humans in times of distress or in otherwise private and personal settings. I explore the extent to which designing these technologies is appropriate in different contexts, and investigate what it would take for them to be sensitive to and foreground people's values, needs, and desires.