Hi! My name is Eureka.
I am a Ph.D. candidate in Technology and Social Behavior, a joint program in computer science and communication at Northwestern University in Evanston, IL. This year, I am also a Northwestern Design Research Cluster Senior Fellow. I work with Elizabeth Gerber in the Delta Lab to understand how online crowds can be leveraged to give useful feedback to creative freelancers. I recently led a project with Joy Kim and Mira Dontcheva at Adobe Research in San Francisco building on this work in an online community.
I have a bachelor's degree from Linfield College in cognitive psychology and media studies. While I was the lead UX researcher at Piktochart, I helped develop a culture of design thinking among graphic designers, web developers and marketers. I am interested in understanding how we might design online technologies that support freelancers in obtaining equitable pay, relating to other professionals, designing their careers, and developing their skills.
- Completed UX Research Internship with the AI User Research at Facebook in Menlo Park
- Presented results of Adobe Research project to product teams at Behance in New York City on November 8, 2018
- Presented 2 first-author papers and volunteered at the ACM Conference on Computer-Supported Cooperative Work in Jersey City, early November 2018
Women (Still) Ask for Less: Gender Differences in Hourly Rate in an Online Labor Marketplace
Method: Big data analysis
In many traditional labor markets, women earn less on average compared to men. However, it is unclear whether this discrepancy persists in the online gig economy, which bears important differences from the traditional labor market (e.g., more flexible work arrangements, shorter-term engagements, reputation systems). In this study, we collected self-determined hourly bill rates from the public profiles of 48,019 workers in the United States (48.8% women) on Upwork, a popular gig work platform. The median female worker set hourly bill rates that were 74% of the median man's hourly bill rates, a gap than cannot be entirely explained by online and offline work experience, education level, and job category. However, in some job categories, we found evidence of a more complex relationship between gender and earnings: women earned more overall than men by working more hours, outpacing the effect of lower hourly bill rates. To better support equality in the rapidly growing gig economy, we encourage continual evaluation of the complex gender dynamics on these platforms and discuss whose responsibility it is to address inequalities.
Eureka Foong, Nicholas Vincent, Brent Hecht, and Elizabeth M. Gerber. 2018. Women (Still) Ask For Less: Gender Differences in Hourly Rate in an Online Labor Marketplace. Proc. ACM Hum.-Comput. Interact. 2, CSCW, Article 53 (November 2018), 21 pages. DOI: https://doi.org/10.1145/3274322
Novice and Expert Sensemaking of Crowdsourced Feedback
Method: Think-aloud quasi-experiment with design experts and novices
Online feedback exchange (OFE) systems are an increasingly popular way to test concepts with millions of target users before going to market. Yet, we know little about how designers make sense of this abundant feedback. This empirical study investigates how expert and novice designers make sense of feedback in OFE systems. We observed that when feedback conflicted with frames originating from the participant's design knowledge, experts were more likely than novices to question the inconsistency, seeking critical information to expand their understanding of the design goals. Our results suggest that in order for OFE systems to be truly effective, they must be able to support nuances in sensemaking activities of novice and expert users.
Eureka Foong, Darren Gergle, and Elizabeth M. Gerber. 2017. Novice and Expert Sensemaking of Crowdsourced Design Feedback. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 45 (December 2017), 18 pages. DOI: https://doi.org/10.1145/3134680
Online Feedback Exchange: A Framework for Understanding the Socio-Psychological Factors
Method: Literature review, design-based research, user testing, experiment
To meet the demand for authentic, timely, and affordable feedback, researchers have explored technologies to connect designers with feedback providers online. While researchers have implemented mechanisms to improve the content of feedback, most systems for online feedback exchange do not support an end-to-end cycle, from help-seeking to sense-making to action. Building on extant literature in learning sciences, design, organizational behavior, and online communities, we propose a conceptual framework to highlight critical processes that affect online feedback exchange. We contribute research questions for future feedback systems and argue that online feedback systems must be able to support designers through five activities that happen before, during, and after the feedback exchange. Our framework suggests that systems should address broader socio-psychological factors, such as how intent should be communicated online, how dialogue can support the interpretation of feedback, and how to balance the tradeoffs of anonymizing feedback providers.
Eureka Foong, Steven P. Dow, Brian P. Bailey, and Elizabeth M. Gerber. 2017. Online Feedback Exchange: A Framework for Understanding the Socio-Psychological Factors. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 4454-4467. DOI: https://doi.org/10.1145/3025453.3025791
Eliciting curiosity through digital rails
Method: Field observations, surveys
From March to September 2016, I was part of Dr. Mike Horn's research team that redesigned digital reading rails at the Field Museum of Natural History in Chicago. I contributed design ideas, tested prototypes, collected video and audio data from museum visitors, and learned how to calculate key success metrics. Our goal was to prompt deeper learning conversations about artifacts in the Cyrus Tang Hall of China. The preliminary work on this project will be presented at the Annual Meeting of the American Educational Research Association (AERA '17) in June 2017.
Learning on the Job: Training crowdworkers to learn complex skills through micro-tasks
Method: Pilot studies, proposed online experiment
Many platforms exist today that make large tasks possible by crowdsourcing smaller micro-tasks. However, the people who engage in such crowdwork face unstable employment and less-than-enriching work environments. With Dr. Liz Gerber and Dr. Steven Franconeri, I am interested to know how crowdwork could be made more valuable to workers. We started exploring a way for crowdworkers to learn about graphic design by being paid for their feedback on other designs. We wanted to know if workers can develop expertise on these platforms and what types of features would make this possible.
Crowds that hack: Problem solving at a civic hackathon
Method: Participant observation, interviews
For a class on field methods, I started conducting participant observations and interviews at Chi Hack Night, a weekly civic hack night in downtown Chicago. I was interested in learning what motivates people to volunteer their time to solving tough societal problems with tech, as well as how a group of unrelated individuals explore a problem space. I was part of the Access to Justice group that is trying to connect disparate resources in the city to help the formerly incarcerated adapt to life outside prison.
- Working toward proposing my thesis this Summer 2019
In high school, I trained in Chinese contemporary dance for 4 years, performing with the Eastern Dancer company in Penang, Malaysia. I've also trained in contemporary jazz for 2 years with Van Collins in Chicago and performed in a tribute performance for choreographer Duwane Pendarvis. I'm currently working on improving my ballet technique and recently got on to pointe!
Segal Design Institute
Ford Motor Company Engineering Design Center
2133 Sheridan Road
Evanston, IL 60208