Tackling Algorithmic BiasesFebruary 12, 2023
At Penn Engineering, Ira Globus-Harris is combining a passion for research with a desire to make the world a better place.
Machine learning has tremendous potential to improve lives in ways large and small. But it can also exacerbate inequities by using what it’s learned to make decisions that are fundamentally unjust.
Ira Globus-Harris aims to eliminate these unfair outcomes. A Ph.D. student in computer and information science who uses the pronoun they, Globus-Harris is exploring technological solutions to mitigate the harm these tools can cause. In this Q&A, Globus-Harris talks about their research and what it’s like to be a graduate student at Penn Engineering.
What got you interested in pursuing a Ph.D. in computer information science?
After college I worked at Boston University as a research software engineer. I was mostly working with the Harvard Privacy Tools project, building privacy tools so that we could collect information about people in a way that would not violate anyone’s individual privacy.
It was fun, but I realized that the research side of things was where my heart was. In particular, I was interested in looking at the technical part of solutions to larger social problems. Technical solutions alone cannot solve these problems, of course, but they’re a part of it. And I was interested in tackling some of the theory behind that.
How did you choose Penn Engineering?
In looking at schools, I mostly looked at places with faculty whose work I admired and who I thought would be fun to work with. At Penn, Michael Kearns and Aaron Roth, who are now my advisors, are both doing wonderful research that spans many different fields.
Moreover, I thought they would be really good advisors and help me grow as a researcher. They’ve been absolutely wonderful, and I am very happy in my research group.
What is your research all about?
I’m looking at issues of bias and inequity in the performance of machine-learning algorithms — things like resume screening tools that inadvertently learn to reject all the female applicants, or prediction algorithms used in courts as part of sentencing recommendations that perform differently for white and Black defendants. These are well-documented problems.
Of course, a technical solution isn’t going to fix the underlying social issues, but it’s still an important part of it. If these algorithms are going to be deployed in practice, we need to think about ways to prevent bias from a technical standpoint.
What drew you to this particular area of research?
I wanted to do something that could lead to the world being a better place. At the same time, I love theory — I love math and thinking about puzzles. I wasn’t happy just implementing things and not getting into the fundamental questions of how we can tackle issues from a mathematical or computer science standpoint. That’s what gives me joy. Working on algorithm fairness gives me a good balance.
What is the student community like at Penn Engineering?
I came in during the pandemic, so it took me a while to make connections. But I am particularly grateful for the LGBTQ Center for being a wonderful support as I settled in at Penn. They connected me with some queer faculty members, and they helped me start an LGBT group for graduate students.
I also made a lot of friends through student government. Hiking and climbing has been my main social outlet, and I was introduced to those through the graduate student government. There are lots of opportunities at Penn to branch out and be social and find time to do the things that you love.
What’s next for you after you finish your Ph.D.?
I’ve generally imagined myself trying to go the academia route because I enjoy both research and teaching. But through my advisors, I was able to get an internship working with Amazon Web Services on a variety of problems related to algorithmic fairness. I was surprised by how much I enjoyed my time working in industry, so I’m leaving that open as a possibility.
Do you have any advice for prospective students?
For a Ph.D. student coming in, the single most important thing to look at is the research group environment, because these are the people who you’re going to be interacting with every single day. If your research community isn’t a good fit, you won’t have a good experience.