This page is optimized for a taller screen. Please rotate your device or increase the size of your browser window.

Evaluations & Racial Equity: How Do We Eliminate Bias from Evaluations Intended to Address … Bias?

October 24, 2022

Challenges in areas ranging from education to the environment, gender to governance, health to housing don’t exist in a vacuum. Each month, Abt experts from two disciplines explore ideas for tackling these challenges in our monthly podcast, The Intersect. Sign up for monthly e-mail notifications here. Catch up with previous episodes here.

Social programs designed to promote equity—in housing, jobs, justice, health care, and more—rely on data to identify strategies and measure impact. But what happens when our data collection and analyses are inadvertently sabotaged by bias? Drs. Rucha Londhe and Eric Hedberg share tips and experiences.

You also might enjoy:

Read the Transcript

Eric Tischler: Hi, and welcome to The Intersect, I'm Eric Tischler. Abt Global tackles complex challenges around the world, ranging from improving health and education to assessing the impact of environmental changes. For any given problem, we bring multiple perspectives to the table. We thought it would be enlightening—and maybe even fun—to pair up colleagues from different disciplines, so they can share their ideas and perhaps spark new thinking about how we solve these challenges.

Today I'm joined by two of those colleagues, Drs. Rucha Londhe and Eric Hedberg. Rucha helps Abt teams bring inequity perspective into their work, notably providing evaluation project teams with the framework for including equity-based principles in data collection, analysis, and reporting. Eric is a sociologist. His research includes investigating the design of evaluations in education criminology.

Welcome!

Dr. Rucha Londhe: Thank you, Eric. Glad to be here.

Dr. Eric Hedberg: Glad to be here. Thanks, Eric.

Eric T: Many social policy programs are designed to help people who have historically been denied equitable opportunities and benefits as a result of long-standing systemic biases. We evaluate programs to determine if they work, if they're correcting and addressing those biases. But what happens when the evaluation itself is biased? How can we get the data we need to understand, to say nothing of solve, the problems we face? So, Eric, I'm going to start with you because the idea for this podcast came from something you said on another podcast (and I'm not jealous). You want to talk about the epiphany you had about this, about your own struggles with the equity and evaluations?

Eric H: Thanks, Eric. I think, it's not been one epiphany but it's been sort of this growing recognition of realizing that just because I have a degree in sociology, and just because I tend to work a lot with numbers and statistics, this realization that those are not tickets out of the human biases that can enter any discussion or product or a piece of research. And so this happened over the course of, well, the last 15 years of my career or so. When, you know, I've sort of been checked a couple of times, and, you know, when you get checked once, you're like, "Okay, maybe." But you get checked again, and then you get checked again...I think it's a responsible human thing to do to sort of say, "Okay, what are my assumptions that I'm bringing to this table" and sort of really, "Where am I right, where am I wrong?" And trying to understand that.

You know, one of the places that really stands out in my head is when I was designing a survey, it was a social network survey, I was asking Arizona high school students, I asked them to name friends, and family, and teachers. And, you know, the family section ... it was really important that I knew the type of relationship they had with the people that they named. And I was also, like, really worried about the length of the survey. And, I sort of said, "Okay, there's a parent, and there's siblings," and then I kind of collapsed everyone else into this other bin. And that really just came from my own experience in sort of urban Midwest of like ... there’s your parents, and your brothers, and then the bulk of everyone else you only see on the holidays. And that was dumb of me to do. But at least I was smart enough to put together a community group to review my survey before I put it in the field. And the community group represented members of the community that I was serving, which included a lot of people from Latino/Hispanic origin. And they checked me on it. They're like, "Okay, cousins and aunts and uncles are different and important and need to be their own categories in these questionnaires." And I realized that I almost deleted and destroyed a bunch of data before I even collected it.

One other story: I was doing an evaluation, and this was in the recent years, as our culture has really worked to catch up to our population in terms of gender identity. Again, community members said, "You know, you really need to have more than just male and female check boxes in your survey," you know, and they offered additional suggestions and I had this sort of dual reaction: As a statistician, I'm like, "Okay, but I'm gonna kind of collapse them all into some other category and not going to get enough individuals to be able to find an impact" and so on and so forth. But then I realized that there's a certain amount of being seen in the answer categories on the survey and that, even though I may not analyze it as such because I don't have enough information about any one particular gender category, it's respectful to the population that I'm trying to survey and I think it's a way of individuals seeing themselves in the answer choices. And, if I didn't take that advice, I may have had non-response. I may have, again, lost data because of my choices. And so, realizing that, "Yeah, math is pretty immune to a lot of human biases, but what we feed the math is not. Therefore our results are not." So, this has sort of been a growing epiphany as I've been trying to be better at over my career.

Eric T: That's a great summation there and Rucha has been nodding assiduously throughout that. Rucha, you want to jump in here? Obviously, looking at bias and evaluation is something you do regularly. What are things that you see, and you can comment on what Eric said, but what else are you seeing?

Rucha: Sure. So, those are great examples, Eric, and what I wanted to do was take those examples and sort of take a step back and look at the process itself and how we need to be thinking about equity at each of those different steps in the process. Where I would like to start is that using an equity lens starts even before the researcher actually starts his research. So, we as researchers need to examine our own background, our own biases, both explicit and those even implicit, that we might not even be immediately aware of. We need to examine those as we take the first step into thinking about research. And what we really need to do is make a promise. This promise is to dive deeper into the data to be able to get to the root causes, to fully understand the context before trying and conclusions. That's getting deeper into the data would look different in different context, some of which Eric just talked about. Maybe it is about pulling together a group of stakeholders, a group of community members to help us understand, help us make meaning out of those data.

So, it's going to look different in different context, but what's important is this promise we make or that we start with, that, yes, I'm going to dive deeper into these data to really understand where these data are coming from before I draw any conclusions. Once we have our studies and once we sort of dive into whether it's a study, evaluation, experiment, whatever it is, equity needs to be at the cornerstone of each and every step then in the process. So right from what questions are we trying to answer and all the way to what instruments are we going to use to collect data? And, as in Eric's example, how are those instruments themselves conducted and how are the questions on those instruments written? All of those, and then also the question about who are the data collected from? How are the data being collected? Is there equitable access for everybody to make sure their data is available to this researcher? All of those questions are also important.

Most importantly, it's that it does not depend on whether we are conducting a qualitative research or quantitative. An equity lens is a lens that can be applied to all kinds of research methodologies and should be applied to all research methodologies. It's basically an ideology. It's an approach and that is applicable and should be applied to each and every research.

Eric T: So I was gonna ask, you know, there's that promise. Practically speaking, is that the community inclusion piece, is that what you're referring to?

Rucha: Yes, it is. It is the promise of community inclusion. But community inclusion, community involvement is one part of it. The promise is dually to make sure that you are understanding the data to the fullest possible extent. And the community is going to help you understand those data. Involving people, involving people that are most impacted by your data, engaging that community and engaging them in every step of the way is gonna help you understand the data.

Eric H: And I also think it's a promise to be curious. It's a promise to not assume that you know everything. It's a promise to be willing to be wrong. I mean, we are scientists and, you know, we have PHD's which is a doctor in philosophy, which is Greek for “love of learning.” So, we're Dr. Learning's here and so that means we have to be open to being wrong, to understanding the wide variety of humanity that we're working with. And I think sometimes I've had quantitative researchers in my orbit, and I've even been guilty of this myself, that we think math buys us out of this responsibility. Or, somehow, because it's in a spreadsheet it doesn't inherit all the human crap, and you know, that's just wrong. So, you know, whether it's quantitative or qualitative, we need to be curious about where the data come from. We need to be curious about what it means to us. What is means to those we collect it from and make that also part of promise.

Rucha: I agree because, at the end of the day it's not the data that we are deal...that we are working with. The final goal is creating and sustaining health and well-being of our individuals and our community, right? That's, like, the ultimate goal of data, the ultimate goal of research. Your ultimate goal is not the collection of just data, you have to be able to make meaning out of those data as it relates to the individuals and communities where it came from.

Eric T: Oh, I love that. Another wonderful summation of what we're doing. So, what I was gonna ask is, how does that apply from a process perspective? Because you can promise, right? But how do you ensure that you live up to that promise? But let me ask you a different question that might incorporate the answer to that: How can we start implementing this, you know, in the processes we are using to conduct evaluations? Practically speaking.

Rucha: I would say there are a few different mindsets with regard to that. One is, obviously, starting off with the individuals themselves. Yes, data are what are gonna tell us what are being told about the individuals but not losing sight of the fact that it's these individuals or communities that are important. And, also, that individuals are not unidimensional. So, as researchers, we have to understand the intersectionality and this is a term, intersectionality, that was coined by Kimberlé Crenshaw which helps us understand the multiple forms of inequity and disadvantage that compound themselves and create these obstacles. If you don't understand these obstacles while evaluating from a single lens of reference, then we have clearly not understood the individual that you are working with here. And that is essential.

If we base our research on a single-axis theory and framework, then we are going to distort the multi-dimensional experience of the under-resourced, demarginalized population and it's going to limit our ability to fully understand what the root cause is behind the data that we collect. And, as we already established, the data themselves are not our final goal, but the creating and sustaining health and well-being of individuals is our final goal. So I would definitely say that, in my mind, the concentration of intersectionality and, along with that, working with the communities, the engagement of communities, which we've already been talking about since the beginning of this session. Um, both of those...making sure that we include folks with lived expertise in each step of the way in research, engaging them from what the aim of the research or the study should be all the way to helping us make meaning out of the data that we collect. I think that is really essential and that is something that we can be working towards, um, now, in the next 10 years, in the next 20 years, forever, I would say.

Eric H: Mm-hmm. And I think, you know, taking that and bringing it to sort of my area of the shop, which is the analysis of the data, I think it's incredibly important to not only reinforce a mindset of intersectionality, we need to reinforce that with an intersectionality in the people working with the data. Unfortunately, when I go to conferences about statistics and, you know, about evaluation, and, you know, a lot of the sort of heavy hitters that do the quantitative analysis kind of look like me. And, that, you know, means that there's a lot of people with world views and experiences similar to mine working with the data. And, you know, I can make a promise to try my best, but there's nothing better than having a wide variety of people in the room working on the data, checking each other. And, I really think if we think about, "What would I like in the next five years, what would like in, you know, the upcoming years?" and then the next generations of people doing our work is I would like, you know, the people in the nerd room to reflect the people in the world that we're studying.

And, you know, and I think that has a lot...there's a lot of stuff that needs to go into doing that as far as education policy and the pipelines and all the rest of that stuff. But, I think that it needs to be a goal and I think we need to come up with creative solutions to try to accelerate that process, you know, while we try to also reform the pipeline. To me, that's really, really important. Because, you know, when I'm like, recoding income, and taking this, you know, very detailed column of information and trying to distill it, I'm making choices. Well, what is low-income? Is it less than 50k, is it less than 20k? Like, what does that mean? And, you know, it's so easy for me, alone, at my laptop, to write information down that I check with no one. And so, again, we need more and more people in the room on that.

Eric T: Great, thank you. Well, I love that, um, this podcast, called The Intersect, you both brought up intersectionality repeatedly and that's something that obviously Abt specializes in. So, you know, we're saying equity is the cornerstone of our work and, um, you guys have both given us some great...see “guys,” I'm already using gender language.

Rucha: You can go with, "you folks." Mm-hmm. Thank you.

Eric T: Thank you! Thank you for discussing this me and thank you for joining me at The Intersect.

 
Work With Us
Ready to change people's lives? We want to hear from you.
We do more than solve the challenges our clients have today. We collaborate to solve the challenges of tomorrow.