How to collect social impact data and measurement tools to consider

In this video, you'll learn how to measure social impact including developing a measurement strategy, how to collect data, and measurement tools to consider.
 
We chat with Nicole McPhail and Emily Hazell from Darwin Pivot and explore how to analyze and use social impact data, what metrics to consider and how to make program decisions from the data. 
 
This is part two of our three-part series on measuring social impact. 
 
Watch or listen to:
 
 

Watch the episode:

 

Prefer to listen?


What we discussed:

Karl Yeh (00:00):

So today I've got two special guests on. Nicole McPhail and Emily Hazell, who are the co-founders of Darwin Pivot.

And this is part two of our discussion on social impact measurement. So thank you both for coming back on.

So let's recap because in our previous episode, we talk about what introducing social impact measurement.

So today we're going to talk about the data itself, but let's recap a bit. How do we go about getting social impact data for our organizations?

How do we collect social impact data?

Nicole McPhail (01:01):

Before you can think about how you get it, just make sure you're thinking about why you're getting it and what you're trying to solve for with whatever aspect of your CSR strategy is.

And I say that because it's just an important reminder for especially someone that's new in the field, every CSR strategy or social impact strategy is different.

 

Therefore you can't look at someone else's company and say, "You know what? We should be doing the same as them," because you're likely solving for and have a completely different strategy than a different company.

All of that said for getting and obtaining the data, there's a few ways that you can do this and you have to look at it both from trying to get qualitative and quantitative data.

just make sure you're thinking about why you're getting it and what you're trying to solve for with whatever aspect of your CSR strategy is.

Qualitative Data Collection

So from a qualitative standpoint, a lot of this can come from the technology that you can use for running your program.

So this can track things like your overall investments, it can look at it across geographies.

You can think about who is engaging in your program and why, and really look at it from an aggregate and at a more granular level.

So that's one really good place.

Quantitative Data Collection

And then the other is from a qualitative perspective and Emily, as our in-house data nerd, maybe you can clarify a bit on qualitative and quantitative, but you can get that from employee surveys. You can do it from interviews.

It's just a little bit less reliable than some of the qualitative data, but it's as important. Just one reason, a quick reason why.

So for instance, you have a program that is a lot of arm twisting, your employees feel obligated to do it and say your overall objective is really helping support employee engagement.

Well, I would argue that's already disengaging if someone feels that they have to participate and you wouldn't know that until you looked at the qualitative data to find out that it's a detractor. It's not actually helping.

And so you can't use the single quantitative data to reinforce your preconceived notions, if that makes sense.

Karl Yeh (03:12):

Well, when we talk about qualitative, quantitative, what is social impact quantitative and what is social impact qualitative? Can you go into a little bit more in depth about those two?

Qualitative vs quantitative social impact data

Emily Hazell (03:23):

Yeah.

So I think with the quantitative, we can think about these as more of specific outputs of the program.

So things that you can measure around participation rates or dollars invested, time invested, et cetera, et cetera.

So we can kind of think of quantitative as more of that robust measure, but it's generally more on an aggregate level.

Qualitative data then kind of get layered in on top of that, where it can zoom in on a team or an individual's behaviors or sentiments.

So for the example of a high participation rate, but then if you actually layer on a qualitative data set to actually get a sentiment of what the experience from the employee perspective or partnership perspective was on that engagement might then give you another added layer of information so that you kind of want both in the social impact space so that you have the most visibility on your program health.

Nicole McPhail (04:24):

I think it's important to note as well.

So when you're thinking about collecting this information, it serves more than one purpose. So I've seen a bit of both.

The first is sometimes people think, "Okay, well, let's just use this because we have to put it in our quarterly business review and we want to get the support from our executives to get more money or just show the value that we're operating here."

But the other piece that I think sometimes we forget is the data you're collecting both qualitative and quantitative should be informing the decisions and the modifications on your strategy in the first place.

So it's not just to use as this is our homework.

This is what we've done. You should be really calculated and critical about how you can look at this data to make changes and to improve.

Karl Yeh (05:15):

And I guess a little bit further on to qualitative quantitative, then

How would you go about calculating social impact value?

Nicole McPhail (05:23):

I wouldn't.

So I think this is a controversial one as well, because I feel like calculating value is an equation that requires a lot of objective things to be included into it.

And it almost makes it impossible in my opinion to truly calculate this value.

So rather than trying to come up with a specific number, I would just look inwards about the strategy, what you're trying to prove, and then remove the sort of not obligation because I get that people want to measure the impact in the community, but remove themselves of the stress of trying to do something that is arguably pretty impossible when you think about it.

And even if you're thinking about how you associate value, it's so subjective in the sense that if someone is looking at a specific organization, they might have philosophical differences on the value that a certain not for profit can drive in the community based on their own morals.

So it can get really, really sticky.

So yeah, I would say maybe focus more on the metrics you can control and not to get too worked up on the actual social impact value as it stands in the formal way.

Karl Yeh (06:55):

So now that we have our social impact data, you gathered it all, kind of, yes, we have the quantitative, yes, we have the qualitative.

So what am I going to do with it now?

Do I start putting in the spreadsheets?

What are the steps that I'm going to be going to to eventually get to that social impact measurement report that I can then share with everybody else?

Now I have my data...what do I do with it?

Emily Hazell (07:18):

Yeah, well, exactly. A step of actually taking the data and putting into a spreadsheet, it sounds like a really basic step, but it's one that we often just don't do.

We don't explore our data enough and look at it from a bunch of different lenses.

And so get to know the data.

Become best friends with your data and understand the more that you understand of the data and you start to actually analyze it the more you actually know, well, maybe we'll move this metric around or we can start to recalculate how we want to measure this.

But a lot of times we're really focusing on just collecting data, collecting data, collecting data, and we need to be making sure that we're actually analyzing that data objectively and making sure that we're collecting what we want to collect and measuring what we want to measure. And so really you can start to break down that data.

 

The first thing that I would do, if somebody just gave me a big amount of data is I would spend a day with it and start looking at it in a spreadsheet.

Start visualizing it spatially and temporally, looking at graphs and charts and just seeing at a high level, if I can start to see some correlation and then start asking more questions once I start to see some of those initial trends coming out at the aggregate level, now let's go in and start to slice and dice the data to see if there's more nuanced trends within this overarching trend that we're seeing.

And so again, just explore the data, be curious and try to look at it again from the previous episode, making sure that you're just trying to let the data speak for itself initially.

And so that you're not going in to say trying to assume what the data is going to say.

You want to come in it from more open-minded and making sure that you're just looking at it to see what it's actually telling you, what story the data is telling you.

Nicole McPhail (09:09):

And then, I think this sort of feeds nicely into the step two.

So once you understand and start asking questions and really thinking about it, then you can start pulling out the right metrics based on what you are you solving for with your strategy, which should definitely align with what your key stakeholders care a lot about because it can inform you on your program looking at the data like Emily was talking about, but the next level is often, how are you sharing it with other people?

And this is where it gets really interesting from a dashboard or a framework perspective.

Because once you understand that data, then you have to be able to show it in a way that can tell your story visually.

And so infographics come into play, really knowing how to prioritize on the most critical data points, understanding how to show this in a temporal or spacial way, showing across geographies on a map.

That type of stuff is how you can really tell these stories and being able to include some of the qualitative information in there too.

 

So what are employees saying?

What are people in the community feeling about the work that you're doing and then reporting out on that in a consistent way.

This is the tough thing. You have to be organized to be able to keep up with it so it's actually useful looking at this data once a year, what's the point?

Emily Hazell (10:41):

When we're talking about just the initial, when you're exploring the data, I think too often we get siloed in our own organization or business unit, or if we're working with similar types of professionals around us, you have a bunch of CSR practitioners looking at the same data.

It can often be helpful too at the beginning to start having conversations with people that might not specialize in this and show them the trends that they're seeing.

Sometimes those challenge networks will ask questions about the data that will spark a new insight for you to think about.

So really broaden the conversation, even just a casual conversation with colleagues that don't necessarily specialize in this can really give you a much more broader perspective on it, but then more nuance when they'll ask a question that you think, "Oh, I would never have thought to ask that question about the data," because you are the expert. So sometimes that's just speaking to non experts within your own organization can be an interesting process as well.

Nicole McPhail (11:37):

Emily, that's awesome. This is why we're business partners.

I love that comment because when you think about the audiences that you're presenting to, especially your stakeholders, they aren't experts in this at all, and they have minimal time.

So not only do you need to have it in a way that anyone can understand it at first glance on your dashboard, you also should have a glossary of terms and things like that as well so people can really understand what you're talking about, because if you have the ability to send out a dashboard that does not require you to be in the room, you can create so much more influence that way.

If you have a live dashboard that your executives can share with each other, and it's automatically updated, you don't have to rely on those once a quarter meetings to update your key stakeholders to get the influence that you need and the buy-in and all of that stuff.

So basically I love your answer and I think it's really relevant to this work.

Karl Yeh (12:41):

So let's shift a little bit to tools.

So what kind of tools would you use to measure both from all the way through some research tools, analysis and insights, and eventually even reporting tools?

What measurement tools to consider

 

Nicole McPhail (12:55):

So a lot of the tools, this is a great thing about employee engagement or CSR software, that can be your tool for collecting the data, but then when you're pulling it over into Excel or even looking at it in the back end and a technology that does this, that can be your tool. (13:15):

 

I'll let Emily talk about different ways that you can look at data in Excel sheets, and then just a surveying tool for collecting information from participants and employees.

And I should say it's as important to get feedback and employee sentiment from employees who are engaged in the program or strategy as it is to get in people that are not. That can be really telling right there. (13:41):

But then in terms of technology to share the data, I've seen a lot of people use Excel sheets to send it to their executives and partners, which I think is really limiting because the visualization is part of how you can tell a story and make people understand it really quickly, as well as I've seen it in PowerPoints and people are just pulling over graphs and stuff like that.

Not to plug us, but we have a live and interactive version of creating a dashboard. (14:10):

It's basically a URL that auto populates with the right data and information.

And as you scroll around, it pops up different insights. I don't want to be a tacky self promoter, but just explaining the different tools that exist.

And what are your thoughts on when someone is capturing and collecting the data, what are some really simple ways in Excel that they can look at it?

Emily Hazell (14:33):

Right. So, within Excel, I guess the first thing, if we're just talking about a large database, the first thing that you're just going to want to do is actually get that into some sort of visualization, like a graph or a chart or something where you can actually take all those individual data points and aggregate them into trends. (14:53):

You're trying to evaluate trends and humans, we're just inherently very visual people.

And we can understand things a lot better when we can see it in a pie chart or a line graph or something like that.

The stuff that we tried to take, so moving beyond just taking the data from Excel and moving into something that's a little bit more robust is there's two things that are important from my perspective for any visualization if you're trying to get buy-in and you're trying to really show what you're doing and why it's important and why should we care is it should be interactive and it should be dynamic. (15:31):

So interactive in that you can play around with the data and select and query different parts of that data and then get new insights.

So the user should be able to go on and actually be able to have a bit of control over the visualization that they're seeing and then dynamic in the way that as new information comes in and whatever that cadence is. (15:51):

So it could be daily.

That's probably a little bit too much. Weekly, monthly, quarterly.

You're getting that automatic dynamic update so that you can see the new visualization.

And so a lot of what we try to do is move beyond something from just in Excel, which I love an Excel so I'm not trying to take down the Excel, but I think that you really want to move beyond into start to layer data visually and temporally so that people can start to understand what this means (16:19):

And so with having these interactive and dynamic dashboards, now we can layer on location, which then allows us to link to localized demographics, broader demographics, specific causes in that area, temporal trends in giving and volunteering, and all of it's housed into one interactive dynamic dashboard that actually, even though the user didn't create it, they can actually come up with a customized view of what they want to see based on what they're looking for in the data.

Nicole McPhail (16:51):

And to give this as an example.

So imagine you again, let's just go back to participation rates. So you're looking at participation in the United States and you see, "Oh, wow, okay, cool. It's 25% participation," and you're looking at a map and then you scroll in and you realize, "Oh, this is all happening in one site."

(17:17) So then you think, "Well, what is that site doing that these other sites are not doing?" So then you scroll in even closer.

And then you're able to look at the various types of causes that are being supported and/or looking at trends that are happening in the community that requires support from a needs assessment (17:37):

So all of a sudden you're able to go in closer and be like, "Oh, it's because this specific executive cares about this pet cause. His or her leadership might be driving the engagement. Okay. How can we replicate it?"

So just being able to zoom in and learning more and learning more can really help shape the decisions that you are making.

Karl Yeh (17:57):

What if you don't have qualitative data? So what can you do with  anecdotal feedback?

Emily Hazell (18:06):

Well, anecdotal feedback, I think there's lots of different ways that you can incorporate it.

Even again, just to layer onto so say you had a little bit of other qualitative or quantitative data.

I think the anecdotal feedback can really provide an interesting narrative as well so you can kind of layer on.

If all you have is anecdotal feedback, you could also just formalize that process a little bit and turn it into something where you have a couple of stories.

Maybe you then send out a more formal survey to actually start to gather more of that data that can be a little bit more of a qualitative data.

And then you can always take qualitative data and convert it into quantitative values that you can actually use to trend and evaluate the program health as well.

Nicole McPhail (18:51):

If you're curious how to do that, it's basically tagging.

So say you use a Survey Monkey or something like that and you ask some questions and you are seeing a sentiment remark that keeps coming up and coming up and coming up, that's how you can quantify the qualitative data by just basically counting pegs on words to really find trends in how people are feeling.

Karl Yeh (19:14):

So, is there anything else you want to add in terms of social impact measurement and data collection?

Emily Hazell (19:20):

I think really just start doing it.

So even if you're not an expert in data and metric, you can start collecting some of the data once you have that framework set up and what you're solving for and you've started listening to your people and you started to gather some of that anecdotal feedback.

Now, just start thinking about the metrics that you could really use to further support your case for the program.

And so I think a lot of the times people just get overwhelmed on where to start and just starting somewhere.

Maybe it's just one component of we're going to collect some sentiments from people that have been from some of our people that are engaged in the programs to begin with. (20:03):

You can start small and build from it, but start somewhere so that you can start to build that understanding of your data.

And then you can reevaluate over time.

But I think a lot of the times it's just getting the ball rolling,

So yeah, you can reach Nicole and Emily at darwinpivot.com and nmcphail@darwininc.org

Connect with Nicole McPhail

Connect with Emily Hazell