The following is a conversation with Allison Fine and Beth Kanter, Co-authors of the upcoming book, The Smart Nonprofit: The Human-Centered Approach to Artificial Intelligence for Social Good, and Denver Frederick, the Host of The Business of Giving.


Allison Fine and Beth Kanter, Co-authors of The Smart Nonprofit

Denver: AI for Good brings together the best minds and technologies to solve the world’s most urgent challenges such as reaching the sustainable development goals by 2030. Well, what about AI4Giving? What possibilities does it provide to enhance and expand philanthropy? What do you need to know but also need to be mindful of? Those answers were provided in a recent paper funded by the Bill and Melinda Gates Foundation and co-authored by my next two guests. 

They are Allison Fine, a pioneer in the use of digital technology for social good, and Beth Kanter, a recognized thought leader in digital transformation and well-being in the nonprofit workplace. And they will be coming out with a new book next spring on the subject titled The Smart Nonprofit: The Human-Centered Approach to Artificial Intelligence for Social Good. 

Welcome back to The Business of Giving, Allison and Beth! 

Allison: Thank you, Denver. It’s so nice to be here with you. 

Beth: Yes. Thank you so much. 

Denver: Let me start with you, Allison. What is AI, and how is that different from the other technologies that we’ve been interacting with over the last few years? 

Allison: AI is computer code built into products that is making decisions for people. And the reason why it’s different, Denver, is we’ve always had tools that help us work faster or bigger, but we’ve never had technology that is making decisions instead of human beings. That’s why it’s smart, and that’s why it’s scary. 

Denver: Yes. And it’s in the background, too, right? 

Allison: We think of it as the refrigerator humming in the background. Unlike social media, which Beth and I have been working in for many years, you can’t see it necessarily in action, so it’s making calculations and predictions in the background. But those calculations or predictions like, say, who gets bail in jail or who gets interviewed for a job are really, really important. 

Denver: No kidding. That humming in the background, the refrigerator, never used to bother me until somebody says, “Does it bother you?” And then you can’t stop hearing it. 

Where are we at regarding the adoption of AI4Giving? Are we in the middle of the innings, the early innings? How would you describe where we are in that order? 

Allison: So we talk about technology adoption in the shape of a hockey stick. And along the bottom axis is the blade of the hockey stick, the part that hits the puck, and that’s where technology has been developed. And AI has been in the process of development for decades for 60 years. Then you hit the heel of the hockey stick. That’s when the price point of the technology goes down. Computational power goes up – like with mobile phones – and whoosh! Adoption takes off. And somewhere along the handle of the hockey stick, your mom buys a product.

So we’re at the heel of the hockey stick, this inflection point where commercial adoption is happening really quickly, and we need consumers, like nonprofits, to understand what the stuff is, why it’s so important, what the upside is, what the downside is, and how to integrate it into their organizations.

What AI can do for you is to give you the dividend of additional time. It can free you up from some of those boring, time-consuming types of administrative tasks. It’s not going to only give back to you, but that’s going to free you up to spend more time on building relationships with your donors.

Denver: Whoosh!

So, Beth, I’m a fundraiser and I’m hearing about this for the very first time. What’s the first thing you want me to know?

Beth: That’s a great question. The first thing I want you to know is that it’s not going to take over your job and that your boss shouldn’t fire you because the robots aren’t going to take the donors out to lunch. Your job is really important because you provide the humanness, the human-centeredness of that donor relationship.

What AI can do for you is to give you the dividend of additional time. It can free you up from some of those boring, time-consuming types of administrative tasks. For example, if you are the major gifts person and you have to do desk research to figure out: Who should I be talking to? Or, who should I be contacting this week? Who’s most likely to give us a gift? An AI algorithm can do that in a matter of minutes, versus you having to spend 20 hours of your time trying to synthesize many, many data points. 

Denver: So instead of taking time the way social media does, you’re saying this is going to give it back to me. 

Beth: Yes. It’s not only going to give back to you, but that’s going to free you up to spend more time on building relationships with your donors, which both Allison and I feel is really important because: come face it, the renewal rates, the retention rates of donors – they’re abysmal. Allison likes to say they’re “cancerous.” And that’s because we’re treating them like ATM machines instead of building really good, solid relationships with the donors.

Denver: Beth, talking about AI4Giving – now, are we talking about a big, big organization? A mid-size organization? Are we really talking about the full spectrum of nonprofit organizations who can avail themselves of this?

Beth: We get that question a lot because I think the perception is: This is Library-of-Congress size data, and you need a data scientist. You need all these expensive tools and software to use it. That’s not necessarily true. 

Yes, there are a category of projects and tools that are expensive and may require more capacity for an organization to implement. But as Allison mentioned before, the tools are becoming more democratized, and these will become more and more available to smaller organizations. So they may already be in use in some of the fundraising tools that you’re already using. 

So, yes. It is applicable to organizations of all sizes.

Denver: Fantastic. 

Allison, I want to pick up on a little bit of what Beth just said, and that is a lot of people have this reaction when they hear AI4Giving, that we’re becoming even more impersonal. Here are the bots coming in. There’s even – “It’s just going to be more techno-centric.” And what she kind of alluded to, and I know you’ve spoken about as well, that this is an opportunity to humanize fundraising. Talk about that a little bit more, because that is such an important point. 

Allison: It might be the most important point that we talk about, Denver, that AI can’t fix bad fundraising. Bad fundraising, as Beth mentioned, is when we are entirely transactional and communicating with donors only to get the next check, or only interested in people if they can write a check. That’s why the donor retention rates are so low. After year one, the average retention rate for an organization’s set of donors is under 25%. You’ve already lost 75% of people in large part because you’re just churning through them. You’re emailing after email. 

When AI takes the rote pieces of work – the busy work, the paperwork, the bureaucratic work – out of the equation by doing it for you, then we have the opportunity that our friend, Steve MacLaughlin of Blackbaud called the “AI dividend” – that gift of extra time. And that’s a choice for organizations: Do you want to use the extra time to keep doing the bad work of transactional fundraising? Or do you want to use the gift of time to pivot to a deeply relational model where your organization is getting to know donors really deeply as people – understand why this cause is important to them, what moves them to give, what could turn them into ambassadors to recruit other donors to you? 

We’re looking at the wrong set of metrics as organizations, and AI opens up an opportunity to transition to a different way of thinking about fundraising. But it’s going to take a lot of leadership and a lot of courage to do that.

Denver: It’s a real cultural shift inside the organization and particularly in the development office. I want to pick up a little bit on that retention because… my goodness, that’s been the bane of my existence as well in all the years I spent in the nonprofit sector. 

And I got to tell you–and you know this better than I, Allison–this is what direct mail services are selling. It’s acquisition, acquisition, acquisition. And they always throw in a little bit of retention to kind of act as if they’re covering that, but it’s acquisition. You almost have to change that relationship with them as well to put the focus on retention, correct? 

Allison: So this is a leadership problem. This is not a technical problem. This is what Marty Linsky at Harvard calls an “adaptive problem,” because the way we think about fundraising has been–I’ll just say it–it’s been corrupted by the vendor selling services. They sell the services on the hopes of what – 1% to 3% return? That means you haven’t engaged 97% to 99% of the people that are out there, right? 

Denver: Right

Allison: Then the math clearly shows this is what we call “a leaky bucket problem.” You’ve got people pouring in, and they’re pouring out faster than they’re coming in, which makes fundraisers frantic to fill the bucket up again. Nobody does good work when they’re frantic. This is a leadership problem, though, because… I don’t know about you, Denver, but Beth and I, we’ve been on lots of boards over lots of years. We have never once sat in a discussion at the board table on donor retention. 

So, we’re looking at the wrong set of metrics as organizations, and AI opens up an opportunity to transition to a different way of thinking about fundraising. But it’s going to take a lot of leadership and a lot of courage to do that.

Denver: I really agree with you. I also think it’s a little bit of a mindset we have in our society. Too much emphasis on innovation, if you will, and not enough focus on replication of evidence-based practices. Because everybody wants something new. And for whatever reason, we seem to get more excited by new donors coming in than keeping old ones. And that is insane, to be quite honest. It’s just insane. 

Allison: And expensive. 

Denver: And expensive. Really expensive. 

Allison: It’s seven times more expensive to get a new donor than to keep an old one. So, you have to then think in whose best interest is it to keep getting new donors, and then you head towards the vendors. 

Denver: Absolutely. I’m glad you brought that up. 

Well, let’s start with the data, Beth, because you understand data better than just about anybody I know. What do you have to do to be prepared to have AI work for your organization when it pertains to the data that you’ve got to start with? 

Beth: That’s a great question. I would say that you don’t start there. And I think, just as we were talking about the human-centered approach, I think that comes first. It comes with lots of discussion for fundraising applications. And particularly for programs: What makes sense for us to automate or predict or to communicate automatically, and what doesn’t? Where are other situations where the human is working alongside of the machines? And that needs to have a discussion. 

Then the next set of discussions I really think that are essential are really around ethics and bias because there’s lots of potential for us to be exclusionary in the way that we’re applying AI. And then that starts to get me to the data, and there’s a couple of points around data readiness … AI readiness when it comes to data. 

So the first thing, there is a technical piece, which is really about: You have to have enough data. It has to be a large data set. It has to be clean. It has to be free from bias. You may be working with third-party data sets. You have to really understand some of the nuances of how those things are labeled. So there’s the kind of technical piece of the collection and care and sorting of the data. 

And then there’s some technical kinds of expertise around: How do you build these models? Or building those algorithms and understanding how you construct them. And you may not have a data scientist on staff. Not every single nonprofit does. In fact, very few of them do. But that might be collaborating with a big corporation and bringing in some expertise. 

But then there’s this whole host of ethical issues like: Are you doing a threat analysis on how this data could potentially be weaponized? Especially if you’re using it for program delivery. Allison and I have come across a couple of interesting cases that we’re going to be writing about in our book, where people didn’t really think through the kind of do-no-harm pledge with the data. 

Then there’s privacy concerns, and then concerns about the data set that you are training your algorithm on. Is there inherent bias in there? So, for example, if it’s a mortgage data set. We know that the giving of mortgages is biased, right? Redlining different neighborhoods and home values– 

Denver: It’s pretty much taking the bias from the analog world and bringing it to the digital world.

Beth: Yes. That’s a really succinct way of saying it. And if we’re unaware of it, if you have all of the coders and the data scientists, and you’re not talking to other people – program people, people out in the community, and the end-users – then you may not be able to spot those biases until they come up. And then it becomes you’re being racist, right? 

Algorithm is basically opinions encased in code.

Denver: Yes. It’s funny you say that, Beth, because I think there’s something about artificial intelligence and algorithms that is more insidious than almost anything else – because what we do is we blame the algorithm. So I don’t blame my toaster when it doesn’t work, but I blame an algorithm without any of the understanding that we actually program that algorithm, did a lot of humans. 

Beth: Right. So like algorithm is basically opinions encased in code. 

Denver: Right. Another succinct way of putting it. 

Beth: Yes. Right. So if there is some white male tech dude doing this and creating your algorithm for you, and you haven’t involved end-users, then it may be his opinion embedded in the code. That’s how it happens. 

And that’s why we’re making a big point of saying that this isn’t a project or a technology– it’s not a technology project, and it’s not something to throw down the hall to the IT department. Leadership needs to be involved in it, as well as program staff and end-users in the testing and development of this.

Denver: Absolutely. You can’t outsource this to a machine. That’s for sure.

Allison, let me ask you about communications to donors. That’s always been pretty tricky in terms of doing it, making it too general, niching it. How can AI help us be more effective in not only automating it, but actually getting the right message out to the right person at the right time?

Allison: So what AI can do, Denver, is look through all of an organization’s engagements with donors – and some products will even look online and through social media and other third-party data sets – to identify the kinds of communications, the kinds of stories, the kinds of sentiments that an individual donor will respond to. 

And what that does is it really democratizes a kind of a concierge service of donors. So donors at any level now get the personalization that only high-net-worth donors usually get in the world. And so that the AI product–like Gravity is one– can help to personalize emails and other communications, can help your organization know which stories will particularly appeal to which donors. 

We know that the Rainforest Action Network used an AI tool to turn one-time donors into monthly givers by over 800%. We doubled-, tripled-, quadruple checked that number because it seemed so crazy. But what they– 

Denver: Oh, that’s a typo, for sure. First time you see it. That can’t be right. 

Allison: We have banked on that a bunch of times. And what they did was matched up the kinds of stories with the preferences of individual donors, and it really resonated with them.

Denver: That’s one way to address your retention problem, isn’t it?

Allison: That’s exactly right. 

Denver: That’s for sure. Allison, you brought this up, so I want to dig a little bit deeper… because we’re always talking about what nonprofits, how they can use AI to generate more money and more donors and more fans. But you’re kind of talking in terms of what AI can do for the donor.

Speak a little bit about that in terms of using it to match them up with their needs and their passions and organizations that they would be inclined to support. 

Allison: So the flip side of nonprofits using AI are AI tools like Philanthropy Cloud that Salesforce has been developing, that can match up the individual preferences of the donor to their very large database of potential organizations. So I may want to give in a certain geographic area on a certain issue area, at a certain level, and they can provide me with all sorts of options to do that. 

The hope with that kind of system, Denver, is that donors can come across smaller organizations that they may never have known about before. So it democratizes that side of the equation as well. 

Denver: So I have my own personal donor advisor, don’t I? I feel like one of the big guys. That’s pretty cool! 

Allison: That’s exactly right. You can call them like Alfie, your AI bot that’s going to help you, find organizations and help them communicate better with you. 

Denver: That’s cool. 

Beth, let’s pick up on bots. Tell me about chatbots. How are organizations using chatbots to better communicate and connect with their donors? And how has that, in turn, freeing them up to do other work? 

Beth: Sure. I just want to add on to one thing that Allison said. What she was really talking about, and I love this term, it came from Rotary Davis. It’s called “phil-gorithms” – to help philanthropists make decisions.

So let’s talk a little bit about chatbots. So I think chatbots have been the most implemented form of AI. The way they work is something called natural language processing, which allows humans to talk with computers. And Siri is actually an example of that, and so are chatbots. And Facebook has democratized Messenger chatbots. So you don’t need to know how to code to use them. So that’s why we frequently see them used in the nonprofit sector being attached to Facebook pages, but they’re somewhat limited. 

The more sophisticated ones, one of my favorite examples is on a website that’s for — a fundraiser bot called Extra Life, which is an online gaming fundraiser by the Miracle Network Hospital chain. And basically, the chatbot serves as an information butler that’s there 24/7, able to serve up the right information to the donors that may have questions about making a gift or about the fundraiser. And the chatbots don’t need a lunch break. They don’t need to sleep.

And the thing that’s nice about it is, first of all, it improves the donor’s experience because it makes information readily available to them without them having to search through a lot. And also, it’s there all the time. So if the donor is coming on like at two in the morning to make their gift, but the staff is sleeping, the chatbot is there to answer the query. 

On the Miracle Network Hospital’s website for this particular online fundraiser, they have both Canadian donors and American donors, for example. And so, they were getting a lot of questions from Canadian donors saying, “If I make a donation, what does that equal in Canadian dollars?” So the chatbot is something that can handle that question really well. And not only tell the donor what the conversion is, but also point them to a landing page where all the amounts have been converted into Canadian dollars. 

Denver: Fantastic.

Beth: So, I think if they’re well-designed–because we’ve all been on sites where chatbots have been awful and they frustrate us–but if they’re really well-designed, which takes a lot of testing and what’s called socialization of the bot, and really understanding what the bots’ persona is, its script, the type of information that end-users want, and then being able to go through testing and iteration and improvement process. 

Denver: Beth, is artificial intelligence being used to sort of better target in on where a need really exists? Because sometimes we know there’s a pervasive need in an area, but can we use artificial intelligence and better map exactly where it is so we can direct funds more effectively? 

Beth: The answer to that is yes. 

Denver: I’m hoping. 

Beth: And let’s say that– I think that we’re going to see more of that, especially in disaster and humanitarian fundraising. 

One of my favorite examples comes from a collaboration with Google Maps and Google AI machine learning algorithms, and an organization called GiveDirectly, which is a cash transfer. 

Denver: Oh, I know them very well. Yes. They’re great. 

Beth: Yes. And so what they’ve been testing I think with the recent hurricanes in the Gulf region, I believe with Hurricane Harvey back in 2018, is that instead of  deploying aid to people based on word of mouth or what they hear about… immediate reports… they’re able to slice and dice data from weather information, poverty levels of different neighborhoods, past damage and past hurricanes, and be able to pinpoint neighborhoods that might be missed, or some really isolated rural areas and pockets of people that might’ve been missed in the distribution of aid. So you mash that up with GiveDirectly, they’re able to get that cash right to the folks who need it most in the moment. 

Denver: Brave new world. I love it!

Allison, so what’s out on the market now? What are some of the commercial products or providers that a nonprofit organization might want to take a look at and engage with in terms of trying to bring all of what we talked about to life?

Allison: So right now, because it’s such a nascent field, Denver, it’s kind of a patchwork quilt of commercial products coming from very small companies. So there’s boodle.AI and Quilt.AI and Gravity. And we mentioned all of these in the report. There are larger organizations like Salesforce and Microsoft that are bringing out much larger, more powerful platforms, and chatbots are being commercialized very, very quickly now. 

It’s really important, as Beth mentioned earlier, Denver, that organizations have a really clear purpose in mind before jumping to a product. We don’t want product driving this process. And we need to be very, very careful about asking questions of the product designers about what data sets they’ve been using, what assumptions are built into the code, what testing for bias there’s been. 

For commercial vendors, as you mentioned before, algorithm has become the biggest of the buzzwords, and they are very protective over their algorithms. It doesn’t mean that end-users shouldn’t ask really hard questions and shouldn’t expect to have some peek under the hood to get a sense of what kind of thinking went into the development of this product. 

And then you have to ask hard questions about: What happens to your data with this product? Is it drawn into the commercial products’ future use? What happens to who owns your data, who owns the new data that’s developed through the system?

So that’s why we’re writing this new book, A Smart Nonprofit, to be able to help people walk through when, where, and how to use an AI solution, to be smart about it, to be smarter than the bots, and to make sure that it’s used equitably. 

Denver: And listening to what you’re saying, Allison and Beth, it does seem as if there’s an opportunity for the nonprofit sector to set an example here, to raise the bar because they’re in early; they bring values. And I just wonder what you think about that in terms of really, this is by example because I think the sector takes a little bit more time thinking about these things than maybe some other people who are–I don’t want to say more in a hurry, but you know what I mean–looking to hit a bottom line a little bit more with urgency.

Allison: This would be the first time in the history of the nonprofit sector, Denver, that we could lead on how to use technology because we’ve always taken a back seat to the for-profit sector and the commercial sector. We’re poor. We just take hand-me-downs. This is an incredible opportunity for the nonprofit sector, for someplace like the Independent Sector, to say, “We’re going to help create ethical standards for the use of this really advanced technology that is replacing human decision making.” 

Like if you’re a social worker, and your organization is now using AI to screen clients before you even have a chance to talk to them, you had better make sure that that screen isn’t preferencing white people. Our opening story is about a tool that does just that. So we feel really strongly that we are obliged as a sector to take a lead ethically on the use of this smart technology. And wouldn’t that be wonderful to become a model for the commercial sector to follow in how to use technology well? 

This is not about the technology itself. It’s not about the technology tools. It’s really about the potential impact, both negative and positive.

Denver: I really can’t agree with you more. I had Ross Baird on the show recently. He’s the founder of Village Capital. And they’re a wonderful organization, and they’re into impact investing. And I was asking him in terms of what their organization did with other impact investors and how they work together. And he said, “Well, the role of philanthropy and what we do is that philanthropy is the one that brings values to the table, and it helps influence everybody else.” Because these impact investors are sometimes thinking more like the investors and a little less about the impact. And if we can sort of bring that same sensibility to this sector, it would be this whole artificial enterprise. It would be wonderful. 

So, what are your thoughts on that, Beth?

Beth: I couldn’t agree more with what Allison said, and she said it really, really well. Again, I just keep on coming back to: this is not about the technology itself. It’s not about the technology tools. It’s really about the potential impact, both negative and positive.

And because as we’ve been saying, it’s the hum of the refrigerator, it’s invisible. We have in our sector, maybe knee-jerk reactions – Oh, it’s not us. Let’s lean back. Artificial intelligence. Oh my God. Scary robots. Danger, Will Robinson! But we need to lean into this, particularly because of the equity implications that might happen if we don’t. 

Denver: Well, as you sort of alluded to it before, AI is the last link of the chain. It’s never going to fix bad practices. And all the things, the preamble to that, in terms of who you are, what your values are, what you’re looking to achieve – this is just a tool to do that. And if you don’t do all that preliminary work, it’s not going to serve you well. 

Tell us about your paper, Beth – Where is it? What’s it called?  And how can people access it? 

Beth: Great. So if you go to ai4giving.org, you will find a fairly detailed paper called “Unlocking the Power of Generosity with Artificial Intelligence.” It’s based on several months of field scanning and interviews and then writing. 

So it is a landscape analysis of what tools are out there, and of course, what the benefits are, what the potential challenges are. We even got out our crystal balls, and we did some projections about the future and some actions that we can take. And it’s really, again, focused on giving and fundraising. So it’s of interest to both sides of the equation, both the fundraisers, but also in philanthropy. 

We want to get the best of people and the best of bots, but that always means that people are the ultimate decision makers in an organization. 

Denver: And just to show how much on the cutting edge you are, you’ve got the hashtag #AI4Giving. So that just shows you you’re early in the game. 

And Allison, you got a book coming out. It’s called A Smart Nonprofit: The Human-Centered Approach to Artificial Intelligence for Social Good. Tell us about it, and when can we get our hands on it? 

Allison: So we are writing this book right now. It covers the entire array of organizational functions that AI is going to touch – and that’s every function. So it is, in addition to fundraising, it’s comms. It’s marketing. It’s service delivery. It’s back office. AI is coming for every part of the organization. 

So we want organizational leadership, in the C-suite, in the board room: we want you to know what this stuff is, where it comes from, how you can decide what to use, how and when, and how to make sure that your people are always in charge. We want to get the best of people and the best of bots, but that always means that people are the ultimate decision-makers in an organization. 

Denver: Well, the two of you have made some extraordinary contributions to the sector, and this certainly promises to be another one. I want to thank you both for being here today. It was such a pleasure to have you on the program. 

Allison: Lovely to be here, Denver. We look forward to coming back when the book is out. 

Beth: Yes. Thank you so much! 

Denver: Fantastic. Thank you!


Listen to more The Business of Giving episodes for free here. Subscribe to our podcast channel on Spotify to get notified of new episodes. You can also follow us on TwitterInstagram, and on Facebook.

Share This: