Unusual Usecases for Serverless Part 1: Donor Engagement at Scale: How Serverless Makes It Affordable for Not-for-profits
Supercharge Donor Experiences, Not Costs: The Serverless Advantage for Not-for-profits
This post is the first in a series of where serverless architectures work really well in industries you'd not expect to find them.
Introduction
Not-for-profits often struggle to deliver personalised donor experiences due to limited budgets and small tech teams compared to their for-profit counterparts. This creates a crucial gap, especially during fundraising events when the need to scale and respond quickly is paramount. From the donor's perspective, this can lead to a less-than-ideal experience during a critical moment of support.
Serverless architectures promise an offering which enables organisations to automate and scale these interactions without the heavy costs or maintenance overhead which normally comes with traditional web applications, such as the three tier web architecture. Additionally, the operations are offloaded to the PaaS provider, such as Google Cloud or AWS. Need to be highly available and scale to millions after a TV commercial goes to air? Need to create customised thank you emails or messages after a donation with personalised subsequent calls to action? Let's find out how our fictional organisation Community Co implemented a serverless architecture to suit their business and saved on their operational and maintenance overheads.
Use case scenarios
Thank You Email Automation: When a donation comes in, a trigger fires which invokes a Cloud Function. Cloud Functions are super lightweight pieces of code which execute in an on-demand, as-needed basis and generally are designed to do one specific thing. In this case, when a donation is made and the donor is new, the function fires off a personalised email based on their donation amount, donation history or any indicated interests they may have.
Targeted Follow-up Campaigns: After a big fundraising event, you may want to keep the momentum and buzz going, so an avenue to consider is to segment donors based on behaviour (recurring donations, event participation, etc.). Using a combination of Cloud Scheduler and Cloud Functions you could set up some automation to send different content sequences over time or when there’s news from your org which may be of interest to different segments.
"Thank You" Goes Beyond Email: A third example use-case could be to explore further communication and engagement with your donors and/or audience. With a similar approach to personalisation discussed in the other two use cases, you could look to generate personalised thank you postcards, merch or even short videos using serverless functions, external APIs (image manipulation, templating tools). With the explosion of generative AI, using a tuned model you could generate some compelling content to directly engage each of your recipients.
Technical Deep Dive
Ok cool, so we have some scenarios and ideas starting to bubble, what could this look like in practice? Let’s dive into an example. Why don’t we look at the second scenario in the examples above and see what this could look like.
First, we’ll need to design a few Cloud Functions, each responsible for sending a tailored content sequence (emails, announcements, etc.) to a specific donor segment. Cloud Scheduler triggers these functions at designated intervals or they can be triggered in response to relevant events (new organisational announcements, upcoming donation opportunities) via Cloud PubSub. This approach really leverages serverless technology for efficient automation and keeping costs to an absolute minimum by only running very periodically. By targeting donors with content aligned to their past engagement, you increase the chance of continued support and build stronger donor relationships.
Sample config and code
To start, it’s best to get your function built and deployed so that we can configure it as a target in the next step when we use Cloud Scheduler.
const { donorSegmentation } = require('./helpers');
const { sendContent } = require('./emailing');
exports.donorFollowUp = async (req, res) => {
try {
const donorSegments = await donorSegmentation();
// Example: Iterate through segments,
for (const segment of donorSegments) {
const tailoredContent = generateContentForSegment(segment);
await sendContent(segment.emailAddresses, tailoredContent);
}
res.status(200).send('Follow-up emails sent');
} catch (error) {
console.error('Error in donorFollowUp:', error);
res.status(500).send('Something went wrong');
}
};
Worth noting: this is a simplified example just to show the concept. The real implementation is a bit more involved and would naturally be more complex than this.
The Cloud Scheduler setup would involve:
Defining a schedule (cron expression) for when to trigger the functions.
Selecting a target of the invocation (HTTP endpoint or Pub/Sub topic for the Cloud Function).
Alternative options to Cloud Scheduler are many in number, but the ones which are probably worth mentioning are the types which are event based. In that I mean events that occur in other parts of the system which are then emitted to invoke the function which can be used to follow up with Donors on engagement campaigns. Some examples include:
Database changes in Firestore which invoke functions on a change. Let’s say there’s an auction and you want to inform the user they’re no longer the highest bidder. Or target a segment of users once a new item goes up for auction.
PubSub events in response to pretty much anything. PubSub is widely considered to be the glue that connects many distributed programs together to form a system in GCP. Downstream subscribers of PubSub topics can be Cloud Functions. So for example, you could publish a message to a topic and the function that is subscribed to it would be invoked to execute on that message. You might have some analytics or something running in BigQuery that could fire off a message to a PubSub topic which then notifies your donors downstream when an interesting analytical insight occurs.
Going further
Ok so, we've discussed how serverless can save you money and hassle, but let's take it to the next level! Here are a couple of ideas to streamline your processes even further and really get to know your donors better:
CRM Integration: Imagine if your serverless setup talked directly to your CRM. That's a powerhouse of donor information, ready to make your outreach smarter!
Automated Feedback Loops: Think about setting up quick, post-donation surveys set up and sent out as needed. With fine-grained control to your databases or systems, you can control the data that is available as well as build up insights on what your donors love, which is constantly updating.
Conclusion
So, why does all this tech stuff really matter as someone running Community & Co? Because when you make the process of donating and contributing easy for your donors, they're more likely to keep giving, volunteer their time, and spread the word! Streamlining processes and adding that personal touch to the experiences can really impact the mission.
The best part? Serverless tech is really accessible to all levels of business. From the one-person operation to the conglomerate enterprise. This democratisation of technology allows even the smallest nonprofits to create awesome donor experiences without breaking the bank. With the pay-per-use model, some businesses can really take advantage of this, especially for those taking advantage at a small scale or when the infrastructure or software needs to run as needed instead of always on.