INSIGHTS - FREE ME UP AI
Published March 2026 - 7 min read
Most NFP boards aren't yet discussing AI formally. But the reality is that AI is already in your organisation - whether you know it or not.
Staff are using ChatGPT to draft reports. Volunteers are using AI tools to create content. Management is exploring automation to reduce admin overhead.
Without board-level oversight, AI adoption in not-for-profits happens informally, inconsistently, and sometimes in ways that create real risk - to beneficiary data, to funder relationships, and to organisational reputation.
This article is designed to help NFP boards understand what responsible AI adoption looks like - and what questions to be asking.
Let's start with the positive case. AI, when implemented thoughtfully, can:
For resource-constrained NFPs, these time savings are genuinely significant. Staff recovering 5-10 hours a week can redirect that time to the mission.
The risks of ungoverned AI adoption are real:
Staff using free AI tools (like personal ChatGPT accounts) with client or beneficiary data may be creating privacy breaches. Free AI tools can use inputs for model training. The Australian Privacy Act applies.
AI can generate plausible-sounding content that is factually incorrect. If that content is included in a funder report, an advocacy submission, or a client communication without human review, the organisation is accountable for the error.
If an NFP is seen to be using AI in ways that feel impersonal, inappropriate, or that compromise community trust - particularly in the disability, mental health, or child services sectors - the reputational damage can be severe.
Boards and executives who aren't aware of how AI is being used in their organisation can't provide meaningful oversight. That's a governance failure in itself.
Responsible AI adoption for NFPs is built on five principles:
Responsible AI for NFPs isn't about slowing down adoption. It's about making sure adoption is safe, accountable, and aligned with your organisation's values.
If your board hasn't yet discussed AI, these questions are a good starting point:
You don't need detailed technical answers to these questions. You need to know that management has thought through them and has a plan.
The most practical first step for most Australian NFPs is to commission a simple AI governance review - understanding what tools are currently in use and whether appropriate governance is in place.
This doesn't need to be an expensive exercise. A focused review typically takes two to four weeks and delivers:
From there, the board has visibility - and the organisation has a foundation for confident, responsible AI adoption. Our AI automation for not-for-profits service includes governance setup as a core component. You can also explore our responsible AI governance approach for more detail.
Book a free 15-minute AI clarity call. We'll help your board understand the risks, the opportunities, and the practical next steps for responsible AI adoption.
Book a free 15-minute call