INSIGHTS · FREE ME UP AI
AI Policy Template for Not-for-Profits: A Plain-English Starting Point
Published March 2026 · 7 min read
Why your NFP needs an AI policy (and why it doesn't need to be complicated)
An AI Use Policy tells your staff three things: what AI tools they're allowed to use, what data they can put into those tools, and how to handle AI-generated outputs responsibly.
That's it. It doesn't need to be a lengthy legal document. For most Australian NFPs, one to two pages is sufficient.
The purpose of this article is to give you a starting-point template you can adapt for your organisation. This template is designed to be practical and readable - not to impress a compliance auditor, but to actually guide staff behaviour.
Before You Adapt This Template
A few things to confirm before you finalise your policy:
- Identify which AI tools are currently in use across your organisation - both formally approved and informal personal use
- Confirm which version of each tool is being used (e.g. ChatGPT Free vs. Teams vs. Enterprise)
- Check with your IT support or Microsoft admin whether Copilot is already included in your Microsoft 365 licence
- Consider whether your major funders have any requirements or preferences regarding AI use
- Confirm that your board or appropriate governance body approves the final policy
AI Use Policy Template - [Organisation Name]
Version: 1.0
Approved by: [Board / CEO / Leadership Team]
Date: [Date]
Review date: [12 months from approval]
Purpose
This policy describes how [Organisation Name] uses artificial intelligence (AI) tools, and the standards we apply to protect our data, maintain our integrity, and uphold our responsibilities to the people we serve.
Principles
We approach AI use with these principles:
- Assistive only - AI supports our people; humans make decisions
- Privacy first - beneficiary, client, and community data is protected at all times
- Human oversight - any AI-assisted output shared externally is reviewed by a human
- Transparency - we are honest about how we use AI when asked
Approved Tools
The following AI tools are approved for use by staff and volunteers:
- Microsoft Copilot - for tasks using internal documents, email, and Teams (organisational account only)
- ChatGPT Teams - for individual drafting and research tasks (organisational account only)
- [Add/remove tools as appropriate]
All AI tools must be used through an organisational account, not a personal account.
Data Privacy Rules
The following data must NEVER be entered into any AI tool without specific approval from [relevant role]:
- Names, contact details, or identifying information of beneficiaries or clients
- Confidential funder information or financial data
- Personnel information
- Legally privileged or confidential documents
The following data may be used in approved AI tools:
- Internal draft documents and working notes
- Public information and research
- Non-sensitive administrative content
- Anonymised data and information
Human Oversight
Staff using AI tools are responsible for:
- Reviewing all AI-generated content for accuracy before use
- Not sending AI-generated content to external parties without human review and approval
- Being accountable for the content they send - the fact that AI drafted it does not reduce individual responsibility
Reporting
If you become aware of AI being used in a way that may breach this policy or create a privacy risk, report it to [relevant role] immediately.
Review
This policy will be reviewed annually, or sooner if there are significant changes to AI technology or our organisation's AI use.
What to Customise
The sections that require your attention:
- Approved tools - replace the example tools with the tools your organisation actually uses or plans to use
- Data privacy rules - the examples given are common; review against your specific data categories and funder obligations
- Relevant role - fill in who is responsible for oversight and reporting
- Review process - decide who reviews the policy and how often
If you're unsure about any of these elements, this is the part where an AI governance consultant can add real value.
After the Policy Is Approved
A policy on its own isn't governance. Once approved:
- Communicate it to all staff and volunteers - don't just upload it to SharePoint
- Brief your management team on what it means in practice
- Check that only approved tools and accounts are in use
- Set a calendar reminder for your review date
- Consider a brief training session - 30 minutes on what the policy means for daily work
Free Me Up AI can help you with the communication and training component if needed. Learn more about our AI automation for not-for-profits service.
Book a free clarity session