While AI has useful applications that can enhance your communications and fundraising efforts, it is critical to understand how these technologies work and when to use them.
In a recent conversation on Artificial Intelligence (AI) and grant writing, a key takeaway was the importance of using AI tools with intention. In this article, we share a guide for community-based organisations (CBOs) seeking to use AI tools for communications and fundraising.
AI and Communications
Artificial intelligence (AI) is technology that enables computers and machines to imitate human learning, comprehension, problem solving, decision making, creativity and autonomy. By analyzing large amounts of existing data, computers ‘learn’ how to execute communications tasks like note-taking, graphic and motion design, text generation, and even customer service.
Many organizations use AI to draft grant applications, generate outreach materials, and analyze donor trends. This is particularly useful when you need to develop proposals in languages you may not be fluent in, or when quickly adapting messages for different audiences and platforms. AI can also generate layouts for branded materials like reports and social media graphics, even for users with minimal design knowledge.
While AI can offer efficiencies and compensate for skill gaps in your organisation, it cannot substitute human judgment, strategy, and authentic storytelling. Your organisation should be clear about why they are using AI; ensure that it serves your broader goals without replacing human insight and authenticity.
How to use AI effectively
Refine ideas, don’t generate them
AI should support human thinking, not replace it. Use it to clarify and strengthen ideas that originate from the people who best understand the issues you are addressing, and on-ground realities. Original ideas are key in developing well-thought-out initiatives, and programs that respond to community needs. Once you have clarified your ideas, and gathered relevant evidence, use AI tools to help you refine your communications.
Use context-rich prompts
AI works best when given clear, detailed prompts. You can specify your audience, tone, and even desired length of output. The more context provided, the more relevant and useful the output. We tested this out with Gemini, and here’s what we found:
Vague Prompt:
Write a social media post about Youth Day.
Response:
Happy Youth Day! 🎉 Celebrating the power and potential of young people everywhere. Your voices matter! #YouthDay
Detailed prompt:
Write a 200-character Instagram post for Youth Day. Target: university students in Mwanza, Tanzania. Tone: energetic and engaging. Content: Share a statistic about youth volunteerism in Mwanza (25% of students volunteer in community activities) #YouthDay #StudentVolunteers”
Response:
Did you know 25% of students in Mwanza volunteer in their communities? 🤯 This Youth Day, let’s make an impact! Get involved & shape the future! What causes are YOU passionate about? 👇 #YouthDay #StudentVolunteers
Use AI for editing and structuring
AI can improve grammar, tone, and readability, but the substance must come from human experiences and expertise. As one contributor noted in our grant-writing webinar, AI should support clarity and structure in your communications but should not be the primary driver of your communications.
Check for bias and verify authenticity
AI operates on the “garbage in, garbage out” principle, therefore it can produce misleading information and or even deliberate misinformation, when deployed for negative causes like advancing conflict. It’s particularly vital to be cautious when AI creates content concerning underrepresented groups, as highlighted in this article on AI for aid.
“The datasets that AI is built on, the research driving AI progress and the teams that build cutting-edge AI tools do not yet represent large parts of the world. Far too often, development practitioners and voices from the Global South…are rarely a part of the conversation..”
Shachee Doshi, Christopher Burns
Be transparent about AI use
Ensure that you are transparent about your use of AI; this is key in maintaining trust with the audiences you are communicating with. For example, if you use AI to generate a realistic photo-like image for a proposal, you should clearly indicate that it is not a real photo. It is important to note that funders and partners often use tools to test content originality, therefore it is best to be clear from the onset.
Test and Refine
AI output should never be accepted as final. You should review, tweak, and ensure content reflects the organization’s values and goals.
How not to use AI
Do not rely on AI to create core messages
AI lacks lived experience, therefore it cannot replicate the authenticity and contextual richness of grassroots storytelling. When you use AI to fully draft your communications, it results in generic and impersonal messaging.
Don’t let AI dictate strategy
While AI can offer helpful insights, strategic direction must be driven by leadership and community priorities. Remember, you have unique knowledge of your community’s challenges and resources, making them best equipped to shape your organisation’s strategy.
Don’t share sensitive information
As the field of generative AI is evolving, the companies that develop these tools may use your data to improve their products. Protect the privacy and security of your community; do not share sensitive information.
In summary
While AI-driven technologies have a wide range of applications that have the potential to improve your organization’s efficiency and expand your capacity for impact, it is important to understand how these technologies work and when it’s suitable to use them. As a community leader, you should view AI-driven technologies as assistive tools. Decision-making, storytelling, and strategic thinking should always be led by people.

