Data is only powerful if people understand and act on it That’s why just pulling numbers isn’t enough. A good report tells a story, answers key business questions, and helps decision-makers take action. To ensure your analysis actually gets used: ✅ Start with the right question – If you don’t understand what stakeholders really need, you’ll spend hours on the wrong metrics. It’s okay to ask clarifying questions. ✅ Make it simple, not just accurate – Clean tables, clear charts, and insights that anyone (not just data people) can understand. ✅ Provide context, not just numbers – A 20% drop in sales is scary… unless you also show seasonality trends and explain why it’s normal. ✅ Anticipate follow-up questions – The best reports answer the next question before it's asked. ✅ Know your audience – A C-suite executive and a product manager don’t need the same level of detail. Tailor accordingly. Your work should make decision-making easier. If stakeholders are confused, they won’t use your report No matter how technically correct it is. The best data professionals don’t just crunch numbers. They translate data into impact. Have you ever spent hours on an analysis only for no one to use it?
Writing For Nonprofit Reports
Explore top LinkedIn content from expert professionals.
-
-
Reporting is like sowing—it's about making sense of complexity and weaving together threads of impact. While I've never been a seamstress, I can certainly stitch stories with my words. As a Reporting Officer with UNHCR, the UN Refugee Agency, I like to think that the core of my work is "making complex simple". It's only when people can understand things, that they can support it or look for ways to impact it. Here are some things I've learned along the way: ✨ Structured and Clear Presentation of Data: Think of it like giving directions with clear signposts. Organize your content with headings and subheadings to make it easily navigable. A well-structured report helps readers quickly find the information they need. 📊 Use of Key Figures and Visuals: Numbers are great, but they come to life with visuals! Incorporate statistics and visual aids like photos and charts to make your data more impactful. These visuals bring reports to life and make complex information easier to grasp at a glance. 👩👧👦 Showcasing Human Impact: Behind every statistic is a human story. Whether it's Angelina in Mozambique or Alisa in Sudan, real stories of impact illustrate the heart of our work. Sharing these narratives brings meaning and urgency to our reports. 🤝 Highlighting Collaboration and Impact: Partnerships are the secret sauce! Showcasing collaborative efforts and their real-life impacts can really drive home the effectiveness and reach of our initiatives. Writing reports isn't just about numbers—it's about painting a picture of progress and impact. Let's keep telling these stories that matter, making every word count! What about you guys, any tips? #Reporting #HumanitariansAtWork #Communication
-
Your project ended three years ago. The donor moved on. The team disbanded. The report gathered dust. But what if the impact is still happening, and you just stopped looking? International Initiative for Impact Evaluation (3ie) asked that question... then went digging. They revisited 146 closed research projects to see if the evidence had influenced policies, programmess, or public debates after the projects officially ended. What they found challenges how we think about timelines, impact, and follow-up. Here are the takeaways 👇 1️⃣ Impact doesn’t follow your project calendar. 3ie found that evidence from “closed” projects was often used years later, sometimes up to five years after completion. Policy windows opened later, new champions emerged, and findings found fresh relevance. Stop equating project end date with impact expiry date. 2️⃣ Attribution isn’t the goal, contribution is. Instead of trying to “prove causation,” 3ie used Contribution Analysis to test how plausible it was that the research influenced specific outcomes. They checked: ↳What change occurred? ↳Who made that change? ↳How did the evidence feed into it (directly, indirectly, or symbolically?) The result was a transparent, evidence-backed contribution story, not a claim of ownership. 3️⃣ Evidence use is relational, not transactional. The study found that uptake depended on trust, relationships, and timing, not just the quality of the research. Where researchers had maintained contact with policymakers or embedded findings in networks, influence lasted long after funding stopped. 4️⃣ Revisit the ‘cold cases.’ Just because a project file is closed doesn’t mean the story is over. 3ie’s team used document reviews, interviews, and triangulation to trace evidence pathways, discovering new impacts that weren’t visible during project reporting. The lesson? Impact doesn’t stop when projects do. If you stop looking too soon, you might miss the most meaningful part of your story, how your evidence keeps working long after you’ve logged off. 🔥 If this resonates with how you think about learning, evidence, and systems change, you’ll find more reflections like this on my Linkedin timeline. Hit the Follow button to be notified when I post similar content so we keep the conversation going. #ImpactEvaluation #ContributionAnalysis
-
I’ve been eyes deep in reports and briefings for the last few weeks and here are some things I’ve noticed about the ones I’ve found most compelling: 1) ‘How does this work?’ guides are incredibly useful for policymakers. Establishing the foundations before showing what’s new and needed is a great service. 2) I’ve really appreciated people stating their operating assumptions. Trying to work out an organisation’s hidden premises wastes a lot of time - if you have some first order principles it’s really helpful to share them. 3) Being clear who you’re speaking for stops me trying to guess. Is this analysis based on your frontline work, the involvement of people with experience of the issue or your analysis of the literature and international examples? Please tell the reader! I’m always particularly impressed when people are honest about the limitations of their research too. It's great to acknowledge other experts, organisations and resources. 4) Using statistics is wonderful but please be clear what timeframe they relate to and whether they are UK-wide. 5) Disaggregated data is the best data! Highlighting gaps between different groups of people - and how your proposals will close them if they are unfair - helps to focus minds. 6) Beautiful design, data visualisation and proper editing really makes things memorable and therefore impactful. Please don’t scrimp on this bit! 7) And finally please be clear about what readers can and should do. If you can’t imagine the reader putting something on their to-do list as a result of reading your report then you’re probably not clear enough about what you’re asking for. Huge thanks to everyone who is generating evidence, doing analysis and generating policy recommendations. It’s so appreciated and hugely important.
-
Most non-profits struggle with impact measurement. The reason is simple: their work is multi-dimensional and inter-related, but measurement frameworks tend to reduce everything to a single axis: reach. Reach, as measured in the number of lives touched, has been the cornerstone of impact measurement because it is simple, measurable, objective and easy to comprehend. Sometimes, reach has been supplemented by the scale of impact on the life of each individual reached. But even then, it flattens the story. In practice, this creates three problems: 1. It reduces diverse types of work to the same metric. A think tank that shifts policy, a digital platform that reaches millions, and a grassroots NGO working deeply with 200 families are not comparable. 2. It creates pressure to show scale in numbers, rather than outcomes or systemic influence. 3. It makes it harder for funders and nonprofits to have a shared language of what impact looks like. During my six years at Omidyar Network, I worked on its impact measurement framework, one that I still find incredibly valuable. It captures both the direct impact that impact organisations have (as measured by reach, depth and inclusion), as well as the indirect impact that perpetuates their influence in other ways (for e.g., capital mobilised, replication of practices, institutional and policy shifts) I have found this framework incredibly useful because: 1. It moves beyond “just reach,” allowing nonprofits to tell their story in multiple ways. A for-profit impact start-up may focus on reach, but may also want to document its policy engagements with governments. 2. It works across organisational models, including grassroots NGOs, digital-first orgs, or policy think tanks. Each model may emphasize a different part of the framework but can still be placed on it. 3. It creates multiple valid pathways to being a high-impact organisation (e.g., low reach but high depth, or a pioneering idea that gets widely replicated). 4. It allows nonprofits to adapt the “indirect impact” dimension to their own context. For e.g., a think-tank may customise the policy impact pathway, based on its theory of change. Impact is rarely linear. A holistic framework like this creates space for nonprofits to be seen in their full richness, while still giving the ecosystem a common language to work with. #SocialImpact #ImpactMeasurement #Nonprofits #Philanthropy
-
If you want Your NGO to grow, learn how to track what you are doing (Here is Simple Guide to Monitoring and Evaluation for Beginners) When I started my nonprofit journey, I thought passion alone would carry every project. I thought if you show up, distribute relief items, support communities and work hard, the impact would speak for itself. It took me a while to realize something important. Impact does not speak for itself. You must track it. You must measure it. You must show it clearly. That is where Monitoring and Evaluation comes in. A lot of new NGO founders avoid M&E because they think it is complicated or only for big INGOs. But if you want donors to trust your work, if you want communities to benefit more, and if you want your organization to grow, you must understand the basics. Here is a simple guide for beginners: 1. Know what you want to achieve Before you start any project, write down your goals. Are you trying to improve school attendance? Give shelter? Reduce hunger in a community? If you are not clear on your goal, you cannot measure progress. 2. Set simple indicators An indicator is just a way to track your progress. Examples: • Number of children who now attend school • Number of households who received clean water • Number of caregivers trained Keep the indicators realistic and connected to your goals. 3. Collect the right data Your data does not need to be complicated. You can use: • Short surveys • Attendance sheets • Photos • Lists of beneficiaries • Interviews • Field observations Good data makes your work believable. 4. Track changes over time Do not wait until the end of the project. Monitor every week or every month. Ask yourself: Are things improving? Is something going wrong? Should we change our approach? Monitoring helps you fix problems early. 5. Talk to the community Sit with people. Ask questions. Listen to their feedback. Sometimes what you are measuring is not what they truly need. Real impact comes from real listening. 6. Evaluate honestly At the end of the project, sit down and ask: What worked? What failed? What will we do differently next time? Honesty is how NGOs grow. 7. Share your results Donors want to see numbers. Communities want to see improvements. Your team wants to feel proud. Share success stories, lessons learned, and clear evidence. Transparency builds trust. Final thought M&E is not about big grammar or complex tools. It is simply documenting your work, learning from it, and using the lessons to do better next time. If you take it seriously, it will transform how you run your organization and how the world sees your impact. If you want a part two that breaks down how to create a simple M&E plan, let me know.
-
Two little words most charities don’t use enough: So that. They might seem small, but they make a big difference. They’re often the missing link on your website, strategy, impact reports and when you talk about what you do. Most charities are great at saying 𝘸𝘩𝘢𝘵 they do. And not bad at explaining 𝘩𝘰𝘸 they do it. But not always great at explaining 𝘸𝘩𝘺 it matters. Let’s be honest - your process is probably quite boring. But your impact? That’s way more exciting! But too many charities keep banging on about 𝘸𝘩𝘢𝘵 and 𝘩𝘰𝘸, without helping people understand 𝘸𝘩𝘺. You just need to add “so that”, like this: “We support older people through our befriending service… SO THAT they feel less isolated and can live independently at home.” “We run a group for parents with young children… SO THAT they build confidence and parenting skills, which benefits the whole family.” “We support people experiencing homelessness at our day centre… SO THAT they feel safe, build self-esteem and reduce isolation.” It works. And don't just take my word for it. I’ve posted versions of this for years and I’ve heard from charities who say it's improved their bids, strategies, impact reports, website copy, and more. So give it a whirl! SO THAT your messaging explains your impact. SO THAT people finally get why it matters. SO THAT I didn't waste my time writing this 🤣 PS: If this is helpful, tap save SO THAT you can use it in your next report or repost SO THAT it helps others.
-
What is not measured is invisible. What is invisible is lost. What is lost cannot be acted on or remedied. This is not just a play on words—it is the stark reality of impact-driven work. Too often, in the nonprofit space, Monitoring & Evaluation (M&E) is treated as an afterthought, something to be retrofitted at the end of a project cycle. Some organisations see it as a donor requirement rather than a strategic tool. But here is the truth: - If you don’t measure it, how do you know it’s working? - If you can’t see the gaps, how do you improve? - If your impact remains undocumented, how do you secure future funding? M&E is not just about ticking boxes for reports—it is the lifeline of every project. It tells you who you are truly reaching, what change is happening (or not), and where you need to pivot. Without it, your work is a shot in the dark—it may feel good, but is it making a difference? Imagine a doctor prescribing treatment without diagnosing the patient. That is what happens when an organisation launches a programme without clear metrics, baseline data, and impact measurement frameworks. It’s not just inefficient; it’s a waste of resources and opportunity. Nonprofits that embed M&E from the design phase are the ones that: ✅ Prove their impact with evidence, not assumptions. ✅ Secure funding more easily because donors trust verifiable results. ✅ Pivot and improve in real time, rather than waiting until it’s too late. ✅ Scale sustainably, because they understand what works and why. It is time to stop seeing M&E as an external obligation and start treating it as an internal advantage. It is not just about data—it is about learning, adapting, and amplifying impact. So, before you roll out your next big initiative, ask yourself: "Are we building with insight or just good intentions?" Because what is not measured is invisible, what is invisible is lost, and what is lost cannot be acted on—and that is a risk no mission-driven organisation should take.
-
The Dangerous Illusion: Counting Activities Instead of Creating Change The most pernicious myth in the nonprofit sector isn't about resources but about results. When a foundation director posed this question to me—"That's what you DID. But what HAPPENED as a result?"—it revealed a fundamental misconception plaguing our field. We've mastered counting outputs while failing to measure transformation. Consider the stark contrast between these two paradigms: Activity-Focused Organizations: *️⃣ Count volunteers trained, meals served, and events held *️⃣ Report on expenditures and activities completed *️⃣ Present busy-ness as evidence of effectiveness *️⃣ Measure what's easy rather than what matters *️⃣ Lose donor confidence through lack of demonstrated impact Impact-Focused Organizations: ✴️ Document specific changes in client conditions ✴️ Connect activities directly to measurable outcomes ✴️ Establish clear judgment criteria before collecting data ✴️ Form creative partnerships to overcome resource constraints ✴️ Convert data into compelling narratives of transformation I've seen small nonprofits transform their entire approach through this single paradigm shift—moving from counting activities to measuring change. Their once-faltering donor relationships evolved into genuine partnerships. Resources began flowing toward their most effective programs rather than their most visible ones. The truth is, most funders will accept imperfect measurement if it demonstrates a commitment to genuine learning. They know the difference between organizations that count things and organizations that create change. What might emerge if your next board report began not with what you did, but with what happened as a result? #ImpactfulPhilanthropy #TransformationalMetrics #DonorConfidence