<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Artificial Intelligence &#8211; EFA | European Fundraising Association</title>
	<atom:link href="https://efa-net.eu/tag/artificial-intelligence/feed/" rel="self" type="application/rss+xml" />
	<link>https://efa-net.eu</link>
	<description>One Voice, One Goal, Better Fundraising</description>
	<lastBuildDate>Wed, 23 Jul 2025 12:18:43 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Swissfundraising issues guidance to help fundraisers use AI responsibly</title>
		<link>https://efa-net.eu/news/swissfundraising-issues-guidance-to-help-fundraisers-use-ai-responsibly/</link>
		
		<dc:creator><![CDATA[Melanie May]]></dc:creator>
		<pubDate>Wed, 06 Aug 2025 10:31:57 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Switzerland]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<guid isPermaLink="false">https://efa-net.eu/?p=14015</guid>

					<description><![CDATA[Clear review processes, appropriate training and an environmental conscience are key to the responsible use of artificial intelligence (AI) in fundraising, says a new guide created by EFA<span class="excerpt-hellip"> […]</span>]]></description>
										<content:encoded><![CDATA[<p>Clear review processes, appropriate training and an environmental conscience are key to the responsible use of artificial intelligence (AI) in fundraising, says a new <a href="https://swissfundraising.org/de/news/detail-news/?id=97727a64-d251-f011-9b54-f01d87606edb" target="_blank" rel="noopener">guide</a> created by EFA member <a href="https://swissfundraising.org/" target="_blank" rel="noopener">Swissfundraising</a>.</p>
<p>Swissfundraising says that the guidelines are designed to enable both innovation and responsibility, ensuring fundraisers can “effectively utilise the potential of AI for fundraising without neglecting ethical, legal, or environmental aspects”.</p>
<p>The guide’s opening section on ethics reminds readers that AI should “serve as a tool to support people” but that important decisions must “always remain the responsibility of humans”.</p>
<p>The guide goes on to urges charities to “establish clear guidelines” to ensure that AI is used in an inclusive manner, and to empower staff to recognise hallucinations and bias, and to develop strategies to reduce them.</p>
<p>It also advises that all content created by AI “should be reviewed for accuracy and compliance with the principles of fairness and diversity” before being used internally or externally. Staff training and a “clearly-defined review process” are key to making this happen, it says.</p>
<p><strong>Legal and green issues</strong></p>
<p>In addition, the guide urges fundraisers to be mindful of copyright and data protection rules, both in terms of materials uploaded to AI models, and the content produced by them, and tells the sector to remember the environmental impact of AI and prioritise tools which are energy-efficient or powered by renewable energy. The guide says:</p>
<p><em>“The benefits of using AI should always be weighed against the resource consumption.”</em></p>
<p>While the five-page guideline document is relatively high-level, Swissfundraising points out that charities can choose to adopt other, more detailed frameworks to guide in their use of AI.</p>
<p>The working group which created the guidelines included two members of the Swissfundraising board – Christine Bill and Oliver Graz – as well as its director Roger Tinner.</p>
<p>&nbsp;</p>
<p>Picture by ThisIsEngineering via Pexels</p>
<p>&nbsp;</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Investment, training and transparency needed in fundraising AI adoption</title>
		<link>https://efa-net.eu/news/investment-training-and-transparency-needed-in-fundraising-ai-adoption/</link>
		
		<dc:creator><![CDATA[Melanie May]]></dc:creator>
		<pubDate>Wed, 25 Jun 2025 10:39:42 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[United Kingdom]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<guid isPermaLink="false">https://efa-net.eu/?p=13827</guid>

					<description><![CDATA[UK fundraising experts have published a report which is “intended as a first point of call” for fundraisers looking to safely implement AI in their work.<span class="excerpt-hellip"> […]</span>]]></description>
										<content:encoded><![CDATA[<p>UK fundraising experts have published a report which is “intended as a first point of call” for fundraisers looking to safely implement AI in their work.</p>
<p><a href="https://www.york.ac.uk/media/digital-innovation-philanthropy-fundraising/Shaping%20the%20Future%20of%20Fundraising%20with%20AI.pdf" target="_blank" rel="noopener">Shaping the Future of Fundraising with AI</a> was published earlier this month at the Chartered Institute of Fundraising (CIOF)’s annual Fundraising Convention. Based on surveys and interviews with 100 fundraisers, it is published jointly by the CIOF and the University of York’s Research Centre for Digital Innovation in Philanthropy and Fundraising (CDIPF).</p>
<p>Of those surveyed, 47% are currently using AI in some form in their work, with 62% of those saying it has helped improve their written communications.</p>
<p><strong> </strong><strong>Budget advice</strong></p>
<p>A third (37%) of survey respondents say that their organization spends less than £1,000 per year on AI, while 31% don’t know how much is spent. The report says that while it may be “tempting” to try to adopt free AI tools and digital innovations, it will require investment to ensure they are used effectively. It advises:</p>
<p><em>“Fundraising leaders need to evaluate where in the development of their fundraising practice AI can make the greatest impact and add the greatest value for their donors and their beneficiaries.”</em></p>
<p>This is likely to include training – three-quarters of respondents, regardless of whether they are using AI or not, say that they lack knowledge of this technology, while only 37% of respondents have received relevant training in the past year. However, the sector also feels there is a lack of trusted sources of training, sector guidance and best practice cases, the report says.</p>
<p>While AI adoption was sometimes portrayed as a generational issue, with younger fundraisers more keen or able to pick up AI tools, one fundraiser cautioned against this view. They told the researchers:</p>
<p><em>“We have an amazing young workforce who are digitally native, and I think they’re very excited. But no, they haven’t got a clue how to do this, or how we can best do this in a fundraising context.”</em></p>
<p>In addition, 60% of all respondents recognised three or more different risks or concerns around its use – and just 5% say that they see no risk at all. Fundraisers already using AI were more likely to see risks than those not using it.</p>
<p><strong>Risks aplenty</strong></p>
<p>The three most commonly-cited risks were data bias; ethical concerns (cybersecurity risks, privacy &amp; data breaches); and keeping pace with legal and regulatory requirements.</p>
<p>The report says that there is still work to be done on these as a sector, noting:</p>
<p><em>“Complex issues such as the responsible handling of donor data, legal and regulatory compliance, and the implications of microtargeting can be difficult for the sector to navigate. There is a real need to develop more transparent and consistent AI-powered fundraising approaches cognisant of the human rights implications of emergent technologies and data-driven processes.”</em></p>
<p>Dr Marta Herrero, lead researcher and director of the CDIPF, hopes the report will give fundraisers confidence to begin tackling those points, saying:</p>
<p><em>“We want fundraisers to feel confident about using AI, feeling that they understand what it can do well, the challenges it poses, and – most importantly – that they trust in their own abilities to identify how they can use the tools responsibly for the benefits of the communities and donors they serve.”</em></p>
<p>Ceri Edwards, executive director of engagement at the CIOF and EFA president, added:</p>
<p><em>“As a sector, I believe we can work together to shape a future where AI enhances the impact of fundraising efforts, driving positive change in our communities for years to come.”</em></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Quarter of French charity professionals suggest donors should get AI veto</title>
		<link>https://efa-net.eu/news/quarter-of-french-charity-professionals-suggest-donors-should-get-ai-veto/</link>
		
		<dc:creator><![CDATA[Melanie May]]></dc:creator>
		<pubDate>Wed, 02 Apr 2025 10:47:17 +0000</pubDate>
				<category><![CDATA[France]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<guid isPermaLink="false">https://efa-net.eu/?p=13297</guid>

					<description><![CDATA[Artificial intelligence (AI) could play an important role in fundraising strategies, say French charity professionals, although some believe donors should be able to tell charities to<span class="excerpt-hellip"> […]</span>]]></description>
										<content:encoded><![CDATA[<p>Artificial intelligence (AI) could play an important role in fundraising strategies, say French charity professionals, although some believe donors should be able to tell charities to stop using it.</p>
<p>This is according to a new study by <a href="https://www.francegenerosites.org/" target="_blank" rel="noopener">France générosités</a> based on 228 respondents, a fifth of whom are fundraising specialists, working at 102 different nonprofits.</p>
<p>While 83% of survey respondents are already using AI, three quarters of those are doing so on an individual basis, rather than through a team- or organisation-wide initiative or policy.</p>
<p>And just 10% of those already using AI are doing so as part of a fundraising strategy – compared with 82% using it in writing newsletters, articles and other documents; 48% using it for research and monitoring; and 44% using it for communications.</p>
<p>However, 66% of respondents said they could imagine using AI for fundraising strategies – such as donor analysis or target identification – in future.</p>
<p>The survey also asks what nonprofits should do to ensure that any use of AI is consistent with their organization’s values. Nearly a quarter (23%) said that donors should be able to request that there be no use of AI in their relationship with the organization.</p>
<p>More common answers to this question were:</p>
<ul>
<li>88% said that the use of AI must always be complemented by human input</li>
<li>84% said that the organization should only authorise the use of AI tools which meet certain criteria such as data protection controls, or minimization of their environmental impact</li>
<li>60% said that the use of such tools should be made transparent, internally as well as externally</li>
</ul>
<p><strong>Lack of policies</strong></p>
<p>However, the majority of respondents said their nonprofit was still in the early stages of their relationship with AI.</p>
<p>Most said their organization was either considering its position and strategy towards AI (37%) or keeping an eye on it (28%). Few (20%) said that AI had been deployed internally, but no respondent said that their organization had specifically taken a position against AI.</p>
<p>Larger organizations were slightly more likely than smaller organizations to be advanced in their AI thinking.</p>
<p>In addition, the majority of respondents (65%) said their company did not have any sort of ethical charter governing their use of AI. Others said they did not know if such a document existed (14%), that it was currently being developed (17%) or that they already had one (4%).</p>
<p>Among those with a charter, or with one being developed, half (47%) said that this could be a useful way of ensuring transparency and building trust with donors.</p>
<p>Among current AI users, 61% say they are using it every week, with a quarter of those using it every day. ChatGPT is by far the most common tool being used, indicated by 84% of respondents, followed by Microsoft Copilot (26%) and then Gemini (10%).</p>
<p>&nbsp;</p>
<p>Picture by Alexandra Koch on Pexels</p>
<p>&nbsp;</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Danish fundraisers want to use AI – but must think carefully about objectives</title>
		<link>https://efa-net.eu/news/danish-fundraisers-want-to-use-ai-but-must-think-carefully-objectives/</link>
		
		<dc:creator><![CDATA[Melanie May]]></dc:creator>
		<pubDate>Wed, 28 Aug 2024 10:55:36 +0000</pubDate>
				<category><![CDATA[Denmark]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<guid isPermaLink="false">https://efa-net.eu/?p=12363</guid>

					<description><![CDATA[Fundraisers in Denmark are keen to implement artificial intelligence (AI) and other new tech in their work – but have been told that they must think<span class="excerpt-hellip"> […]</span>]]></description>
										<content:encoded><![CDATA[<p>Fundraisers in Denmark are keen to implement artificial intelligence (AI) and other new tech in their work – but have been told that they must think carefully about what they are trying to achieve.</p>
<p>This warning comes from <a href="https://www.cbs.dk/en" target="_blank" rel="noopener">Copenhagen Business School</a> lecturer Per Østergaard, following a study conducted by Østergaard and the Danish fundraising body <a href="https://isobro.dk/" target="_blank" rel="noopener">ISOBRO</a>.</p>
<p>Currently, 89% of the 74 respondents to the study are using ChatGPT 4.0, and 37% use Microsoft’s Copilot. The most common areas in which AI is used are communication (74%), fundraising (47%) and administration (29%).</p>
<p>When asked which departments they expect will grow in their use of tech and data in the next three years, 66% said fundraising. Three-fifths (59%) say that they hope their organisation will be able to use data to predict retention and churn in the future &#8211; in addition to 27% of organisations who say that they already do this.</p>
<p>However, many organisations are only using data in a tactical rather than strategic way, the report suggests &#8211; just 13% of respondents agreed with the statement ‘technologies are becoming a central part of our business strategy’.</p>
<p>Additionally, 64% of respondents to the survey say that there is a lack of understanding in their organisation on how to use new technology and data to improve their operations.</p>
<p>Østergaard says that organisations must adopt a “strategy focus” rather than a “technology focus”, in order to avoid the same mistakes that are often seen with projects such as CRM implementations &#8211; and that organisations need to “start with ‘why’ and consider ‘how’ afterwards. He says:</p>
<p><em>&#8220;Overall, AI presents great opportunities for fundraising organisations. This is both in relation to increasing the quality of work, and to efficiency &#8211; but it must be anchored strategically.”</em></p>
<p>Østergaard adds that in addition, organisations risk falling into the OO+NT = EOO trap &#8211; where OO means ‘old organisation’, NT means ‘new technology’ and EOO refers to ‘expensive old organisations. He first described this formula in 1999 to illustrate how “new technology does not automatically lead to improvements”, and says that it has proven accurate in many occasions in the intervening quarter of a century.</p>
<p>&nbsp;</p>
<p>Photo by Junior Teixeira on Pexels</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Using AI in fundraising requires ethical &#038; data literacy upskilling, warns report</title>
		<link>https://efa-net.eu/news/report-warns-using-ai-in-fundraising-requires-ethical-data-literacy-upskilling/</link>
		
		<dc:creator><![CDATA[Melanie May]]></dc:creator>
		<pubDate>Thu, 08 Feb 2024 13:33:49 +0000</pubDate>
				<category><![CDATA[Europe]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Research]]></category>
		<guid isPermaLink="false">https://efa-net.eu/?p=11896</guid>

					<description><![CDATA[AI offers many opportunities for charities and nonprofits, but is not yet ready to make ethical fundraising decisions, a report from the international fundraising think tank<span class="excerpt-hellip"> […]</span>]]></description>
										<content:encoded><![CDATA[<p>AI offers many opportunities for charities and nonprofits, but is not yet ready to make ethical fundraising decisions, a report from the international fundraising think tank Rogare has warned. This means ethical and data literacy upskilling is necessary to ensure the complex ethical considerations unique to the fundraising sector can be navigated when using it.</p>
<p>The report, <a href="http://www.rogare.net/ai-ethics" target="_blank" rel="noopener"><em>Artificial intelligence and fundraising ethics: A research agenda</em></a><em>,</em> examines ethical issues arising both from applying AI in fundraising, and using AI to resolve fundraising dilemmas. It starts from the assumption that generic concerns and guidance about the use of AI in any context can’t just be transferred and overlain on to fundraising – because its use in this particular context will throw up unique ethical challenges.</p>
<p>It has been put together by a multinational project team lead by American fundraising consultant Cherian Koshy who is also a member of Rogare’s Critical Fundraising Network.</p>
<p><strong>AI and the ethics of fundraising</strong></p>
<p>Two overarching themes emerged from the project group’s work. The first is that AI does not currently have access to sufficiently-sophisticated knowledge of the ethics of fundraising to be able to make ethical decisions. However, it can be used to guide fundraisers through the process of making ethical decisions, such as priming them about what questions to ask, as might be the case in gift acceptance/refusal dilemmas.</p>
<p>The second emergent theme is that because AI lacks sufficient knowledge of fundraising ethics, human oversight is needed to ensure its use in fundraising practice is done ethically and in accordance with best practice and regulatory codes.</p>
<p>Cherian Koshy commented:</p>
<p><em>“Not only does this oversight require a high degree of ethical literacy on the part of human fundraisers, it also requires a high degree of data literacy.</em></p>
<p><em>“However, it is questionable whether both the ethics and data skills, knowledge and competencies exist to the required degree across the entirety of the fundraising workforce that will be tasked with oversight of the use of AI in fundraising.</em></p>
<p><em>“As AI enters and becomes widespread in fundraising practice, we must upskill the human overseers with this knowledge and these competencies. Skilled and knowledgeable human oversight of AI in fundraising is absolutely essential.”</em></p>
<p><strong>Other considerations for fundraisers</strong></p>
<p>Other issues considered in the report and addressed in the research agenda include:</p>
<ul>
<li>The need to balance transparency around AI with potential negative impacts on donations. Supporters increasingly expect clarity on whether they are interacting with a bot or human. But revealing the use of AI could decrease engagement or giving. More research is required to navigate this tension.</li>
<li>Data ethics and potential biases are also examined. Relying on flawed data risks amplifying discrimination through automated decisions. Thorough analysis should scrutinise existing fundraising data sets and AI systems for embedded biases.</li>
<li>Maintaining an inclusive sector is a priority. Cost barriers could concentrate AI capabilities among larger nonprofits. The project team advocates for “shared data infrastructure and open standards” to ensure equal access for smaller organisations.</li>
<li>The report cautions against over-reliance on AI that could cause a “loss of fundraising expertise through deskilling”.</li>
<li>Unique intellectual property issues require clarification as AI enters fundraising, such as who owns outputs like synthetic media. And determining accountability for potential harms from opaque AI systems presents challenges.</li>
<li>As charities increasingly adopt AI, maintaining public trust will require transparency, assessing workforce impacts, and developing governance aligned with supporter expectations and sector values. With careful oversight, AI presents opportunities to advance nonprofit missions. But as this report emphasises, we must proactively address the ethical dimensions.</li>
</ul>
<p>To address the issues, the report presents a 10-point research agenda tailored to the use of AI in fundraising. It calls for gathering stakeholder perspectives, auditing data and algorithms, developing ethical frameworks, and assessing oversight mechanisms.</p>
<p>&nbsp;</p>
<p>Picture by Gerd Altmann on Pixabay</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Charlène Petit: The New Deal of Digital, AI &#038; the Donor Experience – &#038; how to survive it</title>
		<link>https://efa-net.eu/features/charlene-petit-the-new-deal-of-digital-ai-donor-experience/</link>
		
		<dc:creator><![CDATA[Melanie May]]></dc:creator>
		<pubDate>Wed, 04 Oct 2023 11:15:18 +0000</pubDate>
				<category><![CDATA[Expert View]]></category>
		<category><![CDATA[Features]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Digital]]></category>
		<guid isPermaLink="false">https://efa-net.eu/?p=11493</guid>

					<description><![CDATA[Ageing supporters, rising online giving, and donation volatility mean nonprofits are tackling the triple challenge of renewing their donor base, conversion and retention. From now on,<span class="excerpt-hellip"> […]</span>]]></description>
										<content:encoded><![CDATA[<p><em>Ageing supporters, rising online giving, and donation volatility mean nonprofits are tackling the triple challenge of renewing their donor base, conversion and retention. From now on, says Charlène Petit, </em><em>founder of Facteur Digital, and the</em> <em>FILantropio podcast</em><em>, their resilience will depend on their ability to embrace the New Deal of Digital, AI and Donor Experience Design 2.0.</em></p>
<p>Digital technology in the charitable sector is a bit like the Loch Ness monster: it’s exciting fodder for discussion, but nearly impossible to spot in action. While most charities are aware that they need to develop their digital culture, they don’t know how to handle the &#8220;beast&#8221; due to a lack of training and skills.</p>
<p>Seven deadly sins are preventing organizations from achieving digital maturity:</p>
<ul>
<li><strong>A short-sighted, ROI-based approach</strong>, preventing organizations from fully investing in a digital long-term transformation. Usually this is due to a lack of resources, expertise, time and talent. Interestingly, a good long-term decision often yields disappointing short-term results.</li>
<li><strong>A lack of digital expertise on boards of directors, </strong>leading to antagonism between directors and employees over digital’s place. Directors often fail to show leadership on this subject, and don’t understand the resources and skills required. Digital campaigns are often treated as one-offs, which – unless financial results are positive in the short term – renders any attempt to invest in digital superfluous.</li>
<li><strong>Major internal discord </strong>caused by a lack of understanding about the opportunities offered by digital marketing and how to use the tools.</li>
<li><strong>Digital goals that aren’t linked to any strategy</strong>. In most cases, these goals are not formally presented in a strategic plan, let alone an action plan. Without precise objectives or performance indicators, organizations often make do with &#8220;quick wins&#8221;.</li>
<li><strong>Absence of a budget dedicated entirely to digital</strong>, which means organizations work on a piecemeal basis, and digital projects are budget lines within other budget items with no way to easily manage the digital budget as a whole.</li>
<li><strong>Data analysis is lacking</strong> with topics mostly limited to viewing statistics, open or click-through rates. Generally speaking, organizations don&#8217;t know their conversion statistics, and are under-equipped with the tools and knowledge to do so.</li>
<li><strong>Too much siloing between different organizational functions</strong>, particularly between fundraising, marketing and communications activities. This is a considerable hindrance towards growth, as failure to ensure communication between all the building blocks of an organization’s ecosystem prevents the adoption of a cross-functional and iterative approach.</li>
</ul>
<p>These major pitfalls are accompanied by challenges that cannot be met with half-truths or half-measures. It’s becoming vital for many organizations to move out of reactive mode and into proactive mode ­­– future generations of donors are 100% digital.</p>
<p>The other major challenge is to remain visible in a hyper-competitive and disrupted market, marked by hyper-choice and the arrival of new trusted third parties. Knowing how to communicate one&#8217;s promise and value proposition online is a minimum requirement for existence nowadays. Becoming a media brand is an undeniable asset. Successful organizations know how to continuously tell their story to stay top of their audience’s mind.</p>
<p>Case in point is <a href="https://www.thediaryoflouise.ca/" target="_blank" rel="noopener">The diary of Louise</a> by Relief. To raise awareness of its services, the organization created an online platform featuring the fictional Louise, who lives with anxiety and shared her ups and downs every week for a year. Brand content is definitely a powerful ally for digital projects.</p>
<p>Grant-making foundations will have a decisive role to play in this conversion, by providing adequate funding to accelerate the sector&#8217;s digital transformation. The cost of inaction is all too often overlooked, and will likely be even higher with the advent of artificial intelligence.</p>
<p>&nbsp;</p>
<p><strong>The rise and pitfalls of AI</strong></p>
<p>In the space of a few months, AI has become the talk of the town with the arrival of ChatGPT making its impact on our lives and businesses tangible and concrete. We’re entering a civilizational revolution that’s as exciting as it is terrifying, and the challenge will be to strike a balance between innovation and the preservation of human dignity.</p>
<p>The philanthropic world is no exception to this technological tsunami, and those involved must seriously consider the risks and opportunities of integrating algorithmic tools into their practice. Here are the 10 commandments to take into account:</p>
<ol>
<li><strong>Conduct an impact analysis before implementing AI</strong>.</li>
</ol>
<p>It’s important to identify the upstream consequences of a major organizational change on teams, operational processes and even corporate culture, to ensure adoption is accepted and understood.</p>
<ol start="2">
<li><strong>Define the place and function of AI within your organization</strong>.</li>
</ol>
<p>We all agree automation of repetitive, low value-added tasks is desirable, but what about the generation of text and images? If you&#8217;re looking to stand out from the crowd and add heart and spirit to your content, humans will do a better job. On the other hand, ChatGPT can be a great help in developing personas and marketing strategies. Tasks based on information retrieval should also be treated with caution. Doing trend monitoring or researching potential donors can be less risky than informing customers via a chatbot equipped with AI.</p>
<ol start="3">
<li><strong>Adopt a policy for the responsible and supervised use of AI</strong>.</li>
</ol>
<p>Use this to guarantee the protection of donors&#8217; privacy, stating the nature and scope of the tasks entrusted to it, and ruling on the type of data that can be submitted to AI. This must be drawn up in collaboration with the employees responsible for applying it, and made public and accessible to donors, in the same way as a confidentiality policy.</p>
<ol start="4">
<li><strong>Gauge your relationship to the dispossession of your content.</strong></li>
</ol>
<p>Everything you inject into AI no longer belongs to you. If you ask it to put itself in the shoes of a donor and give its opinion on your brochure or communications, the elements submitted will certainly clarify your request, but also train the tool and feed future answers to other people&#8217;s questions. Google has said that any content made public on the internet could to be used to feed its AI assistant Bard, including websites, social networks and YouTube videos. It’s plausible that organizations will have to prioritize the content they distribute, from the most generic (and copyable) to the most differentiating, according to the degree of exclusivity they wish to have. It&#8217;s a safe bet that audio content formats, which are harder to copy than text, will become increasingly popular.</p>
<ol start="5">
<li><strong>Learn to avoid bias and utilitarian criteria</strong>.</li>
</ol>
<p>Using algorithmic tools that are faster and more efficient than humans in terms of execution and data cross-referencing doesn&#8217;t mean you don&#8217;t need to think critically. We need to be more practical in our approach to AI, entrusting it with workflow optimization rather than decision-making. Take the case of a community foundation seeking to analyze several hundred grant applications within a tight timeframe with an external selection committee. Going through each application individually, entering the essential information in a pre-established criteria grid, is a colossal task. To escape the manual hell of cut-and-paste, the combination of AI and automation tools is a lifeline.</p>
<ol start="6">
<li><strong>The tool must not become an avatar of magical thinking for charities in need of everything</strong> <strong>(resources, talent, time)</strong>.</li>
</ol>
<p>Using ChatGPT as an ace up your sleeve, yes. Making it the master of your strategy, no. AI is a skill that comes on top of business expertise, and it&#8217;s essential to have that beforehand. Otherwise you&#8217;ll be undercutting AI’s full potential for your organization.</p>
<ol start="7">
<li><strong>Playing the transparency card to preserve trust. </strong></li>
</ol>
<p>The last thing a donor expects is for &#8220;love of humankind&#8221; to be embodied in artificially infused exchanges. At a time when we&#8217;re already worrying about the influence of AI on human relationships, isn&#8217;t it better to reveal the contexts in which we use it? Who knows, we may one day see the appearance of a &#8220;Not generated by AI&#8221; label.</p>
<ol start="8">
<li><strong>Only use AI to do better than before</strong>.</li>
</ol>
<p>The tool must provide you with a leap in growth or productivity. Otherwise, you&#8217;re wasting your time.</p>
<ol start="9">
<li><strong>Don&#8217;t put all your eggs in one basket.</strong></li>
</ol>
<p>Over-reliance on third-party applications and services can limit your ability to function without them. AI is said to be the new electricity, but there&#8217;s no guarantee against blackouts. Once mass adoption and dependency have set in, the rules may change to our disadvantage, such as increasing subscription costs.</p>
<ol start="10">
<li><strong>What to do with the &#8220;time dividend&#8221;?</strong></li>
</ol>
<p>That&#8217;s the question every organization climbing on the AI bandwagon needs to answer. Productivity gains are good. Knowing what to do with it is better. The time saved in production can be invested in creation, iteration, training or simply in maintaining human relations through a donor experience strategy.</p>
<p>&nbsp;</p>
<p><strong>DXD (Donor Experience Design) reinvented</strong></p>
<p>DXD, also known as Donor Experience (DX), takes its inspiration from Customer Experience (CX) by focusing on the design of a positive and meaningful experience for donors to build loyalty, further engage them and encourage them to support our cause on a regular basis.</p>
<p>In its early 2020s version, DXD is marked by a shift from audience to community. The difference between the two is engagement. The case study that best illustrates this trend is <a href="https://team-planet.com/">Team for the Planet</a> (TFTP), with the exception that donors have been replaced by the company&#8217;s own nonprofit investors. Its aim is to raise a billion euros by 2030 to finance around a hundred innovations to combat greenhouse gases. TFTP excels at communicating its value proposition and mobilizing its associates to form a highly effective taskforce. The company ticks all the community-building boxes, enabling it to raise nearly $24 million euros in just three and a half years.</p>
<p>Canadian entrepreneur and Internet community builder, Greg Isenberg, developed the T.R.I.B.E. test to find out whether an organization (TFTP in our case) has created a community:</p>
<p><strong>Togetherness</strong> — <strong>people have a space to be together</strong>. Collective intelligence is one of TFTP&#8217;s core values. Important decisions, such as which innovations to finance, are taken at the Annual General Meeting with all shareholders, enabling everyone to contribute in addition to their financial participation.</p>
<p><strong>Rituals</strong> — <strong>people can participate in routines</strong>. With TFTP, associates are regularly asked to relay TFTP posts on social media using a turnkey kit.</p>
<p><strong>Identity</strong> — <strong>people feel they’re with like-minded people</strong>. Many people connect on LinkedIn because they have TFTP in common, implying they share the same values of preserving the planet.</p>
<p><strong>Belonging</strong> — <strong>people feel they’re part of something bigger than themselves</strong>. TFTP shareholders display their participation in the project on social networks with an &#8220;I joined&#8221; banner. They also list it as an additional position in their LinkedIn CVs.</p>
<p><strong>Engagement</strong> — <strong>people consistently adding to the conversation without need for the brand</strong>. It&#8217;s even more powerful when the content is memorable. To celebrate the 100,000-associate milestone, TFTP released the <a href="https://youtu.be/x7iQ50XDSYk">first rap video for the climate</a> which caused quite a stir on social media.</p>
<p>The charitable sector has plenty of good ideas to draw on from TFTP to assist in reinventing the donor experience in light of the major generational and behavioural transformations underway.</p>
<p>In a nutshell, nonprofits are caught in a double race against time: the inevitable catching up of their digital transformation in an ever-shortening timeframe, and preparing to leap into the AI vortex while there&#8217;s still time to have their say on ethics and usage. The great challenge for charities in the coming years will be to navigate the complex and ever-changing environment that sits at the intersection of virtual and real worlds.</p>
<p>&nbsp;</p>
<p><strong><img fetchpriority="high" decoding="async" class="size-medium wp-image-11495 alignright" src="https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-300x300.jpg" alt="Charlene Petit" width="300" height="300" srcset="https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-300x300.jpg 300w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-1024x1024.jpg 1024w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-150x150.jpg 150w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-768x768.jpg 768w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-75x75.jpg 75w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-480x480.jpg 480w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-24x24.jpg 24w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-36x36.jpg 36w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug-48x48.jpg 48w, https://efa-net.eu/wp-content/uploads/2023/09/Portrait_charlene_mug.jpg 1368w" sizes="(max-width:767px) 300px, 300px" />About Charlène Petit </strong></p>
<p><a href="https://www.linkedin.com/in/chpetit/">Charlène Petit</a> debunks and vulgarizes philanthropy through her <a href="https://filantropio.com/" target="_blank" rel="noopener">podcast FILantropio</a> and her monthly newsletter <a href="https://www.subscribepage.com/vitamine-g">Vitamine G</a>. Based in Montreal since 2013, and after an eight-year career as a fundraising professional, she founded Facteur Digital. Her mission is to support charitable brands (nonprofits, foundations, NGOs) in their digital growth strategies and brand awareness by offering growth marketing consulting and brand content production services (podcast, storytelling, copywriting)<em>.</em></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>Picture by Ekaterina Bolovtsova on Pexels</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Patrick Gibbels: New rules are on the way for Artificial Intelligence</title>
		<link>https://efa-net.eu/features/patrick-gibbels-new-rules-are-on-the-way-for-artificial-intelligence/</link>
		
		<dc:creator><![CDATA[Melanie May]]></dc:creator>
		<pubDate>Wed, 10 Mar 2021 10:00:56 +0000</pubDate>
				<category><![CDATA[Features]]></category>
		<category><![CDATA[View from Brussels]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Digital]]></category>
		<category><![CDATA[policy]]></category>
		<guid isPermaLink="false">https://efa-net.eu/?p=7746</guid>

					<description><![CDATA[The use of Artificial Intelligence in fundraising is becoming more and more widespread. With a legislative proposal for AI due from the European Commission imminently, our<span class="excerpt-hellip"> […]</span>]]></description>
										<content:encoded><![CDATA[<p><i>The use of Artificial Intelligence in fundraising is becoming more and more widespread. With a legislative proposal for AI due from the European Commission imminently, our columnist Patrick Gibbels explores how nonprofits are using such technology and what this could mean for the sector.</i></p>
<p>The European Commission is about to propose a legislative package which aims to regulate Artificial Intelligence (AI) in Europe. More precisely, it aims to harmonise such rules across the EU.</p>
<p>There seems to be consensus amongst the EU Member States that a new regulatory framework for AI is needed to complement the applicable legislation, including those relating to consumer protection, data protection and privacy regimes.</p>
<p><a href="https://efa-net.eu/features/patrick-gibbels-new-eu-privacy-laws-are-gaining-momentum">Back in February</a>, I wrote about the upcoming new EU rules on e-privacy and how this might affect civil society organisations and fundraising. At a time where public fundraising is limited, reliance on electronic communication is high. Any new measures that further increase the privacy and data protection of citizens, might reduce the ways in which charities and NGOs reach out to and keep track of existing and potential donors. Whilst the e-Privacy Regulation aims at protecting the privacy of EU citizens vis-a-vis digital communications in general, the upcoming framework on AI seeks to regulate an important element of these digital communications which is Artificial Intelligence. But how do fundraisers use Artificial Intelligence?</p>
<p>To some, AI might seem like something out of a sci-fi movie, but it is applied more broadly than one might think. The most common example of AI within civil society organisations is most likely in the use of chatbots. These are AI-driven automated chat machines which gather data from the user and rapidly develop &#8216;smart&#8217; answers, based on these data. They are used on organisations&#8217; websites to answer frequently asked questions but they can also be used within messenger apps or on social media platforms like Facebook.</p>
<p>AI powered tools can be used to identify prospective donors, and chatbots can offer them tailored information about a charity and engage in conversations by mining data from previous responses, before asking for donations. AI is also used within charities to analyse donor data and suggest how to personalise appeals. With all of this, a question of ethics arises, which the EU intends to tackle.</p>
<p>In the examples above, it is important to ensure that AI conversation does not turn into AI manipulation. The EU Commission proposes a &#8220;human-centric&#8221; approach to AI that respects the EU values and principles, and that features non-discrimination, fairness, accountability, transparency, and privacy. Under the new rules, citizens would have to be informed whenever they are interacting with an AI system, but organisations would also be asked to keep track of the data used to train the algorithms, and to ensure that EU values are respected when using this data, which could of course add additional regulatory and administrative burden.</p>
<p>Perhaps more importantly, there does not seem to be consensus at EU level about how all of this should be enforced. Earlier versions of the proposal suggested creating a new regulator at EU level, but the current version expresses a preference for existing national enforcement bodies to enforce the regulations. However, these bodies are already overloaded supervising the GDPR and other privacy regulations. This could lead to fragmented implementation and enforcement, and market distortions between the Member States.</p>
<p>It is important to find a balance between safeguarding EU values and applying ethics, whilst leaving flexibility for AI systems to be used for good causes. The proposal is at relatively early stages, expected to be tabled in April, and civil society organisations have an important role to play in the discussions regarding ethics in AI. So, watch this space. EFA will monitor further developments and report back.</p>
<p>&nbsp;</p>
<p>Main photo (above) by Gerd Altmann from Pixabay</p>
<div id="attachment_5398" style="width: 310px" class="wp-caption alignright"><img decoding="async" aria-describedby="caption-attachment-5398" class="wp-image-5398 size-medium" src="https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-300x200.jpeg" alt="" width="300" height="200" srcset="https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-300x200.jpeg 300w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-768x512.jpeg 768w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-219x146.jpeg 219w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-50x33.jpeg 50w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-113x75.jpeg 113w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-24x16.jpeg 24w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-36x24.jpeg 36w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels-48x32.jpeg 48w, https://efa-net.eu/wp-content/uploads/2020/02/Patrick_Gibbels.jpeg 900w" sizes="(max-width:767px) 300px, 300px" /><p id="caption-attachment-5398" class="wp-caption-text">Patrick Gibbels, Gibbels Public Affairs</p></div>
<p>&nbsp;</p>
<p><strong>About Patrick Gibbels</strong></p>
<p>Patrick is EFA’s public affairs columnist in Brussels. He is the director of Gibbels Public Affairs. Follow Patrick <a href="https://twitter.com/gpa_brussels?lang=en" target="_blank" rel="noopener noreferrer">@GPA_Brussels.</a></p>
<p>Read more from Patrick in our <a href="https://efa-net.eu/category/features/view-from-brussels">View from Brussels</a> column here.</p>
<p>&nbsp;</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
