Wikimedia Foundation https://wikimediafoundation.org Thu, 16 Dec 2021 18:45:11 +0000 en-US hourly 1 https://wordpress.org/?v=5.8.2 https://wikimediafoundation.org/wp-content/uploads/2018/05/cropped-wikimedia-logo_black-svg-2.png?w=32 Wikimedia Foundation https://wikimediafoundation.org 32 32 155614485 Five reasons Wikipedia needs your support https://wikimediafoundation.org/news/2021/11/30/five-reasons-wikipedia-needs-your-support/ Tue, 30 Nov 2021 18:00:00 +0000 https://wikimediafoundation.org/?p=65751 In the 20 years since Wikipedia was born, it has grown to become a valued and beloved knowledge destination for millions across the globe. Now with over 55 million articles, its growth has been fueled by a global volunteer force and donors who explore and visit the site regularly.  Supported by contributions from readers around the world, the Wikimedia Foundation, the nonprofit that operates Wikipedia, works to ensure the site remains accurate and ad-free, while supporting the free knowledge infrastructure that makes 18 billion monthly visits to the site possible. 

Reader support has allowed Wikipedia to celebrate 20 years as the world’s only major website run by a nonprofit organization. We continue to rely on this generosity as the need for accurate, neutral information, created in the public interest becomes ever-more acute. Here are five reasons why you should support free knowledge work: 

  1. Ensuring the long term independence of Wikipedia and other projects that keep knowledge free 

“Wikipedia is a unique entity that continues to add value in the lives of me and my loved ones. I feel that the crowd funded nature of Wikipedia’s balance substantially contributes to Wikipedia’s immaterial values. Whenever I have money and wikipedia is in need, I will contribute.”

Donor in the Netherlands

Part of the role of the Wikimedia Foundation is to ensure that the independence of Wikipedia and other free knowledge projects is never compromised. The majority of the funding for the Wikimedia Foundation comes from millions of individual donors around the world who give an average of $15 USD. This model preserves our independence by reducing the ability of any one organization or individual to influence our decisions. It also aligns directly with our values, creating accountability with our readers. You can read more about how our revenue model protects our independence in the Wikimedia Foundation’s guiding principles

Our legal and policy teams also work to uphold our independence, protecting our projects from censorship and advocating for laws that uphold free expression and open up knowledge for anyone to use. Support for this work is essential to securing everyone’s right to access, share and contribute to knowledge-building. 

  1. Keeping Wikipedia safe, secure and inclusive   

Wikimedia values — transparency, privacy, inclusion, and accessibility — are built into our technology. Just around 250 engineering and product staff at the Wikimedia Foundation work with our servers to ensure our projects are always available. That means one technical employee for every four million monthly Wikipedia readers! 

As technology platforms increasingly deal with new threats and risks from bad actors, we develop tools and features that protect editor privacy, maintain security, and respond to attacks. We also work to improve our projects, making them more accessible to people with disabilities, or those who primarily access our sites on mobile. Wikipedia projects are built with the intention to keep bandwidth costs low for readers, so that it’s easy for anyone, anywhere to enjoy their value. 

The open source software maintained by our engineers in cooperation with volunteers around the world, MediaWiki, powers our projects and supports more than 300 languages, many more than any other major online platform. This empowers our communities to make content accessible in more languages than you will find on any other top ten website, and it puts our software on the leading edge of global outreach.

  1. Supporting the global Wikimedia volunteer community to help fill knowledge gaps and improve our projects 

Supported by Foundation grants, Wikimedia volunteer and affiliate campaigns continue to make notable contributions to the free knowledge movement. 

For example, some affiliates are working to add new media files to Wikimedia Commons, the world’s largest free-to-use library of illustrations, photos, drawings, videos, and music:

  • In Europe, Wikimedia UK’s partnership with the Khalili Collections, to share more than 1,500 high-resolution images of items from across eight collections, now sees the uploaded images getting more than two million views per month. 
  • Additionally,  this year’s Wiki Loves Africa campaign resulted in over 8,319 images and 56 video files contributed by 1,149 photographers. The campaign challenges stereotypes and negative visual narratives about Africa. Since the collection began in January 2016, over 72,300 images have been loaded to the platform under a Creative Commons license and have been viewed 787 million times. 

With paywalls and price tags increasingly placed on content, the growing collection of free use files on Wikimedia Commons is becoming even more  vital to our efforts to expose people around the world to new sights, art, and cultures.

  1. Building a future for greater knowledge equity   

Our vision is to create a world in which every human can share in the sum of all knowledge. We know that we are far from achieving that goal and that large equity gaps remain in our projects. From content drives, to inclusive product design and research, there are several ways Wikimedia projects work to advance knowledge equity to ensure diverse, more equitable, accessible and inclusive initiatives. Our Wikimedia in Education initiative, for example, promotes equity in education by expanding access to linguistically and culturally relevant open educational resources, and provides opportunities for teachers and students to participate in knowledge production.  In 2020, the Foundation joined UNESCO’s Global Education Coalition, allowing us to discover new ways to support education for people and communities most affected by the COVID-19 pandemic. 

  1. Making sure you know that we use your donations responsibly 

The Wikimedia Foundation has a two-decade-long track record of using resources efficiently, transparently, and in service of impact — which is why independent nonprofit evaluator Charity Navigator gives us its highest overall ratings for accountability and transparency. It’s also why nonprofit research organization GuideStar gives us its Platinum Seal of Transparency. We remain committed to making the best use of donor funds to support Wikipedia. 

We  invite you to support our mission. You can make a donation to Wikipedia at donate.wikimedia.org. For more information about the Wikimedia Foundation’s fundraising program, please see the 2020-2021 Fundraising Report. For additional information about donating, see our list of Frequently Asked Questions. Thank you!

]]>
65751
Creating a Culture of Online Safety: Wikipedia’s Security team rises to new cyber challenges https://wikimediafoundation.org/news/2021/11/24/creating-a-culture-of-online-safety-wikipedias-security-team-rises-to-new-cyber-challenges/ Wed, 24 Nov 2021 17:08:00 +0000 https://wikimediafoundation.org/?p=65747 The Wikimedia Foundation’s Security team is an often invisible force that works tirelessly to protect the information and software of Wikipedia and our other projects. The internet has changed a lot since Wikipedia was created in 2001, and that change has brought with it myriad new security challenges.

From our vast army of diverse volunteer editors who create and maintain the online encyclopedia and its companion projects, to the millions of people around the world who use them every day, our security experts protect our community’s privacy and ensure safe and secure access to invaluable educational resources, acting in real time to confront cyber attacks.

John Bennett, The Wikimedia Foundation.

The Wikimedia Foundation’s Security team is committed to fostering a culture of security. This includes growing security functions to keep up with ever-evolving threats to the health of Wikipedia and the free knowledge movement at large.

It also includes equipping those who are closest to the challenges with appropriate knowledge and tools so they can make good security and privacy decisions for themselves.

The Wikimedia Foundation’s Director of Security John Bennett recently shared in the following Q&A how the Foundation is getting ahead of changing security vulnerabilities, as well as positioning itself at the cutting edge of championing privacy and security on our collaborative platforms.

Q: Why is the work of the Security team so important right now?

The world has come to rely on Wikipedia’s knowledge. We are also living through a moment in history where we are seeing the greatest number of threats to free and open-source knowledge. As we have seen over the past few years, disinformation and bad actors online can pose huge threats to democracy and public health. Wikipedia volunteers work in real time to fact check and ensure the public has safe, reliable access to critical information.

Wikipedia’s continued success as a top-10 site with hundreds of millions of readers means that it will continue to be a target for vandals and hackers. We have to constantly evolve our security efforts to meet new challenges and the growing sophistication of hacking and malicious behavior.

“We are living through a moment in history where we are seeing the greatest number of threats to free and open-source knowledge.”

Security and privacy are key elements in our work to be champions of free knowledge. Though fundamental, this behind-the-scenes work often goes unnoticed. You don’t recognize how important security systems are until they are broken. Investing in a culture of security now will allow Wikipedia to protect its record of the sum of human knowledge for generations to come.

Q: Craig Newmark Philanthropies recently invested $2.5 million in the Foundation’s security operations. What does this investment mean for your work?

This generous new funding is allowing Wikipedia and the Foundation to evolve with the times and get ahead of ongoing threats from hackers and malicious internet users. Over the next two years, we are boosting our security capabilities to an even more thorough level than where we’ve been before.

To take a step back, this investment from Craig is going to our Security team, which has the mission to serve and guide the Foundation and Wikimedia community by providing security services to inform risk and to cultivate a culture of security.

This donation is actually Craig’s second in support of our work. In 2019, Craig funded efforts to vigorously monitor and thwart risks to Wikimedia’s projects. That first investment allowed us to grow and mature a host of security capabilities and services. These include application security, risk management, incident response, and more. While threats to our operations happen nearly every day, we work proactively to prevent cyber attacks by following best practices, leveraging open source software to aid our security efforts, and by performing security reviews.

But to keep up with changing security threats, we need to do much more, and that’s what this new funding will help us to do — take our security to the next level. We’re very grateful to Craig for facilitating that. As the founder of craigslist, he has been a long-time supporter of the free knowledge movement and the work we do at Wikipedia, or as he calls it, “the place where facts go to live.”

Q: What are the Security team’s priorities for the near future?

We have developed a comprehensive three-year security strategy with three areas of focus:

First, cyber risk. Security risk is a tool that we use to assess potential loss and potential opportunity. It’s a framework for us to evaluate our priorities. We need to create a common language and understanding of risk within the Foundation and our communities. To that end, we will be rolling out a series of “roll your own” risk assessments for our staff and communities to learn about security and privacy best practices and equip them to make the best, informed decisions for themselves.

“Understanding and having an appreciation for security and privacy is in everyone’s best interest.”

Second, security architecture. Through this pillar of work, we will deploy robust security services and capabilities for the Foundation and our community projects, including Wikipedia. There are two projects I am particularly excited about. The first is a new internal differential privacyservice for those seeking to safely use and release data. This will enable our staff, volunteers, researchers, and others to consume and share data in a safe and privacy-respecting way. The second project is an effort to move application security practices and tooling closer to those people who are creating code, which will enhance our current security practice and add velocity.

Third, capabilities management. Our main goal with this area of our work is to get better at what we do. It is essentially an ongoing internal audit of our security work, with the ultimate goal of improving security efficacy and creating solutions for Foundation staff and community members. We will evaluate the effectiveness of all of our security and privacy services, as well as establish standards and practices to modify or end services if needed.

Q: What does a secure culture at Wikimedia look like, and how can other online platforms follow the Wikimedia Foundation’s lead?

Understanding and having an appreciation for security and privacy is in everyone’s best interest. What I mean is that by creating an understanding of risks, threats, and vulnerabilities, we are teaching others how to appreciate and how to apply an appropriate lens to various security and privacy situations.

In a large online community like ours, we want people to be comfortable with their security and privacy practices and in asking questions. In the spirit of Wikimedia, our team conducts this work with a human-first approach. We know we are going to have vulnerabilities and threats to our platforms and technology stack — that’s inevitable; but one of our greatest strengths to mitigate these challenges is our community. Empowering them and others to help understand and promote security and privacy is key to creating the culture of security we are seeking.

Q: Any closing thoughts?

Wikipedia at its core is a bold idea that anyone can access and contribute to the world’s knowledge. Our platforms were built on the notion that security and privacy sustain freedom of expression. Security doesn’t mean policing the community of volunteer contributors that make Wikipedia work, but rather empowering all of our users and staff with security practices and resources that will protect and expand our reach. By making Wikipedia sustainable and safe from cyberthreats, we are setting an example for other online platforms that a culture of security can and should be a collaborative effort.

“We are setting an example for other online platforms that a culture of security can and should be a collaborative effort.”

I am super grateful to be part of this work and for the amazing group of people I get to collaborate with on a daily basis. Maryum Styles, Hal Triedman, James Fishback, Samuel Guebo, Sam Reed, Scott Bassett, Manfredi Martorana, David Sharpe, and Jennifer Cross make up a small but super powerful team. I am a huge believer in this team and what it can do and can’t wait to see what’s next!

]]>
65747
The Digital Services Act could require big changes to digital platforms. Here are 4 things lawmakers need to know to protect people-powered spaces like Wikipedia. https://wikimediafoundation.org/news/2021/11/16/the-digital-services-act-could-require-big-changes-to-digital-platforms/ Tue, 16 Nov 2021 12:00:00 +0000 https://wikimediafoundation.org/?p=65667 Update: The European Parliament’s Internal Market Committee voted on key provisions of the Digital Services Act (DSA) on 13 December 2021. For an overview of the vote and what this means for the future of the DSA, read our post on Medium.

The Wikimedia Foundation, the nonprofit that operates Wikipedia, applauds European policymakers’ efforts to make content moderation more accountable and transparent. However, some of the DSA’s current provisions and proposed amendments also include requirements that could put Wikipedia’s collaborative and not-for-profit model at risk

Wikipedia’s system of open collaboration has enabled knowledge-sharing on a global scale for more than 20 years. It is one of the most beloved websites in the world, as well as one of the most trusted sources for up-to-date knowledge about COVID-19. All of this is only made possible by laws that protect its volunteer-led model. But now, that people-powered model is getting caught in the cross-fires of the DSA proposals.

The current DSA framework is designed to address the operating models of major tech platforms. But a variety of websites, Wikipedia included, don’t work in the same way that for-profit tech platforms do. Applying a one-size-fits all solution to the complex problem of illegal content online could stifle a diverse, thriving, and noncommercial ecosystem of online communities and platforms. 

We are calling on European lawmakers to take a more nuanced approach to internet regulation. There is more to the internet than Big Tech platforms run by multinational corporations. We ask lawmakers to protect and support nonprofit, community-governed, public interest projects like Wikipedia as the DSA proceeds through the European Parliament and Council.

We are ready to work with lawmakers to amend the DSA package so that it empowers and protects the ability of all Europeans to collaborate in the public interest. 

Protect Wikipedia, protect the people’s internet. 

Here are four things policymakers should know before finalizing the DSA legislation: 

  1. The DSA needs to address the algorithmic systems and business models that drive the harms caused by illegal content. 

DSA provisions remain overly-focused on removing content through prescriptive content removal processes. The reality is that removing all illegal content from the internet as soon as it appears is as daunting as any effort to prevent and eliminate all crime in the physical world. Given that the European Union is committed to protecting human rights online and offline, lawmakers should focus on the primary cause of widespread harm online: systems that amplify and spread illegal content. 

A safer internet is only possible if DSA provisions address the targeted advertising business model that drives the spread of illegal content. As the Facebook whistleblower Frances Haugen emphasized in her recent testimony in Brussels, the algorithms driving profits for ad-placements are also at the root of the problem that the DSA is seeking to address. New regulation should focus on these mechanisms that maximize the reach and impact of illegal content. 

But lawmakers should not be overly focused on Facebook and similar platforms. As a non-profit website, Wikipedia is available for free to everyone, without ads, and without tracking reader behavior. Our volunteer-led, collaborative model of content production and governance helps ensure that content on Wikipedia is neutral and reliable. Thousands of editors deliberate, debate, and work together to decide what information gets included and how it is presented. This works very differently than the centralized systems that lean on algorithms to both share information in a way that maximizes engagement, and to moderate potentially illegal or harmful content.  

In Wikipedia’s 20 years, our global community of volunteers has proven that empowering users to share and debate facts is a powerful means to combat the use of the internet by hoaxers, foreign influence operators, and extremists. It is imperative that new legislation like the DSA fosters space for a variety of web platforms, commercial and noncommercial, to thrive.

“Wikipedia has shown that it is possible to create healthy online environments that are resilient against disinformation and manipulation. Through nuance and context, Wikipedia offers a model that works well to address the intricacies required in content moderation. Yes, there might be disagreement amongst volunteers on how to present a topic, but that discussion yields better, more neutral, and reliable articles. This process is what has enabled it to be one of the most successful content moderation models in this day and age.”

Brit Stakston, media strategist and Board member of Wikimedia Sverige
  1. Terms of service should be transparent and equitable, but regulators should not be overly-prescriptive in determining how they are created and enforced. 

The draft DSA’s Article 12 currently states that an online provider has to disclose its terms of service—its rules and tools for content moderation— and that they must be enforced “in a diligent, objective, and proportionate manner.” We agree that terms of service should be as transparent and equitable as possible. However, the words “objective” and “proportionate” leave room for an open, vague interpretation. We sympathize with the intent, which is to make companies’ content moderation processes less arbitrary and opaque. But forcing platforms to be “objective” about terms of service violations would have unintended consequences. Such language could potentially lead to enforcement that would make it impossible for community-governed platforms like Wikipedia to use volunteer-driven, collaborative processes to create new rules and enforce existing ones that take context and origin of all content appropriately into account. 

The policies for content and conduct on Wikipedia are developed and enforced by the people contributing to Wikipedia themselves. This model allows people who know about a topic to determine what content should exist on the site and how that content should be maintained, based on established neutrality and reliable sourcing rules. This model, while imperfect, keeps Wikipedia neutral and reliable. As more people engage in the editorial process of debating, fact-checking, and adding information, Wikipedia articles tend to become more neutral. What’s more, volunteers’ deliberation, decisions, and enforcement actions are publicly documented on the website.  

This approach to content creation and governance is a far-cry from the top-down power structure of the commercial platforms that DSA provisions target. The DSA should protect and promote spaces on the web that allow for open collaboration instead of forcing Wikipedia to conform to a top-down model.  

  1. The process for identifying and removing “illegal content” must include user communities.

Article 14 states that online platforms will be responsible for removing any illegal content that might be uploaded by users, once the platforms have been notified of that illegal content. It also states that platforms will be responsible for creating mechanisms that make it possible for users to alert platform providers of illegal content. These provisions tend to only speak to one type of platform: those with centralized content moderation systems, where users have limited ability to participate in decisions over content, and moderation instead tends to fall on a singular body run by the platform. It is unclear how platforms that fall outside this archetype will be affected by the final versions of these provisions. 

The Wikipedia model empowers the volunteers who edit Wikipedia to remove content according to a mutually-agreed upon set of shared standards. Thus while the Wikimedia Foundation handles some requests to evaluate illegal content, the vast majority of content that does not meet Wikipedia’s standards is handled by volunteers before a complaint is even made to the Foundation. One size simply does not fit all in this case.

We fear that by placing legal responsibility for enforcement solely on service providers and requiring them to uphold strict standards for content removal, the law disincentivizes systems which rely on community moderators and deliberative processes. In fact, these processes have been shown to work well to identify and quickly remove bad content. The result would be an online world in which service providers, not people, control what information is available online. We are concerned that this provision will do the exact opposite of what the DSA intends by giving more power to platforms, and less to people who use them. 

  1. People cannot be replaced with algorithms when it comes to moderating content. 

The best parts of the internet are powered by people, not in spite of them. Article 12 and 14 would require platform operators to seize control of all decisions about content moderation, which would in turn incentivize or even require the use of automatic content detection systems. While such systems can support community-led content moderation by flagging content for review, they cannot replace humans. If anything, research has uncovered systemic biases and high error-rates that are all-too-frequently associated with the use of automated tools. Such algorithms can thus further compound the harm posed by amplification. Automated tools are limited in their ability to identify fringe content that may be extreme but still has public interest value. One example of such content are videos documenting human rights abuses, which have been demonstrated to be swiftly removed.  These examples only underscore the need to prioritize human context over speed.

Therefore, European lawmakers should avoid over-reliance on the kind of algorithms used by commercial platforms to moderate content. If the DSA forces or incentivizes platforms to deploy algorithms to make judgements about the value or infringing nature of content, we all – as digital citizenry – miss out on the opportunity to shape our digital future together. 

On Wikipedia, machine learning tools are used as an aid, not a replacement for human-led content moderation. These tools operate transparently on Wikipedia, and volunteers have the final say in what actions machine learning tools might suggest. As we have seen, putting more decision-making power into the hands of Wikipedia readers and editors makes the site more robust and reliable. 

“It is impossible to trust a ‘perfect algorithm’ to moderate content online. There will always be errors, by malicious intent or otherwise. Wikipedia is successful because it does not follow a predefined model; rather, it relies on the discussions and consensus of humans instead of algorithms.”

Maurizio Codogno, longtime Italian Wikipedia volunteer 

We urge policymakers to think about how new rules can help reshape our digital spaces so that collaborative platforms like ours are no longer the exception. Regulation should empower people to take control of their digital public spaces, instead of confining them to act as passive receivers of content moderation practices. We need policy and legal frameworks that enable and empower citizens to shape the internet’s future, rather than forcing platforms to exclude them further. 

Our public interest community is here to engage with lawmakers to help design regulations that empower citizens to improve our online spaces together. 

“Humanity’s knowledge is, more often than not, still inaccessible to many: whether it’s stored in private archives, hidden in little-known databases, or lost in the memories of our elders. Wikipedia aims to improve the dissemination of knowledge by digitizing our heritage and sharing it freely for everyone online. The COVID-19 pandemic and subsequent infodemic only further remind us of the importance of spreading free knowledge.”

Pierre-Yves Beaudouin, President, Wikimedia France

How to get in touch with Wikimedia’s policy experts 

]]>
65667
Wikimedia Foundation launches Wikimedia Enterprise: the new, opt-in product for companies and organizations to easily reuse content from Wikipedia and Wikimedia projects https://wikimediafoundation.org/news/2021/10/25/wikimedia-foundation-launches-wikimedia-enterprise-the-new-opt-in-product-for-companies-and-organizations-to-easily-reuse-content-from-wikipedia-and-wikimedia-projects/ Mon, 25 Oct 2021 16:56:45 +0000 https://wikimediafoundation.org/?p=65595 October 25, 2021, San Francisco, CA, USA ―  The Wikimedia Foundation, the nonprofit that operates Wikipedia and other Wikimedia projects, today formally launched a new commercial product, Wikimedia Enterprise, for large-scale reusers and distributors of Wikimedia content. The opt-in product, operated by Wikimedia, LLC, is designed for companies and organizations who want to more easily reuse content from Wikipedia and Wikimedia projects at a high volume. 

“In its 20 years, Wikipedia has grown to become one of the most trusted resources for knowledge in the world,” said Lisa Seitz-Gruwell, Wikimedia’s Chief Advancement Officer. “As people and companies increasingly seek to leverage its value, we created Wikimedia Enterprise to address the growing number of ways people encounter Wikipedia content outside of our sites and further support our free knowledge mission. The product meets the growing needs of commercial content reusers, making it easier for people to discover, find, and share content from our sites, while also providing commercial companies an avenue to support and invest in the future of Wikimedia’s knowledge ecosystem.” 

First announced in March earlier this year, Wikimedia Enterprise makes the process of leveraging, packaging, and sharing content from Wikipedia and Wikimedia projects more efficient for large scale content reusers. In most cases, commercial entities that reuse Wikimedia content at a high volume have product, service, and system requirements that go beyond what Wikimedia freely provides through publicly available APIs and data dumps. The  information panels shown in search engine results and the information served by virtual home assistants are examples of how Wikimedia content is frequently used by other websites.  

Wikimedia Enterprise aims to address this challenge through a product that offers service level agreements with companies, including guaranteed uptime, customer service support, and more efficient access to Wikimedia content through a series of new APIs, designed specifically for high volume content reuse. 

Wikipedia and other Wikimedia projects will continue to be funded primarily by reader donations from individuals around the world. Wikimedia Enterprise helps to diversify the Foundation’s financial support, but is expected to be a small portion of the organization’s revenue. This will not impact smaller content reusers who can continue to leverage Wikimedia’s data dumps and APIs freely for their own use. 

“Wikimedia Enterprise is designed to meet a variety of different content reuse needs,” said Lane Becker, Senior Director of Earned Revenue at the Wikimedia Foundation. “From big to small, nonprofit to for profit, we want to work with organizations that find value in Wikipedia content and want to support our mission of making free knowledge more accessible to everyone.”   

The creation of Wikimedia Enterprise arose, in part, from the recent Movement Strategy – the global, collaborative strategy process to direct Wikipedia’s future by the year 2030 devised side-by-side with movement volunteers. By making Wikimedia content easier to discover, find, and share, the product speaks to the two key pillars of the 2030 strategy recommendations: advancing knowledge equity and knowledge as a service. 

Interested customers are encouraged to visit the Wikimedia Enterprise website for more information on the product offering, pricing, and other important details. 

About the Wikimedia Foundation 

The Wikimedia Foundation is the nonprofit organization that operates Wikipedia and the other Wikimedia free knowledge projects. Wikimedia Enterprise is operated by Wikimedia, LLC, a wholly owned limited liability company (LLC) of the Wikimedia Foundation. The Foundation’s vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge, and that everyone should be able to access that knowledge freely. We host Wikipedia and the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content, support the volunteer communities and partners who make Wikimedia possible, and advocate for policies that enable Wikimedia and free knowledge to thrive. 

The Wikimedia Foundation is a charitable, not-for-profit organization that relies on donations. We receive donations from millions of individuals around the world, with an average donation of about $15. We also receive donations through institutional grants and gifts. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.

For more information on Wikimedia Enterprise:

Website: https://enterprise.wikimedia.com

See also:

Frequently Asked Questions

Technical documentation

Strategy and principles essay 

Logos (SVG & PNG)

Homepage on wiki

]]>
65595
Why donate to Wikipedia? https://wikimediafoundation.org/news/2021/10/08/why-donate-to-wikipedia/ Fri, 08 Oct 2021 22:39:02 +0000 https://wikimediafoundation.org/?p=65556 Nonprofit organizations across the world are vibrant and diverse with wide ranging missions and objectives. One thing that ties them together is the goal of fundraising and awareness – something each organization approaches differently. At the Wikimedia Foundation, your generous donations help us maintain our independence, serve our diverse and global community, and––unlike many other major websites––guarantee that Wikipedia will never have to rely on advertising. In short, your donations help keep free knowledge free.

We are grateful to be funded primarily by readers around the world who give an average of €10, responding to appeals in banners and email. Reader donations are the best and most important support that the Wikimedia Foundation receives, because they are a reflection of the value that people feel Wikipedia brings to their lives. These donations have allowed the Foundation to provide the infrastructure, world-class technical engineering, and community support that a top ten global website requires.

Every donation we receive is effectively and transparently spent to support our mission. It is important to know that donations do not fund the editing of Wikipedia by Foundation staff. The Foundation does not write, edit, or determine what content is included on Wikipedia or how that content is maintained––editorial policy is determined by Wikipedia’s global community of volunteer editors who strive to deliver neutral and reliable information and prevent and revert inaccurate information on the site. 

Here are just some of the ways we do use donations to sustain Wikipedia and free knowledge:

Providing international technology infrastructure

Our dedicated engineering staff work to ensure you can securely and quickly access Wikipedia on your preferred device no matter where you are in the world. Donations also ensure people around the world can access Wikipedia in their preferred language. While most major websites support an average of 50–100 languages, Wikipedia supports over 300 and counting.

Supporting community-led projects to increase access to trusted information

We support numerous initiatives and projects, including various volunteer-led events and workshops that enrich content on Wikimedia sites and invite new editors to join. We collaborate with Wikipedia volunteers around the globe to support their ideas and help them bring more free knowledge to the world. Every year, about 10% of our budget is specifically dedicated to supporting community projects that enrich, grow, and improve knowledge on Wikipedia.

Defending and protecting free access to information globally

Our legal team works to protect free knowledge, working to prevent censorship, advocating for free licenses and the reform of copyright laws, as well as defending our volunteers from the threat of reprisal. In addition, while Wikipedia remains an example of the good the internet can provide, governments are looking for new models to curb the influence of larger tech companies. Tech companies that operate in the for-profit area have significant financial resources to respond, but we need the help of our supporters to protect our movement’s efforts from these and other looming challenges. 

Empowering French communities

The Wikimedia Foundation supports and promotes the direct impact our free knowledge projects have on French communities. The French language edition of Wikipedia receives nearly one billion pageviews each month, with half of those views coming from France. Wikimedia Commons, our free media repository, has hundreds of thousands of photos and videos sharing the sites and sounds of France with people around the world – including nearly 20,000 files showcasing the Eiffel Tower. Finally, the Foundation provides support to Wikimedia France, the independent movement affiliate working on the ground in France everyday.

We see the impact of these efforts in the messages we receive from French donors:

“Thank you for your kind return email. I am a grandmother over 70 who loves Wikipedia, it helps me a lot when I want or need to know something important to me.”

“Your work is masterful, universal. A huge thank you for allowing us, indeed, without effort, to access your encyclopedia on a daily basis or nearly. Your appeal for financial support is also useful so as to remind us that we are all members of a common world, a common history.”

“I use Wikipedia almost daily. I have the chance to have free access to the French and English pages, which multiplies the field of possibilities. Wikipedia is a gold mine, long live free knowledge! Thank you for what you do for the good of humanity.”


It takes a village to successfully operate a global free knowledge platform, and your donations help by sustaining Wikipedia and our numerous other open source projects. We hope to keep Wikipedia primarily funded by our readers long into the future. Please consider making a donation to Wikipedia to ensure it continues to thrive and remain independent for years to come.

]]>
65556
Wikimedia Foundation appoints new Vice President of Communications, Vice President of Product Design https://wikimediafoundation.org/news/2021/10/08/wikimedia-foundation-appoints-new-vice-president-of-communications-vice-president-of-product-design/ Fri, 08 Oct 2021 16:31:26 +0000 https://wikimediafoundation.org/?p=65546 8 October 2021, San Francisco, California — The Wikimedia Foundation today announced the appointment of two new Vice Presidents: Anusha Alikhan as Vice President of Communications, and Margeigh Novotny as Vice President of Product Design. The Wikimedia Foundation is the nonprofit organization that supports Wikipedia and other free knowledge projects. With this news, both Anusha and Margeigh build on their established tenure at the Foundation and step into expanded leadership roles supporting the broader free knowledge movement.

“I’m thrilled by this opportunity to recognize the deep bench of leaders we’re developing at the Wikimedia Foundation,” said Robyn Arville, Wikimedia Foundation Chief of Talent and Culture. “Anusha and Margeigh both excel at articulating long-term strategy for their respective functions. They are creative thinkers who have the experience and the operational skills to work across rapidly expanding teams and broaden the ways in which Communications and Product support our global movement.” 

Anusha Alikhan brings more than 14 years of communications experience spanning the areas of human rights, technology, international development, journalism and media innovation. She started her career as an employment and human rights lawyer in Toronto, Canada. She expanded her focus on advocacy by building a career around social good with communications leadership roles at the UN, the National Parkinson Foundation, and Knight Foundation. Technology has been core to her focus in communications; at the UN her work centered on promoting technology solutions to advance global peacekeeping, and at Knight Foundation she led strategies that emphasized the power of technology to inform and engage.

Anusha Alikhan
Anusha Alikhan, Wikimedia Foundation Vice President of Communications

In her two years at the Foundation, Anusha has led an expanding team and strengthened Wikimedia’s strategic communications focus. Her emphasis on targeted-campaign building has increased the Foundation’s media impact, particularly around key initiatives including Wikipedia’s 20th birthday, the Foundation’s partnership with the World Health Organization and our approach to misinformation. She also stewarded the restructuring of the Foundation’s digital strategy to expand its visibility, increase brand alignment, and attract new and global audiences. Her focus on diversity, equity and inclusion has advanced new approaches to engagement and outreach, combining storytelling with data-driven insights.  

As the new VP of Communications, Anusha will oversee communications activities across media, brand, marketing, movement and internal communications functions, with the goal of educating and engaging global audiences on Wikimedia work. She will move forward the department’s strategy to expand communications within regional markets, advancing the Foundation’s equity, advocacy and growth goals, and creating campaigns that meet people where they are. 

“Communications has a vital role to play as the Wikimedia Foundation seeks to grow and support a diverse, global movement, while pushing the understanding of what it takes to keep knowledge free,” said Anusha. “Wikimedia is a technology and social good organization working to advance a movement that is human at its core; our stories and unique perspective have the power to connect and influence. I could not be more excited to collaborate with a talented group of staff to continue and expand this work.” 

Anusha has a master’s degree in journalism from New York University, a law degree from Queen’s University in Ontario and an honors bachelor of arts from the University of Toronto. She is also a Board member at the Communications Network, First Draft News, and Awesome Foundation Miami, and a member of the Communications Network working group for diversity, equity and inclusion.

Margeigh joined the Wikimedia Foundation in 2018, building on an expansive career as a product strategist, designer and inventor. An architect by training, she has led the development of many first of category consumer technology products, including  the design of polite intelligent systems that are able to build trust through transparency with the users they serve. She is also the first inventor on several patents related to user-centered machine learning, video content delivery and other hardware-software interfaces.

Margeigh Novotny
Margeigh Novotny, Wikimedia Foundation Vice President of Product Design

During her tenure at the Foundation, Margeigh has led the Product Design and Design Strategy teams, with an emphasis on modernizing the reading experience across Wikimedia products, and streamlining the editing experience to make it more accessible for newcomers. She has expanded design research capabilities to make it possible to reach users in emerging contexts, and to engage with them in their preferred language. Margeigh has also co-led the development of Wikimedia’s Product Platform Strategy and been an active contributor to the broader Movement Strategy effort, working closely with Wikimedia volunteer communities on recommendations for the future of the Wikimedia movement.  

In her new role as the VP of Product Design, Margeigh will focus on making inclusive product development methodologies a best practice at the Wikimedia Foundation. She will expand the department’s capability to design with, not for – prioritizing products and features which will empower emerging communities to scale. She will ensure that human-centered design continues to influence organizational practice at the Foundation, and will continue to support design thinking in the free knowledge movement.

“I joined the Foundation because I believe the Wikimedia projects are critical cultural infrastructure,” said Margeigh. “I want to help ensure the resilience and relevance of the projects in these times of global uncertainty and change. I’m honored and grateful to work with such a talented and mission driven team on products that truly welcome all people to participate in the sum of all knowledge.”

Margeign has a Bachelor’s of Architecture, Philosophy minor, from California Polytechnic State University in San Luis Obispo. She did her masters’ studies in architecture, anthropology and continental philosophy at the University of Michigan, Ann Arbor and University of California, Berkeley. She is a registered Architect in the state of California.

About the Wikimedia Foundation

The Wikimedia Foundation is the nonprofit organization that operates Wikipedia and the other Wikimedia free knowledge projects. Our vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge, and that everyone should be able to access that knowledge freely. We host Wikipedia and the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content, support the volunteer communities and partners who make Wikimedia possible, and advocate for policies that enable Wikimedia and free knowledge to thrive. 

The Wikimedia Foundation is a charitable, not-for-profit organization that relies on donations. We receive donations from millions of individuals around the world, with an average donation of about $15. We also receive donations through institutional grants and gifts. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.

]]>
65546
China again blocks Wikimedia Foundation’s accreditation to World Intellectual Property Organization https://wikimediafoundation.org/news/2021/10/05/china-again-blocks-wikimedia-foundations-accreditation-to-world-intellectual-property-organization/ Tue, 05 Oct 2021 20:14:00 +0000 https://wikimediafoundation.org/?p=65528 5 October 2021, San Francisco, CA, USA — China today blocked the Wikimedia Foundation’s bid for observer status at the World Intellectual Property Organization (WIPO) for the second time after the Foundation’s initial application in 2020. 

WIPO is the United Nations (UN) organization that develops international treaties on copyright, patents, trademarks and related issues. China was again the only country to object to the accreditation of the Wikimedia Foundation as an official observer. The Foundation will reapply for official observer status in 2022, but it will only be admitted by WIPO if China decides to lift its blockade.

WIPO’s work, which shapes international rules that affect the sharing of free knowledge, impacts Wikipedia’s ability to provide hundreds of millions of people with information in their own languages. “The Wikimedia Foundation’s absence from these meetings deprives our communities of an opportunity to participate in this process,” says Amanda Keton, General Counsel of the Wikimedia Foundation. 

As in 2020, China’s statement falsely suggested that the Wikimedia Foundation was spreading disinformation via the independent, volunteer-led Wikimedia Taiwan chapter. The United States and the group of industrialized countries at WIPO — which also includes many European Union member states, Australia, Canada, the Holy See, Israel, Japan, New Zealand, Norway, Switzerland, Turkey, and the United Kingdom — expressed their support for the Foundation’s application. Since WIPO is generally run by consensus, any one country may veto accreditation requests by non-governmental organizations. 

A wide range of international and non-profit organizations as well as private companies are official observers of WIPO proceedings and debates. These outside groups offer technical expertise, on-the-ground experience, and diversity of opinions to help WIPO carry out its global mandate. Many of these organizations have members, partners, or affiliates in Taiwan. 

“The Wikimedia Foundation operates Wikipedia, one of the most popular sources of information for people around the world. The Foundation’s exclusion sets a worrying precedent for other organizations – nonprofits and for-profits – that are committed to promoting access to information, culture, and education,” adds Keton. “We renew our call to WIPO members, including China, to approve our application. The international community must ensure meaningful civil society participation in UN fora.”

The Wikimedia Foundation provides the essential infrastructure for free knowledge and advocates for a world in which every single human being can freely share in the sum of all knowledge.

]]>
65528
Series: Reflections on representation this Black History Month https://wikimediafoundation.org/news/2021/10/05/series-reflections-on-representation-this-black-history-month/ Tue, 05 Oct 2021 03:24:00 +0000 https://wikimediafoundation.org/?p=65531 October marks the start of Black History Month, in the UK and Ireland. At the Wikimedia Foundation, this month is an exciting opportunity to honor the lived experiences, stories, and contributions of Black people, Africans, and Africans in the diaspora around the world. 

We are also digging deeper into the meaning of representation — exploring the challenges, the opportunities, and the incredible work already underway to ensure our movement reflects the full, rich diversity of all humanity, not just this month, but all year long. 


The challenges in representation in our movement are real.

When it comes to contributors on Wikimedia projects, the majority (61%) are based in Europe and Northern America.

Only 1.6% of contributors are based in Africa, although people in Africa comprise 17% of the world’s population. In the US specifically, fewer than 1% of Wikipedia’s editors base identify as Black or African American.

When it comes to content, there are more Wikipedia articles written about Antarctica than most countries in Africa. Africa has almost twice the population of Europe, and yet only 15 percent the number of articles.

We also know that women, Black, Indigenous, and people of color, as well as LGBTQ+ people often face increased scrutiny, pressure, or outright harassment on our projects — a disheartening reality we aim to address with a new Universal Code of Conduct.

The Wikimedia Foundation recently launched the Open the Knowledge initiative to raise awareness of the biases, under-representation, and inequities in our movement that continue to close Wikimedia projects to much of the world’s people and knowledge. We are inviting all who support our mission and participate in our movement to help open the knowledge — making it more diverse, more equitable, accessible and inclusive.


It’s with these data and challenges in mind that we ask ourselves: 

What does it really mean to be represented on Wikimedia projects (such as Wikipedia), and within the movement of volunteers who create it?

Is it reaching a certain percentage point, creating more Wikipedia articles, increasing the number of contributors? Or is it more than numbers on a screen?

We believe it is more, so much more. 

Representation is a construct, one that has layers — one that exists not just in data points but in how people feel. That means real representation comes from a range of approaches on Wikimedia projects: From having articles in your language, seeing images of people who are part of your history, attending events where you feel welcomed, and more.


There are several There are several community-led initiatives already making important progress toward knowledge equity — a pillar of our movement strategy that calls us to make Wikimedia projects more welcoming and representative of communities that have been overlooked and oppressed by systems of power and privilege.

Groups such as Black Lunch Table, AfroCrowd, and WhoseKnowledge? focus on adding knowledge about Black history and people of African descent to our projects. The AfroCine project, Africa Wiki Challenge, and Wiki Loves Africa photography campaign aim to increase information on our projects from African countries. These groups are just some of many working to improve diversity and participation across the Wikimedia ecosystem.

Drawing inspiration from these initiatives, and in an effort to elevate them and different views on representation, each week this month, we will highlight a project that works to improve the representation of Black people, Africans, and Africans in the diaspora in our movement.

Check back every week for a new video and profile that celebrates Wikimedia movement initiatives strengthening representation and participation in Wikimedia projects — getting us closer and closer to making knowledge equity a reality. 


Week 1: Nigerian Language Oral History Documentation Project

Did you know that more than 6,000 languages are spoken in the world, and over 500 are spoken in Nigeria? This is approximately 8.3% of the total languages spoken worldwide. However, many of these smaller, Indigenous languages face extinction, as they have little documentation, and are not written down or taught in schools.

Enter the Nigerian Language Oral History Documentation Project: an initiative by the Wikimedia Nigeria Foundation Inc., supported by the Wikimedia Foundation, that is working to enrich Wikimedia projects with freely licensed audiovisual files documenting spoken languages and dialects in Nigeria.

Olaniyan Olushola, president of Wikimedia User Group Nigeria, says of the project’s importance:

“I perceive representation on Wikimedia projects as one of the ways of protecting the diversities of Africa and its people, the richness in our culture and traditions, and providing a level playing ground to accommodate our views on relevant discussions in and about the movement. … I am excited that this project will preserve at least over 50 languages that are bound to face extinction.”

So far, the project has produced and documented 52 audiovisuals, which have been used on over 150 related Wikipedia articles in over 20 languages. 

Week 2: Stay tuned! 

Week 3: Stay tuned!

Week 4: Stay tuned!

]]>
65531
Wikimedia Foundation launches campaign with South African creative community to promote access to and sharing of free knowledge https://wikimediafoundation.org/news/2021/09/28/wikimedia-foundation-launches-campaign-with-south-african-creative-community/ Tue, 28 Sep 2021 10:48:00 +0000 https://wikimediafoundation.org/?p=65506 September 28 2021, Johannesburg, South Africa – Today, on the International Day for Universal Access to Information, the Wikimedia Foundation, the non-profit that operates Wikipedia, is launching a campaign in collaboration with the South African creative community to showcase the power of knowledge and everyone’s right to access, create and share it. Over the next four weeks, the Foundation will release online multimedia content built by South African writers, filmmakers, fashion designers, artists and thought leaders, highlighting a variety of topics from popular culture to social politics to the evolution of African cinema. The public can follow the campaign online using #WikipediaByUs.

The campaign will invite these content creators to explore Wikipedia, the world’s largest online encyclopedia, as a place to freely share knowledge, and introduce them to its model of open collaboration, which allows anyone, anywhere to add well-sourced, neutral content to the site. Content creators will use their creativity to contribute to the knowledge ecosystem, emphasizing that information can be conveyed in many ways, from writing and pictures to sound and music. Aligned with Wikimedia’s  goal to break down the social, political and technical barriers preventing people from accessing and contributing to free knowledge, the campaign also aims to draw attention to the South African stories, contexts, history and experiences missing from Wikipedia.

Khanyi Mpumlwana, the Wikimedia Foundation’s Creative Director said, “Access to knowledge and information in South Africa has too often been limited by class barriers and divided along racial lines. Wikipedia opens an opportunity for anyone, anywhere to share and access knowledge. Through this campaign, we want to show all content creators across the country that knowledge can take various forms and that we can all play a part in its creation.”

The Foundation is collaborating with the Bubblegum Club, a cultural intelligence agency to showcase writers, filmmakers, thought leaders and other influencers in South Africa as experts and knowledge producers in a variety of subject areas. Two directors, Zandi Tisani and Monde Gumede, will create captivating short films, with Zandi telling a story of African film history   and Monde drawing attention to current challenges in knowledge dissemination, including the spread of misinformation. Other key influencers include Ridhwaan Suliman, a mathematician and Twitter influencer, who will illustrate the importance of storytelling through data, and Khensani Mohlatlole, a writer and video essayist focused on fashion and design and its historical impact, who will share a vlog underlying the importance of uncovering unique, untold stories.

As one of the world’s top ten most visited websites, Wikipedia is a source of knowledge for billions of people across the world. Wikipedia is written by more than 280,000 global volunteer contributors, but currently only 1.5% of these editors are based in Africa, and an even smaller percentage are South African. As a result, Wikipedia articles are missing perspectives from Africa; history written about South Africa and other countries is being documented by people in other regions of the world, making them less representative and leaving large content gaps. 

Anusha Alikhan, Senior Director of Communications at the Wikimedia Foundation said: “When more people from diverse backgrounds collaborate on Wikipedia, they move us closer to achieving our vision of ensuring our projects reflect the diversity of knowledge from people, cultures and languages around the world. South Africans have a role to play in shaping this global resource, telling their own histories, and creating content about South Africa by South Africans.” 

South Africa has a history of societal inequality, which has created barriers to accessing knowledge. Initial data from research conducted by the Wikimedia Foundation illustrated that 91% of South Africans believe that knowledge is synonymous with freedom, whilst 94.5% of them believe that knowledge is truly power. 

Currently, 9 of South Africa’s 11 official languages are represented on Wikipedia, largely due to the efforts of non-profit organization Wikimedia South Africa, a recognized chapter of the Wikimedia movement.  The top four most visited language Wikipedias monthly in the country are English (81 million pageviews), Afrikaans (3 million pageviews), IsiZulu (95,000 pageviews), and IsiXhosa (49,000 pageviews), as of last month. The Wikimedia South Africa chapter has built a movement of knowledge champions who support the building of free knowledge by contributing and editing content on Wikipedia and other Wikimedia projects. 

Drawing from the South African context, the Wikimedia campaign aims to show South Africans an avenue to address inequities in accessing and sharing knowledge. It invites them to join the free knowledge movement and contribute to building the cultural and linguistic representation of South Africa on Wikipedia. 

Some of the creative collaborators participating in the campaign include: 

  • Monde Gumede – filmmaker who will create a short film about how misinformation fuels panic and social divide in a South African social and digital context
  • Zayaan Khan – multidisciplinary ecological thought leader who will provide a socio political perspective on how we navigate natural environments    
  • Kabelo Kungwane – storyteller behind The Sartists who will develop a knowledge sharing series around culture and fashion in the country 
  • Amogelang Maledu – art practitioner, interested in popular culture, who will share a video essay reflecting her work on sound cultures in Southern Africa and their influences
  • Khensani Mohlatlole – writer and video essayist who will unpack themes in South African popular culture
  • Ridhwaan Suliman – mathematician and Twitter influencer who will illustrate the importance of accurate information, and how information is distributed in a pandemic
  • Zandi Tisani – filmmaker who will create a short film about the history of African film, and how we have come to consume or create content 

To learn more about Wikimedia Foundation’s efforts to increase knowledge equity in its projects, explore our Open the Knowledge initiative.

To learn how to get involved in the Wikimedia South Africa chapter, visit here.

Follow the campaign online using #WikipediaByUs. 

About the Wikimedia Foundation

The Wikimedia Foundation is the nonprofit organization that operates Wikipedia and the other Wikimedia free knowledge projects. Our vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge, and that everyone should be able to access that knowledge freely. We host Wikipedia and the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content, support the volunteer communities and partners who make Wikimedia possible, and advocate for policies that enable Wikimedia and free knowledge to thrive. 

The Wikimedia Foundation is a charitable, not-for-profit organization that relies on donations. We receive donations from millions of individuals around the world, with an average donation of about $15. We also receive donations through institutional grants and gifts. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.

]]>
65506
Wikimedia Foundation Announces New Vice President for Global Advocacy Rebecca MacKinnon https://wikimediafoundation.org/news/2021/09/27/wikimedia-foundation-announces-new-vice-president-for-global-advocacy-rebecca-mackinnon/ Mon, 27 Sep 2021 18:17:56 +0000 https://wikimediafoundation.org/?p=65501 27 September 2021, San Francisco, California — The Wikimedia Foundation today announced that Rebecca MacKinnon has joined the Foundation as its first Vice President for Global Advocacy. Rebecca is an experienced advocate for privacy rights and media freedom who has had a long-career in journalism, academia and public policy.

“Rebecca joins our team at a time when it is increasingly urgent to establish and defend access to open knowledge,” said Amanda Keton, General Counsel at the Wikimedia Foundation. “She brings a breadth of experience across digital advocacy issues that will enable the Wikimedia Foundation to influence key regulatory initiatives in support of our mission. Her work will allow us to grow public awareness and engagement on policy issues to protect the future of the open internet.”

As the VP for Global Advocacy, Rebecca will provide strategic leadership and direction to the Wikimedia Foundation’s growing public policy team, and continue Wikimedia’s efforts to establish and defend a legal and regulatory landscape essential to the future of free knowledge globally. She is extremely familiar with the work of the Wikimedia movement, having previously been an active member of the Advisory Board to the Wikimedia Foundation’s Board of Trustees between 2007-2012. Now, as part of the Foundation she will guide the public policy team on efforts to support the global movement of Wikimedia volunteers by promoting free expression and addressing national and regulatory threats that prevent access to knowledge.

“I share this movement’s steadfast commitment to knowledge as a human right,” said Rebecca MacKinnon. “I have built my career around advocating for people’s digital rights and defending against legal and technological threats that undermine democracy. Wikimedia plays a unique role in the international information landscape, and I am excited for the opportunity to serve the global movement by strengthening our advocacy efforts to enable all people to participate in the sum of all knowledge.”

Rebecca was most recently the founding director of Ranking Digital Rights (RDR), a program at New America that works to promote freedom of expression and privacy on the internet through the creation of global standards and incentives for technology companies to respect and protect users’ rights. Prior to launching RDR, in 2012 Rebecca published Consent of the Networked: The Worldwide Struggle for Internet Freedom, one of the first books to publicly discuss the contemporary rise of “networked authoritarianism” and its threat to human rights and democracy.  Her book was the product of nearly a decade working as a researcher, educator, and advocate raising awareness about freedom of expression and privacy online. 

In 2004 as a fellow at Harvard’s Berkman Klein Center she co-founded, with Ethan Zuckerman, the international citizen media network Global Voices, which like Wikipedia is shaped by a global community of volunteer contributors.  She has held board member roles at organizations including the Global Network Initiative and the Committee to Protect Journalists, and serves on the Advisory Network for the Freedom Online Coalition.

Rebecca began her career as a journalist in East Asia, culminating as CNN’s bureau chief in Tokyo as well as Beijing, where she was directly exposed to many of the realities of internet censorship and surveillance, as well as threats to a free press. She is a fluent Mandarin speaker, and a former Fulbright scholar in Taiwan. Rebecca received her Bachelor’s degree with honors in Government at Harvard University, where she was also the Editor-in-Chief of Harvard International Review. She joins the Wikimedia Foundation today.

About the Wikimedia Foundation

The Wikimedia Foundation is the nonprofit organization that operates Wikipedia and the other Wikimedia free knowledge projects. Our vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge, and that everyone should be able to access that knowledge freely. We host Wikipedia and the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content, support the volunteer communities and partners who make Wikimedia possible, and advocate for policies that enable Wikimedia and free knowledge to thrive. 

The Wikimedia Foundation is a charitable, not-for-profit organization that relies on donations. We receive donations from millions of individuals around the world, with an average donation of about $15. We also receive donations through institutional grants and gifts. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.

]]>
65501