Future Audiences/Experiment:Add a Fact


Add-A-Fact is a new experimental feature being developed by the Future Audiences team that builds off the lessons from Citation Needed.

As with all Future Audiences experiments, Add-A-Fact seeks to contribute to the Wikipedia community’s work while attempting to prove or disprove a hypothesis. In this case, we’re seeking to understand how editing audiences can make editorial contributions off-platform (that is, without going directly to Wikipedia.org), and if generative AI can support or hinder this process.

We hypothesize that:

being able to add facts to Wikipedia in a light-touch way can help editors speed up their process without interrupting their day, and that we can make LLM-in-the-loop tools that support rather than obstruct editors’ work.

Using the Add-A-Fact Chrome browser extension, an editor can select text from another website that they may want to add to a Wikipedia article, use an LLM to check if the text they selected is relevant to any articles, and whether the article agrees or disagrees in full or in-part with the text. After the user selects an article Add-A-Fact will then send the text, their thoughts on the text and a structured citation (using Citoid) to the talk page of the article they select.

We recognize how adding facts off-platform to talk pages could overwhelm them and the article watchers who manage those pages, so to that end, Add-A-Fact users will be limited to sending a maximum of 10 facts per day. The MVP release, slated to be ready by Wikimania 2024, will be limited to autoconfirmed en.wiki editors with accounts. In other words, at first, Add-A-Fact will not be available to IP editors, or just anyone on the Internet at-large.

Add-A-Fact is subject to change based on user feedback. You can follow Add A Fact’s development in Phabricator.

Timeline

edit

The team is currently evaluating ways to test the research questions below in the fastest, lowest-cost way – including via existing tools (i.e., the Citation Needed extension) and/or developing new experimental tools or features.

Research questions

edit
  1. Do people on the internet want to contribute good-faith information to Wikipedia?
  2. Who are the people who would be interested in doing this? i.e.:
    • The general public – people who have a casual relationship to Wikipedia (are aware of and may visit it from time to time, but wouldn't consider themselves members of our movement, may not donate, etc.)
    • People who are Wikipedian-like in some way – e.g., Reddit moderators, subgroups on the Internet (i.e., fandoms, communities, fact-checkers, etc.); donors
      • What could incentivize non-Wikipedians to do this? i.e.:
        • Add extra incentives: i.e., wrap the "add a fact" functionality into another useful end-user tool, e.g. Citation Needed (if we discover it is useful/attractive to end-users)
        • Radically lower the barrier to entry: i.e., make the functionality run in the background, like spellcheck (checking for and identifying claims that look like they are on reliable sources and should be added)
        • Other?
    • Existing Wiki(p/m)edians
  3. How might we deliver these contributions into existing or new pipelines for human review/oversight/addition to Wikipedia?

See also

edit
  • WikiGrok (2014-15): on-wiki experiment to encourage casual Wikipedia readers to contribute a structured Wikidata fact to a topic (by answering a simple question about the article they were reading).
    • Findings: high overall engagement and quality of responses (especially when aggregated). Main blocker was in the cost to maintaining/scaling the infrastructure to power suggested questions (at the time, a graph database was the best solution, but there were no affordable, scalable open source solutions on the market).
  • Citation Hunt: a game hosted on Toolforge that allows anyone to search for/add a reference to an unsourced claim onwiki.
  • Wikidata for Web: an extension that displays data from Wikidata on various websites and also allows extraction of data from these websites to input into Wikidata.
  • Article Feedback Tool: a tool piloted to engage readers to participate on Wikipedia and to help editors improve articles based on reader feedback.
    • Findings: Readers welcomed the opportunity to engage with Wikipedia in a new way, but since they were asked to provide freeform feedback on article quality, most of their output was not useful to improving the content. From the final report: “Over a million comments were posted during this experiment: on average, 12% of posts were marked as useful, 46% required no action, and 17% were found inappropriate by Wikipedia editors. However, a majority of editors did not find reader comments useful enough to warrant the extra work of moderating this feedback.”