CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Tuesday, September 9, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

We risk a deluge of AI-written ‘science’ pushing corporate interests—here’s what to do about it

September 8, 2025
158
0

September 8, 2025

The GIST We risk a deluge of AI-written 'science' pushing corporate interests—here's what to do about it

Related Post

Judge reviews $1.5B Anthropic settlement proposal with authors over pirated books for AI training

Judge reviews $1.5B Anthropic settlement proposal with authors over pirated books for AI training

September 9, 2025
Researchers develop a next-generation graph-relational database system

Researchers develop a next-generation graph-relational database system

September 8, 2025
Lisa Lock

scientific editor

Andrew Zinin

lead editor

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

written by researcher(s)

proofread

robot hands typing
Credit: Pixabay/CC0 Public Domain

Back in the 2000s, the American pharmaceutical firm Wyeth was sued by thousands of women who had developed breast cancer after taking its hormone replacement drugs. Court filings revealed the role of "dozens of ghostwritten reviews and commentaries published in medical journals and supplements being used to promote unproven benefits and downplay harms" related to the drugs.

Wyeth, which was taken over by Pfizer in 2009, had paid a medical communications firm to produce these articles, which were published under the bylines of leading doctors in the field (with their consent). Any medical professionals reading these articles and relying on them for prescription advice would have had no idea that Wyeth was behind them.

The pharmaceutical company insisted that everything written was scientifically accurate and—shockingly—that paying ghostwriters for such services was common in the industry. Pfizer ended up paying out more than US$1 billion (£744 million) in damages over the harms from the drugs.

The articles in question are an excellent example of "resmearch"—bullshit science in the service of corporate interests. While the overwhelming majority of researchers are motivated to uncover the truth and check their findings robustly, resmearch is unconcerned with truth—it seeks only to persuade.

We've seen numerous other examples in recent years, such as soft drinks companies and meat producers funding studies that are less likely than independent research to show links between their products and health risks.

A major current worry is that AI tools reduce the costs of producing such evidence to virtually zero. Just a few years ago, it took months to produce a single paper. Now a single individual using AI can produce multiple papers that appear valid in a matter of hours.

Already the public health literature is observing a slew of papers that draw on data optimized for use with an AI to report single-factor results. Single-factor results link a single factor to some health outcome, such as finding a link between eating eggs and developing dementia.

These studies lend themselves to specious results. When datasets span thousands of people and hundreds of pieces of information about them, researchers will inevitably find misleading correlations that occur by chance.

A search of leading academic databases Scopus and Pubmed showed that an average of four single-factor studies were published per year between 2014 and 2021. In the first ten months of 2024 alone, a whopping 190 were published.

These weren't necessarily motivated by corporate interests—some could, for example, be the result of academics looking to publish more material to boost their career prospects. The point is more that with AI facilitating these kinds of studies, they become an added temptation for businesses looking to promote products.

Incidentally, the UK has just given some businesses an additional motivation for producing this material. New government guidance asks baby-food producers to make marketing claims that suggest health benefits only if supported by scientific evidence.

While well-intentioned, it will incentivize firms to find results that their products are healthy. This could increase their demand for the sort of AI-assisted "scientific evidence" that is ever more available.

Fixing the problem

One issue is that research does not always go through peer review prior to informing policy. In 2021, for example, US Supreme Court justice Samuel Alito, in an opinion on the right to carry a gun, cited a briefing paper by a Georgetown academic that presented survey data on gun use.

The academic and gun survey were funded by the Constitutional Defense Fund, which the New York Times describes as a "pro-gun nonprofit."

Since the survey data are not publicly available and the academic has refused to answer questions about this, it is impossible to know whether his results are resmearch. Still, lawyers have referenced his paper in cases across the US to defend gun interests.

One obvious lesson is that anyone relying on research should be wary of any that has not passed peer review. A less obvious lesson is that we will need to reform peer review as well. There has been much discussion in recent years about the explosion in published research and the extent to which reviewers do their jobs properly.

Over the past decade or so, several groups of researchers have made meaningful progress in identifying procedures that reduce the risk of specious findings in published papers. Advances include getting authors to publish a research plan before doing any work (known as preregistration), then transparently reporting all the research steps taken in a study, and making sure reviewers check this is in order.

Also, for single-factor papers, there's a recent method called a specification curve analysis that comprehensively tests the robustness of the claimed relationship against alternative ways of slicing the data.

Journal editors in many fields have adopted these proposals, and updated their rules in other ways too. They often now require authors to publish their data, their code and the survey or materials used in experiments (such as questionnaires, stimuli and so on). Authors also have to disclose conflicts of interest and funding sources.

Some journals have gone further, such as requiring, in response to the finding about the use of AI-optimized datasets, authors to cite all other secondary analyses similar to theirs that have been published and to disclose how AI was used in their work.

Some fields have definitely been more reformist than others. Psychology journals have, in my experience, gone further to adopt these processes than have economics journals.

For instance, a recent study applied additional robustness checks to analyses published in the top-tier American Economic Review. This suggested that studies published in the journal systematically overstated the strength of evidence contained within the data.

In general, the current system seems ill-equipped to cope with the deluge of papers that AI will precipitate. Reviewers need to invest time, effort and scrupulous attention checking preregistrations, specification curve analyses, data, code and so on.

This requires a peer-review mechanism that rewards reviewers for the quality of their reviews.

Public trust in science remains high worldwide. That is good for society because the scientific method is an impartial judge that promotes what is true and meaningful over what is popular or profitable.

Yet AI threatens to take us further from that ideal than ever. If science is to maintain its credibility, we urgently need to incentivize meaningful peer review.

Journal information: American Economic Review Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Citation: We risk a deluge of AI-written 'science' pushing corporate interests—here's what to do about it (2025, September 8) retrieved 8 September 2025 from https://techxplore.com/news/2025-09-deluge-ai-written-science-corporate.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

10 million peer reviews expected in 2025: Experts advocate for AI integration shares

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

Judge reviews $1.5B Anthropic settlement proposal with authors over pirated books for AI training
AI

Judge reviews $1.5B Anthropic settlement proposal with authors over pirated books for AI training

September 9, 2025
0

September 8, 2025 The GIST Judge reviews $1.5B Anthropic settlement proposal with authors over pirated books for AI training Andrew Zinin lead editor Editors' notes This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility: fact-checked...

Read moreDetails
Researchers develop a next-generation graph-relational database system

Researchers develop a next-generation graph-relational database system

September 8, 2025
Light-based chip can boost power efficiency of AI tasks up to 100-fold

Light-based chip can boost power efficiency of AI tasks up to 100-fold

September 8, 2025
Putting AI vision into better focus with method that mimics human processing

Putting AI vision into better focus with method that mimics human processing

September 8, 2025
The rise of dynamic pricing: Should AI decide what you pay?

The rise of dynamic pricing: Should AI decide what you pay?

September 8, 2025
AI shakes up the call center industry, but some tasks are still better left to the humans

AI shakes up the call center industry, but some tasks are still better left to the humans

September 7, 2025
Google avoids being dismantled after US court battle—and it’s down to the rise of AI

Google avoids being dismantled after US court battle—and it’s down to the rise of AI

September 6, 2025

Recent News

Judge reviews $1.5B Anthropic settlement proposal with authors over pirated books for AI training

Judge reviews $1.5B Anthropic settlement proposal with authors over pirated books for AI training

September 9, 2025
Ethereum Hits 0 in Volatility, Bitcoin Oversold? New Uptrend Born, XRP: You Can Smell Recovery

Ethereum Hits 0 in Volatility, Bitcoin Oversold? New Uptrend Born, XRP: You Can Smell Recovery

September 9, 2025
Researchers develop a next-generation graph-relational database system

Researchers develop a next-generation graph-relational database system

September 8, 2025

Eightco Stock Jumps 3000% on Worldcoin Treasury Strategy, BitMine Investment

September 8, 2025

TOP News

  • Investment Giant 21Shares Announces New Five Altcoins Including Avalanche (AVAX)!

    570 shares
    Share 228 Tweet 143
  • Can energy-hungry AI assist lower our power use?

    534 shares
    Share 214 Tweet 134
  • Analyst Says Shiba Inu is Caught at Bull Market Help, and That’s Nice

    536 shares
    Share 214 Tweet 134
  • Our favorite power bank for iPhones is 20 percent off right now

    557 shares
    Share 223 Tweet 139
  • God help us, Donald Trump plans to sell a phone

    570 shares
    Share 228 Tweet 143
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved