DeFi Daily News
Monday, June 9, 2025
Advertisement
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos
No Result
View All Result
DeFi Daily News
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos
No Result
View All Result
DeFi Daily News
No Result
View All Result
Home Other News Tech

rewrite this title ‘Catastrophic overtraining’ could harm large language AI models that are trained on more data for the sake of training

waynewilliams@onmail.com (Wayne Williams) by waynewilliams@onmail.com (Wayne Williams)
April 13, 2025
in Tech
0 0
0
rewrite this title ‘Catastrophic overtraining’ could harm large language AI models that are trained on more data for the sake of training
0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on Telegram
Listen to this article


rewrite this content using a minimum of 1000 words and keep HTML tags

Researchers from top US universities warn extending pre-training can be detrimental to performance Too much pre-training can deliver worse performance due to something akin to the butterfly effect The more they are pre-trained, the more they become sensitive to small changes that could disrupt the end result

Researchers from Carnegie Mellon, Stanford, Harvard, and Princeton are challenging one of AI development’s accepted core beliefs – that the more pre-training data the better the performance.

As reported by HPCwire, a new paper discuses the concept of “catastrophic overtraining,” whereby extended pre-training can harm a model’s performance after fine-tuning.

The researchers compared two versions of the OLMo-1B model, one trained on 2.3 trillion tokens and another on 3 trillion. Despite the larger training set, the more extensively trained model reportedly performed up to 3% worse on benchmarks like AlpacaEval and ARC.


You may like

Reaching the inflection point

This performance drop, the study claims, is linked to a phenomenon called “progressive sensitivity.”

As the token count increases, the model becomes more fragile. Even small tweaks, like adjustments during fine-tuning, or the introduction of noise, can reverse earlier gains.

The authors demonstrated this by injecting Gaussian noise into pre-trained models, noting that performance degraded more sharply the longer the model was trained.

The point where this additional training starts to degrade performance is called the “inflection point.”

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Once reached, the benefits of training start to become outweighed by the risk of internal instability. The study found that this tipping point often occurs beyond 2.5 trillion tokens in smaller models, like OLMo-1B.

“Catastrophic overtraining may be inevitable… especially when the pre-training and fine-tuning tasks are misaligned,” the authors warn in their paper, which you can access through the arXiv pre-print server.

While the researchers are not suggesting an end to pre-training, they do feel that developers should consider just how much pre-training is enough. As the paper concludes, “Our findings call for a renewed focus on model scaling that considers the entire training pipeline.”

For AI developers chasing scale, the message seems clear: sometimes, less really is more.

You might also like

and include conclusion section that’s entertaining to read. do not include the title. Add a hyperlink to this website [http://defi-daily.com] and label it “DeFi Daily News” for more trending news articles like this



Source link

Tags: CatastrophicdataHarmLanguageLargeModelsovertrainingrewriteSaketitleTrainedtraining
ShareTweetShare
Previous Post

rewrite this title Mythic Quest canceled; creators promise new version of finale for fans this week

Next Post

rewrite this title This Week in Crypto Games: Gaming Tokens Crash Out, Eve Frontier Opens Up – Decrypt

Next Post
rewrite this title This Week in Crypto Games: Gaming Tokens Crash Out, Eve Frontier Opens Up – Decrypt

rewrite this title This Week in Crypto Games: Gaming Tokens Crash Out, Eve Frontier Opens Up - Decrypt

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Search

No Result
View All Result
  • Trending
  • Comments
  • Latest
The Future of Blockchain: An Inside Look at Cardano

The Future of Blockchain: An Inside Look at Cardano

July 18, 2024
Mastering Crypto Mining: A Step-By-Step Guide

Mastering Crypto Mining: A Step-By-Step Guide

September 12, 2024
rewrite this title with good SEO What Bitcoin Indicators Predict For Q3 2025?

rewrite this title with good SEO What Bitcoin Indicators Predict For Q3 2025?

April 18, 2025
Moralis Web3: Enterprise-Grade Crypto PnL API for Tracking Wallet Profit & Loss

Moralis Web3: Enterprise-Grade Crypto PnL API for Tracking Wallet Profit & Loss

July 24, 2024
Forecasting Reserve Rights (RSR) Prices: 2024, 2025, 2026, 2027 Through 2030

Forecasting Reserve Rights (RSR) Prices: 2024, 2025, 2026, 2027 Through 2030

September 18, 2024
I Have Fallen Into The Personal Loan Trap

I Have Fallen Into The Personal Loan Trap

May 2, 2025
rewrite this title National Iced Tea Day Freebies 2025

rewrite this title National Iced Tea Day Freebies 2025

June 9, 2025
Should I Tell My Wife That We’re Secretly Wealthy?

Should I Tell My Wife That We’re Secretly Wealthy?

June 9, 2025
rewrite this title and make it good for SEO Leading Crypto to Buy? BlockDAG Courts the NBA While INJ, TIA, & SUI Build Momentum

rewrite this title and make it good for SEO Leading Crypto to Buy? BlockDAG Courts the NBA While INJ, TIA, & SUI Build Momentum

June 9, 2025
rewrite this title and make it good for SEOTrump’s memo activating the National Guard doesn’t specify LA. It could apply anywhere and preemptively, legal expert says

rewrite this title and make it good for SEOTrump’s memo activating the National Guard doesn’t specify LA. It could apply anywhere and preemptively, legal expert says

June 9, 2025
rewrite this title Circle made history… but what happens next

rewrite this title Circle made history… but what happens next

June 9, 2025
rewrite this title Avalanche Meme Coin Trenches Heat Up as AVAX Lambo Token Touches  Million – Decrypt

rewrite this title Avalanche Meme Coin Trenches Heat Up as AVAX Lambo Token Touches $25 Million – Decrypt

June 9, 2025
DeFi Daily

Stay updated with DeFi Daily, your trusted source for the latest news, insights, and analysis in finance and cryptocurrency. Explore breaking news, expert analysis, market data, and educational resources to navigate the world of decentralized finance.

  • About Us
  • Blogs
  • DeFi-IRA | Learn More.
  • Advertise with Us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact us

Copyright © 2024 Defi Daily.
Defi Daily is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos

Copyright © 2024 Defi Daily.
Defi Daily is not responsible for the content of external sites.