DeFi Daily News
Monday, June 9, 2025
Advertisement
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos
No Result
View All Result
DeFi Daily News
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos
No Result
View All Result
DeFi Daily News
No Result
View All Result
Home Other News Tech

rewrite this title ‘Catastrophic overtraining’ could harm large language AI models that are trained on more data for the sake of training

waynewilliams@onmail.com (Wayne Williams) by waynewilliams@onmail.com (Wayne Williams)
April 13, 2025
in Tech
0 0
0
rewrite this title ‘Catastrophic overtraining’ could harm large language AI models that are trained on more data for the sake of training
0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on Telegram
Listen to this article


rewrite this content using a minimum of 1000 words and keep HTML tags

Researchers from top US universities warn extending pre-training can be detrimental to performance Too much pre-training can deliver worse performance due to something akin to the butterfly effect The more they are pre-trained, the more they become sensitive to small changes that could disrupt the end result

Researchers from Carnegie Mellon, Stanford, Harvard, and Princeton are challenging one of AI development’s accepted core beliefs – that the more pre-training data the better the performance.

As reported by HPCwire, a new paper discuses the concept of “catastrophic overtraining,” whereby extended pre-training can harm a model’s performance after fine-tuning.

The researchers compared two versions of the OLMo-1B model, one trained on 2.3 trillion tokens and another on 3 trillion. Despite the larger training set, the more extensively trained model reportedly performed up to 3% worse on benchmarks like AlpacaEval and ARC.


You may like

Reaching the inflection point

This performance drop, the study claims, is linked to a phenomenon called “progressive sensitivity.”

As the token count increases, the model becomes more fragile. Even small tweaks, like adjustments during fine-tuning, or the introduction of noise, can reverse earlier gains.

The authors demonstrated this by injecting Gaussian noise into pre-trained models, noting that performance degraded more sharply the longer the model was trained.

The point where this additional training starts to degrade performance is called the “inflection point.”

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Once reached, the benefits of training start to become outweighed by the risk of internal instability. The study found that this tipping point often occurs beyond 2.5 trillion tokens in smaller models, like OLMo-1B.

“Catastrophic overtraining may be inevitable… especially when the pre-training and fine-tuning tasks are misaligned,” the authors warn in their paper, which you can access through the arXiv pre-print server.

While the researchers are not suggesting an end to pre-training, they do feel that developers should consider just how much pre-training is enough. As the paper concludes, “Our findings call for a renewed focus on model scaling that considers the entire training pipeline.”

For AI developers chasing scale, the message seems clear: sometimes, less really is more.

You might also like

and include conclusion section that’s entertaining to read. do not include the title. Add a hyperlink to this website [http://defi-daily.com] and label it “DeFi Daily News” for more trending news articles like this



Source link

Tags: CatastrophicdataHarmLanguageLargeModelsovertrainingrewriteSaketitleTrainedtraining
ShareTweetShare
Previous Post

rewrite this title Mythic Quest canceled; creators promise new version of finale for fans this week

Next Post

rewrite this title This Week in Crypto Games: Gaming Tokens Crash Out, Eve Frontier Opens Up – Decrypt

Next Post
rewrite this title This Week in Crypto Games: Gaming Tokens Crash Out, Eve Frontier Opens Up – Decrypt

rewrite this title This Week in Crypto Games: Gaming Tokens Crash Out, Eve Frontier Opens Up - Decrypt

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Search

No Result
View All Result
  • Trending
  • Comments
  • Latest
Changelly Collaborates with BRLA Digital and Announces Zero-Fee Campaign – Cryptocurrency Insights & Trading Guidance on Changelly’s Blog

Changelly Collaborates with BRLA Digital and Announces Zero-Fee Campaign – Cryptocurrency Insights & Trading Guidance on Changelly’s Blog

July 25, 2024
Boeing machinists refuse latest offer, prolonging bruising six-week strike

Boeing machinists refuse latest offer, prolonging bruising six-week strike

October 23, 2024
5 Crypto Experts Predict: Bitcoin is About To EXPLODE Just Like Gold!

5 Crypto Experts Predict: Bitcoin is About To EXPLODE Just Like Gold!

May 3, 2025
Bitcoin Miners Selling Bitcoin to Stay Solvent Amid Volatility in Price – Decrypt

Bitcoin Miners Selling Bitcoin to Stay Solvent Amid Volatility in Price – Decrypt

August 13, 2024
I Built The DREAM Office Setup!

I Built The DREAM Office Setup!

November 30, 2024
rewrite this title Haliey Welch Breaks Silence on Hawk Tuah Coin Collapse

rewrite this title Haliey Welch Breaks Silence on Hawk Tuah Coin Collapse

May 6, 2025
rewrite this title and make it good for SEOInsurers’ new biz premiums up in May, policy sales fall

rewrite this title and make it good for SEOInsurers’ new biz premiums up in May, policy sales fall

June 9, 2025
rewrite this title with good SEO Ethereum Weekly Structure Tightens – Tower Top Pattern In Play?

rewrite this title with good SEO Ethereum Weekly Structure Tightens – Tower Top Pattern In Play?

June 9, 2025
rewrite this title Bitcoin climbs back to 0k amid sustained corporate, institutional interest

rewrite this title Bitcoin climbs back to $110k amid sustained corporate, institutional interest

June 9, 2025
rewrite this title 9 Best Dubai Hotels to Book Using Points – NerdWallet

rewrite this title 9 Best Dubai Hotels to Book Using Points – NerdWallet

June 9, 2025
rewrite this title RFK Jr. Retires Entire CDC Vaccine Advisory Committee

rewrite this title RFK Jr. Retires Entire CDC Vaccine Advisory Committee

June 9, 2025
rewrite this title Italy give Spalletti winning send off

rewrite this title Italy give Spalletti winning send off

June 9, 2025
DeFi Daily

Stay updated with DeFi Daily, your trusted source for the latest news, insights, and analysis in finance and cryptocurrency. Explore breaking news, expert analysis, market data, and educational resources to navigate the world of decentralized finance.

  • About Us
  • Blogs
  • DeFi-IRA | Learn More.
  • Advertise with Us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact us

Copyright © 2024 Defi Daily.
Defi Daily is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos

Copyright © 2024 Defi Daily.
Defi Daily is not responsible for the content of external sites.