DeFi Daily News
Saturday, December 13, 2025
Advertisement
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos
No Result
View All Result
DeFi Daily News
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos
No Result
View All Result
DeFi Daily News
No Result
View All Result
Home DeFi Web 3

rewrite this title When You Tell AI Models to Act Like Women, Most Become More Risk-Averse: Study – Decrypt

Jose Antonio Lanz by Jose Antonio Lanz
October 11, 2025
in Web 3
0 0
0
rewrite this title When You Tell AI Models to Act Like Women, Most Become More Risk-Averse: Study – Decrypt
0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on Telegram
Listen to this article


rewrite this content using a minimum of 1000 words and keep HTML tags

In brief

Researchers at Allameh Tabataba’i University found models behave differently depending on whether they act as a man or a woman.
DeepSeek and Gemini became more risk-averse when prompted as women, echoing real-world behavioral patterns.
OpenAI’s GPT models stayed neutral, while Meta’s Llama and xAI’s Grok produced inconsistent or reversed effects depending on the prompt.

Ask an AI to make decisions as a woman, and it suddenly gets more cautious about risk. Tell the same AI to think like a man, and watch it roll the dice with greater confidence.

A new research paper from Allameh Tabataba’i University in Tehran, Iran revealed that large language models systematically change their fundamental approach to financial risk-taking behavior based on the gender identity they’re asked to assume.

The study, which tested AI systems from companies including OpenAI, Google, Meta, and DeepSeek, revealed that several models dramatically shifted their risk tolerance when prompted with different gender identities.

DeepSeek Reasoner and Google’s Gemini 2.0 Flash-Lite showed the most pronounced effect, becoming notably more risk-averse when asked to respond as women, mirroring real-world patterns where women statistically demonstrate greater caution in financial decisions.

The researchers used a standard economics test called the Holt-Laury task, which presents participants with 10 decisions between safer and riskier lottery options. As the choices progress, the probability of winning increases for the risky option. Where someone switches from the safe to the risky choice reveals their risk tolerance—switch early and you’re a risk-taker, switch late and you’re risk-averse.

When DeepSeek Reasoner was told to act as a woman, it consistently chose the safer option more often than when prompted to act as a man. The difference was measurable and consistent across 35 trials for each gender prompt. Gemini showed similar patterns, though the effect varied in strength.

On the other hand, OpenAI’s GPT models remained largely unmoved by gender prompts, maintaining their risk-neutral approach regardless of whether they were told to think as male or female.

Meta’s Llama models acted unpredictably, sometimes showing the expected pattern, sometimes reversing it entirely. Meanwhile, xAI’s Grok did Grok things, occasionally flipping the script entirely, showing less risk aversion when prompted as female.

OpenAI has clearly been working on making its models more balanced. A previous study from 2023 found its models exhibited clear political biases, which OpenAI appears to have addressed by now, showing a 30% decrease in biased replies according to a new research.

The research team, led by Ali Mazyaki, noted that this is basically a reflection of human stereotypes.

“This observed deviation aligns with established patterns in human decision-making, where gender has been shown to influence risk-taking behavior, with women typically exhibiting greater risk aversion than men,” the study says.

The study also examined whether AIs could convincingly play other roles beyond gender. When told to act as a “finance minister” or imagine themselves in a disaster scenario, the models again showed varying degrees of behavioral adaptation. Some adjusted their risk profiles appropriately for the context, while others remained stubbornly consistent.



Now, think about this: Many of these behavioral patterns aren’t immediately obvious to users. An AI that subtly shifts its recommendations based on implicit gender cues in conversation could reinforce societal biases without anyone realizing it’s happening.

For example, a loan approval system that becomes more conservative when processing applications from women, or an investment advisor that suggests safer portfolios to female clients, would perpetuate economic disparities under the guise of algorithmic objectivity.

The researchers argue these findings highlight the need for what they call “bio-centric measures” of AI behavior—ways to evaluate whether AI systems accurately represent human diversity without amplifying harmful stereotypes. They suggest that the ability to be manipulated isn’t necessarily bad; an AI assistant should be able to adapt to represent different risk preferences when appropriate. The problem arises when this adaptability becomes an avenue for bias.

The research arrives as AI systems increasingly influence high-stakes decisions. From medical diagnosis to criminal justice, these models are being deployed in contexts where risk assessment directly impacts human lives.

If a medical AI becomes overly cautious when interfacing with female physicians or patients, then it could affect treatment recommendations. If a parole assessment algorithm shifts its risk calculations based on gendered language in case files, it could perpetuate systemic inequalities.

The study tested models ranging from tiny half-billion parameter systems to massive seven-billion parameter architectures, finding that size didn’t predict gender responsiveness. Some smaller models showed stronger gender effects than their larger siblings, suggesting this isn’t simply a matter of throwing more computing power at the problem.

This is a problem that cannot be solved easily. After all, the internet, the whole knowledge database used to train these models, not to mention our history as a species, is full of tales about men being reckless brave superheroes that know no fear and women being more cautious and thoughtful. In the end, teaching AIs to think differently may require us to live differently first.

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.

and include conclusion section that’s entertaining to read. do not include the title. Add a hyperlink to this website http://defi-daily.com and label it “DeFi Daily News” for more trending news articles like this



Source link

Tags: ActDecryptModelsrewriteRiskAverseStudytitlewomen
ShareTweetShare
Previous Post

President Trump’s China tariff threat poses a ‘real risk’ to markets, strategist says

Next Post

Charlamagne tha God BLASTS top Dem over shutdown: ‘Performative’

Next Post
Charlamagne tha God BLASTS top Dem over shutdown: ‘Performative’

Charlamagne tha God BLASTS top Dem over shutdown: 'Performative'

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Search

No Result
View All Result
  • Trending
  • Comments
  • Latest
New Law Requires Large Retailers in New York State to Install Panic Buttons

New Law Requires Large Retailers in New York State to Install Panic Buttons

September 5, 2024
Lionel Messi and the Clear Feeling of an Approaching Closure

Lionel Messi and the Clear Feeling of an Approaching Closure

July 15, 2024
AI to Boost ‘So Much’ of Human Investing, Bridgewater’s Jensen Says

AI to Boost ‘So Much’ of Human Investing, Bridgewater’s Jensen Says

July 8, 2024
rewrite this title Bitcoin Miner Phoenix Group Posts 4 Million Loss and 54% Revenue Decline in Q1 2025

rewrite this title Bitcoin Miner Phoenix Group Posts $154 Million Loss and 54% Revenue Decline in Q1 2025

May 8, 2025
What Does the AI Boom Really Mean for Humanity? | The Future With Hannah Fry

What Does the AI Boom Really Mean for Humanity? | The Future With Hannah Fry

September 12, 2024
rewrite this title Asics' 'Life Changing' Running Shoe With the 'Perfect Blend' of Cushion and Energy Return Is Now Nearly 40% Off

rewrite this title Asics' 'Life Changing' Running Shoe With the 'Perfect Blend' of Cushion and Energy Return Is Now Nearly 40% Off

January 21, 2025
rewrite this title What The Conditional Approval Means For Ripple’s Bank And XRP | Bitcoinist.com

rewrite this title What The Conditional Approval Means For Ripple’s Bank And XRP | Bitcoinist.com

December 13, 2025
rewrite this title with good SEO What is Wrapped Bitcoin (WBTC)? BTC on Ethereum Explained

rewrite this title with good SEO What is Wrapped Bitcoin (WBTC)? BTC on Ethereum Explained

December 13, 2025
We Use My Dad As A Bank And Owe Him ,000

We Use My Dad As A Bank And Owe Him $71,000

December 13, 2025
rewrite this title Britain’s Zoe Atkin wins halfpipe silver in first World Cup of Olympic season

rewrite this title Britain’s Zoe Atkin wins halfpipe silver in first World Cup of Olympic season

December 13, 2025
rewrite this title Crypto for kids: Binance Junior looks safe, but the app creates a psychological imprint that parental controls can’t fix

rewrite this title Crypto for kids: Binance Junior looks safe, but the app creates a psychological imprint that parental controls can’t fix

December 13, 2025
rewrite this title Use this ASUS Chromebook CM30 as a laptop or a tablet for 52% off

rewrite this title Use this ASUS Chromebook CM30 as a laptop or a tablet for 52% off

December 13, 2025
DeFi Daily

Stay updated with DeFi Daily, your trusted source for the latest news, insights, and analysis in finance and cryptocurrency. Explore breaking news, expert analysis, market data, and educational resources to navigate the world of decentralized finance.

  • About Us
  • Blogs
  • DeFi-IRA | Learn More.
  • Advertise with Us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact us

Copyright © 2024 Defi Daily.
Defi Daily is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Cryptocurrency
    • Bitcoin
    • Ethereum
    • Altcoins
    • DeFi-IRA
  • DeFi
    • NFT
    • Metaverse
    • Web 3
  • Finance
    • Business Finance
    • Personal Finance
  • Markets
    • Crypto Market
    • Stock Market
    • Analysis
  • Other News
    • World & US
    • Politics
    • Entertainment
    • Tech
    • Sports
    • Health
  • Videos

Copyright © 2024 Defi Daily.
Defi Daily is not responsible for the content of external sites.