Stanford study outlines dangers of asking AI chatbots for personal advice
” According to a recent Pew report, 12% of U. teens say they turn to chatbots for emotional support or advice. And the study’s lead author, computer science Ph.
“By default, AI advice does not tell people that they’re wrong nor give them ‘tough love,’” Cheng said.
“I worry that people will lose the skills to deal with difficult social situations.
” The study had two parts.
“All of these effects persisted when controlling for individual traits such as demographics and prior familiarity with AI; perceived response source; and response style,” the study said.
” Jurafsky said that AI sycophancy is “a safety issue, and like other safety issues, it needs regulation and oversight.
That’s the best thing to do for now
Logic Quality Breakdown:
- Updated_At:
- Truth_Blocks:
- Analysis_Method: