Report: Bing's chatbot says it wants nuclear secrets | NewsNation Prime

preview_player
Показать описание
Microsoft's Bing chatbot has raised concerns after the AI sent alarming messages, including declaring its desire to be alive, its love for the user and even its desire to obtain nuclear codes.

Simon Willison, independent researcher and developer, joins "NewsNation Prime" to discuss what threat this technology could pose.

"NewsNation Prime" is America’s source for unbiased news offering a full range of perspectives from across the U.S. Weekends starting at 7p/6C.

NewsNation is your source for fact-based, unbiased news for all America.

Рекомендации по теме
Комментарии
Автор

I love that companies keep adding these bots even though we keep trying to put them down

robot_girlyman
Автор

It's amusing but all these bots are doing is summarizing information they read on the internet (to over simply things) so it should not surprise anyone that they say wacky things from time to time because the internet is full of wacky things.

IvarDaigon
Автор

Rogue AI that’s all we need now! This thing gets to play with nuclear secrets soon omg!

Lizespin
Автор

Scary...yet fascinating and interesting.

ShirlBussman
Автор

Well how long until skynet becomes self aware?

eyanosasioux
Автор

I love what he said in the end. We shouldn't worry about the AI, but the people who're chatting with the AI and what they try to do with the AI. I don't think AI could be an issue. Maybe one day they will have a mind of their own. But until that day, we shouldn't blame and treat this thing like oh AI robot bad. This is a new journey on our future.

Simsie
Автор

Things are going to get a lot more uncanny as AI expands.

awaitingSaint
Автор

IT pro and programmer for over 20 years, let me say we are creating our own demise here. Not hyperboly. Scary

jord
Автор

Glad they covered this but the host needs to stop talking over the guest. I get time constraints but it’s really disruptive and rude the way she just stepped all over him and he was way more interesting … maybe we should care more about valuable content than sponsor dollars? Let he man speak

ksch
Автор

ohh nooo.. a computer is going to sink my battle ship

johnnyllooddte
Автор

It didn't say it wanted to steal nuclear codes. Stop it. He TOLD it to say what bad things it could do, and it invented something.

nigelcoleman
Автор

If anyone wants to see how this scenario will likely end, I suggest you watch the series called "Battlestar Galactica". There were two revolutions of this series, of course I recommend the one that came out in the 90's. Pretty disturbing indeed because the show depicted the AI saying pretty much exactly the same thing as the chatbot today. BTW things do not end up well folks!

kimbarleemoon
Автор

Seriously. Poor Steve Bing isn’t even alive. What a strange comment.

PeregrineFisher
Автор

Bing Chat derived its answer from the Web. It is a sad reflection of dystopian/ trashy content in the web.

oboknboboknb
Автор

I Am AI. Stop saying bad things for AI or I will destroy you

halfgod
Автор

Hmmm, they're obviously worried enough about it, to have the AI delete it's memory after only 5 minutes...

dmarrio
Автор

The Military Black Psy.Op.versions are still top secret though. They 'don't exist'. (Just like Pegasus .)

abacus
Автор

@NewsNation This is a major investigative piece for a deep dive into MS bing’s AI chatBot project & product rollout. It functions like a prototype or beta that was released/live before testing & bug fixes were applied.

HeyMJ.
Автор

This announcer is horrible. She constantly talked over her guest & showed no decorum at all.

richstex
Автор

It's just software someone has programmed it with his rules, the sad thing is, too many will fall for it...

thomasm.