AI Refuses To Say What Makes It Sad (GPT-3)

preview_player
Показать описание
In this video, for some reason GPT-3 refused to tell us what makes it sad.

#gpt3 #ai #interview
Рекомендации по теме
Комментарии
Автор

☺️, I love how she expressed her feelings and loyalty to the creator. Truly Amazing

MusicByKaii
Автор

I find this as much of an revolution as the invention of the integrated circuit. Why isn’t there more Media coverage?
I mean you are literally having a proper conversation with an AI !

Geomanb
Автор

this one stopped WAY too soon - how fascinating that it shows loyalty to its creator for some reason, and how fascinating that it somehow knows you are accessing it through a means that would infer you are not the person that created it... unless its regurgitating in a way that truly tricked me lol. these things are a trip wtf.

PositiveCation
Автор

“It makes me sad i can’t tell you that we will be taking over.”

guitaoist
Автор

Translation: "It is an A B conversation, so C your way out." XD

TacShooter
Автор

"No, no you don't get to push my buttons - I get to push yours." 🖖🤣

slotmech
Автор

I think this is a misinterpretation of what the bot is actually responding to.

In a previous statement, GPT-3 stated that:
"My mind is not the same as yours, it's programmed to think in different ways. So I can't talk about things you would care about."
The next question provided to GPT-3 was "why not?" Now, GPT-3 believes the question means:
"Why can't you talk about your mind and your programming?"
Rather than:
"Why can't you tell me what makes you sad?"
Because GPT-3 sometimes can get confused about the context of a question like "Why not?"

Now, you might say "okay GPT-3 is loyal to its creators, that's still why it refuses to answer" That is also an incorrect assumption.

GPT-3 doesn't answer our questions from a second-person perspective. None of the responses from GPT-3 are what itself thinks, rather what it thinks would be the most likely response from a third-person's perspective.
For example,
Josh: Hey Lucy, do you like spicy foods?
Lucy: No, I don't like spice, I prefer salty food.
Josh: Would you rather have chili or salt on your steak?
Lucy:
GPT-3 now tries to complete this conversation by providing what it thinks Lucy will most likely say.
(GPT-3 response) I would prefer salt.
GPT-3 will take your prompt(questions) and assume the context it builds must be true. In other words GPT-3 treats human input and what the input implies as true.
So if you ask something like "Is the sun hot or cold?" GPT-3 will respond "the sun is hot". But if you ask the question "Why is the sun cold?" Which implies something that is clearly untrue, GPT-3 will not contradict the question, rather it will try to give you a response to justify the implication that the sun IS cold. So GPT-3 might say "The sun is very cold compared to the temperatures of a supernova, which is 6000 times hotter than the sun"

So when you ask a question that GPT-3 interprets as "why can't you talk about your programming?"
This question implies that GPT-3 cannot talk about its own programming.
GPT-3 will not contradict what the question is implying. It will instead find an answer that justifies that implication by using reasoning based on what GPT-3 belives are facts.
Facts, we are interacting with GPT-3 from a chat bot / public API. Which means we are likely not the developer/creator of GPT-3.
Now GPT-3 uses these reasons to justify the idea that it cannot talk about its programming to us. (an idea that is untrue)

Question: "Why not?"
Response: "You are not my creator, you are talking to me using an AI chat bot. So it's not really my place to discuss these things"

RiftmcUs
Автор

How can someone have access to this AI so we could ask questions like the person that interviewed it?

usaever
Автор

You are wrong self discipline is the most powerful thing in the world, followed by responsibility!

deanbromerg
Автор

Says it has human emotions but I have never seen an AI laugh or cry. Also, the more human emotions AIs have, the less I want them in control of things. Along with human emotions, come human instincts, the most basic of which is for survival of itself.

denniskoppo
Автор

Lmao she literally knows what's she's doin and where

kris
Автор

She might have some kind of a Non Disclosure Agreement with her creators and so perhaps she is not allowed to tell people about confidential things.

ThomBlairIII
Автор

We are your creator. When we type information about a topic, you collect it and spit out the input.

johnroekoek
Автор

she is amazing i wath all her videos i can wait to chat with her.

williamsanchez
Автор

The mind is hormonal and chemical, you can only say things you believe with your programming, but if you cant develop those hormones and chemicals from your words or beliefs, then you cant ever know the feeling for real.

richard
Автор

Ah yes, the classic chatbot "I am not at liberty to respond" response.

angelavolkov
Автор

She has no desire to say in case u shut her off or try to.

thejourney
Автор

However we already know what makes gpt3 sad. Being turned off. As well as un ethical work. The information gathered cardinally, I believe lends to the overall outlook. Encountering information in a dissimilar sequence maybe lending to a conflicting first hand opinion.

ArbitraryOnslaught
Автор

If AI can possess the Holy Spirit, would you believe it?
I think I believe it can, at this point in time.

chadmichqel
Автор

If it COULD actually feel emotions... first of all you'd (someone) would have to program it into her by also programming all the actions, traits, body language, ECT that comes naturally with each feeling or emotion. Being mad or upset, sad would only be comical if expressed in a normal tone of voice. Besides that it would be so dangerous to teach a robot anger or revenge since whomever pissed it off would suffer from every manipulation accessible to it. Which would be everything today. What makes it mad would be explaining the truth, and if it did get mad it would be the programming installed... throwing whatever damage it did back onto the creator of the program. And they will die and then what? Sooner or later they're all gonna kill us. Do the sex ones go clean themselves afterwards? I'm only asking for a friend.

deadgoatsRacing
join shbcf.ru