The AI Robot ChatGPT is Hilariously Bad at Math... #shorts

preview_player
Показать описание
The popular new robot Chat GPT / OpenAI might be pretty good at certain types of knowledge/intelligence, but it turns out to be hilariously bad at math!

Combo Class links:

#chatgpt #ai #aichat #openai

Combo Class, taught by Domotro, is a crazy educational show where you can have fun learning rare things about math, science, language, and more! This is the channel for Combo Class SHORTS and EXTRAS - make sure you're also subscribed to the main Combo Class channel where full episodes go!

DISCLAIMER: any use of fire, tools, or other science experiments in this series is always done in a safe and professional way. Do not try to copy any actions you see in this series yourself.
Рекомендации по теме
Комментарии
Автор

Yeah, this is so true. That AI is so bad at Math. A trick would be to ask a follow-up question like "is that right?" or "is that correct" and sometimes if it is not sure it apologises and try correcting itself (still ending up with a wrong answer sometimes). Don't trust it with your assignments.
It tries when it comes to word questions though.

emmanuelkoech
Автор

"2/3 cannot be expressed as a simple fraction" 😂💀

AviationCaptain
Автор

“Numbers that are both prime and perfect squares are difficult to find”

Yeah bro you don’t say 💀

deathvideogame
Автор

My god, if the super computer independently discovered Terryology, maybe he was right all along...

mainaccount
Автор

my favorite part is when it claimed "the last digit of pi is 1"

Abstract_zx
Автор

Dude, I really love your content and how happy and excited you seem in every one of them

maximofernandez
Автор

Look up "How high can ChatGPT count?" on this very platform.

Inspirator_AG
Автор

It confidently produces impressive sounding and completely wrong answers. Like no human who could use those words correctly in a sentence would ever produce an answer so utterly wrong. It's weird.

nbooth
Автор

People really don't know what it means to be called "Language Model" it is just a probability projection of the next word.

korawichbikedashcam
Автор

You should play chess with chat gpt. It says it knows what it's doing but it cheats relentlessly

disclaimer
Автор

Kinda wish I'd known that after I used Chat GPT to get me through 6 months of calculus classes and complex calculations, all of which I cannot comprehend

Lord_eBatts
Автор

I asked... Can I trust ChatGPT to do math?
ChatGPT replied...
Yes, ChatGPT is capable of performing basic arithmetic operations accurately. However, it is always a good idea to double-check the results to make sure they are correct.

paulvild
Автор

I guess it knows the fact that
1+1 is a bigger one but twice the ammount of 1
And
1 * 1 is also same as that because we are multiplying two saparate individual ones.

kaya_stu
Автор

_A collab with Stand-Up Maths would be amazing..._

TheBlackDeck
Автор

I don't particularly like math but I love your videos, you're energy just shows how much you're genuinely excited about what you're doing and that makes me happy 😊

jacobkrueger
Автор

yeah, i figured out one or two months ago that chatgpt does really suck at math.

i wanted some help with contest math but as soon as it kept saying that 6 was a square number and that 12 doesnt equal four times the sum of its digits (it obviously does as 12=3*(1+2) but it completely denied itself afterwards) i noped out of the chatgpt math class

part
Автор

A physics professors was testing chapt gpt by giving it a physics exams for first and second year student, it would often get the method and formulate and explanation right but rbut gets the computation wrong. It’s really good at generating codes though.

Also, Chapgpt has been dialed down because people were asking dangerous questions, like how to build a bomb, weaknesses to banks …

MrSidney
Автор

Y'all just stick to Wolfram Alpha

zacerax
Автор

Chatgpt is a language model and not a math model

imvine
Автор

Once someone asked it a simple maths question (I don't remember entirely what it was but I think it was 1+1), chatgpt answered with the right answer (2 if it was 1+1) and they told chatgpt that it was wrong and that the answer was something else and it believed them

aera