chatGPT is not very good at basic math (or logic)

preview_player
Показать описание
Here I give you an example of a simple math expression in Python and ChatGPT cannot figure out why the result is like that.
Although sometimes very impressive, ChatGPT really don't "understand" logic or math, it just makes up words that sound very convincing, mimicking the samples it was training on, but so far as I know, there's no causality involved (at lease in this and most current AI models). It's pretty good at gaslighting though.
Thus, takes whatever chatGPT says with a grain of salt :).
Рекомендации по теме
Комментарии
Автор

Chatgpt is a cool little tool to have some fun conversations with, but when people are using it to try and learn programming, or to learn any other logic-based field like mathematics, it just falls flat. I saw a clip of a person asking whether <x> was greater than <y> or not, and it couldn't comprehend the correct answer. That's some elementary logic.

valizeth