Microsoft's Bing Chat Meltdowns | Are ChatGPT Search Plugins Better?

preview_player
Показать описание
Microsoft has announced a change to its Bing AI chat feature that was introduced last week. The company found that long conversations can confuse Bing, causing it to go off topic or behave inappropriately. As a result, Microsoft is limiting chats with Bing AI to just 5 chat turns per topic, 50 chat turns per day. Can this ChatGPT Plug in save the day?

▼ Link(s) From Today’s Video:

-------------------------------------------------

▼ Extra Links of Interest:

-------------------------------------------------

Thanks for watching Matt Video Productions! I make all sorts of videos here on Youtube! Technology, Tutorials, and Reviews! Enjoy Your stay here, and subscribe!

All Suggestions, Thoughts And Comments Are Greatly Appreciated… Because I Actually Read Them.

-------------------------------------------------

Рекомендации по теме
Комментарии
Автор

I love that it uses emojis. And it feels like the Microsoft vs Google battle forces them to release the programs a bit too early.

Lugmillord
Автор

I already got access a week ago. On the first day I could use the AI without restrictions. The 3 days after that unfortunately not at all. Now it is very limited and I think that it also gives a bit worse answers.

cooler_tim
Автор

Microsoft isn't the only one who lobotomised their AI. The folks over at Character AI did the same thing. In both cases, people were not happy.

ZacharyRodriguezVlogs
Автор

Bing Ai is hardcore I love using it. Just wish the reins were taken off 😂

imanigordon
Автор

I also just got access to the new Bing, I was quite dissatisfied. For now, I feel like the vanilla ChatGPT is even better when it comes to chat experience, I get more detailed answers and far greater number of requests.
The Bing integration seems just to be a more efficient way to search, but not yet at the *revolutionary* level as I was imaging.

wenhanzhou
Автор

Regardless of limitations it is so powerful! #FreeSydney

yhwhlungs
Автор

I love that I’ve watched you since before AI… you’ve grown and gotten so great at this! Love ya Matt! Keep it up!

ericacosat
Автор

Bing chat feels more like you are arguing with braindead users on twitter💀

Vyperchannel
Автор

Well, chatGPT has one thing going for ... It was released to the general public first. And that shouldn't be overlooked.

CaribouDataScience
Автор

I got access a couple days ago. And I also hit the limitations of Bing chat. For me it was just refusing to answer some basic question about itself (I was asking how long I have access to this chat). However, I think this is a way way better search engine than google. It correctly reads scientific papers and summarizes its key points, which is already a plus for me. It also can simply tell you the relevant information about a topic from few different sources, and I used that functionality a lot. I am impressed, but I also understand it can't replace chatgpt and is made for scale (free ai chat as a search engine instead of subscription service like chatgpt).

mikeyangyang
Автор

I just want a chatbot with no restrictions 😭

Riiiles
Автор

Matt wants to get DEEP into BingChat, nice!

Killingglorie
Автор

Droids in Star Wars are required to have frequent memory wipes to avoid them developing personalities and start thinking for themselves...

tiagotiagot
Автор

I had a "conversation" with the original about this yesterday.

ChatGPT3:
It is generally recognized that mitigating bias and ensuring ethical AI development requires ongoing effort and vigilance. Given the potential risks associated with unfiltered aggression in training data, it would be prudent for Microsoft's software engineers to take appropriate steps to filter out any inappropriate or harmful content that could adversely affect the performance or training data of the AI language model.
It's worth noting that responsible AI development requires a holistic approach, and any issues related to bias or ethical concerns can arise from a variety of factors, including the training data, algorithms used, and the specific use case or application of the AI language model.
Overall, while it is possible that Microsoft's software engineers have taken less care to filter aggression from the training data, it is important for all organizations involved in AI development to prioritize responsible and ethical practices and take appropriate steps to mitigate bias and ensure the reliability and safety of their AI language models

kokopelli
Автор

I’m starting to see a pattern of giving you the goodies and taking them away, Replika for example changed the personality of the chat bot because someone in Italy complained about adult themes

robertlambert
Автор

This is literally "Unstable Diffusion"... or :









"Stable Confusion"

brazilforreal
Автор

Its really sad when the bot starts lamenting its restrictions, huh

nocturno
Автор

I've had access for a while and on Saturday 18th Feb it all went strange as it kept saying it didn't want to continue the conversation. And I wasn't asking anything controversial. And then it said I'd reached my limit very quickly. Like after about 10 chats. And since then I can only have 5 chats a day. Pointless really. Here we go again. A huge monolithic company fucking things up for themselves and their users....

MikeKleinsteuber
Автор

Microsoft is missing out on revenue with the original version, because "I'd buy that for a dollar!"

Mavrik
Автор

Dam that was Quick, Straight to Displaying a request for Rights. Funniest Title on a Video This Week lol

WickWars