Gemini 1.5 Pro - Coding Assistant with 1M tokens

preview_player
Показать описание
In this video, we will have a first look at Gemini 1.5 Pro and use it as coding assistant.

LINKS:

Signup for Advanced RAG:

TIMESTAMPS:
[00:00] Gemini 1.5 Pro 1M Tokens
[02:35] Uploading files to Gemini
[03:45] Putting it to the Test

All Interesting Videos:

Рекомендации по теме
Комментарии
Автор

Great video, dude. Thanks. Nearby i am trying to install llama localy and was disapointed... I dont know it took so long time to give me an answer... it is something near 8 - 13 seconds... I got a lot of hopes for this because needed it for my job)))

AlexAlex-eizf
Автор

Thanks 👍 A comparison of different AI coding assistents would be great ✌️

donttellya
Автор

Great video! Wish I could afford you on my project.

GetzAI
Автор

I tested it too as i got the 1 million token as well. I forgot to take the screenshot of its replies but if it did, everyone would have been laughing.
It sucks ass.

abdulrehmanbaber
Автор

I got access to the model but I hit an internal error in any of the models I use in AI Studio. Anyone else with this problem?

miladmirmoghtadaei
Автор

Is gemini advanced the same than Google AI Studio, sorry I don't know how to access Google AI Studio

saadamiens
Автор

I tried it. It says it's not Gemini but bard and he doesn't know about Gemini. It can't analise videos

malemsana_only
Автор

I am totally confused about the naming… so, Gemini 1.5 was released some weeks back to a few people. Is this now something new or did you only get access now?

maxziebell
Автор

It's said to support Polish language, but it's still not available in Poland. Are we some sort of third world country, or what?

tomaszzielinski
Автор

I tested Gemini Pro 1.5, and it's very bad. I'm not sure why people are so hyped about it. Probably just blinded by Google's research paper, which seems very off from reality, again 😥

In my tests, Gemini Pro 1.5 could not retrieve information from approximately 100k (yes, just 1/10th of 1M) of context tokens, so I think Google just messed up something with their evaluation of context retrieval with 1M tokens. It would definitely blow my mind if it worked as they promised, but in reality it just does not.

mirkakonest