How ChatGPT Can Help You Get Super Fast Internet 🤯

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

This is simple encryption and decryption. Dont glorify everything if u dont know anything about it.

souravghosh
Автор

In order to understand the encoded code you need another decoder or the relevant compiler which is different and difficult to build and the processing speed not depends on the code size but rather on complexity and many other things...

anujchourange
Автор

Richard did it first with pied piper before ChatGPT 😂

urajrajput
Автор

Its called GZIP, which is a compression algo that most servers do on html and text documents. You will get significant compression ratios of 80% done by server and decoded by browser. There are many text compressors available, nothing new.

acedigibits
Автор

This person's mind will blow when he converts binary numbers to decimal....Very cool compression. and if he ever learns about fourier ananlysis and corresponding natural biases....

TheEllod
Автор

So are you and your team trying to do this level of compression in coding??? Waiting for that✌🏼

gogetamui
Автор

we already use compression behind the scenes while transmitting packets over the network.

Sarthakz
Автор

Woooww, so innovative, really cool! This may change the world one day.

wensong
Автор

YouTubers daily routine.
1. Wake up
2. Talk about chatGPT
3. Sleep
4. And repeat

chiru
Автор

Chargpt is wonderful, I know how to code python. I learned it just for fun, I'm not from an engineering background.
But I needed to code in html for a personal task although I didn't know to code in html.
So, I entered my python code into gpt and it did the work for me, with design elements also.
With few tweaks my webpage was ready, thanks to gpt

jisnudeepmandal
Автор

so in general you to upload and download once 50KB up and Down

now with GPT first upload 50KB (send to gpt)
then download 5KB (compressed) then if you send 5KB total 60KB instead of 50KB

and whoever wants have decompress that will download 5KB, Upload 5KB to GPT, download 50KB to GPT

This might be good for long term storage

But surely not for Data Transfer

let_it_b_x
Автор

Hey varun can you make a full length video of AI and its impact on Indian economy because unlike China a huge portion of our economy is service sector based which may be heavily affected by AI unlike manufacturing and I see no one in India is talking about it

harshvardhansingh
Автор

*The prompt* 👇

Compress the following text in a way that fits a tweet, and such that you (gpt-4) can reconstruct it as close as possible to the original. This is for yourself. Do not make it human readable. Abuse of language mixing, abbreviations, symbols (unicode and emojis) to aggressively compress it, whole keeping all the information to fully reconstruct it.

##Text to compress :

sumitroy
Автор

For all my non tech guys, this is nothing.
Its simple encryption or hashing based method.
There are pre defined algos to achieve it. in fact.. if you understand the concept, you can make your own algo.

AdityaSingh-qlke
Автор

People should talk about lossy vs lossless compression before talking about using this in places where real data is involved.

Good demonstration
Bad usecase

omkardeokar
Автор

Ohh Bhai amazing thank you so much for the daily dose❤❤❤

azamali
Автор

Everyone one should know about auto encoder & decoder also variational encoder and decoder. It will surprise you because this is the mathematics behind this compression and decompression. It's actually useful for analyzing higher dimensional data into lower dimensions

xbinarylol
Автор

ChatGPT works on input data that we give it. So if try and ask GPT a question and later copy paste the answer asking if gpt wrote it, it says YES

ashdhuri
Автор

Man you are conflating cool with Geeky. I’m loving it 😊👍🏽

AdityaBarrela
Автор

This is not new and literally done everywhere. I’ll just leave this here, “Modern work on data compression began in the late 1940s with the development of information theory. In 1949 Claude Shannon and Robert Fano devised a systematic way to assign codewords based on probabilities of blocks. An optimal method for doing this was then found by David Huffman in 1951.”

howard