GZIP is not enough!

preview_player
Показать описание
Compression matters! Heavy page weight hurts companies (in cost to transfer) and users (in cost to download). With the boom of mobile devices, especially in countries with lower speed connectivity, reducing your page weight is critical for the success with users on mobile. In this talk, Colt will discuss a plethora of research into alternate ways to compress, minify, munge, and represent CSS, JSON, HTML, XML and Javascript data on the web; and how GZIP sometimes helps, but often-times can get in the way. Attendees will walk away with a suite of options for reducing their page transfer sizes, and ideas about how to integrate the topics into their development pipelines.
Рекомендации по теме
Комментарии
Автор

Seems I'm a couple of years late watching this. Never the less, very interesting stuff! :) Thanks for posting!

thAttempt
Автор

It's been years since I graduated as an engineer, but watching this made me realize I should've taken mathematics even more seriously.

SalmanRavoof
Автор

My vote is always for middle-out. I can't believe you Hooli guys can't get that figured out :-\

bcbigb
Автор

Very interesting, good to see I'm not the only person interested in reducing overheads.

etmax
Автор

I totally agree. I don't watch most of the videos Google Developer uploads beacuse they're too many, but I watch only the ones I'm interested in.

acolombo
Автор

Really great video, learned a lot and am excited to try-out some of these methods.

FranciscoLopez
Автор

Could you please explain why the kiwi's in the PNG with the extra two columns of pixels were not compressed? I would've expected the second kiwi to have fit in the 32k window, and perhaps part of the third kiwi as well.

DavidTanGQ
Автор

There is a mistake, in that the sorted list does not have the same cardinality as the source set. The correct result would be
[0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1, 1]. He wanted to make an ideal set [0, 1, 1, 1, 1, 1, 1, 1, 1, 1] to demonstrate the compression. In a real world example, it would be fine to have multiple entries that were the same. Those would lead to a value of zero in the delta output like I showed.

fashnek
Автор

There's a problem with the delta approach. Latency. Sure, you might save a few kilobytes... but those additional requests each incur latency. So whatever raw transfer speed improvements you would get (which are not simply linear due to packet sizes, saving 1 byte inside a packet isn't nearly as important as saving 1 byte that would overflow to a new packet) would almost certainly be eaten up by the latency of those requests, especially given the way consumer connections are throttled on upstream. (Not that requests for these things would be large and run into throttling themselves, but considering the ways in which the throttling is done, especially in circumstances where the network connection is being used, such as in a family home)

DustinRodriguez_
Автор

Love it! Is there more? Can you please make a playlist for these educational videos?

gfetco
Автор

The encode time(s) are a much larger part of latency than I would have thought.

The 50ms of encode time per 100kb of data figure is non negligible for files that rarely change like CSS or JS. Serving them a pre-compressed files makes a lot of sense. For CMS frameworks we just need the tools to do it for us.

Is the given amazon example based on the default gzip DeflateCompressionLevel (6)? And is this based on an average server side? Or a local machine?

hexanet
Автор

From 25:00 how do you sort the combination of digits since 3 & 2 are listed twice?

LaurentPerche
Автор

"Minification" reminds me of code obfuscation which companies used back in the day to distribute source code (back when the world was much more than Windows, OSX and Linux) while making it difficult for the user to read it, and -- even before that -- the tokenezation performed by the MS BASIC interpreter.

RonJohn
Автор

So I guess this history lesson is the reason Chrome is being cautious and hiding brotli behind a flag in Canary even though brotli was also invented at Google; while Firefox is going ahead and releasing it into the wild in the very release I'm downloading the update for now.

That said, assuming the proxy issues haven't disappeared (which they may not be as big an issue in the years since); it does tell me that those bugs won't be as big an issue for bzip2 or brotli in HTTPS traffic, which is a growing category of traffic.

danfr
Автор

Wow, how about a 90% compression system that is math based and not huff based, what effect would that have on the net....video streaming, etc.

michaelmoore
Автор

I believe this is because people like the idea of being expert in one thing but just a few have eager and interest in discover more about what it is said to be important?

hay
Автор

The section around 27:00 is incorrectly stated and misleading -- "GZIP is inflating the smaller file" is /wrong/. Those red numbers are not an indictment of GZIP at all and are not indicating that GZIP is harmful or "scary". They're an indictment of the genetic algorithm-based minifier tools, which make the data inherently /less compressable/. In other words, they make GZIP a little bit less helpful, NOT harmful. GZIP is no less of a "silver bullet" with this argument. GA minification is.

fashnek
Автор

its an official google channel, I guess allot of people subscribed over the years but not many watch every vid ?

JensVanHerck
Автор

MY HEAD~ My head.... I need ibuprofen & compression.

Mrbarracuda
Автор

📺💬 I have been working with unreal-engine for 3 years and we found the problem of the game is that objects are in game sizes and when we move pixels that is a lot of data.
🥺💬 I understand also games and the GZip is involved in the compression of pixels when allowed some function working with data when it is in compression format.

📺💬 Huffman codes present in bits code by the order of the number of words present in the context that should be Huffman because more frequently present data is used less number of bits represent or priority in order for symmetric characters or word encoding.
📺💬 Yui you should correct the comment first 🥺💬 It is true if it is Huffman encoding, text line a, an, the or pronunciation will have the priority bits but the longest word matching is to find the number of them in sequences they are present in the context. Huffman is good for reading words because there are not much of repeating words when the table is but how about logging and number locations and bits represents⁉
🥺💬 I also read WinZip which is the reason why DAT format compression is over 70 percent for some text format, it supports bot the longest search and Huffman.
🧸💬 Advantage of GZIP is you still can work with data when it is in compression format.

📺💬 Delta compression where we can send patches to update the data to the client by the client continues working on the original file.
🧸💬 There is an application not only to update the text file for JSON parameters value but visualization of objects, screen transformed data, mouse pointer, and keyboard types and rules.
🐑💬 It is a good application but a security concern, they invented it for many years back but an attacker can replay on this transmission communication by reading from encryption packages because of real-time transform limitations today loing encryption algorithms are helpful with this method.
📺💬 Horizontal Delta compression 🧸💬 Transferring of data in field format grouping and priority you can reduced sizes of communication packages because the data reply required to process are arrived at the same time.

Jirayu.Kaewprateep