AV1 is disappointing.

preview_player
Показать описание
For now...

My Spotify:

The AV1 codec is in a tough spot compared to H.264 (AVC) right now. It's extremely powerful, but not much supports it. Let's see how it goes in the next few years
Рекомендации по теме
Комментарии
Автор

Av1 might have a slow initial start, but being a royalty free codex will allow every company to freely add av1 to their product allowing for a large user base.

In the next 5 years av1 will probably be the new standard for most devices, and will be incorporated into most people’s graphics cards for encoding.

lifelesshawk
Автор

Things changed pretty quick. RTX 3000 and newer GPUs, RDNA 3, Intel ARC etc all support AV1 hardware decode now (and some like ARC even do AV1 encode) so CPU usage is like ~5% when playing a 1080p AV1 vid on my 8700K (because my RTX 3080 is doing the decoding). The majority of the videos I get on YouTube when using my PC are in AV1 now. I'm guessing because I have a RTX 3080 which has AV1 hardware decode, meanwhile my Note 20 Ultra still gets VP9 since its SOC dose not have AV1 hardware decode. Surprised it's taken Qualcomm till the Snapdragon 8 Gen 2 to finally add AV1 decode.

sean
Автор

You could buy the Intel Arc A380 for around $100 and throw it into a lower PCIe slot under your main GPU and use it for AV1 encoding. I hear it works great but I have not tried it. :)

bodasactra
Автор

This video is specially relevant NOW, because YouTube just decide to use AV1 for streaming in beta.
I think this video is hardly focused on content creators and streamers, not on just watchers. I don't understand why so much people dislike it (yeap, using 'Return YouTube Dislike' extension. Works great).

rocvan
Автор

One interesting fact is that youtube has started using AV1 heavily and there is a good chance that this video is in AV1 because decoding is quite easy and most things can do it even if not with hardware decoding.

And it's hard for creators to encode without native hardware acceleration, but I think it will be the same as h.265 where there was hardware acceleration but no native. (sort of like a translation layer)

And good based video.

vithorcicka
Автор

What an expert, he doesn't even know how many threads the 5900x has. It's 24 NOT 32.

limitlesswave
Автор

Even though YouTube has to store AV1, VP9 AND h264, they see this as mainly a bandwidth saving measure. Once enough devices support the codecs, and they drop support for older devices, then they will be able to see the storage saving benefits as well.

BlakeB
Автор

The main predicament is most phones won't decode AV1. Unless there's an alternate YT player. But that's where the largest benefit is for users to save bandwidth. If they got software decode support, it would drain the batteries faster. People would have to choose between shorter battery time and spending more on data. As of now, we can save battery and data by defaulting to a lower resolution / keeping data saver mode on. Any idea when AV1 hw support will be normal in phones?

uncrunch
Автор

I've been using AV1 for years. It's far from disappointing. From youtube videos, to AV1 coded movies. The compression is amazing.

TabalugaDragon
Автор

I can just smell the misinformation.

1. You can use software decoding just fine on any relatively modern CPU.
I can decode 4k30 10b video on a laptop 2C/4T Skylake trash CPU, so anything more modern should easily be able to cope with 1080p60 or whatever with not much load.

2. Comparing HW encoding vs software encoding is not really fair, especially when considering the fact that game streaming is an area where HW encoding has an advatange due to zero copy. For archival recording, use HW encoding all you want. For low bitrate streaming/recording, might as well use software encoding to get efficiency gains.
Let's not forget that it seems OBS, on windows specifically, has problems with SVT-AV1 performance for some reason. Even discounting that, encoding 1440p60 games by default is hard to encode footage, making your CPU work decently hard :)

I feel this video would have been better if it took a deeper technical dive on the subject.

neutronpcxt
Автор

Nowadays it became possible to stream AV1 with CPU using the SVT-AV1 codec. If somebody wonder I have a 1440p 60fps streaming test in a playlist called "unlisted video" on my channel. If somebody wonder what is the point of CPU streaming when we have a GPU encoder? Actually the image quality is a little bit better than the GPU encoder.

leucome
Автор

I believe Youtube should transcode everyone video to VP9. Instead of requiring user to upload video in 1440p, or have tons of views to get VP9. And if anyone isn't able to play VP9 videos, then they can have AVC copy. AV1 has similar requirements. But you'll need to upload video in 8K, or able to get a video to get millions of views.

mianlo
Автор

I think the main take is that software encoding (on CPU) isn't good for gaming streams or recordings. Which is also the case for x265. The main reasons why people do software encoding is quality and file size. For things like talking streams, cooking streams, whatever, AV1 software encoding should be fine. Also, if your CPU is struggling, you can also choose a lower preset. Depending on resolution and stuff, SVT AV1 preset 9 or 10 can be the sweet spot, and they are probably still slightly better than hardware encoded x265.
Just for comparison, SVT AV1 preset 6 is just as slow as x265 preset slow, and just as nice looking, AV1 maybe looking even better.

petouser
Автор

4:55 Actually YouTube once DID re-encode a huge chunk of the videos stored on their platform. In 2014 they added 60fps support for new and existing content, and did so by re-encoding all videos uploaded at 60fps. To be fair YouTube was significantly smaller at the time and it took far less computing power than it would have today. But it definitely was a huge task and YouTube was willing to do it. Kinda interesting.

WaveSmash
Автор

Even after watch this video, I am still confused. Can you explain why twitch is still using only H624?

ksouvenir
Автор

Intel Arc's version of AV1 encoding is pretty awesome tho...

AlphaYellow
Автор

While it's true that encoding on software (CPU) is slower than encoding on hardware (like on a GPU's dedicated encoding circuitry), you could still use your old graphics card to hardware-encode in an old codec like HEVC at a higher bitrate, or even raw video (provided you have enough disk space), so that you don't get those high CPU usage percentages while gaming. Then once you are done recording, you can convert that HEVC video to AV1 overnight using a software encoder and at a lower bitrate. This way you get low CPU usage while recording your gaming session but also get to enjoy the benefits AV1 has to offer, albeit later.

Also, AV1 decoding isn't as taxing as encoding, this is true for most codecs including h264/h265. While it is true that most systems might not yet support AV1 decoding, it is not forcefully necessary to do the decoding on hardware, it can be done by software as well and this is as simple as just downloading a codec. So you don't need a new device to decode AV1, if it can already decode h264/h265 well then it should have little problem decoding AV1.

noblessus
Автор

Great Video, very useful information. Just little correction. Ryzen 5900X has 12 cores and 24 threads. I do have same processor and it's sad that AV1 taking so much source from processor.

stawsky
Автор

I use AV1 transcoding for video streaming from my home media server and all H264 content is being converted into AV1 automatically now. H265 is "good enough" but half the space for the same quality as 264 is only part of the bennifit.
If you are streaming multiple videos at once to several family members while away from the house that compression really matters.
Also, AV1 playback is much less difficult than encoding, so lots of devices support it.

syspowertools
Автор

GPU encoding has left CPU encoding in the dirt... AV1 encoding is not meant to be done by a CPU if you want real-time results. It's the same with video games--they used to run off of the CPU alone, until graphics cards started to help with the load. Nowadays you simply wouldn't imagine playing a game like Cyberpunk without using a graphics card. The same has happened to real-time video encoding. I'm sure your 5900X can stream with x264 'fast' preset just fine, but unless CPUs will get specific hardware acceleration for AV1, it's simply not a good idea for real-time video encoding.

This video feels like a "mehh I can't use the new technology with what I already have." I don't really understand why this is surprising.

Also, a lot of hardware already supports AV1 decoding, including phones. The tech world has been preparing for its mass-adoption for a couple of years now.

Morris-