No, we’re definitely talking about the same thing. Your assertion is that HEVC is harder to decode than it is to ENCODE H.264. My assertion is that it is not.
The numbers you’re showing above are dealing with different media formats. 4K HDR and 4K SDR are the only 2 instances of HEVC in that list.
a 1080p HEVC file (one you’ve encoded yourself) is easier to decode, than it is to encode that same file into x264.
Edit: To clarify about the FPS drop, if something is harder to decode, causing more strain on the CPU, it will translate as an FPS drop when the CPU is being taxed at 100%. Since Handbrakes job is to use as much CPU as possible to get the job done, if it was harder to decode one over the other, it would translate as a drop in FPS, since more CPU cycles would be given to the process of decoding.
x264 -> x264
x265 -> x264
One of the main reasons it’s harder to transcode 4K is the sheer number of pixels.
1920x1080 = 2,073,600
3840x2160 = 8,294,400
But when dealing with the same resolution file, and only dealing with the difference in codec, it’s easier to decode the file out of x265, than it is to encode that file into x264.
I updated the post above to include the x265 → x264 vs the x264 → x264 and again, no real difference. In fact, if anything, the x265 pulls out ahead in all tests so far.
Plex’s Encoder, as far as I am aware works the same as any other encoder, it will use as much of the CPU as possible, except that it only does it in short bursts till it reaches it’s buffer size, no?
Sure there can, you play the file, and it decodes, and you see how much cpu is being used, if it’s a film based file, we know that’s at 24 frames per second (23.976). In this case, CPU usage is at 3-5%. Then, you encode that same file into x264, and you get 66 frames per second at 100% CPU usage.
to get up to 100% CPU usage on the playback, the file would have to be 480fps. So Decoding 265 is about 20 times more efficient than encoding x264 at the same resolution.
The process of decoding anything is much easier than the process of encoding, because encoding is where the real math happens. Decoding is where you have the answer key and you’re just displaying it.
yeah, I don’t know how you are arriving at 66 fps at 100% or why it is relevant, because that is entirely going to be dependant on the CPU itself, of which there are vast differences in math speed across generations.
Here is what I see, with no HW decode or encode…
4k hevc > x264 = kills peoples cpu
1080 hevc > x264 = kills peoples cpu a small amount less
1080 x264 > any x264 = people cpu kicking back smoking a cigarette.
When I decode (playback) a 1080p HEVC file, we’re looking at 3-5% CPU usage with hardware decode disabled.
When I encode that same file, it encodes at 66 frames per second, maxing out my processor at 100%.
This has more to do with the number of pixels than it does the HEVC.
This is not my experience, my entire library is x265 now, I made this choice based on testing between the two and found nearly no difference in transcoding processor usage, and the benefits outweighed the minimal drawbacks.
Like I said, there’s nearly no difference on my machine between the transcoding if the input file is 264 or 265. It’s only when you transcode INTO 265 vs INTO 264 that there’s a real difference in speed (as you can see above, the 265 encodes were 1/2 the speed of the 264 encodes).
Edit: But, we’re not really talking about CPU A vs CPU B. We’re talking about, specifically, decoding versus encoding. Decoding 1080p HEVC is not nearly as intensive as encoding 1080p x264. There’s a 2% difference in decoding x264 over x265 on my processor. Where an x264 movie decodes in in 1-3% range, an x265 movie decodes in the 3-5% range. That’s it. Which was the point I was making way up in the posts. Decoding is a relatively easy job, so a CPU can decode x265 as long as the GPU is handling the encoding of x264 or x265 and things would go smoothly. It’s the encoding that is the real pain in the rear, and it’s what needs the horsepower in most cases.
Now, going from 2160p HEVC -> 1080p x264 is quite the task, but so is 2160p x264 -> 1080p x264, because the input file has 8million pixels to decode, and the CPU needs to decide which ones to keep and which ones to get rid of. Input resolution is the largest part of the problem when it comes to decoding, not the chosen codec.
Your cpu is already powerful enough to decode hevc in real time or greater.
I suspect the disconnect between us, is when using cpus that cannot decode hevc in real time, the load of decoding hevc is greater than the encoding load of x264.
I don’t think either one of us is going to convince the other at this point.
In mind, we need to isolate and benchmark each thing…
The problem is, decodes are dependent on the complexity of the encode (which goes back to quality vs size).
do you compare the decode/encode load/fps of hevc @ 1gig file size vs x264 @ 1gig file size?
or compare encode/decode load/fps of hevc @ quality X vs x264 @ quality X (and can quality X ever be equal between the 2?)
I mean you can set x264 encoding anywhere from no compression to take all night, see you in the morning quality.
Anyway, I’m done for the night.
You’ve given me food for thought.
If I get some time, I will try to compare how good or bad 1080p hevc decodes on my cpu, and how that compares to various encoding 1080p.
Hehe, no worries man. I like the debate, and it’s always good to have some back and forth.
This doesn’t seem to matter as much, to be honest.
I’ve tried everything from 1080p x264 @ 8Mbit and x265 @ both 4 and 8Mbit, all the way up to 2160p at similar intervals (Original bitrate vs 70Mbit vs 35Mbit), leaving all other settings as default. The conclusion I came to after transcoding in a bunch of different resolutions and bitrates was that the biggest factor in the speed at which encoding occurs (given all other settings are the same) is input resolution.
I can bust out StaxRip and a UHD BluRay and run a bunch of tests if you want, something like the following:
2160p HEVC HDR Original Bitrate → 2160p x264 Tonemapped Double Bitrate
2160p HEVC HDR Original Bitrate → 2160p x264 Tonemapped Same Bitrate
2160p HEVC HDR Original Bitrate → 2160p HEVC HDR 1/2 Bitrate
Or whatever combinations you want really, and capture all of the numbers for you. I like experimenting and trying new things out