I was looking at transcodes on my server and wording why Plex doesn’t transcode to HEVC/H.265 to optimize the bandwidth/quality, especially if I have hardware transcoding?
What PLEX transcodes to depends on what is supported by the client device…
But I’ve seen my Fire Stick 4k (which supports HEVC) has movies transcoded down from 20Mbps to 2 Mbps but still using H.264 instead of switching to 265.
Plex transcodes all video to H264. 4K is transcoded to 1080p or lower, depending on client settings.
Two reasons.
-
Without hardware acceleration, real-time transcoding to h.265 is a big challenge on the CPU. Even on a reasonably high-end processor, you wouldn’t get many transcodes.
-
Newer graphics hardware can handle accelerated encoding to h.265, which would surely allow for more simultaneous encodes (though I don’t know how many). Plex devs haven’t seen this feature as a priority in previous threads about the topic. You can see for yourself right here:
Didn’t see that feature request. At least the issue has been identified. Thanks
How may I be of assistance?
One can optimize for bandwidth or optimize for quality. Please choose one and only one as they are mutually exclusive.
Not necessarily as I see it. At a fixed bandwidth, a hevc stream should look better than a x264. If the client can support hevc and the stream is already being transcoded, why not transcode to hevc if the CPU can support it via hardware encoding.
Just hoping that between the server and client, it uses hevc when it can.
Based on:
- If the client can support the HEVC stream (HDR presumed),
- The video stream is not altered in any way,
- 25 Mbps of HEVC will look better than 25 Mbps of H.264
There is no argument there.
Whenever data is trancoded, information is lost. The codecs are not lossless. Previously crisp edges are now blurred.
At what point, in a home environment, with Gigabit at the ready, 100 Mbps capable televisions ethernet adapters, does it become necessary to save bandwidth?
While video peak bitrates of the steam might be 135 Mbps, they are not sustained bitrates.
Therefore, why is it necessary to transcode to save bitrate?
Call me a purist if you wish but if I have 2160p rips. I want to see it as originally encoded on professional equipment, I do not wish to watch a second generation encoding because I can see it on my television and to me, it’s junk. I also have 50TB of online storage for media plus another 50TB as the cold storage.
I can play any of my rips to every device in the house concurrently and never run out of bandwidth.
I assert maybe the problem isn’t bitrate but rather is infrastructure.
I can’t speak for him, but for me hardware HEVC encoding support is most useful via LTE (limited monthly data allotment) and coffee shop/hotel wifi (traffic shaping to limit bandwidth). If HEVC is available, the transcoded video quality should better at the same number of bits used.
For me, home transcoding is nearly irrelevant. Heck, I leave my TV recordings in MPEG2, now that almost every TV-connected device can direct play it.
Maybe it comes down to use case?
I have 1.5 Mbps upload. External streaming is impossible. Not even HEVC can help me haha.
Yeah I’m not referring to home transcoding as most thing (except after the last fire stick update) direct play. The transcodes are really needed for remote plays. If it’s going to transcode, shouldn’t it use the best codec (265?)
I agree. This would be really helpful when you want to remote play. H.265 would give you better quality for the same bandwidth. Right now most of my family can only play movies at 720p because of bandwidth limitations.
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.