I was looking at the wikipedia page for quick sync and noticed
Version 7 (Ice Lake)
The Ice Lake (microprocessor) adds VP9 4:4:4 decoding, VP9 encoding (up to 10-bit and 4:4:4), HEVC 4:2:2 and 4:4:4 decoding and encoding, HDR10 Tone Mapping and Open Source Media Shaders. HEVC hardware encoding quality has also been improved.
does that mean, we need ice lake for hw tone mapping or does plex do it differently?
so, as I’ve learned, my low end gemini lake system can’t do the hdr tone mapping well, though it can handle the uhd transcoding just fine. Is it a fair assumption that tremont microarchitecture based systems would make a killer low powered plex system (i.e. basically able to handle anything one really would want to play today to any client), or is it possible that they would have a similar issue to my gemini lake system? due to its native tone mapping support due to 11th gen graphics?
I have a KabyLake. It is capable of 6 simultaneous HW transcodes with tone mapping.
All of my videos are BR rips (ripped from my media).
There must be another issue. Burning subtitles perhaps ? Image-based subtitles are problematic for any CPU because of the process
HW decode
CPU burn subtitle into raw image
HW Tone map
HW encode
You will run out of “CPU” long before you run out of “QSV ASIC” when subtitles are involved at high (50+ Mbps) bitrates.
That having been said, unless you’re trying to transcode more than 8 concurrent streams, with no subtitles, you should have more than enough QSV ASIC bandwidth in that GeminiLake.
it could be that gemini lake isn’t as well supported (it’s not as well supported by intel compute runtime vs kaby lake itself), but while it handles (non tone mapping) transcoding with ease (literally did nothing besides enable it), I can’t use tone mapping (no subtitles at all, let alone burnt in). It just hangs for a long time, plays a few moments of video/audio than hangs again for an extended period of time (rinse and repeat).
Looking at intel_gpu_top (and regular top). the gpu is using much less gpu than in the non tone map case (as if its not being fed), but Plex Transcoder is using over a single core (130% or so) vs like 30% in the non tone mapping case. which to me would seem like its trying to software tone map for some reason. dont know how to debug this. Any information I should provide?
tried both the library built into ubuntu’s apt repo and the deb’s from intel’s github page (latest release) and didn’t see a difference.
ok, so I’ve played a bit more with this, and it seems tone mapping works perfectly fine on my gemini lake box (playing to an nvidia shield tv 2019 pro, for testing purposes to a sony x940e tv).
But, this is only when i select an explicit transcoding profile (i.e. 1080p medium …). If I select “convert automatically” it just hangs with hdr tone mapping enabled. Which is what I was doing before, always selecting “convert automatically”.
When OpenCL is used for the tone mapping, is the GPU asked to do that generically (compared to QSV specific instructions)? Or does it become a raw CPU compute task?
I have an i7-8700 and used to run Plex on Linux but never had great success with HEVC HDR transcoding to lower bit rate SDR. Will have to rebuild and try again. Right now I’m on Windows and simply have tone mapping disabled.
Per GitHub - intel/media-driver only the following support HDR10TM as a hardware accelerated feature.
ICL (Ice Lake)
TGLx (TGL: Tiger Lake, RKL: Rocket Lake, ADL-S/P: Alder Lake)
Interesting to note, although JSL (Jasper Lake) / EHL (Elkhart Lake) have 11th gen GPUs, they don’t support the same feature.
I know VERY little about Windows. What I do know -
The Windows server version is 32 bit and not 64 bit;
Windows actually handles part of the transcoding (DXVA2 (?) device is what I’ve seen in logs for decoding but have seen the QSV itself for encoding)
You having tonemapping disabled on Windows does not surprise me.
I just helped a friend, who’s not linux savvy at all, setup Ubuntu on an NUC9QNX (i9) box. He’s getting along very well with it.
I run an i7-8809 (the 8700 w/ AMD Radeon embedded as well)
You’ve expressed HDR → SDR at low bit rate.
Bitrate of the source material?
Target bitrate ?
I wouldn’t be too quick to jump on the latest from Intel. Linux support generally lags behind Windows for these things. In December, Intel added AdlerLake to their ICR support – and broke everything else below it.