Server Version#: 1.24.2.4973
Player Version#: Unknown, latest version on Roku Premiere+
Hi,
I’ve had a successful Plex server for more than 4 years. Two remote users are playing every piece of content in Direct Play with high-speed connections, from 720p to 4K HDR. Works flawlessly.
Lately though, I have started to see this:
The CPU usage doesn’t seem to rise that much, so is there anything really being transcoded. What gives?
Thanks in advance.
Thank you for your reply. However, that is not the case, everything is set to original quality. A 4K file will transcode to 4K (hw), a 1080p will transcode to 1080p (hw), etc. This behavior is new and no hardware or settings have been tweaked since it started, neither on the server side or the player side.
A hardware transcode means that the server’s PC video card is doing the work?
You’ll need to post your server logs when playing a video, and the Roku client log would probably be good too. They’ll say why they choose to transcode instead of direct playing.
All right thanks. I understand that there isn’t a Plex update or windows update that is known to cause such changes? I will see about posting logs and such, perhaps I can figure it out by myself if the information it contains is revealing. Thank you.
Is the bitrate of the file bigger than the upload speed you set under remote access? If it is or really close to it then this will force a transcode. The h.264 level set in the app can also force a transcode if not set properly or the file is encoded higher than supported.
Good insight; not the case for both though. Upload speed is set rather high and in the picture example I posted, the file is a mere 720p 2mbps file. I checked the h264 level of the file(s) and they match ther player max level at 4.1 high.
Here is a little piece of log were I believe the transcode starts, do you guys see any red flags?