If client is set to Convert Automatically Plex chooses wrong codec. But not if set manually

Server Version#:1.41.8.9834
Player Version#:AppleTV

My wife was complaining that movies were erroring out when trying to watch a high bit rate movie at our lakehouse. I have the Latest gen of Apple tv setu there and irt streams from our house in the city. My server is a dual xenon machine, with 384gb of ram and two gtx 1080ti GPUs. Its running on Ubuntu 24.04 so no docker involved. Plex is set so that if it needs to transcode it should transcode everything to HEVC. I thought it was working fine, as I tested it it by manually selecting resolutions at the lake house and it showed HW encoding with HEVC. Well I noticed when my wife would complain I would check the dashboard and see that its chosen h.264 and for some reason it can’t keep up. But the card dedicated to plex hardly flinches. I have the appletvs set at the lakehouse convert automatically so she doent have to do anything. I had to manuall change it to convert to 4k high, and low and behold is transcodes to HEVC and the movie plays fine. So I did a test as I thought it might have to do with subtitles. I uploaded a 4k HEVC game recording I had that is very high bitrate that doesnt have any subtitles or fluff. She tried it and the same thing happened, if on automatic it choses h.264 and stopps playing after a few seconds. But if she choses the resolution plex transcodes in HEVC.
So it can’t be subtitles. I have the bandwidth to direct stream, but some of my DVD rips are freaking huge and I just want to make the experience for her and the kids seamless.
Can anyone think of why plex would do this?
If you need any more info let me know.

Thanks.

not sure but I will make a report for devs and QA to check. If possible to get your server logs to add to report it would be appreciated.

Does it only do this on Apple TV?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.