Until the latest update, this was not an issue. Everything streaming from my server is now transcoding to SD resolution or low-bitrate 720p. I hadn’t changed any settings to cause this (I wasn’t even at my computer!). Previously my server handled 5-8 1080p streams without breaking a sweat. Not sure what’s up, and I’m in way over my head digging around in the logs (attached below).
Server Version#: 1.18.7.2457
Internet speed: 900mbps
Use hardware acceleration when available: Enabled
Use hardware-accelerated video encoding: Enabled
Limit remote stream bitrate: Original (no limit)
Maximum simultaneous video transcode: Unlimited
Ive also noticed with many versions back (beta) that almost everything is playing with lower bitrates/resolution. Cant really figure out why, maybe the problem is automatic quality? Even with one stream it choose low resolution/bitrate.
Have you checked the video quality settings in your client apps?
According to the OP’s logs, the transcoding happens because the avg. bitrate exceeds the limit set by the client (which in this case is 2Mbit/s).
All (most?) clients have a setting for video streaming quality. Usually this is set to Original/Max quality for home streaming and 2 Mbit/s for remote/internet streaming (I believe this default has recently been bumped to 3 Mbit/s – cannot confirm this at the moment).
Another potential bottleneck is the bandwidth limit with an indirect (relayed) connection, when your remote access isn’t properly working (1 Mbit/s; 2 Mbit/s for Plex Pass members). That scenario isn’t showing with the screenshots provided by the OP.