pshanew:
Nah, it’s using hardware accelerated decoding (all that matters in this case). Specifically, via the videotoolbox API:
(460032) 🔧 PMKMPVClient.m:519 | [cplayer] v: List of enabled features: asm audiounit avfoundation bsd-fstatfs build-date cplugins debug-build ffmpeg gl glob glob-posix iconv ios-gl lcms2 lgpl libass libdl libm libmpv-shared optimize osx-thread-name plain-gl posix posix-or-mingw pthreads stdatomic tvos vector videotoolbox-hwaccel zlib
(459977) 🔧 PMKMPVClient.m:519 | [libmpv_render] v: Loading hwdec driver 'videotoolbox'
(459927) 🔧 PMKMPVClient.m:519 | [vd] v: Looking at hwdec h264-videotoolbox...
(459927) 🔧 PMKMPVClient.m:519 | [vd] v: Trying hardware decoding via h264-videotoolbox.
(459976) 🔧 PMKMPVClient.m:519 | [vd] v: Pixel formats supported by decoder: videotoolbox_vld yuv420p
(459976) 🔧 PMKMPVClient.m:519 | [vd] v: Requesting pixfmt 'videotoolbox_vld' from decoder.
(459673) 🔧 PMKMPVClient.m:519 | [ffmpeg/video] debug: h264: Reinit context to 720x416, pix_fmt: videotoolbox_vld
(459790) 🔧 PMKMPVClient.m:519 | [vd] v: Pixel formats supported by decoder: videotoolbox_vld yuv420p
(459790) 🔧 PMKMPVClient.m:519 | [vd] v: Requesting pixfmt 'videotoolbox_vld' from decoder.
(459964) 🔧 PMKMPVClient.m:519 | [ffmpeg/video] debug: h264: Reinit context to 720x416, pix_fmt: videotoolbox_vld
(459964) ➖ PMKMPVClient.m:511 | [vd] info: Using hardware decoding (videotoolbox).
(459964) 🔧 PMKMPVClient.m:519 | [vd] v: Decoder format: 720x404 [404:405] videotoolbox[nv12] auto/auto/auto/auto/auto CL=mpeg2/4/h264
(459964) 🔧 PMKMPVClient.m:519 | [vf] v: [in] 720x404 [359:360] videotoolbox[nv12] bt.601/bt.709/bt.1886/limited/display SP=1.000000 CL=mpeg2/4/h264
(459964) 🔧 PMKMPVClient.m:519 | [vf] v: [userdeint] 720x404 [359:360] videotoolbox[nv12] bt.601/bt.709/bt.1886/limited/display SP=1.000000 CL=mpeg2/4/h264
(459964) 🔧 PMKMPVClient.m:519 | [vf] v: [autorotate] 720x404 [359:360] videotoolbox[nv12] bt.601/bt.709/bt.1886/limited/display SP=1.000000 CL=mpeg2/4/h264
(459964) 🔧 PMKMPVClient.m:519 | [vf] v: [convert] 720x404 [359:360] videotoolbox[nv12] bt.601/bt.709/bt.1886/limited/display SP=1.000000 CL=mpeg2/4/h264
(459964) 🔧 PMKMPVClient.m:519 | [vf] v: [out] 720x404 [359:360] videotoolbox[nv12] bt.601/bt.709/bt.1886/limited/display SP=1.000000 CL=mpeg2/4/h264
(459927) ➖ PMKMPVClient.m:511 | [cplayer] info: VO: [libmpv] 720x404 => 720x405 videotoolbox[nv12]
(459927) 🔧 PMKMPVClient.m:519 | [vo/libmpv] v: reconfig to 720x404 [359:360] videotoolbox[nv12] bt.601/bt.709/bt.1886/limited/display SP=1.000000 CL=mpeg2/4/h264
Maybe there’s some confusion between video decoding and the use of OpenGL vs. Metal for rendering?
What you posted seems to be an h264 file, while many of the problematic 4K HDR files are HEVC h265.
But whether or not the Plex player is using some hardware optimization for some parts of the player stack for some types of decoding, it’s fairly obvious that for this common stuttering issue, there’s a lot more CPU use (and heat generated) playing the identical file with the Plex player than with the player in apps like Infuse.
I suppose there’s some value in being precise in our language about what we mean when we say the Plex player isn’t using hardware acceleration, but I think for the context of this discussion, that value is fairly low.
When folks (rightly) point out the difference in players and wonder why. The reason is that the Plex player isn’t using (the same amount of) hardware acceleration available on the AppleTV device that other players are.