Hardware transcoding is automatically converted to cpu transcoding

Server Version#:1.32.1.6999
Player Version#:windows 1.67.2.3705-db506a00
My GPU is GTX1650, it has the latest drivers. when i play a video(4K,HEVC,SDR,25mbps) remotely to 1080p-8mbps, my server first use the GPU to transcode for about 6 minutes, then use the CPU to transcode. In the end, due to the 100% CPU load for a long time, the client playback stopped automatically. In fact, when the GPU is transcoding, the GPU load is not high.
How do i solve this problem?

After comparison, i think it’s probably because the frame rate mode of the video is VFR, The staff can experiment with this aspect.(The above content is translated by Google)

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.