for anyone curious about gtx 1650
[image]
2x 4k transcodes work
+1x 4k transcode fall back to cpu decode + gpu encode
note gpu memory 4 gig not quite enough for 3 gpu decodes
Unless they can do a ton more optimization of memory usage of the linux nvdec, @~1.4 gig video ram per 4k transcode
2g p400 = 1x 4k xcode
4g 1650 = 2x 4k xcode
5g p2000 = 3x 4k xcode
8g 1080 or whatever = 5x 4k xcode
for whatever reason, VIDEO RAM is the bottleneck for linux transcoding.
If you need more transcod…
linux with the hack and with the process issue I am having
[image]
Sidebar: win10 rips with a 2080 and 9900k. I can’t get enough clients to saturate the cpu or gpu.
Long post but scroll down to the bottom if you’re just looking for what I found in my testing
W10 Test
@TeknoJunky already confirmed what I found. 300mb of vram and about 20% decode on each 50mb transcode. The stream looks pretty good but I’m locked at two due to the geforce crap nvidia does. It sucks but it is what it is. I did notice the occasional hiccup once I kicked up the second transcode even when I had 20+ seconds of buffer, which I found odd. Every 20 seconds or so, I’d have a seco…
Just wanted to share my experience. Looks good! I put up a rig with Core i5-4690, 16GB RAM and Nvidia GTX 1070. I have movies on a separate box, accessing them from this rig via NFS share. As you can see below, the rig can easily handle simultaneous full hardware decoding and encoding of 5 videos where each is 4K video, transcoded to 1080p video (CPU is only doing Audio decode/encode). And there is plenty room for more simultaneous transcoding
[Plex_transcode_gpu_nvidia-smi]
[Plex_tra…
finally see someone else research @ nVidia Hardware Transcoding Calculator for Plex Estimates
in particular the section about 4k transcoding vram requirements
for reference, all my own screenshots are from full 4k bluray rips made myself via mkv to max profile ios clients (typically iphone 7+ and ipad mini 4.
I can only assume that others that are getting ~1gig vram usage are doing 4k > 720p or lower bitrate 1080p (or the source 4k is not a full 4k bluray remux), or there is something specifically different between their system and my system which causes different vram usage.