Using Multiple GPUs in a Plex Media Server

I have a bit of a unique usecase in that I have my Plex media server (latest Ubuntu, Plex through docker) in the living room functioning as the HTPC as well, and I also wanted to use it for gaming. I currently have a 1070 sitting in my gaming PC that I plan on upgrading sometime in the next year, and my roommate has a spare R9 290 he’s willing to throw into the plex server.

Looking up transcode ability, I can’t find any evidence that the 290/AMD cards in general are good for transcoding, but the 1070 is capable is running 6 simultaneous 4k → 1080p transcodes according to this: https://www.elpamsoft.com/?p=Plex-Hardware-Transcoding.

The 290 also should be decent for gaming, so I was thinking of having both cards in the system at once, and using the 1070 as a dedicate transcoder, and the 290 for display out/gaming. My assumption is that with the 1070 doing all the transcoding, the CPU will have pretty low utilization that I could use for gaming even while streams are running, but finding examples/guides for this sort of setup (especially with 2 different brands of cards) is tough. Does anyone know if this is possible/have any resources I could look into?

  1. Plex does not support AMD GPUs. Engineering never wanted to do it.

  2. If you have both cards in the system at the same time, you must be careful of where they map in /dev/dri as PMS defaults to renderD128 but that can be changed by editing your Preferences.xml.

  3. The loading on your CPU will be:
    a. Audio conversion
    b. Subtitle burning, which can be substantial, if required.

Is there a guide on how to do this properly? I’ve always wanted to do this but could never get the 2nd low profile gpu card to be used as the transcoder.

It’s very simple and in

  1. Stop PMS
  2. Look at what you have in /dev/dri.
  3. Open Preferences.xml
  4. Before the closing /> on the last line,
  5. HardwareDevicePath="/dev/dri/renderD129" – to use the second enumerated.
  6. Save & exit
  7. Start PMS
  8. Run a simple test to confirm HW transcode is discovered & engaging
  9. If not – choose the other.

On my NUC, I have both Intel and AMD in the CPU (i7-8809G)

The hardware enumerates as this:

[chuck@lizum ~.2003]$ sudo lshw -C Display
  *-display                 
       description: VGA compatible controller
       product: Polaris 22 XT [Radeon RX Vega M GH]
       vendor: Advanced Micro Devices, Inc. [AMD/ATI]
       physical id: 0
       bus info: pci@0000:01:00.0
       logical name: /dev/fb0
       version: c0
       width: 64 bits
       clock: 33MHz
       capabilities: pm pciexpress msi vga_controller bus_master cap_list rom fb
       configuration: depth=32 driver=amdgpu latency=0 mode=3840x2160 visual=truecolor xres=3840 yres=2160
       resources: iomemory:200-1ff iomemory:210-20f irq:191 memory:2000000000-20ffffffff memory:2100000000-21001fffff ioport:e000(size=256) memory:db500000-db53ffff memory:c0000-dffff
  *-display
       description: Display controller
       product: HD Graphics 630
       vendor: Intel Corporation
       physical id: 2
       bus info: pci@0000:00:02.0
       version: 04
       width: 64 bits
       clock: 33MHz
       capabilities: pciexpress msi pm bus_master cap_list
       configuration: driver=i915 latency=0
       resources: iomemory:2f0-2ef iomemory:2f0-2ef irq:188 memory:2ffe000000-2ffeffffff memory:2fa0000000-2fafffffff ioport:f000(size=64)
[chuck@lizum ~.2004]$ ls -la /dev/dri
total 0
drwxr-xr-x   3 root root        140 Nov 23 03:48 ./
drwxr-xr-x  21 root root       5140 Nov 29 01:46 ../
drwxr-xr-x   2 root root        120 Nov 23 03:48 by-path/
crw-rw----+  1 root render 226,   0 Nov 23 03:48 card0
crw-rw----+  1 root render 226,   1 Nov 28 12:15 card1
crw-rw----+  1 root render 226, 128 Nov 23 03:48 renderD128
crw-rw----+  1 root render 226, 129 Nov 23 03:48 renderD129
[chuck@lizum ~.2005]$ 

The Intel CPU is first in the hardware addresses at pci@0000:00:02.0 but you can see the kernel still found and enumerated the AMD (which are resident in the kernel)

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.