It is said that modern Intel CPUs are more capable in regards of hardware encoding in terms of quality. However, it is only referring to Intel generations of CPU:
A recent Intel CPU meeting these requirements:
2nd-generation Intel Core (Sandy Bridge, 2011) or newer (we recommend 5th-gen Broadwell or newer for the best experience; Sandy Bridge, in particular, is known to sometimes have poor visual output on some systems)
I’m planning to buy a 3900X Ryzen CPU and I was wondering if choosing AMD will bring me issues or drawbacks for hardware transcoding. I’ll be using a RTX 2070 Super by the way.
You will gain nothing from using a rtx 2070 as Nvidia gaming cards are restricted to 2 hardware transcodes. You are better off (if your not gaming much or competitively) with quadro p2000. Then my friend you will have a killer Plex media server probably capable of 25+ transcodes. If you just wanted just the AMD CPU should be reasonable for Plex but believe though (just in Plex mind you) the bang for buck is definitely in the Intel court.
If you’re comfortable using the hacked/modded Nvidia drivers, there’s no need to pay for a p2000, you could just buy a sub-$100 card and mod the drivers instead to remove the 2-transcode limit.
That is very true…but depends on what OS you will be using Linux or windows or something else…the Linux hack method you can find on you tube…the windows method I haven’t been able to locate anywhere, but know it is there. I personally don’t know of any other hacks for NVIDIA.
Hi, thanks for your replies guys, but my question was about the CPU. I’ll be using a 2070 because I need this for video games.
The documentation from Plex I quoted says the CPU generation has an impact on video quality even if the transcoding is done but the graphic card. I am surprised but that’s what I read. So my guess Ian there’s some kind of instructions on recent CPU that help with the GPU “connection” or something…
I was wondering if AMD allows the same level of image quality as Intel.
When I researched a while back, people seemed to say “no” to that question. But I haven’t personally tested it.
BTW the reason the generation has an impact on video quality is because the chip makers are continually tweaking and making improvements. There’s very little difference between the 7th and 8th Intel generations (no major leap) but they’re both significantly better than the 3rd or 4th Intel generations. And they decode more codecs, such as HEVC and HEVC 10-bit.
I’ll second what cafe diem said… There shouldn’t be any descernible difference between the chipmakers. The age of the chip is the difference due to the tweaks. By cafe diem thanks for the git hub link been trying to find that everywhere.