Plex Movie Optimization on Multi Server Setup

So I have a question that I have been unable to find an answer for while searching. Setup information are as follows. I have a system that I use as my main Plex server and I have my Desktop that I have Plex Server running on that typically optimizes movies to 4MB 720p for remote access. My main plex server is running in a docker container and therefor using linux. My desktop is running windows 10. I am on the latest current beta build 1.11.2.4772. This may have been happening before as well but I just recently replaced the CPU in my main plex server and have been monitoring the temps on it.

My question / observations are that when I start a movie optimization on my Windows PC, the process starts up and is using my CPU for plex. I also noticed that for whatever reason the temps started going up on my other server, which lead me to notice the CPU usage for plex went up. I stopped the movie optimize and the CPU use went down on both my Desktop and the server. I started the optimize again and observed the same thing. CPU usage went up on both Desktop and Server PMS.

What is going on here? Why does the process on plex start going on on my other server when I start a transcode?

I have tested it the other way and when I transcode from the source machine, the machine that holds the storage, I am not seeing any CPU usage for Plex on the other machine; it only seems to occur when I’m optimizing on the machine that uses the source machine’s storage. I’m probably not going to get an answer but I figured I would ask in case anyone else has ever seen this.

edited: Hopefully to clarify what I am asking.