Local AI for Plexamp (or Plex)

I would like to suggest allowing for local (Ollama) AI endpoints for use with sonicsage. This would be nice for generating movie and TV playlists/recommendations as well. As I understand it Ollama and others mirror the OpenAI api so I feel like allowing users to add an API key as well as direct the url to a local host might be a lite lift.

Another direction might be to use openwebui or similar as an intermediary. Using something like their pipelines integration.

43 Likes

I was also going to ask about this, but thought it was too niche to post about. Agree this would be a great feature, and more privacy preserving.

5 Likes

I would have to agree with this too. If you’re running Plex you likely already self-host other apps too.

Services like Ollama and LocalAI are insanely simple to setup. Just having the option would be nice

3 Likes

besides the URL and the API key, i think there’d also be a need to feed the PMS media contents into the model somehow and that could require some engineering. if there is anything like a lot of titles in your PMS, then you would quickly run out of token space if you tried to feed the titles in as prompt context. So you would need to do some RAG and keeping that indexed RAG db up to date with additions and deletions from your media library would also need to be dealt with. I don’t think any of this is insurmountable but for a local AI to be useful, there’s more to it than just letting you connect to a locally hosted model.

3 Likes

Agree, assume whatever mechanism in place for OpenAI to access PMS data would remain. I imagined RAG. I

2 Likes

Yes, I agree, i wonder how easy it would be to allow Ollama to be used in lieu or in addition to OpenAI (api). I’ve been trying out Sonic sage so I have that setup, but I also have been experimenting with Ollama , open-webui , and stable diffusion locally too. So i think the suggestion by AlieFoSho is a good one.

3 Likes

I don’t really care about AI BS, but if it’s an option in Plex I would rather it at least be a self-hosted option than mf chatGPT. :roll_eyes:

6 Likes

This should be fairly simple to implement. Ollama offers an OpenAI compatible API. Considering the option is already available for OpenAI, then adding an editable endpoint and making the API key optional, should solve the issue.

More info on the Ollama API is here: ollama/docs/openai.md at main · ollama/ollama · GitHub

2 Likes

I’d also appreciate this feature. I won’t sign up for or pay for OpenAI, but I already have OpenWebUI/Ollama running on my server

3 Likes

I’m very interested in this feature as well. Has anyone seen or heard anything moving in that direction from Plex?

2 Likes

Squeaky wheel here. Seriously this is not a difficult feature to ask for.

With some of the moves Plex has been making lately it’s making me wonder if they have a deal with OpenAI. Which of course in 2025 would include selling data like always.

4 Likes

This would really be ideal for plexamp as a “play similar songs” feature but additionally it could be used generally as a recommendation engine for media you may enjoy based off of the content you’ve viewed. It would be really nice to have.

3 Likes

It would be great to have this yesterday

Would be great if the Ollama API could be implemented.

1 Like

This would be so great, and bring plexamp further above other players.

If wI as a programmer I’d give it a go!

I also would like to be able to configure plexamp to use local ai.

Bumping this thread with a laser-focused request.

This entire discussion boils down to one simple, powerful change:

Please make the AI endpoint URL in Plexamp a configurable setting.

That’s it. That’s the entire “ask.”


The “Why” (This is a Sunk Cost, Not a New Feature)

All the hard work is already done. A massive investment was made to build the client-side UI and logic in Plexamp for the OpenAI/TIDAL integration.

Right now, that is a sunk cost. It’s a premium feature, part of the paid Plex Pass offering, that is currently “rotting” on the vine because its original backend connection is gone.

As paying customers, we see the broken potential every time we use the app.

The “Easy Effort” (The Win-Win Business Case)

This is the lowest-hanging fruit on your roadmap, and it’s a perfect “win-win.”

Plex’s Cost (The “Low Effort”):

  • One text field: Add a single “Custom AI Endpoint URL” field in the Plexamp advanced settings.

  • One disclaimer: Make it a Plex Pass feature and add a note: “This is an unsupported, advanced feature. Use at your own risk.”

Plex’s Gain (The “High Return”):

  1. Instantly Salvage an Investment: You immediately turn that “rotting” code into one of the most powerful, active features in Plexamp.

  2. Add Massive Plex Pass Value: This is the ultimate power-user feature. It’s a huge, unique selling point for your most dedicated, paying subscribers.

  3. Zero Support Burden: The community will take on 100% of the work. We will build the RAG backends, set up our local LLMs, and write the guides for each other. You just need to give us the “key” (the configurable field).

Plex is trying to build a sustainable business, and we’re here to support it. Please don’t let a major, valuable, premium-tier feature die. This simple change unlocks it for all your paying users and costs you almost nothing.

7 Likes

not judging your request in any way, but if that endpoint existed right now and you were able to have it start querying your local LLM model, what context would your model have to respond to queries?

1 Like

Would love to see this feature implemented! Experimenting with a locally hosted AI for my plexamp library sounds like the perfect winter project.

1 Like

Definitely a +1 from me. Was nice with TIDAL, would be perfect for my local library.

1 Like