OpenAI (and Compatible or Ollama) on Server Side

It would be really nice if we could consciously decouple from OpenAI and be able to use other compatible models and have it done on the server side for us. (and integrate this everywhere)

That is right now only OpenAI is supported. But I’d love to be able to run Ollama locally and use it with plexamp (or plex for recommendations) passing through to the server doing the recommendations for me. This would allow me to use open source models that aren’t expensive.

Bonus if we can have a built in SIRI/Google control + button tap what brings up voice chat or sends to PlexAmp so that I can do something like: “Siri, have plexamp play high energy hits. No back catalog. I’m in the mood for driving! Use my favorited music.” and it will just go and do it for me. “Siri have plexamp create a playlist from this mix. Name it Hard Driving”

1 Like

Bonus points for almost referencing a Gwyneth Paltrow / Chris Martin thing so soon after the Astronomy thing.

And yeah, once we have a real use for AI again, we’d definitely like to make it at least somewhat agnostic.

4 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.