Automatically generated Subtitles [Whisper Neural Net | OpenAI]

Continuing the discussion from Failed to download subtitle:

Seeing that the issue of downloading subtitles has been a recurring problem throughout Plex’s long history, I propose the consideration of using the neural net Whisper for on-the-fly subtitling and captioning. This can be accomplished via API. If that approach is not feasible due to overhead costs, then finding, funding, and supporting a free and open-source model that is compact and can run locally would be ideal—think of a transformer language model or another natural language processing (NLP) model.

Let’s not ‘stream’ around the bush; we need better subtitles, and we need them now! :popcorn:

Thank you for your suggestion. There’s already an existing suggestion thread discussing an option to automatically create subtitles from the audio tracks / transcription. Please comment/vote in that thread in order to help us avoid distracting or cannibalizing votes.
Unless I’m missing some unique aspect of your suggestion I’ll close this thread as a duplicate.

1 Like

Sound good @tom80H. I’ll go ahead and close this thread. Thank you for bringing this to my attention.