Plexamp gives up loading Tracks folder

Navigating into “Tracks” folder initially shows wheel spinning, after approx. 90 seconds a message “Unable to load data. This makes me sad.” is displayed…

Is Plexamp timing out because the server is too slow, or there are too many items in the library, or does it simply have a timeout if items don’t load within a certain time frame?

Im also not seeing summaries on some folders:

Screen Shot 2021-09-18 at 11.49.58 am

Server Version: 1.24.3.5033
Plexamp Version: 3.7.1

Plex Media Server Logs_2021-09-18_12-19-23.zip (1.9 MB)

You’ve got some incredibly slow requests in there.

Sep 18, 2021 12:10:12.546 [0x7fa72ba64b38] DEBUG - Completed: [10.0.1.100:55948] 200 GET /library/sections/1/all?type=10&excludeFields=summary&sort=titleSort&includeFields=thumbBlurHash (7 live) TLS GZIP Page 0-49 323558ms 8945 bytes (pipelined: 1)

Getting the first page of tracks is taking 323 seconds.

Some combination of:

  1. Slow spinning disk
  2. Massive database
  3. Need to optimize database

thanks for confirming that @elan.

  1. If I find that I fall into the “massive database” category, does my issue stem from a lack of processing power cpu/ram/disk speed (mechanical raid vs ssd)?

  2. Is there any documentation I can reference on what a “massive database” substantiates for Plex?

  3. Yes I regularly optimise database and clear bundles.

I’d also like to confirm, as storage becomes cheaper and internet connections faster, user databases inevitably grow;

  1. Moving forward, is SqLite still adequate for “massive databases”?

Lastly, since installing PMS on Synology NAS I have been concerned by the lack of RAM utilised (on avg 200MB / 4GB).

  1. Would it be beneficial to increase the database cache for users with “massive databases”?

All of the above, but usually spinning disk is the limiting factor for DB access.

I mean, I have ~5000 albums and consider the library decent sized. You should be able to go up to 5-10x that w/o much trouble as long as you have fast disk, but there is no specific limit.

Make sure the optimization is successful, if your database is subtly corrupt it may not complete. How big is it?

SQLite has great performance. If there are issues around perf it’s usually application-level issues. We don’t need to scale to 100M track databases :laughing:

Almost certainly not.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.