Plex scanning slowing down after 30min

Running scan on TS-231+. I am running a scan against a huge music library (7,000 albums). At the start all seems fine and about 40 albums are added in 10 min. This is not super fast, but anyway. After 1 hour it only managed to add 100 albums and I can no longer access server settings or the music. I rebooted the system (as I didn’t know how to kill these processes that scan - they kept running even after I stopped Plex) and re-tried and the same thing happened.
It now takes about 4 minutes to add the album. I see a lot of errors in the logfile:

May 23, 2018 19:31:40.552 [0x73682000] ERROR - Error issuing curl_easy_perform(handle): 28
May 23, 2018 19:31:40.552 [0x73682000] DEBUG - HTTP simulating 408 after curl timeout
May 23, 2018 19:31:40.553 [0x73682000] ERROR - HTTP 408 downloading url http://127.0.0.1:32400/library/changestamp
May 23, 2018 19:31:43.381 [0x73682000] ERROR - Exception inside transaction (inside=1) (…/Library/MetadataItem.cpp:3023): Unable to allocate a changestamp from the server
May 23, 2018 19:31:44.712 [0x73682000] ERROR - Exception assimilating media item in [TITLE OF ALBUM]: Unable to allocate a changestamp from the server

and in Server logfile:

May 23, 2018 19:31:38.558 [0x4faff400] ERROR - Failed to begin transaction (…/Statistics/StatisticsManager.h:191) (tries=1): Cannot begin transaction. database is locked

I am also using MusicBee from my PC and if I rescan those same files it takes less than 1 hour give or take, so I don’t know why Plex is failing.

When you add a lot of media all at once, the database does fragment and it slows down. PMS uses SQLite3 which is an ISAM database engine and therefore does fragment through the natural course of inserting records into multiple tables

The solution is to 1. Cancel the scan , 2. “Optimize Database”. 3. Resume scanning. It will briefly scan over what’s already been indexed and then continue with the work to be done

I tried running the manual scan and that seems to have been much faster and took a few hours. I will keep your suggestion in mind if I ever run into this issue again.

As a secondary solution, which some have adopted as well.

  1. Break the top level directory into a few smaller chunks. (adds 1 level of directories) Most use alphabetical blocks (e.g. A-H,I-M, etc)
  2. Add one ‘block’ folder at a time
  3. As each finishes; Optimize Database and then Edit the library section adding the next one.
  4. It’s a bit annoying but you have a lot more control over how things get done