Support Doc for Database Cache Size (MB)?

I didn’t notice this thread before I asked a related question: Database Cache Size Behavior Question

Before folks go crazy increasing the cache size, the new 40MB default is a 20x increase over the previous value.

Previously Plex’s SQLite cache_size was 2000 pages, and Plex’s model database page_size is 1024 bytes. The end result was a SQLite cache size of about 2MB.

Still wondering if this setting affects the memory footprint of the server executable. I’ve seen mine slowly increase since I made the change earlier today.

Yes. SQLite only caches pages that have been used. It doesn’t greedily suck the whole database file in.

Huge values are unlikely to be helpful - every OS already has file read caching.

1 Like

Right, that’s why I asked earlier if the cache size changes dynamically. Set at 1024, I’ve watched the RAM usage go up to the mid-400s so far.

I’m just thinking, if I have a lot of available memory, why not use it? Don’t see how it could negatively affect anything.

1 Like

I’m also pretty sure that if you set the value larger than the database size, it won’t use more memory.

It’s often a trap to turn dedicated caches all the way to 11 just because “bigger is better”. Any memory used for this can’t be used dynamically by the OS to cache other files, for instance.

@OttoKerner mentioned increasing by 200MB at a time - even that sounds like a BIG increment to me. I suspect that the new default of 40MB is already bigger than really necessary.

I’ll be very surprised if anybody can show a value over 40MB to have performance benefits, but I’d really love to see it!

1 Like

I get your point. For now, I’m keeping an eye on the .exe to see where it ends up. Does the fact that it keeps increasing mean that the larger cache is actually being used? And if so, what would trigger such an increase in memory usage? You can tell I’m a noob when it comes to databases :wink:

@VBB

When I offered to help with the script for Windows, Would it help if I write out the command sequence ?

Is that easy to put into Windows and have it error check results ?

I appreciate it, but I don’t think I need it at this time. Sounded like the other guy really wanted it, though :slight_smile:

@ChuckPa
I did the update, increased cache to 900mb, I have a large 800mb db and a fast system with lots of ram. I’ve been chasing this issue for a log time.

Does your tool work on Hotio plex docker?

I used this tool last week but I’m still having issues.

Yes, it should work inside any docker container.
I know I have expressly given instructions how to stop Plex for Official Plex and Linuxserver.io Plex.

As long as Plex is stopped and you’re in there as “root” ( uid=0 ), it should be fine.

My tool isn’t fancy. I wrote it as a ‘get the job done’ type

27 posts were split to a new topic: Binhex + PlexDBRepair development

Exactly my thinking also. If your DB is 500mb then why increase this cache to 1GB ? This is a db cache (at least by its name) and not a server process or some other background process, its a specific cache for the db.

Say more please :slight_smile:

I wonder why plex would add this setting if there is no benefit in changing it beyond the default :thinking: Anyhow, for now I’ve change my cache to 128mb and will see if I can see any noticeable performance change.

Because it can help some users with particularly big media collections.
Who had to resort to binary patching until now. Suggested SQLite3 DB Optimizations

1 Like

Sure and I’m aware of that thread but was more pointing the question to @Volts and his comments :slight_smile:

Anyhow, round the houses we have gone so I’ll head back to my OP and look for a recommendation from plex on when to change this and what we change it to ?
Everyone has a perspective on what defines of a big media collection.

Any way to run this on windows?

It’s doing precisely the same as the “low level database recovery” in this article: https://support.plex.tv/articles/repair-a-corrupted-database/

Well that wouldn’t make any sense now, would it? :wink:

My DB is 1.98GB, and I’ve had the cache setting at 1024 since yesterday. So far, I haven’t seen the executable use more than 531MB.

1 Like

This is not a DB file cache. It is a DB cache. Meaning it caches only parts of the DB. Typically these which are used frequently/recently.

Yeah, I understand that it’s dynamic, which is great. Just wanted to give an update on what I’ve been observing.

@ChuckPa I am having a heck of time getting this to run on Unraid using the hotio image. Put the .sh in the main folder in Plex in app data. So /mnt/cache/appdata/plex/.

root@9900K:/# docker exec -it plex /bin.bash
bash: docker: command not found

root@9900K:/# s6-svc -d /var/run/service/svc-plex
s6-svc: fatal: unable to control /var/run/service/svc-plex: No such file or directory

That’s what I am getting.