@Craig_Holliday:
This is great news!
To help that even further, would you be able to throw something like this line in every few seconds when in verbose log mode:
Sample Event:
{
"timeStamp": "2025/01/08 12:50:54.792",
"sessionID": "UNIQUE GUID IDENTIFYING THIS PLAY SESSION",
"event": "playState",
"client": {
"player": "experimental",
"appBuild": "1.0.0",
"videoRenderer": "new;someversionstring",
"audioRenderer": "old;somethingelse",
"DeviceName": "X-Plex-Device-Name",
"DeviceModel": "X-Plex-Model",
"DeviceScreenDensity": "X-Plex-Device-Screen-Density",
"DeviceScreenResolution": "X-Plex-Device-Screen-Resolution",
"ClientPlatform": "X-Plex-Client-Platform",
"ClientPlatformVersion": "X-Plex-Platform-Version",
"ClientCapabilities": "X-Plex-Client-Capabilities",
"ClientLocation": "local|remote"
},
"PlaySession": {
"sessionKey": "131",
"guid": "",
"ratingKey": "312995",
"url": "",
"key": "/library/metadata/312995",
"viewOffset": 5209000,
"playQueueItemID": 2256739,
"playQueueID": 74690,
"state": "playing",
"transcodeSession": "ba046284-92a6-41e3-9029-cff0781f9ee4-13"
},
"TranscodeSession": {
"key": "/transcode/sessions/4c3d1ed8-15da-4c88-b7a8-56107f9aa68e-42",
"throttled": false,
"complete": false,
"progress": 64.0,
"size": -22,
"speed": 6.0,
"error": false,
"duration": 9102177,
"context": "streaming",
"sourceVideoCodec": "hevc",
"sourceAudioCodec": "eac3",
"videoDecision": "transcode",
"audioDecision": "transcode",
"protocol": "http",
"container": "mkv",
"videoCodec": "h264",
"audioCodec": "aac",
"audioChannels": 2,
"transcodeHwRequested": true,
"transcodeHwDecoding": "nvdec",
"transcodeHwEncoding": "nvenc",
"transcodeHwDecodingTitle": "NVIDIA (NVDEC)",
"transcodeHwFullPipeline": true,
"transcodeHwEncodingTitle": "NVIDIA (NVENC)",
"timeStamp": 1736102547.1930953,
"maxOffsetAvailable": 5819.236,
"minOffsetAvailable": 0.0
},
"mediaItem": {
"url": "https://something.plex.direct/library/something/file.mkv",
"size": 123456789,
"duration": 12345678,
"videoStream": {
"codec": "HEVC",
"quality": "3840x1920",
"criteria": "SDR",
"bitrate": 123123123123,
"fps": 12.123123123,
"playMethod": "DirectPlay|DirectStream|Transcode"
},
"audioStream": {
"codec": "EAC3",
"channels": 2,
"bitrate": 123123123123,
"playMethod": "DirectPlay|DirectStream|Transcode"
}
},
"performance": {
"droppedFrames": 123,
"droppedPackets": 123,
"avSync": 0.0032423324,
"thermalState": "nominal"
}
}
Sources:
I imagine that the transcodeSession.{start|end|update} and playing events would be particularly helpful for this, and well, the rest of the info is either in the overlay, or it’s available otherwise in the app – so it shouldn’t really be anything net new.
PII/Privacy Concerns:
I selected these really carefully from the data I was able to see in my log output. I don’t believe any of these are personally identifiable, but obviously the user would be posting the log output, so that kinda makes that inevitable if you wanted to back track it.
It might be possible to use some of the media item information to attempt to validate what item is being played, but again you’d have to have a fairly large database of stream information to be able to find the exact file that contains the specific bit rate, size, duration, codecs, etc.
Overall then I’m not especially concerned that this data is overly risky for collection, but if needs be I can imagine it’d be enabled/disabled per debug build – though i’d love to see it permanently incorporated into the app (and server!)
Overall Rationale: – Why do this?
- I believe it contains all the elements someone might need to be able to understand the whole context of a play session.
- When aggregated, patterns should quickly emerge as to what scenarios are causing issues.
- When using a consistent source file (test data), then it should be even simpler to both verify that the source file is accurately analyzed/defined by PMS, and then understanding what happens when you play it.
- Utilizing JSON for the event data makes parsing it relatively simple, so if you just want to see a small slice of the data you could, or if you wanted to see the whole context- it’s there.
- It’s also possible then to ingest these into a time-series database/viewer, so that’d be even more productive to be able to understand what’s going on.
Right now, all* of these attributes are available to the player, either through events sent from or to PMS; local metrics (e.g. X-Plex..) or overlay data. They’re just not condensed into one snapshot, so trying to compile the timeline of a play session is near impossible. If each of these events was plugged into the logs we’d be able to very easily and very quickly interpret the whole picture of a play session.
I hope this helps, and is taken in the spirit its intended!
*(I think one or two items are not available yet…)