Skip to content

Script seems to wrongly disable HDR in certain situations #7

@Dev-Egor

Description

@Dev-Egor

I have 2 Monitors and a VDD, all with HDR enabled by default.
the VDD has a default resolution of 2712 * 1220.

When android client requests a SDR stream at 2712 * 1220, the script correctly disables HDR for the duration of the stream.
When stream is ended, the 2 physical monitors correctly still have HDR enabled.

When android client requests a SDR stream at 2560 * 1440, the script correctly disables HDR for the duration of the stream.
But when stream is ended, the main of the 2 physical displays remains in SDR.

The requested resolution is the only difference between the two cases, and I can reproduce the issue 100% of the time.
I am not sure what causes it, but it could be related to either the main display not supporting the requested resolution, or it being the VDDs default. I am not sure why it disables the HDR on the main display in the first place, I have tried adjusting the startDelay values in the different scripts, but nothing seems to help.

I am also using these other scripts running in the listed order:
MonitorSwapper
ResolutionMatcher
AutoHDR
RTSSLimiter

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions