macOS did have a scale that developers could change with an older version of Quartz Debug.app. Then Apple settled on x2 scale.
The ability to use other scaling factors probably still exists in the macOS but developers have tested their software only with x2 so it might not work well even if you can figure out how to enable it.
Quartz Debug.app v3.0 from Leopard (10.5.8) developer tools (Performance Tools) works on my Power Mac G5 (Quad). It supports arbitrary scale factor from x1 to x3. Here's a screenshot of x3 on a 1920x1200 display:
My Mac Pro (Early 2008) (MacPro3,1) has 64 GB of RAM and an EVGA Nvidia GeForce GTX 680 Mac Edition.
To boot Leopard (10.5.8), I have to use
maxmem=32768
because the kernel is only 32-bit. I also have to remove NVDAResman.kext otherwise a kernel panic occurs with the GTX 680. Leopard doesn't have support for the GTX 680, so there's only the one boot resolution but I can enable x3 scaling. I can't take screenshots though.
Snow Leopard (10.6.8) is similar - I either need to set
maxmem=32768
for 32-bit kernel or boot to 64-bit kernel
maxmem=63488 arch=x86_64
(maxmem is set to 62 GB because the CPU doesn't like 64 GB - it slows things down for some reason). NVDAResman.kext does not need to be removed. Quartz Debug.app v4.0 from Snow Leopard developer tools works, but I prefer v3.0 because it honours the disabling of the "Restore scale factor to default (1.0) on quit" option better. With it disabled, you can log out and then log in to make the Finder use the x3 scaling (arbitrary scaling is per app in Leopard and Snow Leopard).
Lion (10.7.5) supports multiple resolutions with the GTX 680 even though real support for the GTX 680 is supposed to start with Mountain Lion (10.8.5). It supports HiDPI x2 modes. Lion does not support the arbitrary scale factor. Quartz Debug.app v4.2 has a new option "Enable HiDPI display modes" that replaces the arbitrary scale factor.
Anyway, the point is that the ability to use an arbitrary scale factor existed. I believe macOS today continues to have much of that code - an app is told to use a scale factor number - it's not a boolean that is on or off. It might be possible to change that number. macOS creates HiDPI display modes since Lion (10.7.5). The display mode includes a number for the scale factor. Perhaps that number can be changed. For example, when macOS adds an HiDPI x2 mode for each resolution, maybe we can patch it to add an x3 mode also. Maybe this can be done with a Lilu + WhateverGreen patch... With high resolution 8K displays, you might want to uses scale factors up to x4 and beyond.
I remember this being discussed earlier (though I can't quite remember where), and the consensus was that macOS didn't know how to handle an 8K image ...
egpu.io