> Forum is deleting posts where an EDIT has been made. Generally when done shortly after the post was made.
Why wouldn’t they fix that trivial issue?
0
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Ten months of nVidia silence on the topic.
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Shift+Ctrl+Win+B does nothing except for showing a black screen for a moment. GPU scaling remains (blurry) GPU scaling and nVidia CP still shows Display as the scaling device.
Yes, the issue may be on Windows side. But if GPU scaling was disabled completely, Windows could not silently switch to it.
Also (does not resolve the main issue, just a related usability/UX thing), nVidia could keep track of Windows-level changes regarding GPU/display scaling and reflect the actual scaling device in nVidia CP.
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
[Take two.]
Same. Once I switch resolution from 4K to FHD via Windows display settings (a window opened via the “Display settings” item in the context menu of the Windows desktop), GPU scaling is started to be silently used instead of display scaling, though display is _formally_ doing scaling and shown as the scaling device in nVidia CP. For example, if I switch to FHD, Windows shows two resolutions in its display settings:
> Desktop resolution: 1920×1080
> Active signal resolution: 3840×2160
So I’m forced either to temporarily “switch” to scaling via GPU via nVidia CP and immediately cancel the change to make the display _actually_ do scaling, or instead switch resolutions via the classic “List All Modes” window available via a button in the “Advanced display settings” window.
Most people just have no clue what scaling (via GPU or via display) is used because all monitors and most of TVs always added blur when scaling. But I’m currently testing Eve Spectrum 4K monitor — the world’s first computer monitor with built-in pixel-perfect (integer) scaling.
The monitor is able to output e.g. FHD with pixels as perfect square 2×2 groups of same-color physical pixels, and the difference is obvious.
Also, my GPU is limited to 30 Hz at 4K via HDMI 1.x (and even via DP with this specific monitor for some reason, but that’s another story), so FHD is not just blurry, but also results in 30 Hz like true 4K (because for monitor, GPU-prescaled signal _is_ 4K). While when scaling via the monitor, I enjoy 120 Hz via HDMI and 144 Hz via DP, and with zero blur.
And yes, I’m aware that nVidia supports some limited form of integer scaling (incompatible with HDR, tiled mode, etc.), but that’s inapplicable to my GPU due to the way it’s implemented, and even if it was implemented in a backward-compatibl way, integer scaling via GPU would result in 30 Hz instead of 120/144 Hz I get with the monitor’s own scaling. Using integer scaling via GPU is _not_ a subject for further discussion.
My question is how to disable GPU scaling completely, like if the GPU/driver was not able to do scaling at all. There were nVidia GPUs unable to do scaling — e.g. 7600 GS (or the latest driver available for it did not support scaling via GPU?). I need to disable GPU scaling as a feature — like if GPU did not support it at all.
Disabling GPU scaling completely would also potentially solve the issue with GPU scaling to 800×600 forcedly applied to 640×480 before actually outputting it to the monitor, this issue takes place via HDMI and not via DP. At 640×480 via HDMI, Windows shows two resolutions in its display settings:
> Desktop resolution: 640×480
> Active signal resolution: 800×600
So even if to switch GPU scaling mode to “No scaling”, we get 3×3 pixels and 33% loss of screen-height instead of 4×4 pixels and just 11% loss of screen-height.
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Where is my reply?
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Desktop. GTX 650 Ti Boost.
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Why do you still add dots before and after the actual text of your comments?
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Yeah, 2×2=4.
> Try the 471.11 driver.
Same with the the latest 471.11.
> Then Use the previous working NVIDIA driver that you did not have this issue on.
I don’t know the exact version when the bug was introduced. And it’s a bug anyway that should be fixed. And I still don’t know how to disable GPU scaling completely which this topic is about in the first place.
0
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
A user reports that “enabling 160Hz mode disables Integer Scaling in the Nvidia Control Panel”. This may be another limitation of the nVidia implementation of integer scaling, besides incompatibility with HDR, custom resolutions, 4:2:0, tiled mode, etc.
https://www.reddit.com/r/Monitors/comments/mgcfbx/
0
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Still unclear what makes integer scaling fundamentally different from DSR in terms of supporting older GPUs given that both integer scaling and DSR do resolution virtualization transparent to OS and apps. So if implementing integer scaling for older GPUs would need “ongoing continuous support” (as Manuel Guzman said), doesn’t DSR already need such “ongoing continuous support” too?
0
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
If nVidia has nothing to say, this is sort of bad news.
Well, maybe RTX 4000 will solve the issues.
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
There is a user report that with RTX 3000, integer scaling works fine together with DSR:
https://twitter.com/RedSwirl/status/1340423964452139009
This may mean that nVidia did fix the incompatibility of integer scaling with DSR in RTX 3000, and so maybe with HDR and 4:2:0 too.
Not sure whether that user really understood the point correctly, and so whether the experiment was clean and right, and whether I correctly interpreted what they said, though.
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Manuel@NVIDIA , could you confirm or deny whether the integer-scaling limitations were resolved in the RTX 3000 series?
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
I’m also interested whether new Ampere GPUs are free of the integer-scaling limitations that Turing GPUs suffered from, such as incompatibility with HDR (and therefore with pretty all modern games and monitors), custom resolutions and 4:2:0.
There is also a report from an RTX 3080 FE user that integer scaling does not work at all with the new RTX 3000 series:
https://www.reddit.com/r/nvidia/comments/izhis6/
1
MT_ru
0
GeForce Graphics Cards
MT_ru
GeForce Graphics Cards
Another attempt as a reply to my own comment:
The main thread about integer scaling has been closed just about a month ago, probably because an nVidia employee decided that the original feature request has been fulfilled. I’m considering creating a new thread specifically requesting a limitation-free integer-scaling implementation (the current one does not support HDR for example, unlike the other scaling methods). There would be no point in using related-but-different threads like this one for that.
I simply sometimes search for “integer scaling” in the nVidia forums, and recently found this thread with an ambiguous official comment that needs a clarification.
If my current understanding is correct, the current nVidia integer-scaling implementation is fundamentally NOT integer scaling we needed at all (aside from its previously known limitations like HDR incompatibility).
> Forum is deleting posts where an EDIT has been made. Generally when done shortly after the post was made. Why wouldn’t they fix that trivial issue?
Ten months of nVidia silence on the topic.
Shift+Ctrl+Win+B does nothing except for showing a black screen for a moment. GPU scaling remains (blurry) GPU scaling and nVidia CP still shows Display as the scaling device. Yes, the issue may be on Windows side. But if GPU scaling was disabled completely, Windows could not silently switch to it. Also (does not resolve the main issue, just a related usability/UX thing), nVidia could keep track of Windows-level changes regarding GPU/display scaling and reflect the actual scaling device in nVidia CP.
[Take two.] Same. Once I switch resolution from 4K to FHD via Windows display settings (a window opened via the “Display settings” item in the context menu of the Windows desktop), GPU scaling is started to be silently used instead of display scaling, though display is _formally_ doing scaling and shown as the scaling device in nVidia CP. For example, if I switch to FHD, Windows shows two resolutions in its display settings: > Desktop resolution: 1920×1080 > Active signal resolution: 3840×2160 So I’m forced either to temporarily “switch” to scaling via GPU via nVidia CP and immediately cancel the change to make the display _actually_ do scaling, or instead switch resolutions via the classic “List All Modes” window available via a button in the “Advanced display settings” window. Most people just have no clue what scaling (via GPU or via display) is used because all monitors and most of TVs always added blur when scaling. But I’m currently testing Eve Spectrum 4K monitor — the world’s first computer monitor with built-in pixel-perfect (integer) scaling. The monitor is able to output e.g. FHD with pixels as perfect square 2×2 groups of same-color physical pixels, and the difference is obvious. Also, my GPU is limited to 30 Hz at 4K via HDMI 1.x (and even via DP with this specific monitor for some reason, but that’s another story), so FHD is not just blurry, but also results in 30 Hz like true 4K (because for monitor, GPU-prescaled signal _is_ 4K). While when scaling via the monitor, I enjoy 120 Hz via HDMI and 144 Hz via DP, and with zero blur. And yes, I’m aware that nVidia supports some limited form of integer scaling (incompatible with HDR, tiled mode, etc.), but that’s inapplicable to my GPU due to the way it’s implemented, and even if it was implemented in a backward-compatibl way, integer scaling via GPU would result in 30 Hz instead of 120/144 Hz I get with the monitor’s own scaling. Using integer scaling via GPU is _not_ a subject for further discussion. My question is how to disable GPU scaling completely, like if the GPU/driver was not able to do scaling at all. There were nVidia GPUs unable to do scaling — e.g. 7600 GS (or the latest driver available for it did not support scaling via GPU?). I need to disable GPU scaling as a feature — like if GPU did not support it at all. Disabling GPU scaling completely would also potentially solve the issue with GPU scaling to 800×600 forcedly applied to 640×480 before actually outputting it to the monitor, this issue takes place via HDMI and not via DP. At 640×480 via HDMI, Windows shows two resolutions in its display settings: > Desktop resolution: 640×480 > Active signal resolution: 800×600 So even if to switch GPU scaling mode to “No scaling”, we get 3×3 pixels and 33% loss of screen-height instead of 4×4 pixels and just 11% loss of screen-height.
Where is my reply?
Desktop. GTX 650 Ti Boost.
Why do you still add dots before and after the actual text of your comments?
Yeah, 2×2=4. > Try the 471.11 driver. Same with the the latest 471.11. > Then Use the previous working NVIDIA driver that you did not have this issue on. I don’t know the exact version when the bug was introduced. And it’s a bug anyway that should be fixed. And I still don’t know how to disable GPU scaling completely which this topic is about in the first place.
A user reports that “enabling 160Hz mode disables Integer Scaling in the Nvidia Control Panel”. This may be another limitation of the nVidia implementation of integer scaling, besides incompatibility with HDR, custom resolutions, 4:2:0, tiled mode, etc. https://www.reddit.com/r/Monitors/comments/mgcfbx/
Still unclear what makes integer scaling fundamentally different from DSR in terms of supporting older GPUs given that both integer scaling and DSR do resolution virtualization transparent to OS and apps. So if implementing integer scaling for older GPUs would need “ongoing continuous support” (as Manuel Guzman said), doesn’t DSR already need such “ongoing continuous support” too?
If nVidia has nothing to say, this is sort of bad news. Well, maybe RTX 4000 will solve the issues.
There is a user report that with RTX 3000, integer scaling works fine together with DSR: https://twitter.com/RedSwirl/status/1340423964452139009 This may mean that nVidia did fix the incompatibility of integer scaling with DSR in RTX 3000, and so maybe with HDR and 4:2:0 too. Not sure whether that user really understood the point correctly, and so whether the experiment was clean and right, and whether I correctly interpreted what they said, though.
Manuel@NVIDIA , could you confirm or deny whether the integer-scaling limitations were resolved in the RTX 3000 series?
I’m also interested whether new Ampere GPUs are free of the integer-scaling limitations that Turing GPUs suffered from, such as incompatibility with HDR (and therefore with pretty all modern games and monitors), custom resolutions and 4:2:0. There is also a report from an RTX 3080 FE user that integer scaling does not work at all with the new RTX 3000 series: https://www.reddit.com/r/nvidia/comments/izhis6/
Another attempt as a reply to my own comment: The main thread about integer scaling has been closed just about a month ago, probably because an nVidia employee decided that the original feature request has been fulfilled. I’m considering creating a new thread specifically requesting a limitation-free integer-scaling implementation (the current one does not support HDR for example, unlike the other scaling methods). There would be no point in using related-but-different threads like this one for that. I simply sometimes search for “integer scaling” in the nVidia forums, and recently found this thread with an ambiguous official comment that needs a clarification. If my current understanding is correct, the current nVidia integer-scaling implementation is fundamentally NOT integer scaling we needed at all (aside from its previously known limitations like HDR incompatibility).