Jump to content

Trying to get gamescope working on Linux with Nvidia card

Recommended Posts

The issue with BloodRayne was the same as with Dredd, the game seems to use a mix of DirectX 8 and 9.

 

Installed it from the Lutris WINE GOG script: https://lutris.net/games/bloodrayne/

Extracted the dgVoodoo2 dlls to the folder with the rayne.exe

Set d3d9 and d3d8 as native overrides in winecfg

Enabled gamescope, mangohud, and dgvoodoo in Lutris

 

Screenshot:

 

Spoiler

(Pardon the low FerPS, I'm away from my desktop at the moment and this laptop only has integrated HD4400 graphics.)

474923176_Screenshotat2023-05-2613-52-33.thumb.png.bdaea80a920d8adb29d65850290fca24.png

 

Share this post


Link to post

I don't get it, I still can't get downsampling working at all.  I'm attaching some screenshots, maybe you can point out what I'm doing wrong:


 

Spoiler

Settings:

qOOldqr.png

 

In-game settings:

ih9IBii.png

 

In-game screenshot:

gPZbjyA.png

 

Share this post


Link to post

You're not doing anything wrong, I'm afraid the reason it isn't working is that gamescope doesn't support proper downsampling. There's a work in progress version with good-quality downsamping at https://github.com/ValveSoftware/gamescope/pull/740 but that patch will only be merged at the discretion of the maintainer (Valve and their contractors).

 

I've compiled a version which should work on your machine with that patch (copy it to /usr/local/bin and Lutris should pick it up). If you press (super/windows key)+k while the game's running it should toggle the downsampling filter.

 

Spoiler

Nicely anti-aliased example (UI text will be a bit soft, but that's unavoidable without native in-game SSAA):

Screenshot_20230527_165401.thumb.png.4407d8591f9724aaf90acdfcb0c680b0.png

 

 

 

gamescope.zip

Edited by ReflexiveTransativeClosure
fix attachments (see edit history)

Share this post


Link to post

I tried that gamescope version @ReflexiveTransativeClosure posted. The bicubic filter is doing something.

 

Screenshot_20230527_202133.thumb.png.d0b7e0606efba6e0ed4183276b6253b7.png

 

The full size screenshots went over the limit so I've uploaded them to imgur (pardon any compression artifacts):

 

Spoiler

Native 2K
Rfsc6Cj.png

 

4K>2K (no filter)
eqymPtY.png

4K>2K (Bicubic Filter)

5KWx4vQ.png

 

 

Share this post


Link to post

Also I forgot to say -- It's off by default, so you will have to have pressed meta+k at least once to see any difference (in particular those alpha-test shrubs in the distance are a good tell that it's working).

 

If it looks a bit muddy, be careful that you're downscaling from an exact multiple of the display resolution, up to 4x per axis (beyond that it will run into other problems), so up to 7680x4320 from a 1080p output.

 

Share this post


Link to post
On 5/27/2023 at 2:47 PM, Ross Scott said:

I don't get it, I still can't get downsampling working at all.  I'm attaching some screenshots, maybe you can point out what I'm doing wrong:

The only real difference from what I'm doing on my end is that I didn't set the resolution in dgvoodoo.conf, only in the fields in Lutris and in game.

 

On 5/27/2023 at 9:30 PM, ReflexiveTransativeClosure said:

If it looks a bit muddy, be careful that you're downscaling from an exact multiple of the display resolution, up to 4x per axis (beyond that it will run into other problems), so up to 7680x4320 from a 1080p output.

It makes the textures a lot sharper as well but arguably makes the aliasing even more obvious, which is what Ross is trying to reduce.

Share this post


Link to post
On 5/27/2023 at 6:48 PM, ReflexiveTransativeClosure said:

You're not doing anything wrong, I'm afraid the reason it isn't working is that gamescope doesn't support proper downsampling.

 

Well then how was @Unaccounted4 able to get it working then?  His earlier screenshots definitely seem to be showing it.

 

On 5/27/2023 at 6:48 PM, ReflexiveTransativeClosure said:

I've compiled a version which should work on your machine with that patch (copy it to /usr/local/bin and Lutris should pick it up). If you press (super/windows key)+k while the game's running it should toggle the downsampling filter.

 

This is the first thing that DID work, however I think bicubic is the wrong way to go here.  Besides having a bit of an oversharp effect in areas, overall, this introduces noticeably extra blur across the board, it kind of reminds me of old school Quincunx AA.  Is there any way to have it just use a basic bilinear resampling filter instead?  I think that would recreate the normal SSAA effect.

 

On 5/27/2023 at 10:50 PM, Unaccounted4 said:

The only real difference from what I'm doing on my end is that I didn't set the resolution in dgvoodoo.conf, only in the fields in Lutris and in game.

 

I tried it without the dgvoodoo.conf line added.  The result was the same, the only difference was the resolution on the wheel in the menu was lower.  Any more ideas why it's not working on my end?  Do you think this is an Nvidia v. AMD issue or something else?

Share this post


Link to post
On 5/28/2023 at 10:34 AM, Ross Scott said:

I tried it without the dgvoodoo.conf line added.  The result was the same, the only difference was the resolution on the wheel in the menu was lower.  Any more ideas why it's not working on my end?  Do you think this is an Nvidia v. AMD issue or something else?

I think that's the million dollar question, if we can figure that out we can probably report the bug and it might be fixed pretty quick.

Share this post


Link to post

Does Ross have both an nVidia and AMD rig? If so, does gamescope behave differently with nvidia than it does AMD?

 

Could it be possible that gamescope simply honor the game's requests to run at a lower resolution? There is no verbose flag with which to run gamescope with. So despite various local tests, I am unable to understand what is going on nor know the true resolution a game's playing at. For instance, neverwinter night will run at a higher resolution but diablo (through devolutionX) doesn't look different at all, and FTL just doesn't like gamescope at all.

Share this post


Link to post
On 5/29/2023 at 6:25 PM, xrogaan said:

Does Ross have both an nVidia and AMD rig? If so, does gamescope behave differently with nvidia than it does AMD?

 

Could it be possible that gamescope simply honor the game's requests to run at a lower resolution? There is no verbose flag with which to run gamescope with. So despite various local tests, I am unable to understand what is going on nor know the true resolution a game's playing at. For instance, neverwinter night will run at a higher resolution but diablo (through devolutionX) doesn't look different at all, and FTL just doesn't like gamescope at all.

I plan to do tests on an AMD card later, but I was under the assumption I could get downsampling working on an Nvidia card also.  If that is in fact, not possible, I'll abandon it, but I was trying to get a clear answer one way or the other.

 

As for your question, I don't know where the request would be coming from, it was specified at 4k for the sample game I ran. 

 

With your examples, that's easily explained.  Neverwinter Nights is a 3D game and can be rendered at different resolutions.  Both Diablo and FTL are 2D games, so there's no 3d data to render at a higher resolution.

Share this post


Link to post

Just in case this is helpful, I took a screenshot of the blur introduced in the alternate gamescope ReflexiveTransativeClosure uploaded.  Below it is a screenshot without AA at 4k which I took in Windows, then resized to 1080 using bilinear filtering to get 2x2 downsampling.  If you look at the edge of the car or the "checkpoint" text, the introduced blur is more obvious


 

Spoiler


alternate gamescope:

NFHrzFe.png

 

 

4k on Windows downsampled using bilinear filtering to 1080:

sdDWBPW.png

 

Share this post


Link to post
On 5/28/2023 at 9:34 AM, Ross Scott said:

Well then how was @Unaccounted4 able to get it working then?  His earlier screenshots definitely seem to be showing it.

Okay I made a mistake here, gamescope could (and appears to in my testing) apply a very limited form of bilinear filtering in very specific circumstances, which isn't really by design. I think that you would have to be downscaling from an internal resolution which is a non integer factor of the output. This likely won't be as good as proper bilinear (not-least because of the resolution requirements) but it might be that this effect is sufficient for you -- choosing the right internal resolution would be a matter of trial-and-error.

 

(Another possibility is that DGVoodoo2 is doing downsampling -- but I don't think that's what's happening given the configuration described)

 

On 5/29/2023 at 5:01 PM, Ross Scott said:

I plan to do tests on an AMD card later, but I was under the assumption I could get downsampling working on an Nvidia card also.

You're right, given gamescope is working at all, the behaviour of the downsampling should be identical on AMD and Nvidia.

 

I also attach a gamescope build with a linear-ish filter. It's something I hacked together in half an hour though, so it won't

be perfect (I'm not a graphics programmer), and only works for exactly 2x scaling. The result is somewhere in-between unfiltered and bicubic for aliasing and softness.

The keybinds are different too (it's based on my downstream tree, sorry), it's meta+b for no filtering (blit), and meta+j for bilinear. meta+k gives somewhat broken bicubic. (Also, if you press meta+n (sample nearest) by accident, you should press meta+l (sample linear) to get everything back to normal).

 

Unfiltered:

Spoiler

aliased.thumb.png.8b434964d1721408b44f0cd7027e3f93.png

Hacky bilinear:

Spoiler

linear.thumb.png.8b6d4b2a87d3303993b9314a2924b7f8.png

 

gamescope.zip

Share this post


Link to post
On 5/30/2023 at 2:01 AM, ReflexiveTransativeClosure said:

I also attach a gamescope build with a linear-ish filter. It's something I hacked together in half an hour though, so it won't

be perfect (I'm not a graphics programmer), and only works for exactly 2x scaling. The result is somewhere in-between unfiltered and bicubic for aliasing and softness.

 

Well in my limited testing, this pretty much does it.  It was a much cleaner image than the bicubic one and while there was some very slight blurring on the text, I noticed part of the hud was cleaner on your version than the 4k 2x2 downsampling I posted earlier, and that was about as pure as it gets.  I tested it 1NSANE at 4k to 1080, and in Judge Dredd from 1600x1200 to 800x600.  It worked about how it should.  Thanks a ton for the help.

 

So the problem was that it only works for non-integer resolutions?  That would explain why Unaccounted4's worked, since he was going from 4k to 1440?  That is quite bizarre, but I'm glad you found a solution!  I'll see if your theory holds up later using vanilla gamescope and can replicate the results.

 

So for the record, you found a way to apply it to 2x integer-resolutions specifically and made a custom compile based off that?  That's still weird they didn't support that in the first place.

Share this post


Link to post
On 5/30/2023 at 2:01 AM, ReflexiveTransativeClosure said:

Okay I made a mistake here, gamescope could (and appears to in my testing) apply a very limited form of bilinear filtering in very specific circumstances, which isn't really by design. I think that you would have to be downscaling from an internal resolution which is a non integer factor of the output. This likely won't be as good as proper bilinear (not-least because of the resolution requirements) but it might be that this effect is sufficient for you -- choosing the right internal resolution would be a matter of trial-and-error.

 

(Another possibility is that DGVoodoo2 is doing downsampling -- but I don't think that's what's happening given the configuration described)

 

You're right, given gamescope is working at all, the behaviour of the downsampling should be identical on AMD and Nvidia.

 

I also attach a gamescope build with a linear-ish filter. It's something I hacked together in half an hour though, so it won't

be perfect (I'm not a graphics programmer), and only works for exactly 2x scaling. The result is somewhere in-between unfiltered and bicubic for aliasing and softness.

The keybinds are different too (it's based on my downstream tree, sorry), it's meta+b for no filtering (blit), and meta+j for bilinear. meta+k gives somewhat broken bicubic. (Also, if you press meta+n (sample nearest) by accident, you should press meta+l (sample linear) to get everything back to normal).

 

Unfiltered:

  Hide contents

aliased.thumb.png.8b434964d1721408b44f0cd7027e3f93.png

Hacky bilinear:

  Hide contents

linear.thumb.png.8b6d4b2a87d3303993b9314a2924b7f8.png

 

gamescope.zip 3.87 MB · 2 downloads

Very cool.
Would it make sense to make a git repo for this, so that other interested people can find and contribute to it?
Maybe it gets to the point where it gets merged back upstream.

Share this post


Link to post
On 5/30/2023 at 12:21 PM, testman said:

Would it make sense to make a git repo for this, so that other interested people can find and contribute to it?

Or better, submit it as a PR to the main project and get a discussion going on how to improve the software in that way.

Share this post


Link to post
On 5/30/2023 at 1:01 AM, ReflexiveTransativeClosure said:

I think that you would have to be downscaling from an internal resolution which is a non integer factor of the output.

Yup, that seems to be exactly what's happening. Very good catch!

 

On 5/30/2023 at 1:01 AM, ReflexiveTransativeClosure said:

I also attach a gamescope build with a linear-ish filter. It's something I hacked together in half an hour though, so it won't

be perfect (I'm not a graphics programmer), and only works for exactly 2x scaling. The result is somewhere in-between unfiltered and bicubic for aliasing and softness.

The keybinds are different too (it's based on my downstream tree, sorry), it's meta+b for no filtering (blit), and meta+j for bilinear. meta+k gives somewhat broken bicubic. (Also, if you press meta+n (sample nearest) by accident, you should press meta+l (sample linear) to get everything back to normal).

Like the others suggest, absolutely you should make this a PR for gamescope. That is actually awesome!

Edited by Unaccounted4 (see edit history)

Share this post


Link to post
On 5/30/2023 at 9:50 AM, Ross Scott said:

So for the record, you found a way to apply it to 2x integer-resolutions specifically and made a custom compile based off that?  That's still weird they didn't support that in the first place.

Yes, more or less (although now I think about it larger factors should probably work fine I think). I think the omission is probably explained to some extent as gamescope was primarily developed for the Steam Deck (so, limited battery and a small screen), where users are very unlikely to be downsamping in most cases.

 

For anyone who wants this in their own builds, the simplest version (which I'm not sure is strictly linear, but is linear-ish, and should work with any scaling factor supported by the bicubic PR) is a one-line patch on the bicubic branch; just open src/shaders/cs_bicubic.comp and add the following to the start of textureBicubic() [here]:

    return texture(splr, texCoords);

 

Hopefully, when/if the bicubic PR is merged it shouldn't be too hard to add this as an extra mode.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in the community.

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

This website uses cookies, as do most websites since the 90s. By using this site, you consent to cookies. We have to say this or we get in trouble. Learn more.