Jump to content

Linux gaming technical discussion

Recommended Posts


cgalves: Thanks a bunch for your testing, you may have saved me a bunch of time.   I have a few questions, based on your results.



•Since AA is a cornerstone of what I'll be testing for, unless the game I'm testing is DX9 or higher AND supports in-game AA options, does it make any sense for me to even use D9VK?  Obviously it has its uses, but if it can't force AA, that makes it less relevant for what I'm testing for except for a few games in my testing.


•What's the advantage to using Lutris over PlayonLinux for DX5 - DX8 games?  Also, many of the games I'm testing are a little less common, I tried typing in a dozen titles of games I was going to test that didn't even have entries in the Lutris database.


•Maybe qptain Nemo could assist with this also.  Are there any games that give better results with dgvoodoo under DKVK?  I think you use it to tell it to render the game under DX11, but whether it works with the specific game is another question.



RaTcHeT302: Regarding the fence test, that has only 3 solutions:


•Mixed mode MSAA / SSAA (called TSAA by Nvidia, Adaptive AA by AMD, but they may have changed the naming).  This uses MSAA for polygon edges and SSAA for alpha textures.  Won't clear up all aliasing (like some shader effects), but will clear up the fence (when it works).  This technique often doesn't work on most modern games.

•SSAA costly in performance, but gives the cleanest picture (when it works)

•Downsampling.  Similar to SSAA and can be even more costly, but has a much higher compatibility (assuming the game isn't resolution capped)


I'll talk a little about this when I make the video.

Share this post

Link to post

@qptain Nemo Thank you for the instructions on dgvoodoo, I'll have to try it out whenever I get the chance. In regards to the patch in question, I'm not fully sure if it does integer scale or just proportional stretch to full screen. Ross may correct me if I'm wrong, but I believe he was looking for the former, since it provides a sharper look. More info here.


@RaTcHeT302 I wasn't aware of that issue on UE3. Is there a reliable way of telling which games run on which engine? I'd like to cross-compare the games I tested with what engine to better see where the issues I encountered lie.

I had zero luck with FXAA, but I agree that a more bespoke solution would be nice. Unless we manage to make a fund for some programmer to design something to our exact criteria, our best bet is to probably make a feature request on the D9VK GitHub for some option to more reliably add AA in the middle of the rendering pipeline, rather than some endpoint defined by Lutris/Nvidia.


@Ross Scott You're welcome! I'm glad I could be of assistance. As for your questions...

It's hard to say, since these projects evolve so quickly. If you were to release the video tomorrow, I'd say probably not worth using in most games. If you were to release it 6 months from now, then the lack of forced AA might well have been solved by then. I'd say that at a minimum it deserves a mention, since (as my testing shows) the performance boost can be huge, leading to games that might've not been playable before being playable now.

The advantages for Lutris aren't just necessarily on DXVK/D9VK being a lot easier to use. The whole interface feels newer and more modern, complete with the way you interact with Wine being done through drop-down menus, rather than messing with winecfg or wine registry. Then there's the inclusion of the Protonified Wine builds, which use patches made by Valve to improve game compatibility and performance across the board. The one click game installs are also a lot better than what PoL had, with tons more games available since the scripts are a lot easier to make. Lutris is also under more active development so new features get added fairly frequently (they recently added GOG integration!). The only downside I can think about Lutris vs PoL is that PoL had a nicer method for making new Wine prefixes, while Lutris's is a little more convoluted. I'd still say the other benefits outweigh that drawback, though.

I hadn't really heard of dgvoodoo until this thread, but from their description, it seems like they're doing to DirectX 1 through 8.1 and Glide, what WOGL, D9VK and DXVK do to DirectX. Basically, taking instructions that might not have support on modern GPUs/Windows drivers, and translating them into modern DX11 instructions. I've yet to try it, but it's hard to say whether or not we'd have better luck with forced AA down that route. At that point you're taking code and pushing it through two translation layers, so all sorts of weird things could happen. Or it could all work perfectly with zero issues. I'll figure it out soon enough, I suppose.

Share this post

Link to post
Posted (edited)
1 hour ago, Ross Scott said:

Mixed mode MSAA / SSAA (called TSAA by Nvidia, Adaptive AA by AMD, but they may have changed the naming).  This uses MSAA for polygon edges and SSAA for alpha textures.  Won't clear up all aliasing (like some shader effects), but will clear up the fence (when it works).  This technique often doesn't work on most modern games.

I don't think that'll work, having both MSAA and SSAA, on a deferred shading engine, is probably not going to happen, and most games have AA techinques, implemented depending on the rendering engine.


It's all very engine dependent, so one unique AA solution might not be possible, it'll have to be a mix of techniques for now. Sadly. Even if we had MSAA, it's probably going to be faked.


Most post process AA techniques are not actually anti-aliasing, they act more like filters \ blur, or something like that. It's more like, fake anti-aliasing. It tries to mimic AA, but it's not close to the real deal yet, but my information could be outdated, or wrong, feel free to correct me.


Some games don't even bother with normal anti-aliasing, they do some fancy processing in engine, and it sorta looks good enough for them.


It gets to the point where, the terminology loses all meaning, the MSAA you know of, might not actually be the MSAA which is being used in a proper game.

Edited by RaTcHeT302

Share this post

Link to post
Posted (edited)


I don't really know, sorry.


There are probably some DirectX specific tools which you can use, I know Microsoft released some but I forgot the name (for driver related stuff). But I don't know how that would help you find out what engine the game is running on. I guess Wikipedia, or build data, or metadata in the compiled binaries might have some information.

Beyond trial and error, and some random guessing, I don't know. You can probably go for the easy guesses, like if the game uses .upk files, it's probably an Unreal Engine game, and Unity games always have an Assembly-CSharp.dll file in the Managed folder.


You can drag and drop random .dlls into DotNetSpy (DNSpy), if they decompile, there's most likely some data you can find in there. Anything made in C#, should be easy to tell.


For C++ \ x86 stuff, x64 Debugger. Anything else, I got no clue. Maybe that NSA reverse engineering tool could be useful, buuut I'm staying away from that, for now. Ghidra?


Maybe the CFF Explorer can help, but I don't know where to download that anymore. Or just right click on random .dlls in the game folders, and check the properties for information. Maybe there are some console commands which you can run in Unreal Engine games, to show you the build \ engine version. Most Unreal Engine 3 games, have the @ key binded to the console. You can just look at the console, and you can probably tell, what engine it's on, by what the console looks like.


Games like Rocket League most likely use heavily modified versions of the engine though, knowing what engine we are dealing with is not very helpful on its own I think. Knowing how each game renders stuff, is probably more useful. I really don't know, I'm really ignorant on this topic, I wish someone more knowlegeable could help you guys out though.


I think these are things which the Vulkan DirectX people probably know more of. I think you should ask them for more help. I'm probably talking out of my ass, so please dismiss everything I said if everything is wrong.


Edit: Oh, it's DirectX 12 only? I guess I was remembering it wrong. Not very useful I suppose. https://devblogs.microsoft.com/pix/


There are some methods which can be used to translate DX9 calls to DX12, it might make the tool less useless in that case.


Unreal Tournament 2004, has a DX8 to DX9 or DX11 conversion, the framerate was great, but I was no longer able to force AA, after changing the DX level.


Was it this one? I lost the link sadly.



I randomly stumbled on this too. First time I ever heard of it.








I don't know why going from a different version of DirectX, broke forced AA, (I don't realy know how any of this works, on a technical level), so that's a little thing you should probably write down.


You can translate DX calls, to more modern versions of DX, but the existing AA techniques will probably stop working (probably some driver level shenanigans? maybe the conversion was incomplete? I really don't know).



I think this is mostly Windows based though, not very Linux friendly probably.

Edited by RaTcHeT302

Share this post

Link to post
Posted (edited)
9 hours ago, Ross Scott said:

•What's the advantage to using Lutris over PlayonLinux for DX5 - DX8 games?  Also, many of the games I'm testing are a little less common, I tried typing in a dozen titles of games I was going to test that didn't even have entries in the Lutris database.


•Maybe qptain Nemo could assist with this also.  Are there any games that give better results with dgvoodoo under DKVK?  I think you use it to tell it to render the game under DX11, but whether it works with the specific game is another question.

Regarding using Lutris for DX5-8 games: well, it provides you with a lot of useful configuration options per game that are easy to change quickly but if you don't need that then you can do without Lutris.


Regarding dgvoodoo + dxvk: I'd say, at least in some cases, yes. It supports games up to DX 8.1. See for yourself: an imgur album is worth a thousand words.

Couldn't get it working with: Need for Speed 3, Vampire the Masquerade: Redemption.

Also I kept the "apply Phong shading when possible" option on all, as well.

Edited by qptain Nemo

Share this post

Link to post
Posted (edited)

Wow, quite a bit of info here.  I can offer some advice in other areas as I (unfortunately) have quite a bit of experience getting things working on Wine.    If you know what a 'debugger' in then you can have a taste of my latest hellish story :D (on the plus side: it now seems that all of the W3Dhub games now work on Wine).


A note about drivers, ATI vs Nvidia


I'm a staunch ATI person on Linux.  AMD moved their driver development away from their proprietary drivers (Fglrx/Catalyst) and onto the open-source ones (Radeon, AMDGPU) years ago, so now the open source ones (the ones you get by default on Linux, zero effort) are absolutely amazing.  Extremely stable, feature-complete and fast.  It used to be so much worse, trust me.


Nvidia are still sitting on their proprietary drivers (NVIDIA), with the open-source community effort (nouveau) getting little/no help and so still being pretty crap (despite the best & probably tireless efforts of the devs).


Proprietary drivers interfering with other tools/features



If you are using any form of proprietary driver: it may interfere with standard Xorg utilities such as xrandr.  They typically want you to use their proprietary GUI application for configuring screens instead.


Please comment if it causes you issues.  I recall I used to have to force a larger virtual screen in a setting somewhere for FGLRX when I used to use that, otherwise I could not make my displays much bigger or have many of them side-by-side.  It ran out of pixels :D



Driver-level FPS/performance overlays


The Radeon and Nouveau drivers use some shared code called "Gallium" that can provide driver-level overlays for performance monitoring.  Here's an example:


    GALLIUM_HUD=.c120fps,.c32frametime wine Game.exe




These overlays have a lot of options, including tracking stuff like drawcalls and poly counts.  Above I've chosen fps (with a graph cap of 120) and frametime (graph cap of 32-ish).  To see all of the options run:


    GALLIUM_HUD=help glxgears

    GALLIUM_HUD=help any_program_that_uses_opengl_here


(Of course under wine DirectX calls get converted to OpenGL calls by Wine, so any directx game also works.  The shots above are from Interim Apex)



Xrandr: upscaling and downscaling


xrandr is the standard Xorg utility for managing screens and displays.  Resolutions, refresh-rates, orientations, arrangements, etc.  I use it instead of any GUI tools for arranging & setting up my monitors because (IMHO) it's simpler and more predictable.


Sidenote: you can add your own modes (resolutions + refresh rates), something that I used to use to try and push my monitors to slightly higher rates (eg 60->72FPS).  Every monitor has different tolerances, some are much nicer than others.  Not covered here.


First you will want to familiarise yourself with Xrandr.  Run it without any options and it will dump your list of displays + their supported resolutions:

** I can't paste here, it makes a mess.  there's no option in the forum replies for monospace text or code anymore?  What? **


I typically use xrandr like this:


    xrandr --output DVI-0  --mode 1920x1080

    xrandr --output DisplayPort-1 --right-of DVI-1


Today we're going to use it to perform monitor scaling.

Upscaling: low-res games pixel-perfect




Sorry for hijacking the thread, but if there's any info on how to turn off billinear filters \ get pixel perfect scaling (I forgot the term for it), it would be nice to have some info on it, it would be useful for games like StarCraft, where the game is just a blurry mess. 

This is my particular method.


Essentially what we want is:

  • Game provides a low resolution image
  • We get our computer to nearest-neighbour (pixel-perfect) scale this up a few times
  • We get our computer to then place this scaled version on the picture to send to a monitor, already in its native resolution
  • The monitor displays a native-resolution picture

ie the monitor does not perform scaling, we do, and we have full control over it.



If you try and use xrandr's scaling function you will find it provides crappy bilinear scaling by default:


    xrandr --output DisplayPort-1 --scale 0.5x0.5





Xorg is perfectly capable of proper nearest-neighbour scaling, it's just that the xrandr utility doesn't give you an easy way to use it (unless you like writing out your own matrices).  There's a patch to fix this, but patching a popular moving target can be a pain.


Luckily someone else has a patched copy of the code handy.  It took me less than 5 minutes to:


    git clone git://anongit.freedesktop.org/xorg/app/xrandr

    cd xrandr




Note that I'm not installing this copy of xrandr system-wide by running 'sudo make install'.  I already have a system-wide copy of xrandr and I don't want them to fight for supremacy.  Instead I go back to this folder and run this special copy of xrandr from here whenever I need it.  Which method you use is up to you.


I also had to install a 'xorg-util-macros' package, otherwise the compilation process complained these macros were not available.  Your distro will have this package under a different name.  You'll also need things like git, gcc, autotools, xorg-devel.  If you are on a Debian-based distro (like Ubuntu or Mint) you can probably get most of this by "sudo apt-get install build-essential".


Now that that is all done:


   ./xrandr --output DisplayPort-1 --scale 0.5x0.5 --filter nearest




Ooh, much better.  The blurriness in the picture is due to it being scaled down in my image editor + jpeg, in reality it's razor-sharp.


   ./xrandr --output DisplayPort-1 --scale 0.25x0.25 --filter nearest




*melting at the knees*




As you go to really high divisors (eg 0.2 and below) you probably no-longer need to worry about your resolution dividing neatly.  ie you probably won't notice if every 5th pixel is 21 pixels high instead of 20 (for example).



Our next step is to prevent the game from changing our screen resolution (and ruining all of our work).


Open up winecfg and enable the "emulate virtual desktop" option.  Set it to the new divided resolution of your screen (eg 1280x1024 at 0.5x0.5 is 640x512).


Next set Wine's background colour to black, rather than the default Redmond-blue.  This is under the 'Desktop Integration' tab of winecfg.


Sidenote: you can also import Windowx XP Luna theming here if you want :P Others probably work too, I've never tested them.  Sadly the titlebar of windows does not get themed, but all of the widgets do.


Things should be good to go now.  Your game will not be centred on your screen, unfortunately, as the black bars will be on the right and bottom.  You may also need to make your wine-desktop window fullscreen (your window-manager should hopefully have a way of doing this, try right-clicking the titlebar). 


To disable all of this shenaniganry:


    xrandr --output yourdisplay --scale 1x1


Ask for help if you need it.



Downscaling: a dirty way of forcing full AA on games that refuse to support it



xrandr also supports scaling in the other direction:


    xrandr --output yourdisplay --scale 2x2


I use this to take screenshots of programs & things that are bigger than my screen.  It's really fun to turn your 1366x768 screen into something "higher DPI" than your friend's 4K laptop, just make sure they are not wearing their glasses and you'll be able to fool them.


Of course this method is not as efficient as things like MSAA, but it at least works.


Again you may want to enable wine's virtual-desktop feature to prevent your game from changing the screen res, OR it might work fine already (with the game detecting the new virtual res as an actual monitor mode), YMMV.  For games that only support a small list of resolutions the virtual-desktop feature will allow you to force black (blue by default) bars as per the last section.



Edited by Veyrdite

Share this post

Link to post
Posted (edited)

Slowing games down


Some games can't handle your fast CPU.  I have some solutions that partially work or help, but no perfect solutions.


A perfect solution should be possible on Linux, as it's "just" a case of getting the kernel scheduler to only provide your game with timeslots every x scheduling slots or so, but I'm not aware of any pre-existing solutions to let you do that.


Examples games I've encountered include:


  • Interstate76
    • Sky moves too quickly
    • Wheels glitch in/out of the ground (visual issue)
    • Game seems 100% playable, until you get to the mission where you have to jump over a bridge.  You can *never* make it with a high FPS.
    • As it turns out: higher FPS dramatically reduces your vehicle's max speed.   Dramatically.


  • No-One Lives Forever (NOLF) 1
    • Cutscenes cut each scene too early
    • Long audio (speech) tends to get cut



CPUlimit is a tool that I think works by sending SIGSTOP and SIGCONT signals to your process.  ie suspend and unsuspending it very quickly.


It works, but it's ugly and takes some fiddling to get the numbers right.  It does not run at a high enough of a frequency to keep things nice & smooth to play.



Last time I tried: I didn't have luck compiling this myself and getting it to work.  Did it require special cross-compilation to make a 32-bit version when you are on a 64-bit OS?  Wine games are almost always 32bit, so you need a 32bit version for them.



These tools let you manually choose your processor's frequency scaling levels and governors.  You can use them to force your cores to stay in their lowest speed.


On my laptop this works great, I can go down to 480MHz.  My desktop unfortunately only goes down as far as 1.6GHz, which is still too fast for some of these games, but it is still a noticeably and dramatic improvement.


If you want to record video whilst doing this: only change the frequency of one CPU core and look into how to make a process only use that core on Linux.  That way the rest of your cores run at full speed for your capture software + encoding.


Force software rendering

(MESA only, ie not proprietary graphics drivers)


The Mesa part of the driver stack has some pixel-perfect software renders included as a fallback.  You can force-enable them through one of Mesa's many flags (worthwhile skim-reading):


    LIBGL_ALWAYS_SOFTWARE=true  wine game.exe


YMMV.  Sometimes this is perfect.  Other times it's overkill and your FPS shatters.



Edited by Veyrdite

Share this post

Link to post

Veyrdite: I have an older AMD card, but I initially disqualified AMD from the running since it looks like on Linux, they dropped all support for forcing Antialiasing on the driver level.  Downsampling is of course an option, but I'm not sure that applies to older games that may not support modern resolutions (if it does, let me know).  


Correct me if I'm wrong, but my research before led me to this conclusion:


Nvidia drivers = forcing AA only works on some WINE titles, it's hit or miss, have to test it.

AMD drivers = forcing AA USED to only work on some WINE title, it was hit or miss, so AMD got rid of the option altogether.

Share this post

Link to post

> Downsampling is of course an option, but I'm not sure that applies to older games that may not support modern resolutions (if it does, let me know).   

Modern resolutions == higher resolutions or actual modern 16:9 ones?


While you don't need exact modern resolution suppotr you do at least needer "higher" resolution support for your game for this to work.  Any sort of much higher res windowed mode will do if you are happy to live with some black bars around the edges.


Another problem worth mentioning: HUD scaling :D  Many games keep text, HUD and menus 1:1 with the pixels.  Depending on the game this can be anything from a somewhat harmless nuisance (eg on-screen health and ammo in an FPS) to completely frustrating (inventory management).  




> they dropped all support for forcing Antialiasing on the driver level


Raining anisotropic fridges, it looks like the Mesa devs did drop the overrides:


> Some users don't understand that these variables can break OpenGL. The general is rule is that if an app supports MSAA, you mustn't use GALLIUM_MSAA.

> [...]

> In a nutshell, it does more harm than good.


I'm presuming Phoronix used them wrong, Phoronix's forum community kicked up a stink and the Mesa developers got sick of it.  Whilst Phoronix is a great resource it has some reasonably hellish forums & community.




There will always be ways of hacking this support back in through shim shared objects (same things as  'hacked' dlls for games).  Unfortunately I've only ever seen random bits of code that claim to do this, but never compile right or otherwise seem to work.




Share this post

Link to post
Posted (edited)

... so I got thinking.  Wine provides a directX->openGL shim layer.  That layer would have to convert sampling config (AA) from one world to the other.  Perhaps I can fix this problem at the wine level?

I just gave it a whirl on my laptop, it looks like you can: there's a registry key called SampleCount that seems  to do the job.  You have to manually create the key (and its folder) with 'wine regedit', they are not there by default.


Before (no reg key) and after (SampleCount=4) on GTA San Andreas:




Note that transparent objects are still shite, but all other objects and terrain are much smoother.  I suspect all games will react differently.


This game supports AA natively, so it's probably a really bad example.  It might be forcing off the features that would fix the AA on the alpha blend/test textures (transparent wires and trees). 


@Ross Scott: name a few games of interest (that you have have particular troubles with and/or are interested in) and I'll see if I can get copies to give them a whirl.  I'm not a rich guy, so please don't ask for anything too new :P


Hope this helps.  

Sidenote that may be useful: the proprietary Nvidia drivers replace openGL libraries with their own versions, so you (unfortunately) have to fully uninstall the Nvidia drivers if you want to switch your cards out for an ATI/AMD one.  Just keep this in mind if you have any issues (eg no graphical environment, or a slow 1024x768 graphical environment) when you test an ATI card.


Edited by Veyrdite

Share this post

Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in the community.

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.