Jump to content

Linux gaming technical discussion

Recommended Posts

I'm personally a fan of LMDE... Stable, but stays up to date at the same time. Supports anything that would work in Debian, plus it has a more Windows-like UI. I use it on 2 of my personal systems.

bi ti ʤi ˈbulzaɪ

Share this post


Link to post

Ross: Ignore Ubuntu's stance.  They have announced lots of bad decisions before and had to backstep later, or never gone through with them.  It sounds like the people who made this decision don't play games (luddites I say!).

 

Keep using whatever Linux distro you prefer.  If and when you encounter problems then (and only then) consider changing to one of the almost-identical-distros-with-a-different-name-and-colour-scheme that abound.

 

On 6/22/2019 at 11:10 PM, Ross Scott said:

The GTA screenshot looks like a classic case of MSAA-only with no alpha texture supersampling.  The easiest solution (if possible) is to force full SSAA instead. Although this is assuming the wires are textures and not polygon models.  If that's the case, then it's some shader mess intefering with everything. 

The wires are indeed alpha textures on large rectangular polys.  I think this game is too old to be a shader-mess, it's more likely some fixed-pipeline variant.

 

There are a few more reg keys in the doc I link that could have an effect, I'll give them a try.

 

On 6/22/2019 at 11:23 PM, RaTcHeT302 said:

uhh well i had issues with 8-bit installers, or whatever they are called, not working on Windows anymore, so it'll likely have a major impact 

 

but maybe wine is somehow built with that in mind? i don't know, personally it has affected me before on Windows, I don't see why it wouldn't on Linux, but someone who has used Linux + Wine more in depth should look into it

 

16-bit.  Ie games and programs from around the Win1,2,3,95 era. 

 

There are quite a few, as even into the reign of Win98 many games would still use 16bit executables to provide compatibility with older systems.  32-bit executables gave programmers a lot less headache (no 64K limit, proper memory management done for you, etc) but it was a still a new option.

 

I believe Windows has dropped supported for WOW64 (16 bit support on 64-bit windows) completely now.  On 32-bit versions you might still be able to enable it manually?  But on 64-bit I believe you are SOL.

 

The Linux world is unfortunately almost as bad at the moment.  Wine supports 16-bit executables perfectly, but you also need your kernel to support them.  This is an optional feature when the kernel is being compiled, to change it you have to recompile your kernel (easier these days, but still not something most ordinary users will want to do).

 

Different distros choose different options to compile their kernels.  I'm unsure what Ubuntu does (try it to find out), but I know my distro (Void) has it disabled.  If you are lacking 16bit LDT then you will get messages like this when trying to run old 16-bit Windows programs:

 

$ wine esheep.exe

[...]

0032:err:winediag:build_module Failed to create module for "krnl386.exe", 16-bit LDT support may be missing.
0032:err:module:LdrInitializeThunk "krnl386.exe16" failed to initialize, aborting

$

 

Once upon a time 16-bit LDT was enabled by default in basically all distros, then some security flaws were found in it and as a result many distros changed their build settings to disable it.  I've heard that the problems have since been fixed, but few distros have bothered changing the settings back. 

 

After all, who cares about old software?  Pfft!

Edited by Veyrdite (see edit history)

Share this post


Link to post
On 6/23/2019 at 5:21 AM, BTGBullseye said:

I'm personally a fan of LMDE... Stable, but stays up to date at the same time. Supports anything that would work in Debian, plus it has a more Windows-like UI. I use it on 2 of my personal systems.

Ooh, thankyou for the suggestion.  The Mint installs on my folks' laptops are now ancient and I want something else to put them on.

Share this post


Link to post

yes 16 bit installers, i don't know why i said 8 bit (guess i had win 98 in my mind), i forgot the exact number, but anyway, i had issues with Future Cop LAPD and some other games i can't recall right now, the PC release for future cop kinda sucks anyway, i'm just better off emulating the PlayStation 1 version

 

Edited by RaTcHeT302 (see edit history)

Share this post


Link to post

That kill script can kill processes that are not wine related.

#!/bin/bash
sleep 30
for KILLPID in `ps ax | grep -i '.exe' | awk ' { print $1;}'`; do
  kill -9 $KILLPID;
done

killall -I -s KILL -v wineserver 

 

Problem is with the regular expression in grep command. The '.exe' will match exe with any prefix (the dot is special symbol for any character)

For example, on my system it could kill also these processes:

 

 2000 ?        Sl     0:00 /usr/lib/gvfs/gvfsd-trash --spawner :1.6 /org/gtk/gvfs/exec_spaw/0
 7739 ?        Sl     0:00 /usr/lib/gvfs/gvfsd-http --spawner :1.6 /org/gtk/gvfs/exec_spaw/1

 

For start, I would suggest to edit the script in this manner (adding backslash "\" to grep -i '\.exe')

#!/bin/bash
sleep 30
for KILLPID in `ps ax | grep -i '\.exe' | awk ' { print $1;}'`; do
  kill -9 $KILLPID;
done

killall -I -s KILL -v wineserver 

In this case, it will only match lines that contains '.exe'

But it would be much better to test if a process is wine releated.

 

Maybe changing the script to something like this would be enough:
 

#!/bin/bash
sleep 30

# kills currently running wineserver
wineserver -k

 

 

Share this post


Link to post
2 minutes ago, testman said:

Is there anything we can do to help Ross with the issues he is encountering in Linux?

post solutions, idk what else

Share this post


Link to post

Another question:

 

I've found a way to essentially have 2x2 SSAA for many old games on Windows.  If the game will run with dgvoodoo, I can force it at higher resolutions than it will normally run.  If I combine that with Nvidia's DSR (dynamic scaling resolution), I can pump the resolution higher than what my monitor runs at, essentially creating downsampling on the game and getting a cleaner image that way. 

 

Is there anyway to specify Linux or WINE to run at a resolution higher than the native one and then downsample to the native one?  In other words, if you had a 1080p monitor, could you tell a game to render at 4k, but then downsample the output back to 1080?

Share this post


Link to post

I suggest expanding the test range to include the open source nouveau driver for nvidia hardware, geforce 770 is kepler architecture so it should be supported by nouveau, what makes nouveau special is the ability to run DX9 native directly in hardware, no translations to GL or VK, you may have to refer to your distro manual to figure out how to switch the display drivers to nouveau, but it's possible distro builds of mesa may not include nine support so you may have to compile the graphics stack yourself (recommended, and worth it :)), one good thing about this option specifically for you because you wanne play less than DX9 games is boosting the games from pre-9 to 9, so for example you take say a DX7/8 games and using DX1-8 to 9 convertion you get "native" DX9 for pre DX9 games by boosting libraries

 

I am also looking for ways to have SS/MS in WINE/Proton+Nine

so far I only have the general nvidia operational phrase: "you gete blur and you get blur and everyone gets more blur"

but im getting there slowly :x one hack at a time

 

I would also recommend to try with the open source radeon stack, that is get a radeon card and see if anything changes (radeon is VASTLY superior to nouveau in open source driver stack terms, which could lead to improved results), I know this is not exactly what people usually whant to hear since this is money going out, but if it fixes the rendering problems...is worth it

 

I know you mentioned you don't like post "AA" but...it can look good when set to highest, I had very good results even at 2K(native) resolution with FXAA set to 39 in reshade (simply have to edit the shader code and uncomment), one more thing: a 770 can only go so far, performance is very limited, performing translations+applying exotic post shaders or AA not to mention supersampling and friends....can't be done, not at any normal framerate, you could mitigate this by compiling a fully preemptive kernel and set it's clock to 1000 hz, but even the lowest latencies cannot give you throughput...you just need more graphic resources :x if you really prefer to stick to nvidia I highly recommend a 780 Ti, a good one with high clocks like 1200 mhz on the core (such as the EVGA classified kingpin), it's expensive around 500$ (which is the same price since 2014) but it's the best there is. the problem is that radeon cards fail badly in windows and nvidia cards fail badly in linux, so if you buy a radeon card for linux and linux fails you, your windows experience will be compromized by radeon bugs (AMD is a small low-budget company and they cannot afford software development...) and if you buy a modern nvidia card you won't be able to drive it (by open source drivers) in linux at all. I cannot recommend using nvidia closed source drivers, not realy so much for the principle of it but more because nvidia has proven in the past to do dirty things, one such thing is proactively withholding firmware images to drive their modern hardware by open source nouveau driver, they are actively trying to force linux users to use their closed drivers...

 

I also recommend to test some winecfg DLL overrides for various DLL's and don't forget to install directX fully (download directX june 2010 and install it)

 

try xrandr --output yourdisplay --scale 2x2

but I think it might look bad, I haven't tested this yet myself

 

Edited by Manoa (see edit history)

Share this post


Link to post

Hi @Ross Scott ! A lot has happened since you've last had a look at this, so there's been some exciting developments which I'll now try to summarize some highlights:

  • D9VK got merged with DXVK. This makes setup for everything much simpler, as you don't need to separately maintain two different layers. Additionally, since version 5.0 DirectX 9 games run through DXVK in Steam's Proton. To override it use "PROTON_USE_WINED3D=1 %command%" in Steam's advanced launch options.
  • Proton has Integer Scaling available out of the box. That's right! You can get some sharp ass pixels now. Just use "WINE_FULLSCREEN_INTEGER_SCALING=1 %command%" in Steam's advanced launch options. I've not personally tested this yet, but I'm fairly sure it should work in Lutris also (provided you use a Proton-ified version of Wine, at least).
  • vkBasalt has been released and adds SweetFX-esque post-processing effects, including CAS and AA. It only works in Vulkan, but since almost nothing is using OpenGL now, it more or less works in 90% of games. The CAS feature is particularly impressive, and can really help out with some games that just have a blurry look, no matter what.
  • Lutris now has built-in frame-rate limiting support. I'm pretty sure you still need to install libstrangle for it to work, but now you just need to input the desired framerate into a box in your launcher's System options.
  • new-lg4ff is a new driver for Logitech Force Feedback wheels (G29 and older) that implements previously missing Force Feedback effects. It's still a work in progress, but it should help quite a bit for Wine games, that previously had missing Force Feedback. You can also now use the Oversteer GUI to configure your wheel.

Regarding super-sampling, there's a couple of ways to go about it, but no sure-fire way as far as I'm aware. First, as @Manoa mentioned, you can use xrandr to increase your resolution (you don't necessarily have to use integer scaling either). You should be able to automate the resolution switching in Lutris, using the pre-launch script and post-launch script options. Downsides to this are that it won't work for all games, and your whole desktop will change resolution, not just the game. Your second option is to use a custom resolution. This can be done, either through making a custom EDID for your monitor, using Custom Resolution Utility in Windows, or by using xrandr to make a new modeline. I'm not super well versed in this, so I don't want to give you a tutorial that could have wrong information on it, but there's a few guides out there and additional info on the Arch Wiki if you're so inclined. This should work for all games, but has the drawback of being a little riskier, as imputing the wrong values to your monitor could damage it over time. Personally, I'd try using vkBasalt's CAS and seeing if I really need more than what that offers.

 

Now, regarding nouveau and native DX9...

 

12 hours ago, Manoa said:

I suggest expanding the test range to include the open source nouveau driver for nvidia hardware, geforce 770 is kepler architecture so it should be supported by nouveau, what makes nouveau special is the ability to run DX9 native directly in hardware, no translations to GL or VK, you may have to refer to your distro manual to figure out how to switch the display drivers to nouveau, but it's possible distro builds of mesa may not include nine support so you may have to compile the graphics stack yourself (recommended, and worth it :)), one good thing about this option specifically for you because you wanne play less than DX9 games is boosting the games from pre-9 to 9, so for example you take say a DX7/8 games and using DX1-8 to 9 convertion you get "native" DX9 for pre DX9 games by boosting libraries

 

While academically interesting, I highly disagree that native DX9 would be of any benefit. DXVK has been benchmarked as faster in many cases than native Windows + DX9, due to a combination of poor performance in modern drivers + better multi-threading afforded by DXVK. Using DXVK you can still using wrappers to bring DX7/8 to DX9 and subsequently Vulkan (which then allows you to use vkBasalt). Combine this with the bad performance nouveau provides elsewhere, and I fail to see the case for actually daily driving it. If you want open-source drivers, stick to AMD. Until Nvidia changes their ways, if you buy their cards, you have to use their drivers.

Share this post


Link to post

Hey, I just want to recap what I'm doing here, since I'm not sure how many of the suggestions apply to what I'm testing.  So here it is:

 

I'm going to be testing many 3D accelerated older games from the late 90s to late 2000s and besides seeing which I can get running, also see if I can force "real" antialiasing on.  (Either SSAA or mixed mode MSAA + SSAA for alpha textures).  These are going to range from DirectX 5 - DirectX 9 most likely.  If you're talking about enhancements beyond that, that's interesting, but that's beyond the scope for what I'm looking at this point.

 

Manoa: 

Several notes:

-I have an old AMD card that's slower than the 770, but I originally wrote AMD off, since my understanding was they got rid of the option to force AA in games at the driver level on Linux.  In other words, they removed the option to force AA (it had to be supported by the game itself), so I wasn't considering them for my test. Has that changed?

-I don't plan to be testing demanding games performance-wise.  My most recent game will probably be from 2009.  Is a Geforce 770 really not enough muscle to run 15-20 year old games on Linux?

-I'm not interested in shader based AA methods for this test.  Many of these older games are resolution-locked, so AA becomes the only method of cleaning up the image.  Shader based ones can help somewhat, but don't attack the worse elements of aliasing like shimmering and pixel-level flicker.  They mostly just smooth easily defined edges.  The idea behind this test is I'm trying to see if it's possible to get the games to look at good now as they could have in say, 2008.

-I have to definitely disagree with your claim that SSAA can't be done on a 770 with a playable framerate as I've literally done that on WINE with playonlinux last year.  Remember, I'm talking about a lot 15-20+ year old games, the polygon count just isn't that high.

-A sharpen filter could be of use if SSAA is working, but is blurry

 

I'm getting the feeling that you're trying to be helpful, but you may not be understanding the parameters of what I'm trying to test here.

 

 

cgalves:

-I'm not worried about integer scaling for this test, I figure that's something separate from game compatibility.  In other words, if it works, it can work on basically any game.

-Same for force feedback, I actually have that turned off on my own steering wheel anyway.

-I'm not going the custom resolution route that can send rates that the monitor can be incompatible with.  If the scaling can be done on the card / OS itself and it's downsampling the image to be sent back at native resolution, great.  If it involves forcing a monitor to try and take a higher resolution, I'm not doing that.  In addition to necessarily being good for the monitor, that's not a reliably reproduceable solution, it's too dependent on the monitor itself.

-You're honestly confusing me with what I should be running now.  Say I have a DirectX 8 game and a DirectX 9 game that's not in any Linux game database.  What steps would you recommend I take for:

 

1. Getting each game to run

2. Attempting to force antialiasing on said games.

 

For step #1, if it's the sort of thing where I should try method A first, then if that doesn't work, try method B, that's fine, but I'm a little lost as to the suggested path now.  In the past, I was using what I mentioned in the first post in this thread.  Apparently that's outdated now, so I'm trying to get my bearings.

 

 

 

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in the community.

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

This website uses cookies, as do most websites since the 90s. By using this site, you consent to cookies. We have to say this or we get in trouble. Learn more.