Jump to content

Birthday thread: Software

Recommended Posts

Interestingly, no widescreen hack for Diablo 1 here, but there is a fairly new mod:

http://www.rockpapershotgun.com/2014/09/24/diablo-1-hd-resolution-mod/

 

Yeah, but I honestly think that Sizer and sometimes DxWnd does a great job in upscaling for better resolutions.

 

Also, I think Ross want one solution for this problem and hack the engine of every game he plays doesn't seems reasonable (for me).

aka Vincent Van Vega

In Vacation FOREVER:

 

aTeg9KQKZVw

 

Share this post


Link to post

Interestingly, no widescreen hack for Diablo 1 here, but there is a fairly new mod:

http://www.rockpapershotgun.com/2014/09/24/diablo-1-hd-resolution-mod/

 

Yeah, but I honestly think that Sizer and sometimes DxWnd does a great job in upscaling for better resolutions.

 

Also, I think Ross want one solution for this problem and hack the engine of every game he plays doesn't seems reasonable (for me).

 

Neither of those programs can make 2D games fullscreen properly and DxWnd can cause glitches though as we've shown :/

 

Nvidia and AMD aren't adding more options beyond bi-linear any time soon it seems so this is the best we've got.

Share this post


Link to post

Neither of those programs can make 2D games fullscreen properly though as we've shown :/

 

Unfortunately. At least, probably Sizer can make Pharaoh's UI work in higher resolutions.

aka Vincent Van Vega

In Vacation FOREVER:

 

aTeg9KQKZVw

 

Share this post


Link to post

Neither of those programs can make 2D games fullscreen properly though as we've shown :/

 

Unfortunately. At least, probably Sizer can make Pharaoh's UI work in higher resolutions.

 

Indeed you can since the game actually has the courtesy of having a windowed mode, don't need DxWnd for that one :P

 

 

I was curious to see how well Borderless Gaming could remove stubborn windows title bars and borders compared to my shitty script. Despite the program being pretty limited in what it can do.

 

l6PoDwPYnEE

 

- BG does a great job at doing what the programs name says, much better than my shitty script, but the price apparently is mouse input delay.

- borderless gaming adds some mouse input lag in any game you run through it. Not horrible, but you'll notice it.

- Dxwnd plays nice with BG, but still causes menu glitches in Diablo.

(also, interestingly this menu issue has annoyed the dxwnd dev for quite some time: http://sourceforge.net/p/dxwnd/discussion/general/thread/37dbd78d/?limit=50 )

- Dxwnd combined with BG correctly hides windows background unlike with Sizer.

- BG can toggle hiding taskbar which is nice, program won't let you close it until you unhide taskbar, smart. This dev thinks ahead :)

- Combo successfully fakes "fullscreen" without worrying about Nvidia interference with bi linear filtering problems if you actually ran the game in actual fullscreen.

- Dxwnd set to 1400x1050 desktop centered, resolution window. Hide windows background check is toggled on.

 

You can get Borderless Gaming for free on the creator's github page. Don't need to buy it through steam.

https://github.com/Codeusa/Borderless-Gaming/releases

 

edit:

You can minimize some of that input delay by disabling Windows Aero and using a basic theme. The fancy Aero theme makes it worse.

https://helpx.adobe.com/x-productkb/global/disable-windows-aero-windows-7.html

 

Getting windows to use a basic theme in Win 8/10 is more annoying but there be tutorials on the googles.

 

 

edit 2:

 

Found a another handy program on steam forums you could use with Dxwnd.

http://steamcommunity.com/groups/WindowedBorderlessGaming/discussions/0/846966336037436101/

 

Guide for how to use it:

https://steamcommunity.com/discussions/forum/1/864980278005937412/

 

Obviously, not going to work on EVERY game we throw at it just like any of our tests with any programs we've tried here.

edit 3: download link was broken, here is a direct link that works.

https://steamcommunity.com/linkfilter/?url=http://westechsolutions.net/sites/WindowedBorderlessGaming/downloading/WindowedBorderlessGaming_2.1.0.1.zip

If you put dxwnd folder inside the program's folder it'll detect it and it'll be in the options menu which is handy. (if you name the folder dxwnd)

 

This one I couldn't get to work with Diablo, but might work for other games.

Edited by Guest (see edit history)

Share this post


Link to post

Ross, please be careful with nearest neighbor filtering. If the size you are trying to scale is not an integer number like 2, 3, 4, n times you will get distortion. some lines or columns may appear bigger than others.

 

For example, here we have a 256 by 256 pixels image, and some pattern:

eaNQSac.png

 

When scaled to 1080p, ( by a factor of 4.21875 times ) with bilinear filtering we get a smooth but blurry image, as you noted in your video:

 

HL1EcxR.png

 

 

But when scaled with no filter, even tough we get a much crispier image, you may notice that some lines and columns are bigger than others:

 

hhD1zXm.png

 

In your video some of the example footage scaled with this method showed this distortion.

 

You can avoid this problem by always scaling with integer factors. If your target screen is 1920x1080 and your source is 256x256 you must resize to 1024x1024 and pad/letterbox everything else. If you have 320x240 as a source, resize to 1280x960. The math is simple, take the smallest dimension in the target resolution (1080) and divide by the corresponding source dimension (240), then floor the number. You will get 4. now multiply the input resolution by that and you get the resolution you really want.

 

Here is what you get:

 

rpnyrTs.png

 

 

What if you want to fill the entire screen? Use bilinear filtering! since you have already pre-scaled with nearest neighbor, the billinear filter will have much less of a blurry effect on the image:

 

0jrKM1k.png

 

We went from 256x256 -(nearest neighbor)> 1024x1024 -(bilinear)> 1080x1080

 

If you use MAME there is an option called "pre-scale" that does just that. scales by an integer factor with nearest neighbor and then applies bilinear filtering to fill the screen.

Unfortunately I have no idea how you could implement this behavior in windows...

Share this post


Link to post
If you use MAME there is an option called "pre-scale" that does just that. scales by an integer factor with nearest neighbor and then applies bilinear filtering to fill the screen.

Unfortunately I have no idea how you could implement this behavior in windows...

I've used MAME before, and that's what I was looking for ways to do for non-emulated games, but can't seem to find anything. I also don't think Ross has used MAME that much.

Don't insult me. I have trained professionals to do that.

Share this post


Link to post

I know this isn't exactly what you want to hear, but I've like to elaborate on why AA in emulators is such a headache:

 

I'm assuming you are talking about MSAA, and not FXAA or other screen-space AA techniques which are usually a little easier to implement.

 

Computer speed is (usually) not the issue. It doesn't matter if current computers are 10x faster, because usually applying AA on a low-level emulator (it's a lot easier with high-level emulators, but they aren't as accurate) isn't a problem of not having enough cycles, it's a problem of finding the information.

 

To make this a little easier to understand, let me explain exactly what a low level emulator does. A low level emulator will take in the original binary data that represents the game's code (like the original platform's equivalent of an .exe), and then run it through a virtual CPU. However, it has no in-depth understanding of the meaning of the data - just like the CPU in our computers, the virtual CPU in an emulator isn't really smart. It doesn't understand why it does something, it's just really fast at doing it.

 

MSAA works by finding the edges of the 3d shapes and sampling them at a higher rate, before blurring them slightly, which makes edges look smoother. The issue is that this requires the code to have direct access and understanding of where in RAM the 3d shapes are, and the format they are stored in. A CPU doesn't think to itself "let's do transformations on 3d shapes", it thinks "let's do some abstract maths on abstract data representing a certain type of number".

 

If somebody passed you a spreadsheet of binary data and told you to perform MSAA on it, you'd have no idea where to start. The emulator has the same issue. Through pretty complex code analysis you could determine what data is of a certain type (floating point numbers, usually, for 3d data), but determining what data represents 3d shapes and how to anti-alias that just isn't really easily possible. The only true way you could do that would be via having a human reverse-engineer it beforehand and write game-specific hacks to add MSAA.

Share this post


Link to post
I know this isn't exactly what you want to hear, but I've like to elaborate on why AA in emulators is such a headache:

...

The only true way you could do that would be via having a human reverse-engineer it beforehand and write game-specific hacks to add MSAA.

Well, I can also elaborate a bit about that. Actually the problem is a bit more complex - a virtual CPU works _exactly_ like a normal one, so if a normal one may perfrom a MSAA/FXAA pass, than a virtual can do this as well, no problem. That's the whole point. So if we are talking about software emulation rendering here - we are scored. Antialiasing in an emulator (a "hard mode" emulator, such as VirtualBox, VMWare and such) should work out of box! Except that no antialiasing works in software emulation (no hardware acceleration).

 

Sooo, we are moving to the root problem - hardware acceleration. How do one emulate a hardware acceleration? That's right - by not emulating it! The only way to hardware accelerate an emulated program is to pass all the GPU commands and data back to the host OS (the one which is running an emulation software, such as VirtualBox, etc) and perform it on the host OS, using host drivers on a host real GPU. And it works fine, except for the DirectX games, because no-one knows exactly what DirectX actually does under the hood. So all DirectX games are running through some soft of API/driver emulation in the guest system which tries to behave like DirectX, but usually fail miserably (for instance - VirtualBox hardware emulation actually translates all DirectX calls into OpenGL using code from the Wine project), due to numerous undefined behaviours, uses of undocumented or poorly documented DirectX features, driver gotchas and stuff like that. And driver-mode MSAA/FXAA (and we are talking about OLD games here, way before sharers era) is just a black magic that generally does not work even natively, so most emulators just ignore it, because it's such pain in the ass.

 

That said, it might be possible to enforce antialiasing for a game within a guest system by the same means as you enforce antialiasing for a game within the host system, via drivers. As for the host driver it's the same thing as a native application (it doesn't really care what's coming in and how it's get there) it will enforce antialiasing for it anyway. MXAA is a fat chance - it's a bit too complex to enforce, but a fullscreen one is just fine - it's just rendering in n times bigger resolution and then scaling the image down. But you may want to rename exe files for an emulator software into something else. Or somehow tell your drivers not to use any stupid hacks and quirks it might be using for a given application (usually detected by exe name), such as disabling antialiasing.

 

I do generally prefer playing old games in Linux under WINE. I do find it much easier and more convenient than running VirtualBox with Windows XP within Windows whatever else. I also think that it should be possible to enlarge a Wine window to match your resolution with whatever filter you want (it sounds like a more convenient "Screen Zoom" application). Never tried it though. But if you are interested in running Wine I may try to find or patch it.

 

And, BTW, if we are talking about a windows "fullscreen mode" - it actually changes your resolution to a lower one. Which means that filtering stuff is not a videocard's job anymore, but your monitor's one. You should check your monitor's menu/drivers for a filtering config. Mine doesn't have such a feature, but I know some monitors have it.

Share this post


Link to post
Ross, please be careful with nearest neighbor filtering. If the size you are trying to scale is not an integer number like 2, 3, 4, n times you will get distortion. some lines or columns may appear bigger than others.
I'm fully aware of this, the thing is, I'm thinking down the road. Say you have a 4k monitor where the source image doesn't divide evenly. The off-integers are going to look closer and closer to the original the higher your resolution is. With bilinear filtering more resolution makes it WORSE. Regardless, if I'm just playing the game for myself, I would rather have the slightly less accurate image, but sharper look, then the bilinear blur everywhere. I would still keep it accurate for official videos.

 

And yes, I did have the distortion in the video to show the difference more clearly. For game dungeon I upscale past the display resolution with nearest neighbor, then use bicubic or lanczos to bring it to 1080. So if the source is 320x200, I scale it to 1920x1200, then back to 1728x1080.

 

Well, I can also elaborate a bit about that. Actually the problem is a bit more complex - a virtual CPU works _exactly_ like a normal one, so if a normal one may perfrom a MSAA/FXAA pass, than a virtual can do this as well, no problem. That's the whole point. So if we are talking about software emulation rendering here - we are scored. Antialiasing in an emulator (a "hard mode" emulator, such as VirtualBox, VMWare and such) should work out of box! Except that no antialiasing works in software emulation (no hardware acceleration).

 

Sooo, we are moving to the root problem - hardware acceleration. How do one emulate a hardware acceleration? That's right - by not emulating it! The only way to hardware accelerate an emulated program is to pass all the GPU commands and data back to the host OS (the one which is running an emulation software, such as VirtualBox, etc) and perform it on the host OS, using host drivers on a host real GPU. And it works fine, except for the DirectX games, because no-one knows exactly what DirectX actually does under the hood. So all DirectX games are running through some soft of API/driver emulation in the guest system which tries to behave like DirectX, but usually fail miserably (for instance - VirtualBox hardware emulation actually translates all DirectX calls into OpenGL using code from the Wine project), due to numerous undefined behaviours, uses of undocumented or poorly documented DirectX features, driver gotchas and stuff like that. And driver-mode MSAA/FXAA (and we are talking about OLD games here, way before sharers era) is just a black magic that generally does not work even natively, so most emulators just ignore it, because it's such pain in the ass.

 

That said, it might be possible to enforce antialiasing for a game within a guest system by the same means as you enforce antialiasing for a game within the host system, via drivers. As for the host driver it's the same thing as a native application (it doesn't really care what's coming in and how it's get there) it will enforce antialiasing for it anyway. MXAA is a fat chance - it's a bit too complex to enforce, but a fullscreen one is just fine - it's just rendering in n times bigger resolution and then scaling the image down. But you may want to rename exe files for an emulator software into something else. Or somehow tell your drivers not to use any stupid hacks and quirks it might be using for a given application (usually detected by exe name), such as disabling antialiasing.

 

I do generally prefer playing old games in Linux under WINE. I do find it much easier and more convenient than running VirtualBox with Windows XP within Windows whatever else. I also think that it should be possible to enlarge a Wine window to match your resolution with whatever filter you want (it sounds like a more convenient "Screen Zoom" application). Never tried it though. But if you are interested in running Wine I may try to find or patch it.

 

And, BTW, if we are talking about a windows "fullscreen mode" - it actually changes your resolution to a lower one. Which means that filtering stuff is not a videocard's job anymore, but your monitor's one. You should check your monitor's menu/drivers for a filtering config. Mine doesn't have such a feature, but I know some monitors have it.

I'm not expecting miracles, but we have NOTHING right now. Here are some thoughts:

 

-I was actually thinking more of dumb supersampling rather than MSAA. MSAA doesn't do anything for alpha textures anyway. It may not make any difference at all (I don't know), but I would think SSAA would be easier to implement since it wouldn't have to seek for edges, but rather render the whole scene at a higher resolution, then scales it back down.

-You mentioned how virtualbox will translate DirectX protocols into OpenGL anyway and that hardware acceleration isn't emulated, but is instead passed on to the host system. Fine! I haven't used Virtualbox in a long time, so maybe this does work, but in VMware, if I force AA through the hardware control panel on the host system, it STILL doesn't do anything. Is this not the case with Virtualbox? If the virtual machine would just accept the overrides from the host system, this wouldn't be an issue for me at all.

-While it's my last choice, being able to apply FXAA or SMAA to the virtual machine would also be at least another option than we have now. Again, I've tried forcing this to no avail. If it's just being applied broadly, the content being pre-shaders shouldn't matter, since these methods don't use any geometry to apply the "AA" anyway. It's just looking at colors and brightness.

Share this post


Link to post
MXAA

There is no such thing! (you might want to fix the typo, and keep an eye out for it in the future, a lot of sites get very hostile over things like this)

Don't insult me. I have trained professionals to do that.

Share this post


Link to post
I know this isn't exactly what you want to hear, but I've like to elaborate on why AA in emulators is such a headache:

...

The only true way you could do that would be via having a human reverse-engineer it beforehand and write game-specific hacks to add MSAA.

Well, I can also elaborate a bit about that. Actually the problem is a bit more complex - a virtual CPU works _exactly_ like a normal one, so if a normal one may perfrom a MSAA/FXAA pass, than a virtual can do this as well, no problem. That's the whole point. So if we are talking about software emulation rendering here - we are scored. Anti-aliasing in an emulator (a "hard mode" emulator, such as VirtualBox, VMWare and such) should work out of box! Except that no antialiasing works in software emulation (no hardware acceleration).

 

I don't think you quite understand the issue. The issue isn't about performing anti-aliasing - all turing machines can do that - the issue is about figuring WHAT data to apply AA on. The virtual CPU is just a dumb cpu that speeds through instructions; it doesn't understand any high-level concepts about the stuff it's doing. The virtual CPU is no help whatsoever in applying anti-aliasing, because it doesn't understand where in memory the frame buffer is or where the 3d data is stored, as well as not being able to find the depth buffer (which may not even exist, since older games tended to use painters algorithm or other techniques). It just performs calculations, it has no understanding of what it is actually doing.

 

Ultimately adding AA to an emulated game (again, talking about low-level emulation) is as hard as adding AA to a native game - you need to add it in the game's code itself. That requires per-game hacks.

 

Typically, it's *easier* to add AA if it's using hardware emulation. If you have hardware emulation, your emulation code is already capable of de-constructing the 3d data the game uses, as it needs to construct the various buffers to send to the GPU. In software emulation, you can very easily get away with a dumb pixel write from the emulated devices memory-mapped regions and let the game's code perform all the 3d calculations in software. However, because these calculations are happening within the game itself, your emulator has no understanding of what the data is. It cannot easily apply AA in that case.

Share this post


Link to post

So far sizer doesn't help me much. It completely glitched out on a DX9 game I tried (in windowed mode), and for VMware, it forces the host system to change its resolution (rather than zooming in the image).

Share this post


Link to post

Hey Ross, check out Lossless Scaling on Steam. It's all you wanted all along! It is Lossless Integer Scale and sets the game in Fullscreen! 

 

Me and another user posted a review with the app compatibility , and it works with almost all games so far! 

 

Here is a video of me running DOSBox with this app:


https://www.youtube.com/watch?v=Z-TSiOEvoSY

 

And here is a video of me running ScummVM games with this app:


https://www.youtube.com/watch?v=hM5DN2-wa2I

 

I also posted this on the January 2019 Videochat Official Questions Thread so you could see. 

Edited by Thomasbelcar (see edit history)

aka Vincent Van Vega

In Vacation FOREVER:

 

aTeg9KQKZVw

 

Share this post


Link to post

I found a video which may be one step towards Ross's dream software.

If you watch the video you'll see what i mean.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in the community.

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

This website uses cookies, as do most websites since the 90s. By using this site, you consent to cookies. We have to say this or we get in trouble. Learn more.