Jump to content

We're finally getting integer-ratio scaling!

Recommended Posts

..but only on the most advanced CPUs and Graphics cards in the world.

Official Announcements:
Intel: https://software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics
nVidia: https://www.nvidia.com/en-us/geforce/news/gamescom-2019-game-ready-driver/

A better summation: http://tanalin.com/en/articles/lossless-scaling/#h-progress

TL;DR Intel is only supporting supporting integer-ratio scaling on the integrated graphics of their 10th and 11th gen CPUs. nVidia is only supporting it on graphics cards with the new Turing architecture.

 

I can't believe that a simple algorithm will not be implemented on my GTX 1080, and as such this feature will be unavailable to me for a long time because I'm not buying another top of the line graphics card for several years.

Quote

Not available for older GPUs

The feature is not available in the Windows version of the nVidia GeForce driver for previous generations of nVidia GPUs. On March 25, 2019, an nVidia official stated that they have no plans to implement scaling with no blur in their graphics driver for Windows, while admitting that the algorithm itself is straightforward.

As a reason, they refer to impossibility to implement this feature in the Windows driver in a way that wouldn’t require “ongoing continuous support” given that WDDM specs of graphics drivers in Windows 10 are consistently changed.

That said, nVidia officials ignore questions about whether the same ongoing continuous support is required for the existing DSR technology that, as well, does resolution virtualization transparent to operating system and full-screen applications.

 

Edited by UsefullPig

Share this post


Link to post

From what I hear from people in AMD circles, they will be implementing it in the Radeon Settings software shortly after they finish stabilizing the 5700 drivers.

Share this post


Link to post
32 minutes ago, BTGBullseye said:

From what I hear from people in AMD circles, they will be implementing it in the Radeon Settings software shortly after they finish stabilizing the 5700 drivers.

 

On all currently supported cards? If so that's fantastic

Share this post


Link to post
7 hours ago, UsefullPig said:

 

On all currently supported cards? If so that's fantastic

Yes, and it only took one billion dollars and two decades worth of progress, just to essentially turn off a filter.

 

TECHNOLOGY

 

Wait wait wait. What do you mean I need to spend 2000 bucks on it? CAN'T YOU GUYS JUST... MOVE THE MATH OVER THERE.

 

Sometimes I wonder if I'm just insane. Maybe I should keep on wondering... I just feel stupid when I try to wrap around my head, as to why something as simple as this, took so fucking long.

 

I mean, how can the same company, who's releasing million dollars worth of computer technology, struggle with what is essentially, blurry pixel art.

 

I don't get it.

 

Oh right, money, that's why. BLAH. I mean I'm kinda bitter that I need to pay more, for what is essentially some really dumb math algorithm, and it just annoys me how essential this thing is. I mean, I REALLY COULD'VE USED THIS TEN YEARS AGO.

 

Even if they had a reasonable explanation for not doing this sooner, I'm not sure if I'd actually honestly care that much about it at this point. It just seems really puzzling to me though. Was this something they were struggling with? Somehow???

 

NVIDIA: We can make a bunch of rocks think, but making a buncha pixels not look like a blurry mess? no no no that's future technology guys, we can't do that - hey now check out this real time raytracing, YEAH

 

no sorry guys, blurry pixels are too much of a challenge for us, that'll clearly take us twenty years and an R&D budget of one trillion dollars

 

yep

Edited by RaTcHeT302

Share this post


Link to post
21 hours ago, RaTcHeT302 said:

NVIDIA: We can make a bunch of rocks think, but making a buncha pixels not look like a blurry mess? no no no that's future technology guys, we can't do that - hey now check out this real time raytracing, YEAH

Even their raytracing is a myth... AMD cards can do it too, if it's not coded specifically to prevent AMD from running it. It's just like the PhysX stuff.

Share this post


Link to post
On 9/5/2019 at 7:55 AM, BTGBullseye said:

Even their raytracing is a myth... AMD cards can do it too, if it's not coded specifically to prevent AMD from running it. It's just like the PhysX stuff.

Ah PhysX, a gimmick that was so shortlived. What was the point in implementing it and buying it from Ageia, no one knows...

OpenCL on the other hand barely gets use in Physics on the GPU. Bullet Physics uses OpenCL as well.
I think BeamNG.Drive uses OpenCL, that's about the only game I know lol

Share this post


Link to post
On 9/4/2019 at 1:55 PM, BTGBullseye said:

Even their raytracing is a myth... AMD cards can do it too, if it's not coded specifically to prevent AMD from running it. It's just like the PhysX stuff.

So real time ray tracing doesn't really have anything to do with the new Turing architecture? I've noticed that my 1080 has a perfectly fine time doing path-traced shaders in Minecraft with the new SUES PTGI shaders.

Share this post


Link to post
25 minutes ago, UsefullPig said:

So real time ray tracing doesn't really have anything to do with the new Turing architecture? I've noticed that my 1080 has a perfectly fine time doing path-traced shaders in Minecraft with the new SUES PTGI shaders.

Raytracing is just math. It's often used for AI too. There's no technical reason as to why it should only work on a specific graphics card.

It's literally 1980s math. This is just NVIDIA marketing bullshit at it's finest mostly. Pathtracing is just a different algorithm mostly.

 

Really there's no actual real point to real time ray tracing, when we could be using a faster, simpler algorithm in the first place instead.

 

It's all there just to get the cash out of suckers who are willing to throw money at what is essentially VERY FANCY MATH.

It's neat but... we've been able to fake it for decades anyway. I think NVIDIA tech demos tend to looks overly exagerated for the most part anyway. Everything is not so god damn reflective in real life. Good god, when was the last time, when a BRICK was REFLECTIVE.

 

OUR WORLD IS NOT MADE OUT OF CHROME DAMMIT.

 

I'm willing to say that, TrackMania 2 looks more real life like than... well anything raytracing could ever do. I mean I'm just joking but, man if anything, I often think to myself that, "hey this game looks BETTER than real life does".

 

It just goes into the uncanny territory too often for me, and it's very easy for me to tell what's a computer generated image, and what's real.

 

Nothing about this say "real life" to me. Is real life grey, and it has a constant bloom to everything? Sometimes, yes.

Why is everything so damn consistent? It just looks like a videogame.

 

C7UmvUs.jpg

 

I'm not sold. Something in my brain tells me that, there's something strange going on here.

I don't know, these images make me feel sick for some reason. Something about them feels horribly wrong to me.

 

Besides, real life graphics are ugly, they suck. Why do people aim for photorealism so badly? I don't even like how daytime looks in real life once in a while. If it was up to me, it would be sunsets all around, all the time. Do we really need to simulate, a drab, depressing, boring ass looking grey foggy ass day? Ehhh.. no. We can do BETTER, we can NOT copy the ugly stuff from real life.

 

I like videogame graphics more. It might be fake but... it looks way more pleasant, far prettier than pure raytracing will ever look to me. I mean what, is every game which has raytracing as the main focus, going to be a mostly grey, bland looking, overly reflective mess? I'll pass, NVIDIA can keep their graphics cards, I don't care.

 

I wish they could focus on achieving better draw distances instead. NOW THAT WOULD BE AWESOME, YEAH.

Edited by RaTcHeT302

Share this post


Link to post
10 hours ago, UsefullPig said:

So real time ray tracing doesn't really have anything to do with the new Turing architecture? I've noticed that my 1080 has a perfectly fine time doing path-traced shaders in Minecraft with the new SUES PTGI shaders.

A 38 minute video of it in action here...

 

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in the community.

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.