Jump to content


  • Content Count

  • Joined

  • Last visited

  1. Ah, yeah that's on how to force the game to give you all the resolution options because it doesn't recognize modern graphics cards and defaults to settings meant for trash tier processors from 2004. A lot of things about this game are broken on modern systems.
  2. The Sims 2 was clearly not programmed to work on high resolution screens. I feel like there's some value I could change in some config file or some obscure program that could fix this problem. Would you guys know of any programs or how to search for that value? I know I can just set the resolution lower, but everything ends up being blurry and I feel like I'm not wearing my glasses. This is TS2 on a 4k display
  3. So real time ray tracing doesn't really have anything to do with the new Turing architecture? I've noticed that my 1080 has a perfectly fine time doing path-traced shaders in Minecraft with the new SUES PTGI shaders.
  4. On all currently supported cards? If so that's fantastic
  5. I've been playing Hitman™ 2. I could go on about the superb mechanics and sleek aesthetics, but I'll sum it all up with a quote from the Deus Ex: Invisible War video, "I love games that involve you hiding bodies from the authorities and watching people freak out if they see you" Also, in the Deus Ex: Human Revolution episode, Ross talks about how Eidos Montréal shouldn't be making Deus Ex and some other studio should. I think Hitman™'s studio, IOI, is the perfect studio to pick up the torch. Hitman™'s mechanics are similar to Deus Ex's from what I've seen so it shouldn't be too hard for them to make a Deus Ex game with good mechanics. All they would need to do is hire good writers because Hitman™'s story sucks (Also they could do a reboot on the series and call it "Deus Ex™" lol) (Also also, I noticed that IOI slightly changed the HUD elements in Hitman™ 2 recently, and the latest Game Dungeon episode has me paranoid that they're gonna mess with the graphics in a bad way one of these days)
  6. ..but only on the most advanced CPUs and Graphics cards in the world. Official Announcements: Intel: https://software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics nVidia: https://www.nvidia.com/en-us/geforce/news/gamescom-2019-game-ready-driver/ A better summation: http://tanalin.com/en/articles/lossless-scaling/#h-progress TL;DR Intel is only supporting supporting integer-ratio scaling on the integrated graphics of their 10th and 11th gen CPUs. nVidia is only supporting it on graphics cards with the new Turing architecture. I can't believe that a simple algorithm will not be implemented on my GTX 1080, and as such this feature will be unavailable to me for a long time because I'm not buying another top of the line graphics card for several years.
  • Create New...

This website uses cookies, as do most websites since the 90s. By using this site, you consent to cookies. We have to say this or we get in trouble. Learn more.