Jump to content


  • Posts

  • Joined

  • Last visited

  1. Ah, yes; The limitation to anti-aliasing options that comes with deferred rendering -- love the increase in how many dynamic light sources one can have, that it enables, but the aliasing is indeed an eyesore, and the blurry AA options that remain are IMHO almost universally *worse* than the aliasing itself.
  2. Had a Logitech.. I think it was MX1100, wireless, which was a huge heavy chunk, but was as if moulded after my hand, and ran for months off of two AA batteries. Lost it when I accidently dropped it into a cup of tea (yes, really). Replaced it with an MX Master, since the 1100 was out of production, and that thing is too tiny and badly angled in every way possible, for my hand, and its built-in chargeable battery barely lasted a day when it was new; At this point in its life, the thing has become a de-facto wired device.
  3. Neuschwanstein castle is among the "featured" bookmarks in Google Earth VR, by the way... :7
  4. Although "Move as soon as possible, unless the landlord completely blows out and renovates the entire building", seems like the inevitable thing to do, I thought I'd mention that there are combination washer/dryer units -- got one myself, after once too many having forgotten (for several days) to empty the washing machine. Takes longer, tends to cost more than two separate machines, and have limited capacity, but should suffice for a household of two, and the one-step procedure is rather convenient. EDIT: Quite terrified about seeing the washer that close to the bathtub, by the way.
  5. Not the animation side of things as such, but a good watch none the less (EDIT: pertinent to the rendering end): 8ecfZF-IuSI
  6. The best I can offer is to keep trying to get to demo both at some length. :7 Demoing things like Valve's "The Lab", and the "Oculus Home" environment, may not be entirely representative, because their respective art direction are both balanced around certain overall scene lighting levels, that happen to be optimal for minimising apparent artefacts caused by the fresnel lenses; A pleasant slightly-above-medium gray-ish - low on saturation and contrast; Think watercolour, but a tiny bit darker . :7 Go, instead, into a high contrast environment; A dark place dotted with bright accent lights, such as many cockpits in Elite Dangerous, whilst out in dark space, and the "god rays" may jump out at you and smear themselves intrusively all over your vision, millimetres from your eyes. Overall, the Rift (Consumer Version 1) chooses more pixels per degree of your field of view (EDIT3: ...for a significantly sharper image), at the natural cost of a reduction in the width of the latter. This is mitigated a bit by a secondary sacrifice in binocular overlap; You'll still see almost as far to the sides as in a Vive, but the last 5-10 degrees out to either side, can only be seen by the eye on that side, as if you had the largest nose in the world. By all accounts, most people never seem to even notice this, but to me, personally, it is very annoying. The Rift is also a more convenient experience; It is lighter, has adjustable headphones that are mounted on its rigid frame, which makes putting it on much the same as putting on a baseball cap, without any fiddling with straps and cables and separate headphones, whatsoever, and the Home environment starts right up, automatically, just from donning it. Its lenses argueably offer a clearer and more consistent image, where text out to the sides is still pretty damn legible, whereas with the Vive, sharpness falls off, and you can get a bit of "double vision" between the outermost frenels lens rings. A few of us, however, experience a bit of distortion, in the Rift, when looking around, which can cause some "brain strain". The Rift will get its own motion controllers, like the Vive's wands, sometime in the autumn. They are more designed to be "part of your hand", so to speak, taking shape from a hand in resting position, so that you can almost forget you are holding anything at all. They also have analogue buttons with capacitive touch sensing, for groups of fingers, which allows them to infer different hand gestures; E.g, squeeze the buttons under all four fingers, and take your thumb off all of the spots where it may rest, and that is interpreted as a thumbs up, or down, depending on where you are pointing the device. The Vive is brighter, but possibly may not conform to sRGB standard - this appears to result in significantly higher overall contrast ratios in most stuff, compared to the Rift. On one hand, this kind of seems to add a bit of a "realistic" feel to the imagery - on the other, it causes a lot of loss in visual detail, with things in shadow crushing toward black, and bright things blowing out. If we had a good HDR and extensive colour gamut mastering standard, maybe an adaption of BT.2020, together with colour profiling for each HMD, things could be reined in, and those extra lumens exploited... (EDIT: Such a standard can not come soon enough, as far as I am concerned; I want today's content to look as right as possible in tomorrow's hardware, regardless of manifacturer, or API, or capability.) (EDIT2: ...and I also have some strong disagreeable opinions on the act of tying drivers and hardware/metaverse framework APIs to vendor frontends. ) Despite its numerous shortcomings (...and by most accounts untypically, I experience more fresnel glare than in the Rift), so far I find myself constantly turning to my Vive first - for standing and sitting experiences both, but that's me; People have wildly differing experiences with the devices; How well they fit the shape of each individual's face, alone, and how this affects not only comfort, but more importantly the optical path, tends to be rather important. :7 Why is this the first guy to think of this? It's basic human anatomy... Not the first to think of it, I am pretty sure, but certainly one of the few to try it, in the face of rather condescending self-proclaimed know-it-all detractors. :7 The method has a few shortcomings; It can most noteably not overcome the lack of vestibular feedback from velocity changes, but it does give you a helpful amount of sensory input from the physical treading, and is the method I favour at this time (EDIT4: ...as a complement to room-scale free movement, mind you - not a replacement); just needs tracking of the user's feet, and perfect walk cycle phase matching. :7
  7. VorpX does offer the 3D-projection-in-cinema mode, for the games it can deal with. Unfortunately, at some version it stopped supporting older revisions of the Oculus runtime, and it comes with a fantastically clever auto-update-online-at-startup system, which rules out rolling back to an older version that can be used with a DK1.
  • Create New...

This website uses cookies, as do most websites since the 90s. By using this site, you consent to cookies. We have to say this or we get in trouble. Learn more.