Jump to content
The Dark Mod Forums

woah

Member
  • Posts

    389
  • Joined

  • Days Won

    1

woah last won the day on July 18 2010

woah had the most liked content!

Reputation

76 Excellent

1 Follower

Recent Profile Visitors

2964 profile views
  1. Heck, even 50% would be an incredible boost--that's like getting a new generation of GPUs. The numbers I've typically seen are between 30% and 200% but it will likely improve over time. However, I don't think it's going to be anywhere close to the >1000% that Abrash was originally predicting. At the last Oculus Facebook Connect, Carmack reined in those expectations. More recently he said this:
  2. Vari-focal adjustments, i.e. basically allowing each eye to correctly accommodate in a way that's matched to vergence. Should make VR much more comfortable, more immersive, and less limiting (especially when it comes to near-field interactions). Right now the focus depth is fixed and it sucks. EDIT: Foveated rendering may be a thing as well but it seems to have been way over-hyped (in terms of realistic performance gains)
  3. Curious things happening on the PCVR front. A new headset, codenamed "Deckard", was found in the SteamVR files and Arstechnica has confirmed that it's real. Also other things suggesting a split rendering system (mixing parts of rendering between a PC and processing within the headset), "inside out" tracking that also works with lighthouse tracking, wireless, and possibly standalone functionality. Seems awfully similar to Valve's patents for split rendering, a new kind of tracking sensor, and headstraps with integrated wireless and processing. Most interesting thing to me though is that based on public filings from a 2020 lawsuit, it's been revealed that in 2018 Valve invested $10m in the company ImagineOptix. IO is an optics manufacturer that uses photolithographic processes to pattern thin film (few micrometer thick) liquid crystal optics that can be layered and electrically controlled. IO entered a contract with Valve which included a Master Supply Agreement and the construction of a factory to mass product these optics for Valve. The factory was finished in mid 2020 and in early 2021 Valve filed ~10 patents describing the application of the technology to a variety of things important for VR. From Valve's patents, they want to use the technology not just for variable focus but also optical/eye correction in the headset (i.e. no glasses/contacts), optical blur and dynamic angular resolution adjustment in different fields of the FOV, pancake lens-like features, and many other things. The varifocal aspect is similar to Facebook's "Half Dome 3" prototype (which they showed in 2019 through a short video) but apparently Valve was already making moves to mass produce similar liquid crystal lenses a year prior. They also recently filed a patent for a form of eyetracking I haven't seen used much, which would be necessary for most of this stuff to work at all. Of course patents and leaks don't necessarily mean anything about actual products, Valve could cancel it, accurate eyetracking is hard, and if they actually release a VR headset that's this advanced it would fundamentally change the experience VR--it seems too good to be true. A solid state lens that can be electrically controlled to perform so many dynamic optical functions is like something out of science fiction. On the other hand, they built a factory to mass produce these lenses and Arstechnica says speculations about these lenses are "on the right track", so it's somewhat tantalizing.
  4. It definitely seems like a neat device. I personally have no interest in actually using something like it so I see no reason to get it. However there are two things that stick out to me: (1) If I were a kid this would be a the perfect on-ramp to PC gaming. It's cheap (a decent GPU costs more than this) and has everything you need integrated (screen, battery, IO)--imagine if in the 90s you could get a fully capable PC for just $225 ($400 adjusted for inflation). It's powerful enough to play the latest games and FSR will extend its life span. It's simple/streamlined through SteamOS but you can hack around with it as you can any PC. You can upgrade the storage and bring it to a friend's house (which the kids love I guess--mobility doesn't matter to me). And when you're ready to take off the training wheels, you can connect a mouse, keyboard, and external monitor. And then longer term you'll be primed to buy a desktop PC. I see a lot of Valve enthusiasts lining up with reservations to buy this thing "just because it's a cool device" but I hope Valve goes out of their way to market this to kids--that's where this could be really successful (I'm thinking a lot of these existing Steam users won't actually use it much because ... why not just use your PC?) (2) This will hopefully warm AMD up to the idea of making cheap PC gaming SoCs. Consoles are cheap not just due to subsidization but also due to the efficiencies that come with the tight integration of mass produced SoCs. And most PC gamers don't even upgrade their PCs--they just buy a whole new system all at once, so the direct benefits of modularity and specialization are lost on them. So if Valve could convince AMD to make SoCs with performance in line with the major consoles (that's the target games are designed around anyway), that could go a long way toward making decent PC gaming systems cheaper. Right now just a decent GPU costs as much as a modern console. From what Gabe has stated, they're probably even subsidizing this thing--which helps justify their 30% take if this is the direction they're going.
  5. I never had a good experience when setting other people up for Google Photos. The desktop syncing applications were unreliable--would constantly stall and need to be restarted--and also CPU intensive. The duplicate detection didn't really work. The behavioral documentation was cryptic--I had to go to 3rd party websites just to get a good grasp on it. Support was practically nonexistent (as is typical for Google "products")
  6. By the way, here is another company working on the VAC problem but using lightfields and it looks like they have some neat prototypes https://www.roadtovr.com/creal-light-field-ar-vr-headset-prototype/ I guess coincidentally the company, "CREAL", is also pronounced "See Real"
  7. Thanks that makes a lot of sense. I imagine determining the eye ball center is a very difficult problem with the eye not being a rigid body.
  8. Quick question, because I'm having trouble finding more information on this: What is the difference between eye tracking and pupil detection? Is one just trying to determine direction, while the other is trying to determine the complex deformation of the eye?
  9. Awesome thanks so much for the impressions on it. I'm not going to pretend that I understand anything more than the high level concepts, but I don't get to hear directly from actual researchers (former or otherwise) very often so it's hard to ground myself. A few years ago the impression I got from Abrash was that we'd have varifocal by now or very soon, but every subsequent year he's pushed out his predictions further into the future (and the latest is basically "I don't know when"). Sometimes I get the impression that his optimistic predictions are as much targeted at higher ups in the company (that may not want to wait 15 years for a technology to develop) as they are at developers and some of the public. And I'm sure Valve is also working on something targeted at consumers (well, enthusiasts) but nobody can get a word out of them about anything. However, now it seems that if there will be any short term progress here it will involve a major breakthrough in eyetracking, or something more radical/unexpected. In addition, it seems there will be many iterations on varifocal. If we could just get to the point where the visual comfort of a VR headset is comparable to a 800x600 CRT monitor from 1995 I'd be pretty satisfied and would feel good about the state of the tech. Honestly most of the friends I've coaxed into buying headsets rarely ever use them anymore due to a variety of issues like this. The thing about the company in Dresden is interesting. Do you know if they were approaching a computational complexity suitable for real time rendering?
  10. I'm curious then what you think about Facebook's approach to the problem as detailed in the video below. Not because I'm doubting you but Facebook has been drumming up a ton of hype and claiming that varifocal is "almost ready for prime time" and such, and I want to hear a researcher's perspective on it. The gist I get from this is that eyetracking is the major thing that's holding this technology back. Granted, FB has a vested interest in creating the impression that this problem is on the verge of being solved given its necessity for their enormous loss leading investment into VR. At the same time Michael Abrash has said that eyetracking may never be good enough for varifocal and Carmack recently expressed significant doubts about eyetracking being anywhere near as accurate or as robust as people are hoping for.
  11. I think I'm going to quit VR until the fixed focus / vergence accommodation conflict is solved. Or maybe just take a long break. It seems to be getting more and more difficult for my eyes to tolerate it, perhaps because I'm ever more conscious of it. It's also harder to tolerate when I'm tired. There are times in games where I get immersed enough to tune it out a bit but for me it's the #1 reason that flat gaming is 10x more comfortable than VR. For a VR nut like me this is actually a big deal but I'm just tired of feeling like I'm crossing my eyes while playing games.
  12. The interactions in this game almost seem like they were designed for VR but limited by the M&K interface. I hope someone makes a VR mod.
  13. If you've got a VR headset and like electronic music, check out The Wave and, at times, VRChat. Various art exhibitions are being held in VR this year, e.g. through The Museum Of Other Realities https://store.steampowered.com/app/613900/Museum_of_Other_Realities/ https://twitter.com/museumor For future sporting events and such, Google has figured out how to stream lightfield videos over a 300Mbps connection. Essentially lightfield video gives you a volume (e.g. 70cm^3) in which you can move around your head and the image is rendered correctly from every position and orientation (e.g. even mirrors work). Once headsets have variable focus I could see there being a huge market for this https://uploadvr.com/google-lightfield-camera-compression/ Of course nothing is quite like actually being present but this is really the next best thing and, for some people, it's probably good enough.
  14. Good talk on how to solve VR's last major visual issue
×
×
  • Create New...