Jump to content
The Dark Mod Forums

woah

Member
  • Posts

    393
  • Joined

  • Days Won

    1

Everything posted by woah

  1. Right there with you with respect to that desire to have good black levels again. Also looking forward to Micro-OLED displays that may be coming to VR headsets next year, assuming they've addressed mura and black smear. Unfortunately I doubt they'll be dynamic focus displays
  2. https://www.youtube.com/watch?v=Cz8ObjoYhLQ This is also a really neat Half-Life themed short
  3. Really enjoyed this STALKER fan film https://www.youtube.com/watch?v=GvJ91D-N29g
  4. Continuing on with my previous posts about this topic, Carmack actually addressed the state of their varifocal technology in the above talk. I guess the much coveted Half Dome varifocal prototypes they demonstrated (well, "demonstrated" as in "showed a through the lens video of it") still have lots of problems and didn't really work well outside of the lab. Also, problems with cost and glasses. Unsurprisingly varifocal that isn't "perfect" is worse than fixed focus. It seems quite premature to me then that Lanman claimed varifocal was "almost ready for prime time" 1.5 years or so ago. Carmack hopes they can collect a bunch of eyetracking data across wider populations with their next headset (to increase accuracy I suppose), but this seems to confirm varifocal isn't going to be a feature from them any time soon. But just how accurate and robust does eyetracking have to be for this to work in a consumer product? E.g. if eyetracking running at >200hz screws up your focus once every minute, does that create an unacceptable user experience? https://skarredghost.com/2021/10/22/creal-ar-vr-lightfields Then there's this demo of CREAL which approximates a light field: "CREAL states that its innovation is in not calculating low-resolution lightfields for many positions, but few high-quality resolution lightfields in a few selected positions around the eye of the user". The impressions are very exciting to me because it sounds like it's another step along the path of addressing the primary issues I have with VR: However, there's no actual eyetracking and the lightfield portion of the display is limited to a mere 30 degrees (the display is foveated, there's a standard fixed focus display around the perimeter and the transition between the two is abrupt). I have to wonder if it's possible to use a similar kind of display but with eyetracking so the lightfield region follows your eye--sort of similar to what @STiFU mentioned a while back (though that was with a holographic display).
  5. Heck, even 50% would be an incredible boost--that's like getting a new generation of GPUs. The numbers I've typically seen are between 30% and 200% but it will likely improve over time. However, I don't think it's going to be anywhere close to the >1000% that Abrash was originally predicting. At the last Oculus Facebook Connect, Carmack reined in those expectations. More recently he said this:
  6. Vari-focal adjustments, i.e. basically allowing each eye to correctly accommodate in a way that's matched to vergence. Should make VR much more comfortable, more immersive, and less limiting (especially when it comes to near-field interactions). Right now the focus depth is fixed and it sucks. EDIT: Foveated rendering may be a thing as well but it seems to have been way over-hyped (in terms of realistic performance gains)
  7. Curious things happening on the PCVR front. A new headset, codenamed "Deckard", was found in the SteamVR files and Arstechnica has confirmed that it's real. Also other things suggesting a split rendering system (mixing parts of rendering between a PC and processing within the headset), "inside out" tracking that also works with lighthouse tracking, wireless, and possibly standalone functionality. Seems awfully similar to Valve's patents for split rendering, a new kind of tracking sensor, and headstraps with integrated wireless and processing. Most interesting thing to me though is that based on public filings from a 2020 lawsuit, it's been revealed that in 2018 Valve invested $10m in the company ImagineOptix. IO is an optics manufacturer that uses photolithographic processes to pattern thin film (few micrometer thick) liquid crystal optics that can be layered and electrically controlled. IO entered a contract with Valve which included a Master Supply Agreement and the construction of a factory to mass product these optics for Valve. The factory was finished in mid 2020 and in early 2021 Valve filed ~10 patents describing the application of the technology to a variety of things important for VR. From Valve's patents, they want to use the technology not just for variable focus but also optical/eye correction in the headset (i.e. no glasses/contacts), optical blur and dynamic angular resolution adjustment in different fields of the FOV, pancake lens-like features, and many other things. The varifocal aspect is similar to Facebook's "Half Dome 3" prototype (which they showed in 2019 through a short video) but apparently Valve was already making moves to mass produce similar liquid crystal lenses a year prior. They also recently filed a patent for a form of eyetracking I haven't seen used much, which would be necessary for most of this stuff to work at all. Of course patents and leaks don't necessarily mean anything about actual products, Valve could cancel it, accurate eyetracking is hard, and if they actually release a VR headset that's this advanced it would fundamentally change the experience VR--it seems too good to be true. A solid state lens that can be electrically controlled to perform so many dynamic optical functions is like something out of science fiction. On the other hand, they built a factory to mass produce these lenses and Arstechnica says speculations about these lenses are "on the right track", so it's somewhat tantalizing.
  8. It definitely seems like a neat device. I personally have no interest in actually using something like it so I see no reason to get it. However there are two things that stick out to me: (1) If I were a kid this would be a the perfect on-ramp to PC gaming. It's cheap (a decent GPU costs more than this) and has everything you need integrated (screen, battery, IO)--imagine if in the 90s you could get a fully capable PC for just $225 ($400 adjusted for inflation). It's powerful enough to play the latest games and FSR will extend its life span. It's simple/streamlined through SteamOS but you can hack around with it as you can any PC. You can upgrade the storage and bring it to a friend's house (which the kids love I guess--mobility doesn't matter to me). And when you're ready to take off the training wheels, you can connect a mouse, keyboard, and external monitor. And then longer term you'll be primed to buy a desktop PC. I see a lot of Valve enthusiasts lining up with reservations to buy this thing "just because it's a cool device" but I hope Valve goes out of their way to market this to kids--that's where this could be really successful (I'm thinking a lot of these existing Steam users won't actually use it much because ... why not just use your PC?) (2) This will hopefully warm AMD up to the idea of making cheap PC gaming SoCs. Consoles are cheap not just due to subsidization but also due to the efficiencies that come with the tight integration of mass produced SoCs. And most PC gamers don't even upgrade their PCs--they just buy a whole new system all at once, so the direct benefits of modularity and specialization are lost on them. So if Valve could convince AMD to make SoCs with performance in line with the major consoles (that's the target games are designed around anyway), that could go a long way toward making decent PC gaming systems cheaper. Right now just a decent GPU costs as much as a modern console. From what Gabe has stated, they're probably even subsidizing this thing--which helps justify their 30% take if this is the direction they're going.
  9. I never had a good experience when setting other people up for Google Photos. The desktop syncing applications were unreliable--would constantly stall and need to be restarted--and also CPU intensive. The duplicate detection didn't really work. The behavioral documentation was cryptic--I had to go to 3rd party websites just to get a good grasp on it. Support was practically nonexistent (as is typical for Google "products")
  10. By the way, here is another company working on the VAC problem but using lightfields and it looks like they have some neat prototypes https://www.roadtovr.com/creal-light-field-ar-vr-headset-prototype/ I guess coincidentally the company, "CREAL", is also pronounced "See Real"
  11. Thanks that makes a lot of sense. I imagine determining the eye ball center is a very difficult problem with the eye not being a rigid body.
  12. Quick question, because I'm having trouble finding more information on this: What is the difference between eye tracking and pupil detection? Is one just trying to determine direction, while the other is trying to determine the complex deformation of the eye?
  13. Awesome thanks so much for the impressions on it. I'm not going to pretend that I understand anything more than the high level concepts, but I don't get to hear directly from actual researchers (former or otherwise) very often so it's hard to ground myself. A few years ago the impression I got from Abrash was that we'd have varifocal by now or very soon, but every subsequent year he's pushed out his predictions further into the future (and the latest is basically "I don't know when"). Sometimes I get the impression that his optimistic predictions are as much targeted at higher ups in the company (that may not want to wait 15 years for a technology to develop) as they are at developers and some of the public. And I'm sure Valve is also working on something targeted at consumers (well, enthusiasts) but nobody can get a word out of them about anything. However, now it seems that if there will be any short term progress here it will involve a major breakthrough in eyetracking, or something more radical/unexpected. In addition, it seems there will be many iterations on varifocal. If we could just get to the point where the visual comfort of a VR headset is comparable to a 800x600 CRT monitor from 1995 I'd be pretty satisfied and would feel good about the state of the tech. Honestly most of the friends I've coaxed into buying headsets rarely ever use them anymore due to a variety of issues like this. The thing about the company in Dresden is interesting. Do you know if they were approaching a computational complexity suitable for real time rendering?
  14. I'm curious then what you think about Facebook's approach to the problem as detailed in the video below. Not because I'm doubting you but Facebook has been drumming up a ton of hype and claiming that varifocal is "almost ready for prime time" and such, and I want to hear a researcher's perspective on it. The gist I get from this is that eyetracking is the major thing that's holding this technology back. Granted, FB has a vested interest in creating the impression that this problem is on the verge of being solved given its necessity for their enormous loss leading investment into VR. At the same time Michael Abrash has said that eyetracking may never be good enough for varifocal and Carmack recently expressed significant doubts about eyetracking being anywhere near as accurate or as robust as people are hoping for.
  15. I think I'm going to quit VR until the fixed focus / vergence accommodation conflict is solved. Or maybe just take a long break. It seems to be getting more and more difficult for my eyes to tolerate it, perhaps because I'm ever more conscious of it. It's also harder to tolerate when I'm tired. There are times in games where I get immersed enough to tune it out a bit but for me it's the #1 reason that flat gaming is 10x more comfortable than VR. For a VR nut like me this is actually a big deal but I'm just tired of feeling like I'm crossing my eyes while playing games.
  16. The interactions in this game almost seem like they were designed for VR but limited by the M&K interface. I hope someone makes a VR mod.
  17. If you've got a VR headset and like electronic music, check out The Wave and, at times, VRChat. Various art exhibitions are being held in VR this year, e.g. through The Museum Of Other Realities https://store.steampowered.com/app/613900/Museum_of_Other_Realities/ https://twitter.com/museumor For future sporting events and such, Google has figured out how to stream lightfield videos over a 300Mbps connection. Essentially lightfield video gives you a volume (e.g. 70cm^3) in which you can move around your head and the image is rendered correctly from every position and orientation (e.g. even mirrors work). Once headsets have variable focus I could see there being a huge market for this https://uploadvr.com/google-lightfield-camera-compression/ Of course nothing is quite like actually being present but this is really the next best thing and, for some people, it's probably good enough.
  18. Good talk on how to solve VR's last major visual issue
  19. the psychology of US investors in the current market
  20. I'll be happy to get on just 100 Mbps fiber optic in the next few weeks. This will be up from an average of 50kB/s through verizon wireless with a data cap. Been trying to get them to install this for 4 years.
  21. I doubt we'll see very many games at this fidelity for a while but it's nonetheless quite exciting. Especially for developers, assuming it's really that easy. Also, no cut up to $1 million is really cool for a fully open source top of the line game engine. However what I found comical was how we're being shown these incredible graphics but in terms of interactions we're still stuck at "Press X To Interact"
  22. My review: The game is quite amazing in terms of production quality, atmosphere, immersion and the mechanics that they have implemented. It's hard to convey without actually experiencing it but I've never felt so "in" a virtual world before--I've played plenty of VR games but none of them have done anything close to this. The best way I can describe it is "dense". The graphics are often near photorealistic and nearly everything is intricately detailed. The audio is like nothing I've experienced before--almost every sound is accurately mapped spatially and feels so "correct". The environments are fleshed out to an absurd degree, so if you're the type of gamer that likes to spend a lot of time exploring and getting immersed in an environment, this game is a dream come true. You can pick up and prod just about anything, the physics are more well behaved than anything I've seen before, and the hand/finger mapping is so good that it makes you want to reach out and "feel" things--the way the haptics respond and the way your virtual hand conforms to the surfaces kind of compels you to do this (and parts of the environment will respond, e.g. the Xen fauna is a delight to interact with). It's an extraordinary work of art. In a recent interview Valve said that they had tried to do this in the past but what would happen on the desktop is that, with the exception of a small minority, gamers would just speed right past everything and never look back. That meant many hours of developer time just being wasted so it couldn't be justified. However with VR they realized people were spending a lot of time interacting with environments at a higher resolution and this actually persisted over time (they originally assumed it would subside with the VR spectacle). After playing the game this makes total sense and has me pretty excited about the future of gaming. If developers can justify adding more depth to their games and the medium itself motivates players to actually experience that depth, that is only a good thing. The mechanics, interactions, and AI they have implemented are done very well. Everything feels very rewarding to use and interact with--it's all polished to an absurd degree. Especially the gravity gloves, they are a joy to use. The combat is less about scale and more about small encounters. Every shot you take feels consequential. It was already clear to me that e.g. aiming a gun in real life is much harder than on a desktop, but what they've tried to do is take advantage of this to add intensity to the experience for what would otherwise have seemed banal (so e.g. an encounter with a few headcrabs becomes a big deal). The AI itself is actually more reminiscent of the HL1 AI (e.g. the grunt AI) and this is a good thing. This failed on some accounts to due to the teleportation focus and some design decisions that followed from that--which brings us to the downsides. Where the game leaves much to be desired is in the variety of mechanics implemented, the forms of locomotion, and the teleport focus. Much of what they've done *is* impressive, but only for a teleport game. The game supports smooth locomotion (and the actual movement is not a bad implementation at all), but it's clearly a game designed around the constraints of teleportation. The interactions, the AI, the combat, the types of locomotion are all within the confines of what is viable with teleportation. E.g.: There is no melee combat in the game whatsoever--perhaps one of the most obvious affordances of VR--because any compelling melee combat requires movement (being able to quickly dodge and advance on the enemy). The AI is much less difficult than it should be because teleport gamers can't strafe and back step, which means the AI moves slowly, does not overwhelm you, and gives you plenty of time to take aim at it. To make things more difficult for smooth locomotion players, on the higher difficulties they just increased enemy hit points (combine can take like 6+ shots to the head on the highest difficulty), but I still didn't find it difficult because--as a smooth locomotion player--I can just strafe to cover and carefully take shots at enemies that give you plenty of time to aim. You can't climb, swim, jump, run, drive vehicles, or do anything that doesn't map well to teleportation. There is a sort of "teleport mantle"--which works--but it's still underwhelming. Given the nature of this game--that tons of new VR gamers are going to be jumping straight into this with no prior VR experience--it does make sense to design the game this way. However if they make Half-Life 3 VR ( see Anderson's link if you want to be spoiled), they can't do it like this because the majority of regular VR users get used to smooth movement in short order and then never look back. Rather, they need to take full advantage of what are now "standard" forms of comfortable smooth movement and then port back to teleport for this minority that can't adapt. E.g.: adding near field interactions and melee combat with AIs (thankfully it seems likely for HL3VR if you've spoiled the ending of HLA). They need to increase AI difficulty and tension not through hit points but rather through their ability to take cover and maneuver about quickly (VR is actually much better suited to low TTK due to the higher difficulty of aiming--makes each shot feel more rewarding and consequential). They need to incorporate the common forms of locomotion we enjoy in flat gaming--climbing, swimming, platforming, vehicles--and then expand on them in ways that take advantage of VR input (for example, geometry and physics based climbing mechanics with motion controllers). If, after they've implemented these things, the teleport counterparts are lame or clunky then that's unfortunate for that minority that can't adapt, but it is much better than leaving so much of what makes gaming compelling off of the table. Overall the game is an undeniably incredible experience. I can't imagine what it would be like to experience this as your first VR game--it would probably be akin to that magic feeling that HL1 and HL2 gave me as a kid. The game deserves its praise. However, for a VR gamer that's seen a lot of cool VR mechanics from indie developers and that is not locomotively gimped, it's quite limited mechanically and I don't think this approach will work for their next VR Half-Life game. HLA needs to serve as the introductory experience--a beautiful work of art at that--but what comes next should retain this production quality but show the value proposition of VR mechanically.
  23. That's certainly not what I am personally seeing among the VR users I've interacted with and it's not what I'm reading from those familiar with the actual usage numbers. E.g. Palmer himself (the vr poster boy) wrote an article on this very issue where he points to large scale real world market testing that demonstrates very poor retention outside of the hardcore users (of course you're not going to hear this first hand from Valve, Facebook, Sony, etc etc directly because that would not instill confidence in the medium they've invested so heavily in). The Steam Hardware survey shows that after 4 years there are only about 1.3 million connected VR headsets on Steam (and that's just connected, it doesn't mean they're being used), despite many more headsets actually having been sold. The total number of concurrent VR users across all games that are actually using their headsets on Steam is roughly only 10k (up from ~5k in 2017), which is something that a single game in the top 50 on Steam is easily capable of. You have for instance Pavlov VR which is essentially the CS:GO counterpart to VR (the gameplay, the behavior of the weapons, the game modes, literally people have ported all of the maps) but it typically gets 500 to 1K users. Anecdotally I basically see the same thing: probably about 80% of the people I've gotten into VR (with cheap Rifts off of eBay) and those I've met online hardly use their headsets anymore despite the content improving year after year--they would rather just play flat games. There is definitely a hardcore kind of user that sticks around but they are far from the majority. I predict you're going to see a boost from HLA but the same general result. During the honeymoon phase many people will claim that they can't possibly go back to flat games (you hear this over and over again) but a few months to a year later most will. As the collective experience improves, less and less will drop off. In the early days of VR that's something I actually thought myself--that people would be willing to tolerate it for what VR adds to the experience, but what I've come to learn is that there's a big difference between something being just being "tolerable" and actually wanting to use it every day when you have much more comfortable/less clunky and "good enough" alternatives (the alternative in this case being flat gaming). E.g. I can tolerate an annoying glare from a nearby window on a high quality desktop monitor, but if we assume that glare is for whatever reason unavoidable, eventually I'd almost certainly tire of it and even opt for a much worse monitor that doesn't suffer from such a glare. VR has tons of issues like this that are constantly nagging you--there are not just glares but distortion, large parts of the scene that are out of focus, clarity problems etc etc (many of the things I listed above), and it takes a certain kind of user to actually put up with them at this point. I actually see the opposite effect with respect to casuals: the more casual the user, the less likely they are to put up with all of this. It's hard enough to just get them to point of gaining their "VR legs" (resilience to sim sickness). It's similar to how in the early days of the smartphone you had a small market of individuals interested in e.g. PalmPilots and while the average person might genuinely think it was cool device, they wouldn't actually use it due to the collective problems resulting from immature hardware and software. It was appropriate for a certain kind of user but the more casual consumer needed a user experience that was improved on basically all fronts.
×
×
  • Create New...