Jump to content
The Dark Mod Forums

Share your Status / What's on your mind?


STiFU
 Share

Recommended Posts

I think I'm going to quit VR until the fixed focus / vergence accommodation conflict is solved.  Or maybe just take a long break.  It seems to be getting more and more difficult for my eyes to tolerate it, perhaps because I'm ever more conscious of it.  It's also harder to tolerate when I'm tired.  There are times in games where I get immersed enough to tune it out a bit but for me it's the #1 reason that flat gaming is 10x more comfortable than VR.

 

For a VR nut like me this is actually a big deal but I'm just tired of feeling like I'm crossing my eyes while playing games.

  • Like 1
Link to comment
Share on other sites

7 hours ago, woah said:

I think I'm going to quit VR until the fixed focus / vergence accommodation conflict is solved.  Or maybe just take a long break.  It seems to be getting more and more difficult for my eyes to tolerate it, perhaps because I'm ever more conscious of it.  It's also harder to tolerate when I'm tired.  There are times in games where I get immersed enough to tune it out a bit but for me it's the #1 reason that flat gaming is 10x more comfortable than VR.

 

For a VR nut like me this is actually a big deal but I'm just tired of feeling like I'm crossing my eyes while playing games.

You can wait a long time for that. I've written a whole PhD-thesis on this matter! 😄 There were some ideas to use lenses with adjustable focus-length, but that approach is rather impractical. There were some advances in real-time holographic displays by a research facility in Dresden, but that company eventually lost its funding, which is a huge bummer, because on the one hand, their approach was really promising, and on the other, holography is the only way the accomodation-vergence-conflict can truly be solved. I did not hear of any other instances to further research realtime holography after that apparently. 

I my thesis, I developed two methods that tried to reduce the conflict in real-time.

  1. I used an eye-tracker to detect where the user is looking and slowly shifted those contents back to the display where the conflict does not exist. Obviously, this approach has its limitations and experiemental results did not show a significant improvement.
  2. In approach 1, I slowly shifted convergence. In approach 2, I dropped the idea of eye tracking and solely investigated how shifting convergence can be improved. When applying a convergence shift, you implicitly also get a distortion of depth. So in this approach, I countered this distortion of depth by a dynamic adjustment of the camera-baseline (i.e. the distance between the virtual eyes). This approach yielded significant improvement over the regular method, but was way too complex to be adopted by the filming industry. (Fun fact: the image content I used in this experiment was generated by a modified version of Doom 3 BFG. 😄 )

As it currently stands, you have to rely on content producers that they design the content such that the AV-conflict is minimized. This of course is incredibly difficult with games where stuff regular flies out of the screen towards the viewer, which is the most critical thing to do as far as visual fatigue is concerned. 

By the way, accomodation-vergence-conflict is the correct term for this, as accomodation already means focus-adjustment, while vergence relates to the convergence of the eyes on a point of interest.

Further reading: 😉
https://eldorado.tu-dortmund.de/bitstream/2003/36031/1/Dissertation.pdf

  • Like 2
Link to comment
Share on other sites

6 hours ago, STiFU said:

You can wait a long time for that. I've written a whole PhD-thesis on this matter! 😄 There were some ideas to use lenses with adjustable focus-length, but that approach is rather impractical. There were some advances in real-time holographic displays by a research facility in Dresden, but that company eventually lost its funding, which is a huge bummer, because on the one hand, their approach was really promising, and on the other, holography is the only way the accomodation-vergence-conflict can truly be solved. I did not hear of any other instances to further research realtime holography after that apparently. 

I my thesis, I developed two methods that tried to reduce the conflict in real-time.

  1. I used an eye-tracker to detect where the user is looking and slowly shifted those contents back to the display where the conflict does not exist. Obviously, this approach has its limitations and experiemental results did not show a significant improvement.
  2. In approach 1, I slowly shifted convergence. In approach 2, I dropped the idea of eye tracking and solely investigated how shifting convergence can be improved. When applying a convergence shift, you implicitly also get a distortion of depth. So in this approach, I countered this distortion of depth by a dynamic adjustment of the camera-baseline (i.e. the distance between the virtual eyes). This approach yielded significant improvement over the regular method, but was way too complex to be adopted by the filming industry. (Fun fact: the image content I used in this experiment was generated by a modified version of Doom 3 BFG. 😄 )

As it currently stands, you have to rely on content producers that they design the content such that the AV-conflict is minimized. This of course is incredibly difficult with games where stuff regular flies out of the screen towards the viewer, which is the most critical thing to do as far as visual fatigue is concerned. 

By the way, accomodation-vergence-conflict is the correct term for this, as accomodation already means focus-adjustment, while vergence relates to the convergence of the eyes on a point of interest.

Further reading: 😉
https://eldorado.tu-dortmund.de/bitstream/2003/36031/1/Dissertation.pdf

I'm curious then what you think about Facebook's approach to the problem as detailed in the video below.  Not because I'm doubting you but Facebook has been drumming up a ton of hype and claiming that varifocal is "almost ready for prime time" and such, and I want to hear a researcher's perspective on it.  The gist I get from this is that eyetracking is the major thing that's holding this technology back.  Granted, FB has a vested interest in creating the impression that this problem is on the verge of being solved given its necessity for their enormous loss leading investment into VR.

At the same time Michael Abrash has said that eyetracking may never be good enough for varifocal and Carmack recently expressed significant doubts about eyetracking being anywhere near as accurate or as robust as people are hoping for.

 

Edited by woah
  • Like 1
Link to comment
Share on other sites

15 hours ago, woah said:

I'm curious then what you think about Facebook's approach to the problem as detailed in the video below.  Not because I'm doubting you but Facebook has been drumming up a ton of hype and claiming that varifocal is "almost ready for prime time" and such, and I want to hear a researcher's perspective on it.  The gist I get from this is that eyetracking is the major thing that's holding this technology back.  Granted, FB has a vested interest in creating the impression that this problem is on the verge of being solved given its necessity for their enormous loss leading investment into VR.

At the same time Michael Abrash has said that eyetracking may never be good enough for varifocal and Carmack recently expressed significant doubts about eyetracking being anywhere near as accurate or as robust as people are hoping for.

 

I left academia three years ago and started working in the industry, so I might've missed some of the most recent developments (especially related to VR as I absolutely cannot stomach it), but I will gladly listen to that talk.

So, the concepts shown up until minute 50:00 all pretty much existed for quite some time. However, it was nice to see some working prototypes of them, especially the liquid crystal varifocal lens (dynamically adjustable focal length without using moving parts). Afterwards, the focal surface was introduced, which was totally new to me and looks really exciting to be honest. Using a spatial light modulator to implement what is essentially per-pixel varying focal-length is an ingenious idea!! At my previous institute, we had only used SLMs in the context of holography based 2d-image-projection, which really serves no purpose at all except seeing if it works or not. 😄 The captured results of the focal-surface looked somewhat off to me, 'though. I can't really put my finger on what it is, but the blur etc. didn't look right. Possibly non-linear distortions because multiple wavelengths pass through the spatial light modulator, which is really only designed for one singular wavelength? If it is that, it could easily be solved by using one SLM per color-component and using laser-light as an illuminator. If it is not, I hope they get it sorted, as the idea seems really promising.

I am not even sure one would still need eye-tracking with good focal-surfaces. You are correct in your assessment that eye-tracking is extremely error-prone. For my experiements, I had to go to extreme lengths to get eye-tracking working properly for my first approach. I ended up carefully modeling all possible 3d-viewing eye-movements in a kalman filter to get rid of major tracking-noise and augmented that with a realtime stereo-3d-disparity estimation (which I implemented in CUDA) to be able to tell what point in 3d-space the user most likely looked at. It worked fairly well, but might still not be accurate enough for vergence-driven accomodation control.

Considering the problems with eye-tracking, I still think that holography is the most promising approach. Researches have long demonstrated that it is possible to create actual holpgraphic displays, but the problem is the computational side of things. It is incredibly costly to calculate a hologram in real-time. That company in Dresden I mentioned earlier had a brilliant idea to solve this problem. If you want to construct the full optical wavefront of a point in 3d-space, the resulting hologram occupies every pixel of your display. So, to render a full scene, you'd pretty much need to render a full frame (of extremely high resolution) for each 3d-point in your scene and all those sub-holograms are then super-imposed atop of each other to form the final result. However, what if you don't need the full optical wavefront? Afterall, the wavefront only has to be accurate where the pupils of the viewer are located. If you only calculate the sub-holograms for a very narrow range of angles, they actually only span a significantly reduced number of pixels locally, reducing the computational complexity by a HUGE margin. Yes, you would still need a pupil detection, but that is extremely more robust than eye-tracking. That company (can't remember its name right now) actually had a working (albeit small) prototype as well. I never learned why they closed. 😞

So there you have it, my opinion on Facebook's new technology. Thanks for sharing that nice talk. 

I also gave a talk at Electronic Imaging in San Francisco in 2016 and had some nice holidays exploring the westcoast afterwards. So, seeing something from Eletronic Imaging always gives me some nice warm nostalgia. 🙂 

Edit: After writing up this summary, I realize how much I miss doing research and writing about it. Maybe I should return to the academic world eventually. 🙂 

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

3 hours ago, STiFU said:

I left academia three years ago and started working in the industry, so I might've missed some of the most recent developments (especially related to VR as I absolutely cannot stomach it), but I will gladly listen to that talk.

So, the concepts shown up until minute 50:00 all pretty much existed for quite some time. However, it was nice to see some working prototypes of them, especially the liquid crystal varifocal lens (dynamically adjustable focal length without using moving parts). Afterwards, the focal surface was introduced, which was totally new to me and looks really exciting to be honest. Using a spatial light modulator to implement what is essentially per-pixel varying focal-length is an ingenious idea!! At my previous institute, we had only used SLMs in the context of holography based 2d-image-projection, which really serves no purpose at all except seeing if it works or not. 😄 The captured results of the focal-surface looked somewhat off to me, 'though. I can't really put my finger on what it is, but the blur etc. didn't look right. Possibly non-linear distortions because multiple wavelengths pass through the spatial light modulator, which is really only designed for one singular wavelength? If it is that, it could easily be solved by using one SLM per color-component and using laser-light as an illuminator. If it is not, I hope they get it sorted, as the idea seems really promising.

I am not even sure one would still need eye-tracking with good focal-surfaces. You are correct in your assessment that eye-tracking is extremely error-prone. For my experiements, I had to go to extreme lengths to get eye-tracking working properly for my first approach. I ended up carefully modeling all possible 3d-viewing eye-movements in a kalman filter to get rid of major tracking-noise and augmented that with a realtime stereo-3d-disparity estimation (which I implemented in CUDA) to be able to tell what point in 3d-space the user most likely looked at. It worked fairly well, but might still not be accurate enough for vergence-driven accomodation control.

Considering the problems with eye-tracking, I still think that holography is the most promising approach. Researches have long demonstrated that it is possible to create actual holpgraphic displays, but the problem is the computational side of things. It is incredibly costly to calculate a hologram in real-time. That company in Dresden I mentioned earlier had a brilliant idea to solve this problem. If you want to construct the full optical wavefront of a point in 3d-space, the resulting hologram occupies every pixel of your display. So, to render a full scene, you'd pretty much need to render a full frame (of extremely high resolution) for each 3d-point in your scene and all those holograms are super imposed atop of each other to form the final result. However, what if you don't need the full optical wavefront? Afterall, the wavefront only has to be accurate where the pupils of the viewer are located. If you only calculate the holograms for a very narrow range of angles, the subholograms actually only span a significantly reduced number of pixels locally, reducing the computational complexity by a HUGE margin. Yes, you would still need a pupil detection, but that is extremely more robust than eye-tracking.

So there you have it, my opinion on Facebook's new technology. Thanks for sharing that nice talk. 

I myself gave a talk at Electronic Imaging in San Francisco myself in 2016 and had some nice holidays exploring the westcoast afterwards. So, seeing something from Eletronic Imaging always gives me some warm nostalgia. 🙂 

Awesome thanks so much for the impressions on it.  I'm not going to pretend that I understand anything more than the high level concepts, but I don't get to hear directly from actual researchers (former or otherwise) very often so it's hard to ground myself.  A few years ago the impression I got from Abrash was that we'd have varifocal by now or very soon, but every subsequent year he's pushed out his predictions further into the future (and the latest is basically "I don't know when").  Sometimes I get the impression that his optimistic predictions are as much targeted at higher ups in the company (that may not want to wait 15 years for a technology to develop) as they are at developers and some of the public.  And I'm sure Valve is also working on something targeted at consumers (well, enthusiasts) but nobody can get a word out of them about anything.

However, now it seems that if there will be any short term progress here it will involve a major breakthrough in eyetracking, or something more radical/unexpected.  In addition, it seems there will be many iterations on varifocal.  If we could just get to the point where the visual comfort of a VR headset is comparable to a 800x600 CRT monitor from 1995 I'd be pretty satisfied and would feel good about the state of the tech.  Honestly most of the friends I've coaxed into buying headsets rarely ever use them anymore due to a variety of issues like this.

The thing about the company in Dresden is interesting.  Do you know if they were approaching a computational complexity suitable for real time rendering?

Edited by woah
  • Like 1
Link to comment
Share on other sites

Atm, im giving The masterchief edition a workout 😆 and then some cyberpunk 2077 to top it off. X-Mas will be on the short side anyway due to this darn virus (everything is closed down...) im also waiting on a replacement board for my old RIVE so i might be offline for a little when it comes cause it probably wouldnt hurt to get some of the crud out of my PC. So reinstalling windows and my build environment, luckily most of my games are installed on seprate harddrives so there is that.

  • Like 3
Link to comment
Share on other sites

10 hours ago, woah said:

The thing about the company in Dresden is interesting.  Do you know if they were approaching a computational complexity suitable for real time rendering?

Last I heard, they had a working very small prototype. I now started digging for that company again and found it: SeeReal. To my surprise, the website is up again and they have recent entries on it, so it seems like, they didn't go bankrupt afterall!

https://www.seereal.com/technology/

They even seem to have entered a partnership with Volkswagen. This is very exciting news!! They gave a talk about their holographic headmounted display at the 2018 SPIE Photonics Europe. I'll try and see if I can find that talk somehwere to see whether they made any progress.

  • Like 2
Link to comment
Share on other sites

I wish you all a happy new year!

1 hour ago, STiFU said:

Happy new year, everybody! It's been the most boring new year's eve ever! 😄

Not sure how the rules were for you. Here in Lüneburg it was not much different than usual. Slighly less fireworks, since they were not sold, but apparently people had enough old stocks and lighting them was only forbidden in specific parts of town, so it was still enough.

Link to comment
Share on other sites

Happy New 2021 Year.

 

2 hours ago, Destined said:

I wish you all a happy new year!

Not sure how the rules were for you. Here in Lüneburg it was not much different than usual. Slighly less fireworks, since they were not sold, but apparently people had enough old stocks and lighting them was only forbidden in specific parts of town, so it was still enough.

*Laughing in Chinese counterfeit fireworks sold everywhere.*

Edited by Anderson

"I really perceive that vanity about which most men merely prate — the vanity of the human or temporal life. I live continually in a reverie of the future. I have no faith in human perfectibility. I think that human exertion will have no appreciable effect upon humanity. Man is now only more active — not more happy — nor more wise, than he was 6000 years ago. The result will never vary — and to suppose that it will, is to suppose that the foregone man has lived in vain — that the foregone time is but the rudiment of the future — that the myriads who have perished have not been upon equal footing with ourselves — nor are we with our posterity. I cannot agree to lose sight of man the individual, in man the mass."...

- 2 July 1844 letter to James Russell Lowell from Edgar Allan Poe.

badge?user=andarson

Link to comment
Share on other sites

1 hour ago, Destined said:

I wish you all a happy new year!

Not sure how the rules were for you. Here in Lüneburg it was not much different than usual. Slighly less fireworks, since they were not sold, but apparently people had enough old stocks and lighting them was only forbidden in specific parts of town, so it was still enough.

I mean, regardless of the rules, you are still supposed to socially isolate, instead of throwing a big party, but I don't want to start another Corona-discussion

So, my woman and I have spent the evening in front of the TV and went to bed at half past 12, because our little one would awake at 7 am, regardless of whether we partied or not. 😄

  • Like 1
Link to comment
Share on other sites

4 hours ago, STiFU said:

I mean, regardless of the rules, you are still supposed to socially isolate, instead of throwing a big party, but I don't want to start another Corona-discussion

I agree. I was meeting with the regular two people I see on a regular basis, anyway. We went for a walk at midnight and I was just surprised how much fireworks there were.

Link to comment
Share on other sites

1 hour ago, Destined said:

I agree. I was meeting with the regular two people I see on a regular basis, anyway. We went for a walk at midnight and I was just surprised how much fireworks there were.

I was also positively surprised. I live in a little sleepy town of 3000 inhabitants and they put on quite a show around midnight. My GF said it seems like everyone is celebrating that the trainwreck of a year that was 2020 is finally over ^^, but hopefully the festivities won't translate to an increased infection rate. We still don't have much in the way of laws or mandates for Corona here, mostly sternly worded guidelines. No restrictions on selling fireworks either. I did notice a lot of teens/tweens gathering outdoors without masks, but no sign of indoor parties thankfully.

  • Like 1

My Fan Missions:

   Series:                                                                           Standalone:

Chronicles of Skulduggery 0: To Catch a Thief                     The Night of Reluctant Benefaction

Chronicles of Skulduggery 1: Pearls and Swine                    Langhorne Lodge

Chronicles of Skulduggery 2: A Precarious Position              

Chronicles of Skulduggery 3: Sacricide

 

 

 

Link to comment
Share on other sites

I am scared, both for the innocent people of Iran who are oppressed by their government and murdered when they protest for democracy, and for the people of Israel, who the Iranian rulers have threatened with nuclear annihilation if they get their nuclear bomb. I consider myself liberal and voted for Biden, but hearing that he wants to overly-idealistically retry the 2015 Nuclear deal, which let's be honest, was never gonna be followed through by Iran, makes me have second thoughts about voting for Biden. I despise Trump, but he seems right about putting an end to the Iranian regime, which funds much terrorism and torments those of it's people who wish to return to democracy. The fact cowardly EU powers are trying to make a chamberlain appeasement to Iran, whilst leaving the new Israel-Middle East alliance out to dry, is infuriating. They're willing to throw all the liberal ideals of human rights and secular democracy be trampled under the tread of Iran, in exchange for another term by their constituents, at what could result in Israel's nuclear annihilation?

Link to comment
Share on other sites

Ha ha! Linus Torvalds never fails to impress with his ability to brutalize hardware manufacturers for their poor practices:

https://www.realworldtech.com/forum/?threadid=198497&curpostid=198647

Linus needs to teach Carmack how to properly insult IHV's so they can go on an angry comic tour together.

 

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Since Arkane already made a spiritual successor to Thief (Dishonored) and System Shock (Prey), they should be due for one for Deus Ex to complete the gaming golden age's Gaming Trinity hat trick.

Recently I've been playing Hades, among some other games. It's slick. Cool kind of aesthectic, quick gameplay loop for each room (what was sorely lacking in, e.g, Void Bastards trying the same model). Story is cool. The tone is a bit overbearing, but it's easy to get caught up in the gameplay and trying to make it a few more rooms in each round. I don't think it's an all time masterpiece, but I think it's a great game, one of the best of 2020.

  • Like 1

What do you see when you turn out the light? I can't tell you but I know that it's mine.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share


  • Recent Status Updates

    • Nort

      I did it! I finally did it! After 48 hours of frustration and confusion, I managed to finally sort out how to attach equipment to NPC:s properly, and write down coherent instructions on how to do it, both in my manual, and in this forum.
      I can finally rest. I can finally eat breakfast.
      · 0 replies
    • Nort

      Spent most of the day learning and writing about AI navigation. It's stuff I already know, but the magic about writing something down as clearly formulated as possible, is that you then see things that much more clearly. You also find what's unclear to you, and then you figure out exactly how things work. I was hoping to get through it all within just a day, but it looks like this will take tomorrow as well, at this rate. There's also plenty of other nonsense I have to get through as well. Everything's a chore.
      · 0 replies
    • Nort

      I want to be a fish, but I'm not a fish. My mom wants to be a fish too, and so she's killing herself eating plankton and sh***ng blood, and tells me to stop nagging her about taking her medication. I spent several hours today, just lying in bed, having anxiety over just existing, and not being a fish. You fish, you don't know how lucky you are. You can eat all the plankton you want. I tried eating plankton again yesterday, and I just got sick. I hate being a shark.
      · 0 replies
    • Nort

      I just gave myself vertigo. A pleasant kind of vertigo, like the world has been lifted off my shoulders. I'll explain:
      Yesterday I saw to my dismay, that I had made my entire map two - two - units too short on every level - that every set01 piece was sticking 2 units into the ceiling. That's basically 402 brushes that needs to be realigned (minus the ground floor brushes).
      I knew enough about selections to do all of that in a very tense five minutes, and it compiled without leaks. (Thank you so much, Dark Radiant devs, for making an editor with such care for precision that you can align hundreds of brushes perfectly at once (which is not something I can say for Valve's Hammer editor, which has some serious issues on that front, which actually made me just quit it in disgust).) However, the result is that the entire level has now been stretched a barely noticable 2 units, and it will take some getting used to psychologically.
      · 0 replies
    • Nort

      My workflow is basically running from a chain of disasters, eventually trying to seek shelter in former disasters. It's not ideal - it's just my life.
      When I abandoned my first map, it was out of a typical mental breakdown, and so I returned to find a skybox void where the kitchen door should have been (due to a misplaced visportal) and two overlapping brushes Z-fighting on the kitchen floor.
      I've now cleaned up the last bit of mess, by cleanly separating every floor into its own layer. Now I can finally work on each floor in peace.
      ...not that I really needed to. Once you get skilled enough, the orthographic messes, well, I'll let this video speak for itself:
       
      · 1 reply
×
×
  • Create New...