Jump to content
The Dark Mod Forums

Carmack's DIY VR-Headset at E3


plasticman

Recommended Posts

Seems to get viral, even in the non-techie media: Behind closed doors John Carmack showed a VR Headset at the E3.

 

There are some videos available, I'd pick this one (for sound quality, [giantbomb.com]).

 

The device itself seems to be a prototype of an upcoming DIY-kit developed by someone called Palmer Luckey. There's a long thread about it (with Carmack occasional posting).

 

One technical interesting part is, the optics are kept simple while the software compensates for the lens distortion. In this case, the software is nothing else than the upcoming BFG-Edition of Doom3.

 

The dream is: Carmack will succeed, in about one year there will be affordable headsets for the consumer that do not suck; the added code pieces that are needed will become open source; "someone"TM puts it into DarkMod.

 

:rolleyes:

 

I don't really think, we will get there...

  • Like 4
Link to comment
Share on other sites

For the record, I've always loved the idea of VR. I still remember my first experience with a VR headset at some expo thing back in the early-90s. Loved it then; love it now. Dark Mod would be great for it!

 

Probably my dream game for it would be IL-2, since you're already mousing your "head" around constantly as it is, and I already wanted to get an IR track headset, but just going full-on VR would be ideal.

What do you see when you turn out the light? I can't tell you but I know that it's mine.

Link to comment
Share on other sites

I was playing around with headtracking a bit recently using free software called Facetracknoir. There is no need for infrared reflectors or leds. All that's required is a webcam. The latency isn't the greatest and it's a bit of a pain getting everything setup properly but it works.

 

I didn't own any games that support headtracking to test it in so I used more freebies; The Hunter and Rise of Flight. From what I experienced, I can see the value in it but I'm an old school FPS junkie and it's really weird trying to separate head movement from mouse movement.

Link to comment
Share on other sites

I was reading an article that some groups are developing a whole augmented reality (AR) setup, where you have the glasses and they augment the world around you ... like a kind of HUD that interfaces with the real world geometry as you walk or drive around. If they're going that far, no reason they couldn't throw in a VR mode while they're at it.

What do you see when you turn out the light? I can't tell you but I know that it's mine.

Link to comment
Share on other sites

I imagine that moving your head all the time will be a bit more stressful than just moving the mouse a tiny bit. Whole new illness' springs up :D

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man." -- George Bernard Shaw (1856 - 1950)

 

"Remember: If the game lets you do it, it's not cheating." -- Xarax

Link to comment
Share on other sites

slightly off topic, the next gen mobile phones will be glasses, no longer a hand held device, maybe these glasses could be modifyed into vr headsets.

I guess you're referring to Google's Project Glass. This is just a vision of google and not a prediction what is to come next. ;) I have given this a lot of thought when I first heard about Project glass and my conclusion was: What google illustrates is impossible using optical imaging. Your eyes cannot focus on objects this close to you, so they would need to use a lens to alter the focal distance on the display and that would completely mess-up the user's visual system. Furthermore the HUD is just presented on one eye, which again completely messes up the users visual system because there will be binocular rivalry. There are basically only three options to achieve the desired effect:

  1. Using semitransparent mirrors that allow the lightbeams of both the display (with altered focal distance) and the actual world to reach the eye. Of course this approach wouldn't allow for such a compact design as illustrated in the videos, but this is the way these VR-Displays usually work. Using such glasses has the sideeffect of darkening the world like sunglasses.
  2. Implants that project the HUD-elements directly on the retina. Unlikely in the near future.
  3. Holographic displays that generate a wavefield for the user as if the HUD-elements were way in front of the user. This could actually be possible in something like 5 to 10 years or so, as realtime-holographic displays are currently being researched. Check out the company SeeReal (Dresden, Germany) for instance, if you're interested in this topic.

http://www.youtube.com/watch?v=9c6W4CCU9M4

 

I was reading an article that some groups are developing a whole augmented reality (AR) setup, where you have the glasses and they augment the world around you ... like a kind of HUD that interfaces with the real world geometry as you walk or drive around. If they're going that far, no reason they couldn't throw in a VR mode while they're at it.

The problem with AR is that the visions are again way too sophisticated. People dream about highlighting world geometry precisely (like amplifying edges of an important building). Of course this would be cool, but there is one huge problem: The pupil is placed on the eyeball and for that reason, its position is dependent of the viewing direction. Hence you would need complex and precise eye-tracking mechanism in the AR-glasses for both eyes or the highlighted display elements would be offset. I have actually seen AR-glasses like this on this year's CeBIT (Hannover, Germany) and yes, the AR-part was offset in the display. :) An easy way to achieve a similar effect would be to have the user looking at small displays that show the real world as captured through two cameras at the front of those glasses. In this case, standard 2D image processing algorithms could be used to highlight the desired objects and add text et cetera. But this approach would not be true to the original AR-concept.

Link to comment
Share on other sites

I think the trick with "elements that are close-up" in HUD displays is to "defocus" them. Tha way they are in the same "virtual distance" than whatever else you are looking at, so they appear to be in focus. The same is used with these HUD in car front windows, where if you focus on the road, you wouldn't be able to read a distance on the windshield at the same time - unless it is projected to appear "on the road ahead".

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man." -- George Bernard Shaw (1856 - 1950)

 

"Remember: If the game lets you do it, it's not cheating." -- Xarax

Link to comment
Share on other sites

Do I understand you right? You want to defocus elements on a 2d-screen and assume that they magically appear focused in another focal distance? =) It's by far not that easy. I have seen some publications about defocus inverse filtering in the context of projecting imagery to non-planar surfaces some time ago, but this only works to a certain extent. You would need "negative light" for it to work correctly and as we all now, the lowest possible light intensity is zero (black). :) The only way to do it properly is to use a holographic screen.

 

I just did some research on the topic and found a nice free book. I also found out that the problem I described above is commonly known as the "fixed focal length problem".

Link to comment
Share on other sites

Reminds me of those cameras that save the entire light field, so you can focus on any part after the fact. You can move the focal length within the 'image' easily enough with an algorithm, but I imagine it's a whole other problem to move the focal length to the virtual 'screen' itself, the problem you're talking about here.

 

Or I don't know; what if the screen just projected the whole light field back? It's like all the light passing through a window. You'd want a screen that projects a lightfield exactly like the field passing through the window or glasses (so they'd be physically indistinguishable on the "inside" part). Then let the viewer focus on whatever they want to, close or far... So it seems theoretically possible. I mean they can already save a whole lightfield. The issue is projecting a lightfield.

 

But it is an issue; I recall that your eyes do get tired focusing so close up for long periods.

What do you see when you turn out the light? I can't tell you but I know that it's mine.

Link to comment
Share on other sites

What you're thinking of are holographic displays and as I said, they are being researched but not quite there yet. Holographic displays are capable of reproducing the light wavefronts emitted by an object and hence allow for REAL 3d visualization. So your focal length isn't limited to the distance of the display, it can actually vary according to the depth of the presented objects.

 

The lightfield cameras (a.k.a. plenoptic cameras) you're talking about, don't actually capture the complete lightfield unlike holograms. They sort of capture the direction in which each lightbeam reaches the sensor and the direction is captured just for a limited number of quantized angles on the vertical and horizontal axis. This enables people to refocus the image and estimate depth. I believe lightfield cameras will eventually be used in capturing 3D-Content, once they have reached an acceptable resolution. You could possibly convert the recorded data to holograms (using a massive amount of computational power), but there would still be quuite some quantization error.

Link to comment
Share on other sites

From the looks of it they are using a virtual retinal display for the Google Glass Project. This

kinda explains the tech and as you can see it doesn't look much different in this recent presentation of the Google Glass.

 

Since the picture is projected more or less directly on the retina there seems to be no problem with it being to close to the eye. However, technical problems aside, when this augmented reality stuff takes off and merges mobile phones, social networking and GPS-Tracking it will surely become an interesting technology for completely new kinds of games, as well as surveilance.

:P

 

But all this stuff is completely different to VR. The fun part in VR is that you don't see the real world. I tried one of these devices (as some of us it seems) at a fair in the late 90s (Cebit, Hannover). It had the usual three ingredients: headphones, motion sensors and two LCDs (640x480), which appeared like a large stereoscopic screen about 2 meters on front of you.

 

It was fascinating allthough there were clearly issues to be adressed. You could play some hacked version of Descent as a demo, it was nice to see it in "real" 3D but completely awkward to control a spaceship via a headtracking device. Also, it was not that immersive since the screen only covered a limited area of the natural field of view.

 

What excites me about Carmack's take on it is how he adresses all those little things and takes it to what it should have been in the first place. What could turn out to become a problem that's less easy to fix is that you can't see your input devices with the display completely covering the view.

 

It would be fine for me, my left hand knows it's way around "w,a,s,d" and I could move any beverages far away from my mouse before playing -- but for wide adaption that's a no go.

 

For headtracking as an input I think it comes down to how (and if) it is used. Controlling anything else than in-game head movement does not sound like a good idea, and as Rich pointed out, in an FPS it adds another level of diffculty since head movement compromises your mouse aim. But when it comes to flight sims -- when I wore these glasses back then I wanted nothing more than play an X-Wing game with them.

:smile:

Link to comment
Share on other sites

Ah thanks, plasticman. That is very interesting. I have some reading to do now... =) I always thought retinal projection would be a good way to achieve this effect (see my point 2 from above), but I couldn't imagine they'd be able to do it with non-invasive technology, hence my idea of implants. After all, there is still the problem of the moving pupil on the eyeball. I would really like to know how this virtual retinal display behaves when the pupil is not "in the sweet-spot" for that reason.

Link to comment
Share on other sites

I'd think it'd be kind of self-enforcing, since the person is going to adjust the glasses and look where they have to to get the effect or the image dissolves, so their main job would be to just make the sweet spot as large & convenient as they can.

What do you see when you turn out the light? I can't tell you but I know that it's mine.

Link to comment
Share on other sites

I guess you're referring to Google's Project Glass.

 

Oh dear... Kids nowadays wear earphones while they stroll on the streets, totally deaf, and get hit by cars. And the people driving those cars are talking on their cell phones. Everyone is distracted from the important things around them.

 

Add that kind of HUD technology and you have people, not only deaf and distracted, but blind as well when they are reading some HUD message.

 

But then again, the youtube (or future equivalent) might get filled with interesting dude-gets-hit-by-a-160km/h-train-while-reading-his-emails -videos, all streamed from the on-device camera so you get the first-person visual experience in High Defintion video and audio.

  • Like 2

Clipper

-The mapper's best friend.

Link to comment
Share on other sites

I'd think it'd be kind of self-enforcing, since the person is going to adjust the glasses and look where they have to to get the effect or the image dissolves, so their main job would be to just make the sweet spot as large & convenient as they can.

Hmn, I guess we'll have to wait and see but I'd bet it'll feel pretty unnatural, because you don't know where exactly to look in order for the display to appear. You probably don't have any visual indicator whatsoever and usually you can still perceive things in peripheral vision.

Link to comment
Share on other sites

Its funny, I've watched so many incarnatins of these "imersive" displays and all have failed, I am overjoyed the godfather of 3d is tinkering with this and holy $&*# its. Going to be in the next version of doom 3 which happens to run my most favorite game thief... holy god thank you. Dreamed of "taffering" in virtual, designed a projection room and everything, man I'm sooo happy to know this is real and not far away....

Link to comment
Share on other sites

I have to admit if this gets up and running for Dark Mod and it's inexpensive, I might invest some time in making some cool scenes just to have the pleasure of getting those glasses on and feeling like I'm standing in the middle of some epic hall, and walking around. I'm getting goosebumps just thinking about it. :wub:

Yeah I remember Spar talking about his love for this kind of stuff, so this might inspire him to give it a go.

What do you see when you turn out the light? I can't tell you but I know that it's mine.

Link to comment
Share on other sites

I played part of a T2X mission in 3D (using NVidia shutter glasses) and it was extremely cool! There was too much ghosting of bright lights though (the shutters weren't up to the task, or my monitor at the time, or something), so I ended up returning the system. But to use head tracking to look around, that would add another level of immersion that would be incredible!

shadowdark50.gif keep50.gif
Link to comment
Share on other sites

  • 1 month later...

Looks awesome! Wanna try that out :o

Heh und this guy at 1:27 on the right (Brendan Iribe) looks like my imagination thinks Springheel looks like in real life, inspired by his avatar drawing :laugh:

"Einen giftigen Trank aus Kräutern und Wurzeln für die närrischen Städter wollen wir brauen." - Text aus einem verlassenen Heidenlager

Link to comment
Share on other sites

Really neat video. I didn't expect it to look that professional since it seemed more like a hobby/enthusiast project at first.

 

However, this is the most sky-rocketing kickstarter I have ever seen. They got funded in less than a day and still have a month left for people to throw their money at them.

 

:smile:

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recent Status Updates

    • nbohr1more

      TDM 15th Anniversary Contest is now active! Please declare your participation: https://forums.thedarkmod.com/index.php?/topic/22413-the-dark-mod-15th-anniversary-contest-entry-thread/
       
      · 0 replies
    • JackFarmer

      @TheUnbeholden
      You cannot receive PMs. Could you please be so kind and check your mailbox if it is full (or maybe you switched off the function)?
      · 1 reply
    • OrbWeaver

      I like the new frob highlight but it would nice if it was less "flickery" while moving over objects (especially barred metal doors).
      · 4 replies
    • nbohr1more

      Please vote in the 15th Anniversary Contest Theme Poll
       
      · 0 replies
    • Ansome

      Well then, it's been about a week since I released my first FM and I must say that I was very pleasantly surprised by its reception. I had expected half as much interest in my short little FM as I received and even less when it came to positive feedback, but I am glad that the aspects of my mission that I put the most heart into were often the most appreciated. It was also delightful to read plenty of honest criticism and helpful feedback, as I've already been given plenty of useful pointers on improving my brushwork, level design, and gameplay difficulty.
      I've gotten back into the groove of chipping away at my reading and game list, as well as the endless FM catalogue here, but I may very well try my hand at the 15th anniversary contest should it materialize. That is assuming my eyes are ready for a few more months of Dark Radiant's bright interface while burning the midnight oil, of course!
      · 4 replies
×
×
  • Create New...