Jump to content
The Dark Mod Forums

nbohr1more

Development Role
  • Posts

    12127
  • Joined

  • Last visited

  • Days Won

    199

Everything posted by nbohr1more

  1. Congrats on the completion of your non-linear level-transition SDK work! (And thanks to solarspace specifically for helping to fix that TDM Arrow bug). I hope someone comes through and makes those monsters for you, that Rat-man concept art is great!
  2. I would give it a try Biker. For Dark Radiant your map should compile down to much less resources in most cases but you could end-up creating something too large for 32bit without noticing... For Doom 3 it may give you the ability to compile larger maps. You could end-up creating a map that will only dmap on a similarly configured environment so 32bit end-users might not be able to dmap for trouble-shooting purposes. But it's not usually a bad thing to be over-powered. Look at the game studios that use a super-computer to pilot their games until the consumer hardware is fast enough.
  3. "Rushing hands and Roaming Fingers" If I get the joke correctly...
  4. Yeps... The only limits are the Entity Limit (8192) and Event Limit (8192) AFAIK... (And there may be upcoming stuff that breaks even those limits?) But right now, as you have seen, 4000 entities is a more practical limit due to complexity issues... One entity could have thousands and thousands of polys though...
  5. Thanks for the clarifications Rich. I think I have again used the wrong terminology somewhere but pixel density is one area that appears to be saved in the method I described as the height map would have all the coarse values saved to it allowing the normal map all the high resolution changes. I suspect, however, that DDS compression (at the very least) has it's own hierarchical resolution method so my concept is probably redundant. I see that at one time they (id) were considering having "world space" normal maps but decided to postpone support for that method. Perhaps that is what is being used in TDS that has caught Melan's attention... (Not sure whether tangent space or world space is the default normal map format for Unreal engines...)?
  6. Have you visited the official id site, perhaps it recognizes browser nationality settings? http://www.idsoftware.com/
  7. Yeah, Dark Radiant is a kind of spiritual return to playing with game design software. Just like Gary Kitchen, the TDM team has created a wonderful tool for creativity. But also like Gary Kitchen, you might need to pickup a little coding\scripting to make your output really shine http://www.mts.net/~...maker/info.html
  8. Your humor is evident here and there LEGION, it takes awhile to decode sometimes ...
  9. I've been thinking about these posts for awhile because I had just been reading the following thread about heigh maps at D3W: http://www.doom3worl...&t=1154&start=0 This has brought me to the following questions: 1. Can a height map be used to amplify the precision of a normal map? Process: 1) Create High Poly model (no ultra-fine details or wrinkles etc) 2) Create Low Poly model 3) Render a height map 4) Re-render the high poly model by reversing the height map (only seen this done once and was a cumbersome process AFAIK...) http://www.doom3world.org/phpbb2/viewtopic.php?t=23265 5) Create an Ultra High Poly model with wrinkles etc 6) Bake normals (normap map) from the Ultra-High to the re-rendered High poly 7) use the addnormals function to combine the normal map and height map and thus have all the normal map resolution dedicated to fine details while the height map has the coarse height variation (proper resolution hierarchy?) xxx 2. Would that process really increase the available precision or are Normal Maps defined in an absolute scale in reference to the surface rather than a relative scale? 3. Since the process above is the opposite of what appears to be the intended usage, what about baking the ultra high poly model to the height map rather than using the height map for "hand drawn" displacements? 4. Is all this moot because addnormals degrades(?) normal map and height map quality?
  10. What happens if you remove the ".tga" extensions from the material definition? Have you tried using a converted heightmap stage (for trouble-shooting purposes)? textures/heightmap { { blend bumpmap map heightmap(textures/custom/image_1.tga, 10) } }
  11. You are working on OverDose! (That is some quality stuff! ) I'll take a dumb blind guess... is the Normal Map "non-power of two"? (I would post your material definition code so smart folk can provide better questions and answers...)
  12. Commodore 64!!!!! I recall spending an entire summer with Gary Kitchen's "Game Maker" trying to create a Zelda clone then the floppy disk went corrupt. I have never mourned lost pixels more... (This was after becoming somewhat of a C64 veteran before NES revitalized the console market...). It wasn't 'till well after college before I bought my first PC... P4 1.8ghz with Geforce 3 Ti-200 and 256MB DDR266 Ram... Windows 2000.
  13. Nothing would stop you from scouring the forums for "requests and replies" then creating a wiki article just like the "Upcoming Fan Missions" wiki I created. Maybe some team members would see it and chime-in with suggestions and corrections?
  14. Priorities for v1.04 1) Machine Gun weapons. Lord So-and-So won't know what hit him! 2) More day-time decor. Pick-nick blankets, breakfast food, roosters, etc. 3) Slower performance. How will High-End PC users brag about their FPS if everyone plays smoothly? 4) "Silent take-down" button. (in preparation for TDM console edition) 5) 0.0005% brighter grass texture. (Pending 50 more arguments about whether it should only be 0.0004% brighter). 6) AI sexual behaviors (now that there are female AI, why not) 7) Realistic spill-able milk glass. (at least 5 months of scripting and physics hacks, major decision...!) (hope this lightens things up for those grinding away at the really tough stuff... )
  15. Like a list of commonly requested changes and tweaks that are either "in consideration" or "rejected"?
  16. From my understanding many of these details are internal because the team have some major improvements in the works and want to surprise and delight the community (although some of these "surprise features" have slipped out a bit if you know where to look... )
  17. Now that I think of it, maybe that "painting" work-flow is something akin to recording procedural actions from the artists as was done in the 96kb .kkrieger demo. (If you can't come up with a better image compression scheme, why not crowd-source the "compression" from the artists right ... Not that anything out there has been shown to be more effective than the theprodukkt's method of squeezing down file size, ...DDS eat your heart out... just seems like a pretty extreme measure to solve the storage issues for non-tiled unique textures...) I guess the closest to this I could imagine for TDM would be having DR collect the sequence of where you place repetitive grime decals in a scene in relation to some point of origin. Then compress all the vectors, then use a particle effect or script to playback the vectors and paint the decals in the scene at run-time. It would reduce the map size and memory foot-print in exchange for some amount of processing to decode the placement sequence... Probably not worth the trade-off though... Unless collecting the model "placement sequence" could be beneficial for a similar reason... I think we'll see artists being hands-on for asset creation for awhile still. I just don't think those Mandelbrot\Fractal engineers can design equations to cover all the possible aesthetic requirements for the majority of scenes right now. Things are advancing quickly though...
  18. The only reliable thing I've see with regard to 3rd party calibration is "Adobe Gamma". It seems that ViewSonic are the only CRT makers that bother to put detailed Phosphor Persistence specs on their page. But after reviewing the topic, lowering the brightness is the only thing I could think of. (It would be cool if you could increase the blanking period to double the ratio of the draw period then crank the refresh way up... but I don't think the DDC spec allows for ratios.)
  19. I will do a little searching on my break but... (off the top of my head)... The phosphor chemistry will affect this behavior so you may need to lower your overall brightness (if you are past the point of being able to return this monitor and do comparison shopping against others...)
  20. The problem with many of the modern Bloom and HDR effects is that they are intended to be subtle enhancements that remove artifacts and make the scenes more accurate. The only way to have a dramatic effect with Bloom and HDR is to either 1) Have a lighting artist carefully paint dramatic lighting differences in the same scene (still pretty subtle) or 2) Go gonzo and exaggerate the effect until Bloom artifacts are visible. When marketing folks have trouble showing via extreme screen-shots the benefits of a new graphic technique, they choose the most crass hit-you-over-the-head options. So it has become trendy to abuse Bloom (it's the new Lens Flare). The same thing applies to some of the recent SSAO demos from ATI where the effect was exaggerated so badly that it looked like the objects had a hard outline. One of the worst outcomes is when an actual game has some of these bad decision built into the look, like the new Wolfenstein title where every objected is coated in specular highlighting.
  21. Yep. That's his real name AFAIK. At least that's how he's billed in all his press releases. (It'd be funny if he chose that moniker then had to keep it professionally because ATI felt that his reputation under that pseudonym was valuable from a client relations POV... JC Denton better watch out for the same scenario )
  22. Catalyst AI can be beneficial for other games (primarily DirectX games). I also recall that at one point dual video-card (crossfire) configurations required Catalyst AI to use both cards. Unfortunately, ATI seems to be utterly ignoring OpenGL as there aren't many "hot new games" for the API. Dave Baumann the ATI product manager recently asked the question "What is needs to be fixed" regarding ATI's OpenGL implementation. I asked that he pose that question at Doom3world... http://forum.beyond3d.com/showpost.php?p=1478830&postcount=34
×
×
  • Create New...