-
Posts
12752 -
Joined
-
Last visited
-
Days Won
261
Everything posted by nbohr1more
-
Did you also try this patching application, as was recommended previously...? http://www.ntcore.com/4gb_patch.php
-
Yeah, its a comparison of T3 to TDM. The UT version for T3 is very customized so it wouldn't surprise me too much other than that it would be tough to pull-off Parallax Mapping on an XBOX... but that could just be a PC exclusive feature or patched-in feature...
-
Well that applies to UE3 but thanks for clarifying that because it was in dispute from the sources I googled... TDS is UE2.5 so was that back-ported or again is it just some perception (placebo)?
-
Did you load up Sikkpin's test map or were you trying this in the Doom 3 game? (I think you need to add height maps to the alpha channel of the normal maps according to some comments...?)
-
Don't mind me Rich, I just have this superstition that the Unreal engines are doing "something" to improve Normal Map rendering over what Doom 3 has done. Maybe it's just the artists, maybe it's just the higher-poly assets but every time someone raves about Normal Maps in Unreal or TDS my superstition comes back. To make matters worse is UE3 has something called "Virtual Displacement Mapping" which most in the industry just claim is another name for Normal Mapping but I KNOW that can't be right. After pondering further the only other thing I can think of is that the Normal Map renderer in Unreal does some further interpretation or correction of the underlying Vertex Normals. Maybe they create new Vertex Normals at run-time and thus make the Tangent Space normal maps look more like the World Space ones. Yeah POM or Tessellation are the ultimate surface detail techniques so this is probably a lot of useless wheel-spinning. It's just the implication of a "hidden factor" that bothers me (and there probably isn't one ).
-
Congrats on the completion of your non-linear level-transition SDK work! (And thanks to solarspace specifically for helping to fix that TDM Arrow bug). I hope someone comes through and makes those monsters for you, that Rat-man concept art is great!
-
Whoosh... (the perils of online humor)
-
I would give it a try Biker. For Dark Radiant your map should compile down to much less resources in most cases but you could end-up creating something too large for 32bit without noticing... For Doom 3 it may give you the ability to compile larger maps. You could end-up creating a map that will only dmap on a similarly configured environment so 32bit end-users might not be able to dmap for trouble-shooting purposes. But it's not usually a bad thing to be over-powered. Look at the game studios that use a super-computer to pilot their games until the consumer hardware is fast enough.
-
So, what are you working on right now?
nbohr1more replied to Springheel's topic in TDM Editors Guild
"Rushing hands and Roaming Fingers" If I get the joke correctly... -
Yeps... The only limits are the Entity Limit (8192) and Event Limit (8192) AFAIK... (And there may be upcoming stuff that breaks even those limits?) But right now, as you have seen, 4000 entities is a more practical limit due to complexity issues... One entity could have thousands and thousands of polys though...
-
Thanks for the clarifications Rich. I think I have again used the wrong terminology somewhere but pixel density is one area that appears to be saved in the method I described as the height map would have all the coarse values saved to it allowing the normal map all the high resolution changes. I suspect, however, that DDS compression (at the very least) has it's own hierarchical resolution method so my concept is probably redundant. I see that at one time they (id) were considering having "world space" normal maps but decided to postpone support for that method. Perhaps that is what is being used in TDS that has caught Melan's attention... (Not sure whether tangent space or world space is the default normal map format for Unreal engines...)?
-
Have you visited the official id site, perhaps it recognizes browser nationality settings? http://www.idsoftware.com/
-
Yeah, Dark Radiant is a kind of spiritual return to playing with game design software. Just like Gary Kitchen, the TDM team has created a wonderful tool for creativity. But also like Gary Kitchen, you might need to pickup a little coding\scripting to make your output really shine http://www.mts.net/~...maker/info.html
-
Your humor is evident here and there LEGION, it takes awhile to decode sometimes ...
-
I've been thinking about these posts for awhile because I had just been reading the following thread about heigh maps at D3W: http://www.doom3worl...&t=1154&start=0 This has brought me to the following questions: 1. Can a height map be used to amplify the precision of a normal map? Process: 1) Create High Poly model (no ultra-fine details or wrinkles etc) 2) Create Low Poly model 3) Render a height map 4) Re-render the high poly model by reversing the height map (only seen this done once and was a cumbersome process AFAIK...) http://www.doom3world.org/phpbb2/viewtopic.php?t=23265 5) Create an Ultra High Poly model with wrinkles etc 6) Bake normals (normap map) from the Ultra-High to the re-rendered High poly 7) use the addnormals function to combine the normal map and height map and thus have all the normal map resolution dedicated to fine details while the height map has the coarse height variation (proper resolution hierarchy?) xxx 2. Would that process really increase the available precision or are Normal Maps defined in an absolute scale in reference to the surface rather than a relative scale? 3. Since the process above is the opposite of what appears to be the intended usage, what about baking the ultra high poly model to the height map rather than using the height map for "hand drawn" displacements? 4. Is all this moot because addnormals degrades(?) normal map and height map quality?
-
What happens if you remove the ".tga" extensions from the material definition? Have you tried using a converted heightmap stage (for trouble-shooting purposes)? textures/heightmap { { blend bumpmap map heightmap(textures/custom/image_1.tga, 10) } }
-
You are working on OverDose! (That is some quality stuff! ) I'll take a dumb blind guess... is the Normal Map "non-power of two"? (I would post your material definition code so smart folk can provide better questions and answers...)
-
Commodore 64!!!!! I recall spending an entire summer with Gary Kitchen's "Game Maker" trying to create a Zelda clone then the floppy disk went corrupt. I have never mourned lost pixels more... (This was after becoming somewhat of a C64 veteran before NES revitalized the console market...). It wasn't 'till well after college before I bought my first PC... P4 1.8ghz with Geforce 3 Ti-200 and 256MB DDR266 Ram... Windows 2000.
-
@LEGION: Bonetown?
-
Nothing would stop you from scouring the forums for "requests and replies" then creating a wiki article just like the "Upcoming Fan Missions" wiki I created. Maybe some team members would see it and chime-in with suggestions and corrections?
-
Priorities for v1.04 1) Machine Gun weapons. Lord So-and-So won't know what hit him! 2) More day-time decor. Pick-nick blankets, breakfast food, roosters, etc. 3) Slower performance. How will High-End PC users brag about their FPS if everyone plays smoothly? 4) "Silent take-down" button. (in preparation for TDM console edition) 5) 0.0005% brighter grass texture. (Pending 50 more arguments about whether it should only be 0.0004% brighter). 6) AI sexual behaviors (now that there are female AI, why not) 7) Realistic spill-able milk glass. (at least 5 months of scripting and physics hacks, major decision...!) (hope this lightens things up for those grinding away at the really tough stuff... )
-
Like a list of commonly requested changes and tweaks that are either "in consideration" or "rejected"?
-
From my understanding many of these details are internal because the team have some major improvements in the works and want to surprise and delight the community (although some of these "surprise features" have slipped out a bit if you know where to look... )
-
Now that I think of it, maybe that "painting" work-flow is something akin to recording procedural actions from the artists as was done in the 96kb .kkrieger demo. (If you can't come up with a better image compression scheme, why not crowd-source the "compression" from the artists right ... Not that anything out there has been shown to be more effective than the theprodukkt's method of squeezing down file size, ...DDS eat your heart out... just seems like a pretty extreme measure to solve the storage issues for non-tiled unique textures...) I guess the closest to this I could imagine for TDM would be having DR collect the sequence of where you place repetitive grime decals in a scene in relation to some point of origin. Then compress all the vectors, then use a particle effect or script to playback the vectors and paint the decals in the scene at run-time. It would reduce the map size and memory foot-print in exchange for some amount of processing to decode the placement sequence... Probably not worth the trade-off though... Unless collecting the model "placement sequence" could be beneficial for a similar reason... I think we'll see artists being hands-on for asset creation for awhile still. I just don't think those Mandelbrot\Fractal engineers can design equations to cover all the possible aesthetic requirements for the majority of scenes right now. Things are advancing quickly though...
-
The only reliable thing I've see with regard to 3rd party calibration is "Adobe Gamma". It seems that ViewSonic are the only CRT makers that bother to put detailed Phosphor Persistence specs on their page. But after reviewing the topic, lowering the brightness is the only thing I could think of. (It would be cool if you could increase the blanking period to double the ratio of the draw period then crank the refresh way up... but I don't think the DDC spec allows for ratios.)