Jump to content
The Dark Mod Forums

Search the Community

Showing results for '/tags/forums/work thread/'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General Discussion
    • News & Announcements
    • The Dark Mod
    • Fan Missions
    • Off-Topic
  • Feedback and Support
    • TDM Tech Support
    • DarkRadiant Feedback and Development
    • I want to Help
  • Editing and Design
    • TDM Editors Guild
    • Art Assets
    • Music & SFX

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

  1. Of course, identifying struct via HN is useless. At work, we use HN when there are reasons for caution: u for unsigned to prevent unintentional implicit unsigned promotion (and overflow) Yes, I know, the compiler will issue a warning, but in some legacy projects, there are tons of warnings, so devs are going to miss that one new one. We are slowly working toward warning-free builds, but as long as we are not there yet, the u-prefix helps a lot. p for pointers to make the reader / dev aware that that variable could be NULL. m_ for members, obviously useful. Similarily, g_ and s_ for globals and statics respectively, although it is obviously discouraged to use those. Various prefixes for the different coordinate systems we use. I for interface, just for convenience b for boolean, this is basically just for convenience so you can directly see that this is the most basic type. This one, one could defintely argue against, but I personally like it.
  2. In the light of a post by Obsttorte, I am opening a dedicated topic for discussion: Some thoughts of mine, trying to keep existing mechanics in mind: The feature should work with regular, standard doors only: avoid sliding / custom / fancy-opening doors... Locked doors will stop you to a halt, obviously. The door-opening animation must be way faster, obviously. Because doors can open towards you, we will probably have to make doors frobable before their time (when running). Forget, at least for now, about "slamming doors close" when running. Not only it is difficult to achieve with the mouse and keyboard but it would make AI following you look clumsy (unless they learn to slam doors as well). Let's not think about "slam door sounds" for now. Current sounds perhaps work just fine. And now the fun stuff: To keep things simple, slamming a door open would make AI react just like if a heavy object was thrown to a non-carpeted floor in that point. AI that is right behind that door gets pushed away (if the door opens towards them, that is). What's important here is that nearby AI get out of your way. We can further elaborate the idea: On duty: AI gets pushed and fall to the floor but stand up right after OR AI gets pushed and goes into "flash-bomb" mode. Civilians: Random possibility for civilians to remain permanently knocked out on the floor? Undead: I don't know. Zombies get torn to pieces? Discuss!
  3. I get that and full disclosure: I did not read the full thread, it's just too much! I was absent in recent months, so I missed all of this. That makes a lot of sense. Although, to solve this, one could communicate via audio cue ("uh uh") that no use-type-interaction is available for that entity. Anyway, there is an incredibly easy way to solve the inconsistency while staying true to the original control scheme and that is to simply swap shouldering and grabber. Entity type Short Press Long Press ...Release Button Junk Grabber Nothing Nothing Food Grabber Eat Nothing Loot Pick-up Nothing Nothing Bodies Grabber Shoulder Nothing Lights Grabber Exstinguish Nothing Tools Inventory Nothing Nothing This way, every interaction that was originally "frob + use" (shouldering, eat, exstinguish) will the become long-press-frob. So, the long-press simply becomes a shorthand for special action, nice and clean. While I do love the hold-type-grabber to death, I'd be willing to sacrifice it for a consistent control-scheme. (@stgatilov @Daft Mugi) Right now, we have this weird mixture of hold-type- and toggle-type-grabber that I guarantee you, will confuse new players (and streamers). I say, either fully embrace the lovely new hold-type-grabber (which I am still all for) or drop it completely.
  4. There was an idea to add two features to GUI scripts (6164). The first one is runScript command, which allows GUI script to call a function from game script. Interestingly, this feature is already supported in the GUI engine, but the game code only processes this command when the player clicks left mouse button on the GUI (i.e. usually it works in onAction handler, but not in namedEvent or onTime handlers). Obviously, ID initially did not envision runScript as a global feature which works the same way everywhere, their idea was that it is context-sensitive, and whoever calls the GUI code can then pull the commands generated by the call and do whatever he wants with them. I'm not sure I really want to change this architecture... Anyway: what are the possible use cases for runScript command? The second feature is namedEvent command, which simply generates/calls a named event with specified name on the whole UI, which can be then handled by matching onNamedEvent handlers. However, this command can be implemented in several ways: Whenever namedEvent command is executed, the named event is processed immediately. The rest of the script (after namedEvent command) is continued only after generated named event is fully processed. Whenever namedEvent command is executed, named event is put into some kind of queue, then the current script continues to execute. The generated named event is executed at some moment later, but surely on the current frame. The point 2 can be further differentiated on the exact order when generated named events are processed. So the first approach is how functions normally behave in normal imperative languages, with a real call stack. The second approach is delayed execution, like what we currently do with "resetTime X; -> X::onTime 0 {...}" combo (at least everywhere in the main menu GUI). My worry with the first approach is that it is an major change for GUI engine with no past experience, and it will probably not match well with the long-established GUI wierdness (I mean e.g. the wierdness that all expressions in GUI script are executed before the script commands start executing). And it would work different both from the "resetTime + onTime 0" combo. On the other hand, the callGui in game scripts do execute named event immediately. And I must admit nested GUI calls could be used to reduce the issues from the GUI weirdness mentioned above. Also, this command exists in Quake 4, but I'm not sure how exactly it works. And it's probably good idea to make TDM work the same way.
  5. Personally, for this kind of game, double-clicking and double-tapping are the kind of things that instantly get a sour taste in my mouth as soon as I hear it, because I already know from experience that it not reliable enough and it's more prone to human error. You said it yourself: It may also happen accidentally if your fingers twitch, or if your mouse misregisters an extra click, or when you think you didn't hit the target and immediately click again, but it so happens that you did hit the target the first time; or when you're hasting to perform the same task multiples times in quick succession (not very relevant here, but I couldn't count how many times I died in minecraft tapping keys to adjust my position and accidentally double-tapping forward and sprinting off a cliff). Also, given that we're talking about the right mouse button, I'm feeling even less inclined. I can double-right-click, but it doesn't feel very comfortable. It's really not something I'm used to, so it requires a degree of effort (and I suppose it will get my hand tired, as @Wellingtoncrabmentioned). Yup. I like ghosting. To be fair, I play with my own lax rules, but I still feel bothered if I have to tamper with the natural order of things, if I think it's unnecessary or that I shouldn't be forced to. It might seem petty, but it's part of what makes the challenge interesting. You never know, the candle owner might have noticed before bed that the candle was about an inch away from a spec of dust, and in the morning he might realize that that distance has changed. So, from a ghosting perspective, when you tamper with things without needing to, you've introduced a point of failure. If the game requires it, that's another story -- it turns into "I can't work around it, so I have to bend the rules because of it". But as far as I'm concerned it still bothers me and kind of ruins the challenge. For example, TDM has been leaving a really sour taste in my mouth when it comes to dropping keys, because lots of missions don't allow it (because unfortunately they're not droppable by default in DR, and for no good reason, afaik). After Snatcher's post about the delay interfering with manipulating bodies, I was growing tempted to agree with reversing it, but then I realized both ways have issues of the same kind. If you can click to shoulder, then you can't click to drag body -- you always have to hold button, which is not great if you're dragging a body a long distance. If you can click to drag a body, then you can't hold to drag a body. You always have to click to drag and then click to release, which is also not great when you're dragging limbs for a pose or something. (I suppose this might be even more annoying than the other.) Maybe I should posting this in that thread.
  6. So for long time now I've been working on a map, and lately I got really enthused lol. So it's almost done now. Here's a sneak peek. I turned the gamma way up on the vid so it can be seen in the video. https://www.youtube.com/watch?v=zjemP2LfzXw&feature=youtu.be
  7. Hi MirceaKitsune, This whole thread mainly is about the TDM Modpack (I acknowledge the title can be misleading). This probably isn't the place for the Keypad but you and your mod are welcome here. The TDM Modpack is limited to mods that can be used right away and with ease by players in existing missions. Your keypad must be thoughtfully and carefully integrated into a map therefore such mod isn't suited for the Modpack. @wesp5's Unofficial Patch includes, among other features, tools for mappers such as a flint, an invisibility potion... and even prefabs although I am not sure if these are new or updated versions. I can support your work and I am happy to try and collaborate where I can but having a mod formally integrated into TDM is out of my reach. Either you have the contacts or you get lucky and your mod is found of interest by someone with the knowledge and/or the power. I would definitely like to see your Keypad featuring in future missions and while I can hardly put two brushes together in DarkRadiant I will have a look at it. My suggestion to increase your chances is to provide a small test map along with the mod to showcase your work. Why you talented people aim so high. Build something small that is complete on its own. Release. Expand your something and make the current whole complete on its own. Release. Expand...
  8. You know what I find most significant? That he's still working on missions after all this. That said I can't and won't comment on the decisions made. I'm sure there were good reasons for them, and, I really don't know the people involved in this mod as people who would recklessly exclude someone. I really like bikerdude's work, and, his missions are welcome additions to the mod. I think I played them all by now. Looks like he's in touch with some people here anyway.
  9. The spambots are getting better. Even a more or less realistic thread title.
  10. I think improvements to the training mission should go to a seperate thread. > Some games have tutorials inside the in-game main menu. Just some text with images would work fine. I agree that the way this currently works in the patched version with longpress and then keeping pressed is not satisfactury. Can it not be set that the grabbing body keeps grabbed when you release the mouse key? So like a toggle. It's still not great, because it means body control is hidden behind a long press.
  11. hmm well turns out there is an FSR 3 mod out now that actually helps quite a bit with the 1080 ti, still not getting stellar performance but i can play it at medium settings now with around 45 fps on the old card. the 2080 ti can manage it at medium with raytracing at low and still get 60 fps or high with raytracing off to keep it at around 60 fps, drops to around 45 fps with raytracing at low but still playabe due to it being very stable performance wise. fps only fluctuates by 2 to 3 frames. have to say the game looks really good the story is a bit weird though. there are actually elements from control in it you even get to work with the FBC.
  12. Do subtitles for voice audio in non-video briefings actually work? I have an instance where I can't get them to show up. Or I'm missing something.
  13. This is another point which been gone over again and again in the thread and doesn’t need more elaboration. Two additional users just pointed out that frobbing is a context sensitive action that is already not “consistent”. There are lots of moveable objects for example that already don’t work like you are describing. Inventory items like keys are also moveables. The primary action is to acquire them. If you wish to interact with them as you would another moveable you must drop them first at which point they appear in front of the player as other physics objects do. No one complains about this not being consistent, as it would be tedious and potentially confusing to need two key presses to acquire most moveable inventory items, many of which are required to progress in the game, just like it is tedious and potentially confusing to do this with bodies. It is TDM which is not consistent with games like Thief TDP/2/3, System Shock 2, Deus Ex, Dishonored, etc. Maybe this is in part why it is confusing, as there is a good chance people coming to TDM will have some familiarity with those games. Thanks for your understanding.
  14. I guess I want to be sure I understand the context of entity "death" - both purchasing the glasses and picking them up adds a atdm:xray_glasses item to the players inventory. You mentioned basically the inventory slot which should be the be pointing at atdm:xray_glasses is pointing at nothing So it probably got "killed": -In the process of being added to inventory (in this case it was a purchase) -Upon the first attempt at using it (it sounds like it immediately did not work) -Or in the very small window in between Unless the entity in an inventory slot can be "dead" prior to it ever being added?
  15. That bug has been around for over a decade though. In some missions, changing FOV in console would only work when frobbing an item. I always assumed it was a bug in the game, but it's possible it's a bug in certain missions due to scripting. But I've definitely encountered it a LONNNNNNNG time ago.
  16. Hello, everyone! In this multi-part, comprehensive tutorial I will introduce you to a new light type that has been available in The Dark Mod since version 2.06, what it does, why you would want to use it and how to implement it in your Fan Missions. This tutorial is aimed at the intermediate mapper. Explanations of how to use DarkRadiant, write material files, etc. are outside of its scope. I will, however, aim to be thorough and explain the relevant concepts comprehensively. Let us begin by delineating the sections of the tutorial: Part 1 will walk you through four, distinct ways to add ambient light to a scene, the last way using irradiance environment maps (or IEMs). Lighting a scene with an IEM is considered image-based lighting. Explaining this concept is not in the scope of this tutorial; rather, we will compare and contrast our currently available methods with this new one. If you already understand the benefits IBL confers, you may consider this introductory section superfluous. Part 2 will review the current state of cubemap lights in TDM, brief you on capturing an environment cubemap inside TDM and note limitations you may run into. Three cubemap filtering applications will be introduced and reviewed. Part 3 will go into further detail of the types of inputs and outputs required by each program and give a walkthrough of the simplest way to get an irradiance map working in-game. Part 4 will guide you through two additional, different workflows of how to convert your cubemap to an irradiance map and unstitch it back to the six separate image files that the engine needs. Part 5 will conclude the tutorial with some considerations as to the scalability of the methods hitherto explained and will enumerate some good practices in creating IEMs. Typical scenes will be considered. Essential links and resources will be posted here and a succinct list of the steps and tools needed for each workflow will be summarized, for quick reference. Without further ado, let us begin. Part 1 Imagine the scene. You’ve just made a great environment for your map, you’ve got your geometry exactly how you want it… but there’s a problem. Nobody can appreciate your efforts if they can’t see anything! [Fig. 1] This will be the test scene for the rest of our tutorial — I would tell you to “get acquainted with it” but it’s rather hard to, at the moment. The Dark Mod is a game where the interplay between light and shadow is of great importance. Placing lights is designing gameplay. In this example scene, a corridor with two windows, I have decided to place 3 lights for the player to stealth his way around. Two lights from the windows streak down across the floor and a third, placeholder light for a fixture later to be added, is shining behind me, at one end of the corridor. Strictly speaking, this is sufficient for gameplay in my case. It is plainly obvious, however, that the scene looks bad, incomplete. “Gameplay” lights aside, the rest of the environment is pitch black. This is undesirable for two reasons. It looks wrong. In real life, lights bounce off surfaces and diffuse in all directions. This diffused, omni-directional lighting is called ambient lighting and its emitment can be termed irradiance. You may contrast this with directional lighting radiating from a point, which is called point lighting and its emitment — radiance. One can argue that ambient lighting sells the realism of a scene. Be that as it may, suppose we disregard scary, real-life optics and set concerns of “realism” aside… It’s bad gameplay. Being in darkness is a positive for the player avatar, but looking at darkness is a negative for the player, themselves. They need to differentiate obstacles and objects in the environment to move their avatar. Our current light level makes the scene illegible. The eye strain involved in reading the environment in these light conditions may well give your player a headache, figurative and literal, and greatly distract them from enjoying your level. This tutorial assumes you use DarkRadiant or are at least aware of idtech4’s light types. From my earlier explanation, you can see the parallels between the real life point/ambient light dichotomy and the aptly named “point” and “ambient” light types that you can use in the editor. For further review, you can consult our wiki. Seeing as how there is a danger in confusing the terms here, I will hereafter refer to real life ambient light as “irradiant light”, to differentiate it from the TDM ambient lights, which are our engine’s practical implementation of the optical phenomenon. A similar distinction between “radiant light” and point lights will be made for the same reason. Back to our problem. Knowing, now, that most all your scenes should have irradiant light in addition to radiant light, let’s try (and fail, instructionally) to fix up our gloomy corridor. [Fig. 2] The easiest and ugliest solution: ambient lights. Atdm:ambient_world is a game entity that is basically an ambient light with no falloff, modifiable by the location system. One of the first things we all do when starting a new map is putting an ambient_world in it. In the above image, the darkness problem is solved by raising the ambient light level using ambient_world (or via an info_location entity). Practically every Dark Mod mission solves its darkness problem1 like this. Entirely relying on the global ambient light, however, is far from ideal and I argue that it solves neither of our two, aforementioned problems. Ambient_world provides irradiant light and you may further modulate its color and brightness per location. However, said color and brightness are constant across the entire scene. This is neither realistic, nor does it reduce eye strain. It only makes the scene marginally more legible. Let’s abandon this uniform lighting approach and try a different solution that’s more scene-specific. [Fig. 3] Non-uniform, but has unintended consequences. Our global ambient now down to a negligible level, the next logical approach would be hand-placed ambient lights with falloff, like ambient_biground. Two are placed here, supplementing our window point lights. Combining ambient and point lights may not be standard TDM practice, but multiple idtech4 tutorials extol the virtues of this method. I, myself, have used it in King of Diamonds. For instance, in the Parkins residence, the red room with the fireplace has ambient lights coupled to both the electric light and the fire flame. They color the shadows and enrich the scene, and they get toggled alongside their parent (point) lights, whenever they change state (extinguished/relit). This is markedly better than before, but to be honest anything is, and you may notice some unintended side-effects. The AI I’ve placed in the middle of the ambient light’s volume gets omnidirectionally illuminated far more than any of the walls, by virtue of how light projection in the engine works. Moving the ambient lights’ centers closer to the windows would alleviate this, but would introduce another issue — the wall would get lit on the other side as well. Ambient lights don’t cast shadows, meaning they go through walls. You could solve this by creating custom ambient light projection textures, but at this point we are three ad hocs in and this is getting needlessly complicated. I concede that this method has limited use cases but illuminating big spaces that AI can move through, like our corridor, isn’t one of them. Let’s move on. [Fig. 4] More directional, but looks off. I have personally been using this method in my WIP maps a lot. For development (vs. release), I even recommend it. A point light instead of an ambient light is used here. The texture is either “biground1” or “defaultpointlight” (the latter here). The light does not cast shadows, and its light origin is set at one side of the corridor, illuminating it at an angle. This solves the problem of omnidirectional illumination for props or AI in the middle of the light volume, you can now see that the AI is lit from the back rather than from all sides. In addition, the point light provides that which the ambient one cannot, namely specular and normal interaction, two very important features that help our players read the environment better. This is about as good as you can get but there are still some niggling problems. The scene still looks too monochromatic and dark. From experience, I can tell you that this method looks good in certain scenes, but this is clearly not one of them. Sure, we can use two, non-shadowcasting point lights instead of one, aligned to our windows like in the previous example, we can even artfully combine local and global ambient lights to furnish the scene further, but by this point we will have multiple light entities placed, which is unwieldy to work with and possibly detrimental to performance. Another problem is that a point light’s movable light origin helps combat ambient omnidirectionality, but its projection texture still illuminates things the strongest in the middle of its volume. I have made multiple experiments with editing the Z-projection falloff texture of these lights and the results have all left me unsatisfied. It just does not look right. A final, more intellectual criticism against this method is that this does not, in a technical sense, supply irradiant light. Nothing here is diffuse, this is just radiant light pretending the best it can. [Fig. 5] The irradiance map method provides the best looking solution to imbuing your scene with an ambient glow. This is the corridor lit with irradiance map lights, a new lighting method introduced in The Dark Mod 2.06. Note the subtle gradients on the left wall and the bounced, orange light on the right column. Note the agreeable light on the AI. Comparing the previous methods and this, it is plainly obvious that an irradiance environment map looks the most realistic and defines the environment far better than any of the other solutions. Why exactly does this image look better than the others? You can inform yourself on image-based lighting and the nature of diffuse irradiance, but images speak louder than words. As you can see, the fact of the matter is that the effect, subtle as it may be, substantially improves the realism of the scene, at least compared to the methods previously available to us. Procuring irradiance environment maps for use in lighting your level will hereafter be the chief subject of this tutorial. The next part will review environment cubemap capture in TDM, the makeIrradiance keyword and three external applications that you can use to convert a TDM cubemap into an irradiance map. 1 “ Note that the color buffer is cleared to black: Doom3 world is naturally pitch black since there is no "ambient" light: In order to be visible a surface/polygon must interact. with a light. This explains why Doom3 was so dark ! “ [source] Part 2 Cubemaps are not new to The Dark Mod. The skybox materials in some of our prefabs are cubemaps, some glass and polished tile materials use cubemaps to fake reflections for cheap. Cubemap lights, however, are comparatively new. The wiki page linked earlier describes these two, new light types that were added in TDM 2.05. cubicLight is a shadow-casting light with true spherical falloff. An example of such a light can be found in the core files, “lights/cubic/tdm_lampshade_cubic”. ambientCubicLight is the light type we will be focusing on. Prior to TDM 2.06, it acted as a movable, on-demand reflection dispenser, making surfaces in its radius reflect a pre-set cubemap, much like glass. After 2.06, the old behavior was discarded and ambientCubicLight was converted to accept industry standard irradiance environment maps. Irradiance environment maps (IEMs) are what we want to make, so perhaps the first thing to make clear is that they aren’t really “handmade”. An IEM is the output of a filtering process (convolution) which requires an input in the form of a regular environment cubemap. In other words, if we want to make an IEM, we need a regular cubemap, ideally one depicting our environment — in this case, the corridor. I say a snapshot of the environment is ideal for lighting it because this emulates how irradiant light in the real world works. All radiating surfaces are recorded in our cubemap, our ambient optic array as it were, then blurred, or convoluted, to approximate light scatter and diffusion, then the in-game light “shines” this approximation of irradiant light back to the surfaces. There is a bit of a “chicken and the egg” situation here, if your scene is dark to begin with, wouldn’t you just get a dark irradiance map and accomplish nothing? In the captured cubemap faces in Fig. 6, you may notice that the environment looks different than what I’ve shown so far. I used two ambient lights to brighten up the windows for a better final irradiance result. I’ve “primed the pump”, so to speak. You can ignore this conundrum for the moment, ways to set up your scenes for better results, or priming the pump correctly, will be discussed at the end of the tutorial. Capturing the Environment The wiki has a tutorial on capturing cubemaps by angua, but it is woefully out of date. Let me run you through the process for 2.07 really briefly. To start with, I fly to approx. the center of the corridor with noclip. I then type “envshot t 256” in the console. This outputs six .tga images in the <root>/env folder, simply named “t”, sized 256x256 px and constituting the six sides of a cube and depicting the entire environment. This is how they look in the folder: [Fig. 6] The six cube faces in the folder. Of note here is that I do not need to switch to a 640x480 resolution, neither do I need to rename these files, they can already be used in an ambientCubicLight. Setting Up the Lights For brevity’s sake, I’ll skip explaining material definitions, if you’ve ever added a custom texture to your map, you know how to do this. Suffice it to say, it is much the same with custom lights. In your <root>/materials/my_cool_cubemaps.mtr file, you should have something like this: lights/ambientcube/my_test_IEM_light { ambientCubicLight { forceHighQuality //cameraCubeMap makeIrradiance(env/t) cameraCubeMap env/t colored zeroClamp } } We’ll play with the commented out line in just a bit. Firstly, let’s place the actual light in DarkRadiant. It’s as simple as creating a new light or two and setting them up in much the same way you would a regular ambient light. I select the appropriate light texture from the list, “my_test_IEM_light” in the “ambientcube” subfolder and I leave the light colored pure white. [Fig. 7] The corridor in DR, top view, with the ambient cubic lights highlighted. I can place one that fills the volume or two that stagger the effect somewhat. Remember that these lights still have a spherical falloff. Preference and experimentation will prove what looks best to you. Please note that what the material we defined does is load a cubemap while we established that ambientCubicLights only work with irradiance maps. Let’s see if this causes any problems in-game. I save the map and run it in game to see the results. If I already have TDM running, I type “reloadDecls” in the console to reload my material files and “reloadImages” to reload the .tga images in the /env folder. [Fig. 8] Well this looks completely wrong, big surprise. Wouldn’t you know it, putting a cubemap in the place of an irradiance map doesn’t quite work. Everything in the scene, especially the AI, looks to be bathed in slick oil. Even if a material doesn’t have a specular map, it won’t matter, the ambientCubicLight will produce specular reflections like this. Let’s compare how our cubemap .tga files compares with the IEM .tgas we’ll have by the end of the tutorial: [Fig. 9] t_back.tga is the back face of the environment cubemap, tIEM_back.tga is the back face of the irradiance map derived from it. As you can see, the IEM image looks very different. If I were to use “env/tIEM” instead of “env/t” in the material definition above, I would get the proper result, as seen in the last screenshot of part 1. So it is that we need a properly filtered IEM for our lights to work correctly. Speaking of that mtr def though, let’s not invoke an irradiance map we haven’t learned to convert yet. Let’s try an automatic, in-engine way to convert cubemaps to IEMs, namely the makeIrradiance material keyword. makeIrradiance and Its Limitations Let’s uncomment the sixth line in that definition and comment out the seventh. cameraCubeMap makeIrradiance(env/t) //cameraCubeMap env/t Here is a picture of how a cubemap ran through the makeIrradiance keyword looks like: [Fig. 10] Say ‘Hi’ to our friend in the back, the normalmap test cylinder. It’s a custom texture I’ve made to demonstrate cubemap interactions in a clean way. Hey now, this looks pretty nice! The scene is a bit greener than before, but you may even argue it looks more pleasing to the eyes. Unfortunately, the devil is in the details. Let’s compare the makeIrradiance keyword’s output with the custom made irradiance map setup seen at the end of part 1. [Fig. 11, 12] A closer look at the brick texture reveals that the undesired specular highlighting is still present. The normal map test cylinder confirms that the reason for this is the noisy output of the makeIrradiance keyword. The in-engine conversion is algorithmic, more specifically, it doesn't allow us to directly compare .tga files like we did above. Were we able to, however, I'm sure the makeIrradiance IEM would look grainy and rough compared to the smooth gradient of the IEM you’ll have by the end of this tutorial. The makeIrradiance keyword is good for quick testing but it won’t allow you fine control over your irradiance map. If we want the light to look proper, we need a dedicated cubemap filtering software. A Review of Cubemap Filtering Software Here I’ll introduce three programs you can produce an irradiance map with. In the coming parts, I will present you with a guide for working with each one of them. I should also note that installing all of these is trivial, so I’ll skip that instructional step when describing their workflows. I will not relay you any ad copy, as you can already read it on these programs’ websites. I’ll just list the advantages and disadvantages that concern us. Lys https://www.knaldtech.com/lys/ Advantages: Good UI, rich image manipulation options, working radiance/specular map filtering with multiple convolution algorithms. Disadvantages: $50 price tag, limited import/export options, only available on Windows 64-bit systems. cmftStudio https://github.com/dariomanesku/cmftStudio Advantages: Available on Windows, OSX and Linux, free, open source software, command line interface available. Disadvantages: Somewhat confusing UI, limited import options, missing features (radiance/specular map filtering is broken, fullscreen doesn’t work), 32-bit binaries need to be built from source (I will provide a 32-bit Windows executable at the end of the tutorial). Modified CubeMapGen https://seblagarde.wordpress.com/2012/06/10/amd-cubemapgen-for-physically-based-rendering/ Advantages: Free software, quickest to work with (clarified later). Disadvantages: Bad UI, only Windows binaries available, subpar IEM export due to bad image adjustment options. Let’s take a break at this point and come back to these programs in part 3. A lot of caveats need to be expounded on as to which of these three is the “best” software for making an irradiance map for our purposes. Neither of these programs has a discreet workflow; rather, the workflow will include or exclude certain additional programs and steps depending on which app you choose to work with. It will dovetail and be similar in all cases. Part 3 The aim of this tutorial is twofold. First, it aims to provide the most hands-free and time-efficient method of converting an envshot, environment cubemap to an IEM and getting it working in-game. The second is using as few applications as possible and keeping them all free software that is available for download, much like TDM itself. The tutorial was originally going to only cover IEM production through Lys, as that was the app I used to test the whole process with. I soon realized that it would be inconsiderate of me to suggest you buy a fifty dollar product for a single step in a process that adds comparatively little to the value of a FM, if we’re being honest (if you asked me, the community would benefit far more from a level design tutorial than a technical one like this, but hey, maybe later, I’m filling a niche right now that nobody else has filled). This led me to seek out open-source alternatives to Lys, such as Cubemapgen, which I knew of and cmftStudio, which I did not. I will preempt my own explanations and tell you right away that, in my opinion, cmftStudio is the program you should use for IEM creation. This comes with one big caveat, however, which I’m about to get into. Six Faces on a Cross and The Photoshop Problem Let’s review. Taking an envshot in-game gives you six separate images that are game-ready. Meaning, you get six, split cubemap faces as an output, you need six, split irradiance map faces as an input. This is a problem, because neither Lys nor cmftStudio accept a sequence of images as such. They need to be stitched together in a cube cross, a single image of the unwrapped cube, like this: [Fig. 13] From Lys. Our cubemap has been stitched into a cross and the “Debug Cube Map Face Position” option has been checked, showing the orientations of each face. In Lys only panoramas, sphere maps and cube maps can be loaded into the program. The first two do not concern us, the third specifically refers to a single image file. Therefore, to import a TDM envshot into Lys you need to stitch your cubemap into a cross. Furthermore, Lys’ export also outputs a cubemap cross, therefore you also need to unstitch the cubemap into its faces afterwards if you want to use it in TDM. In cmftStudio you can import single map faces! Well… no, you can’t. The readme on GitHub boasts “Input and output types: cubemap, cube cross, latlong, face list, horizontal and vertical strip.” but this is false. The UI will not allow you to select multiple files on import, rendering the “face list” input type impossible.2 Therefore, to import a TDM envshot into cmftStudio you need to stitch your cubemap into a cross. Fortunately, the “face list” export type does work! Therefore, you don’t need to unstitch the cubemap manually, cmftStudio will export individual faces for you. In both of these cases, then, you need a cubemap cross. For this tutorial I will use Adobe Photoshop, a commercial piece of software, to stitch our faces into a cubemap in an automated fashion (using Photoshop’s Actions). This is the big caveat to using cmftStudio, even if you do not want to buy Lys, PS is still a prerequisite for working with both programs. There are, of course, open source alternatives to Photoshop, such as GIMP, but it is specifically Photoshop’s Action functionality that will power these workflows. GIMP has its own Actions in the form of Macros, but they are written with python. GIMP is not a software suite that I use, neither is python a language I am proficient with. Out of deference for those who don’t have, or like working with, Photoshop, I will later go through the steps I take inside the image editor in some detail, in order for the studious reader to reconstruct them, if they so desire, in their image editing software of choice. At any rate, and at the risk of sounding a little presumptuous, I take it that, as creative types, most of you already have Photoshop on your computers. 2 An asterisk regarding the “impossibility” of this. cmftStudio is a GUI for cmft, a command line interface that does the same stuff but inside a command prompt. I need to stress that I am certain multiple faces can be inputted in the command line, but messing with unwieldy prompts or writing batch files is neither time-saving nor user-friendly. This tutorial is aimed at the average mapper, but a coder might find the versatility offered in cmft interesting. The Cubemapgen Workflow You will have noticed that I purposefully omitted Cubemapgen from the previous discussion. This is because working with Cubemapgen, wonderfully, does not need Photoshop to be involved! Cubemapgen both accepts individual cubemap faces as input and exports individual irradiance map faces as output. Why, then, did I even waste your time with all the talk of Lys, cmftStudio and Photoshop? Well, woefully, Cubemapgen’s irradiance maps look poor at worst and inconsistent at best. Comparing IEMs exported from Lys and cmftStudio, you will see that both look practically the same, which is good! An IEM exported from Cubemapgen, by default, is far too desaturated and the confusing UI does not help in bringing it to parity with the other two programs. If you work solely with Cubemapgen, you won’t even know what ‘parity’ is, since you won’t have a standard to compare to. [Fig. 14] A comparison between the same irradiance map face, exported with the different apps at their respective, default settings. Brightened and enlarged for legibility. This may not bother you and I concede that it is a small price to pay for those not interested in working with Photoshop. The Cubemapgen workflow is so easy to describe that I will in fact do just that, now. After I do so, however, I will argue that it flies in the face of one of the aims of this tutorial, namely: efficiency. Step 1: Load the cubemap faces into Cubemapgen. Returning to specifics, you will remember that we have, at the moment, six .tga cubemap faces in a folder that we want to convert to six irradiance map faces. With Cubemapgen open, direct your attention to these buttons: [Fig. 15] You can load a cubemap face by pressing the corresponding button or using the hotkey ‘F’. To ensure the image faces the correct way, you must load it in the corresponding “slot”, from the Select Cubemap Face dropdown menu above, or by pressing the 1-6 number keys on your keyboard. Here is a helpful list: X+ Face <1> corresponds to *_right X- Face <2> corresponds to *_left Y+ Face <3> corresponds to *_up Y- Face <4> corresponds to *_down Z+ Face <5> corresponds to *_forward Z- Face <6> corresponds to *_back ...with the asterisk representing the name of your cubemap. With enough practice, you can get quite proficient in loading cubemap faces using keyboard shortcuts. Note that the ‘Skybox’ option in the blue panel is checked, I recommend you use it. Step 2: Generate the Irradiance Map [Fig. 16] The corridor environment cubemap loaded in and filtered to an irradiance map. The options on the right are my attempt to get the IEM to look right, though they are by no means prescriptive. Generating an IEM with Modified CubeMapGen 1.66 is as easy as checking the ‘Irradiance cubemap’ checkbox and hitting ‘Filter Cubemap’ in the red panel. There are numerous other options there, but most will have no effect with the checkbox on. For more information, consult the Sébastien Lagarde blog post that you got the app from. I leave it to you to experiment with the input and output gamma sliders, you really have no set standard on how your irradiance map is supposed to look, so unfortunately you’ll have to eyeball it and rely on trial and error. Two things are important to note. The ‘Output Cube Size’ box in the red panel is the resolution that you want your IEM to export to. In the yellow panel, make sure you set the output as RGB rather than RGBA! We don’t need alpha channels in our images. Step 3: Export Irradiance Map Faces Back in the green panel, click the ‘Save CubeMap to Images’ button. Save the images as .tga with a descriptive name. [Fig. 17] The exported irradiance map faces in the folder. These files still need to be renamed with appropriate suffixes in order to constitute a readable cubemap for the engine. The nomenclature is the same as the table above: “c00” is the X+ Face, to be renamed “right”, “c01” is the X- Face and so on. Right left, up down, forward and back. That’s the order! This is all there is to this workflow. A “cameraCubeMap env/testshot” in the light material will give us a result that will look, at the very least, better than the inbuilt makeIrradiance material keyword. [Fig. 17] The map ended up being a little bright. Feel free to open Fig. 4 and this in seperate tabs and compare the Lys/cmft export with the cubemapgen one. A Review of the Workflow Time for the promised criticism to this workflow. I already stated my distaste for the lack of a standardised set of filtering values with this method. The lack of any kind of preset system for saving the values you like makes working with Cubemapgen even more slipshod. Additionally, in part 2, I said that Cubemapgen is the fastest to work with, but this needs to be qualified. What we just did was convert one cubemap to an irradiance map, but a typical game level ought to use more than a single IEM. Premeditation and capturing fake, “generic” environment cubemaps (e.g. setting up a “blue light on the right, orange on the left” room or a “bright skylight above, brown floor” room, then capturing them with envshot) might allow for some judicious reuse and keep your distinct IEM light definition count down to single digits, but you can only go so far with that. I am not arguing here for an ambient cubic light in every scene either, certainly only those that you deem need the extra attention, or those for which the regular lighting methods enumerated in Part 1 do not quite work. I do tentatively assume, though, that for an average level you would use between one and two dozen distinct IEMs. Keep in mind that commercial games, with their automated probe systems for capturing environment shots, use many, many more than that. With about 20 cubemaps to be converted and 6 faces each to load into Cubemapgen, you’ll be going through the same motions 120 whole times (saving and renaming not included). If you decide to do this in one sitting (and you should, as Cubemapgen, to reiterate, does not keep settings between sessions), you are in for a very tedious process that, while effective, is not very efficient. The simple fact is that loading six things one by one is just slower than loading a single thing once! The “single thing” I’m referring to is, of course, the single, stitched cubemap cross texture. In the next part, I will go into detail regarding how to make a cubemap cross in Photoshop in preparation for cmftStudio and Lys. It will initially seem a far more time-consuming process to you than the Cubemapgen workflow, but through the magic of automation and the Actions feature, you will be able to accomplish the cubemap stitch process in as little as a drag-and-drop into PS and a single click. The best thing is that after we go through the steps, you won’t have to recreate them yourself, as I will provide you with a custom Actions .atn file and save you the effort. I advise you not to skip the explanations, however. The keen-eyed among you may have noticed that you can also load a cube cross in Cubemapgen. If you want to use both Cubemapgen and Photoshop together to automate your Cubemapgen workflow, be aware that Cubemap gen takes crosses that have a different orientation than the ones Lys and cmftStudio use. My macros (actions) are designed for the latter, so if you want to adjust them for Cubemapgen you would do well to study my steps and modify them appropriately. For the moment, you’ve been given the barebones essentials needed to capture an envshot, convert it to an irradiance map and put it in your level at an appropriate location, all without needing a single piece of proprietary software. You can stop here and start cranking out irradiance maps to your heart’s content, but if you’re in the mood for some more serious automation, consider the next section.
  17. AMD at Computex:

    (discuss in CPU/GPU news thread)
  18. hmm with FSR in performance mode i could boost it to 25/30 fps at the lowest setting on the 1080 ti so from slideshow to barely playable at 1080p . strangely FSR does not seem to get this old card a whole lot of extras in this game, compared to the callisto protocol where i can actually keep it above 60 fps with raytracing on at 1080p, it is a whole lot more detailed compared to the latter though so that might explain it to some degree. also the 1080 ti does not fully support dx 12 ultimate so the mesh shading might not work all to well with it.
  19. "The Threepenny Revue" https://www.thedarkmod.com/missiondetails/?internalName=threepenny "I've been in the business of other peoples' valuables for as long as I can remember, so I'm no stranger to breaking and entering. But until today, I've never done a robbery on commission. I guess there's a first time for everything..." Randal Cartier, a local theater owner, thinks himself above paying protection money to the local gangs. You've been hired to prove him wrong. "The Threepenny Revue" is a first attempt at a Dark Mod Fan Mission. After playing TDM for ages and loving it, I wanted to try my hand at creating one of my own. As such, this is a short, simple, and relatively straightforward mission made to learn the ropes. The experience was very enjoyable, and I'm planning to work on another one in the future. In the meantime, I hope you enjoy this one. It's available now in the mission list, but in the event anybody wants or needs a backup source I'm hosting the files on my own site here. Special thanks to @Cambridge Spy, @thebigh, @Shadow, @wesp5, and @boissiere for Beta Testing and giving feedback, which helped enormously in ironing out problems in DarkRadiant
  20. There's a problem with Lady02 sound clips. A number of them are untrimmed, having dead air at the end up to modulo 5 seconds. Out of 310 sound clips, counts of suspect clips are: 69 at exactly 5 seconds 15 at exactly 10 seconds 4 at higher values, modulo 5 seconds If a clip has, say, 3 seconds of dead air, I don't want to keep showing the subtitle during those 3 seconds. Solutions are: 1) trim the clips. The best solution, but a good amount of work (that I hate doing. Any volunteers?) 2) use SRT in each case to end the subtitle early. This is a fair amount of extra work for me, for those clips that otherwise would not need SRT (namely, the 5 second and some of the 10 second ones). But I'll do this if (1) and (3) seem out of reach. 3) Change the -dx command so it can take a negative value, and end the subtitle early. It would still take some effort to determine the -dx value, but less than (2), and with less file proliferation. @stgatilov, is this viable/easy? EDIT: Reported to bugtracker as https://bugs.thedarkmod.com/view.php?id=6352
  21. IMO it's one of these cases again, where you try to solve asset-level problem by making changes on the engine level. What I'm interested the most though: let's say I have custom candle assets / entities in my WIP that work like in Thief 3 (first frob extinguishes candles, second one picks them up). Will these changes break it?
  22. They are there for a reason though, not just because someone said so. If you want to communicate with people and you work on the code with someone, you need to share some common principles. And while I agree that things like clean code can be a bit extreme at times, I've never seen anyone questioning it super hard; neither stuff like solid principles, for example. Obviously, you can be a rebel if you want to, but you'll probably end up working alone.
  23. Haven't read the whole thread, but my theory is that this happens when you have already picked up the key the other guard (pool area) is wearing.
  24. I've very rarely seen setter methods which return values (unlike getters which obviously need to return the value they are "getting"). What value should a setter return? The same value it was given as a parameter? That's entirely pointless because the calling code already has that value. It could return the previous value, but such a value isn't necessarily defined (and doesn't appear to be relevant in the case of writing something to a file). Sometimes setters return the object itself, so you can call them in a chain, e.g. myObject.setWidth(60).setHeight(20).setColour(RED); but it's not clear how that would work with writeFloat which isn't an object method to begin with. That's certainly common (and is a convention I use), but not universal. The C++ standard library doesn't use it, for example — to check if a vector is empty you call std::vector<T>::empty(), not isEmpty().
  25. If any mappers have encountered weirdness with kill objectives not working with drowning AI, I think I've found out why. I don't think it would be a particularly difficult one to fix either. I've raised this bug report: https://bugs.thedarkmod.com/view.php?id=6323 Some context here: https://forums.thedarkmod.com/index.php?/topic/21837-fan-mission-the-lieutenant-2-high-expectations-by-frost_salamander-20230424/&do=findComment&comment=487316 I think this is a bug, but just raising here in case some people think otherwise.
×
×
  • Create New...