Jump to content
The Dark Mod Forums


  • Posts

  • Joined

  • Last visited

  • Days Won


Everything posted by Frost_Salamander

  1. That has now been raised: https://bugs.thedarkmod.com/view.php?id=5616
  2. Hi @datiswousthanks for this. I did end up sorting it out, but by redoing the lid. I missed the underlying cause so thanks for pointing that out. Is this is sort of thing that should go in the bug tracker? Happy to add it if so...
  3. that's what I thought as well, but it all seems to be set up correctly according to this, and also comparing it to one of the footlockers that works. I'll play around with it some more, but either way it seems to be a bug in the assets. I tried to use the same prefab for Hare in the Snare part 1 but gave up on it then.
  4. I'd like to use the containers/openable/MerryChest1.pfb prefab in TDM prefabs. However, for some reason the lid doesn't open. I've tried various things to try and fix it, but I can't figure out what's wrong with it. I even tried re-creating the lid (revert to worldspawn, then convert back to a mover_door and re-adding the properties), but same issue. Has anyone else tried using it? I realize the other 'footlocker' prefabs work fine, but I like the look of this one as it looks a bit more like a travel trunk, which is what it's supposed to be in the map.
  5. Would just scaling them with the model scaler in DR help with the long/tall versions? https://wiki.thedarkmod.com/index.php?title=Model_Scaling
  6. Do we have any hard numbers related to bandwidth usage? You're right in that the video files are an issue - you do need LFS for files larger than 100 MB, and then bandwidth is a problem as you say. I don't know how popular this suggestion will be, but what about a limit on video size? If they are under 100 MB (very doable) then this goes away potentially. This post kind of suggests a lot is possible here (although it won't solve the problem with older FMs with large videos mind you). Some of this wording I think can be worked around. TDM and FMs ARE 'software projects'. They just happen to be games. I'm not sure what the issue is here. A lot of the assets are binary, yes, but a lot aren't. Skins, material files, some models, scripts, map files, etc are all text. At the end of the day, we could always just ask Github themselves about it if there is concern. Also calling LFS a hack and dismissing Git as over-complicated I think is unfair. It's very easy to use. I've been using it for years in a professional capacity and I rarely need to do anything except clone, push, pull and create/merge pull requests, and this is usually through either a code editor or UI (Github or Azure DevOps, for example). I pretty much never use Git command line, except for the initial clone and it's a copy/paste one-liner. Also, the bandwidth limitation I think only applies if you are using LFS, so the FM download thing I think would be fine - they wouldn't allow you to attach unlimited 2 GB binary artifacts as releases if they had a problem with this. Regarding Github actions - that was just a suggestion if anything needed to be done for the packaging/release process. I don't think packaging and uploading a file on release would put a burden on their servers once in a while, and I wasn't suggesting using Actions as a CDN.
  7. Here is what an FM looks like in Github (Hare in the Snare, part 1): code: https://github.com/thedarkmodcommunity/hits1 pull requests: https://github.com/thedarkmodcommunity/hits1/pulls project board: https://github.com/thedarkmodcommunity/hits1/projects/1 releases: https://github.com/thedarkmodcommunity/hits1/releases The release artifacts are automatically generated when you create a release. Currently, this is just a .zip/tar of the repo source, but a .pk4 can be manually added as well. However, if all assets are in the repo, then you don't need to manually do anything at all, just download/rename the zip to .pk4). Another advantage of using Github for the FM is it makes collaboration much easier. All you need to do is clone the repo into your darkmod/fms folder. When one person pushes an update, the other person just needs to pull and they will have all the latest changes. They just need to run DMAP or whatever and off they go. No more passing .pk4 files around, etc. EDIT: updated links, as I just moved the repo into the TDM Github org
  8. That kind of sucks though right? Azure blob storage is cheap, but still. What Azure region are you using? You also said Akamai CDN - that won't be free either right? Nothing wrong with that at all. There is nothing in the terms of use to say Github can't be used for this. And there is code in it (scripts, etc). But, this is a silly argument and a non-issue Can I have the read-only link please? I can't find it published anywhere. I do not want write access, and understand why you would want to keep it that way. I'm not up to anything sinister, just curious what it looks like. Dealing with hidden, closed source systems in an environment like this is kind of strange to be honest. That could be an end-state goal, yes. What I would suggest for a start is after someone releases an FM, the FM source gets added to the Github org in its own repository, and the FM .pk4 is also added as a release artifact (and yes, a link to that for the in-game downloader). The source tree and release are synchronized using the usual Git release tagging mechanisms. A huge number of companies and ecosystems rely on Github to host their releases, I don't think there is much risk involved here? When you say 'their own Github repo' and 'central catalogue', these would all be in the TDM github organization so everything is in one place. This is a crucial thing missing from TDM at the moment IMO. I would also suggest you move all your other tools and things to the TDM organization (e.g. DarkRadiant, the Perl I18N script, in-game downloader, modelling plugins, etc). Everything is all over the place right now, and between that and these hidden SVN repositories it's not very community-friendly. If some of this already exists in Github, it can just be moved/added to the org and the maintainer can carry on as usual. Because with a Github organization you can give per-repository permissions, you could give write to the FM author only (and maybe the admin team) so they could maintain it. Others could still contribute by pull request of course, but it would be up to the author (and whoever else, like admins) to ignore or accept such contributions. It's very easy to just protect the master branch from pushes and let others write to branches for their PRs. What if the author knows nothing about Git or Github? Well that's no different from now where most of them probably know nothing about SVN, where the FMs are maintained now. I suppose just an admin takes the FM source and updates the repository themselves somehow.
  9. Would you have to pay for the Azure storage yourself? Re: the Github 'file server' - why is that abuse? Each FM is an open source project, and it can have release artifacts. This ties in with what I am suggesting with the Github organisation for TDM. Each FM can have its own repository in the org with all its source code (i.e the entire FM codebase), with associated release artifacts. By zero-maintenance, I meant that you shouldn't have to maintain mirrors, worry about hosting, etc. Yes, someone will need to create the release, but that's as simple as clicking a button and uploading the .pk4 (or automate using Github actions) Also, where are the FMs in SVN? All I can seem to find is the TDM source itself. Is there a link floating around somewhere? EDIT: "Although it most likely means tying to some particular CDN" - well this is why you would use Github as the source (jsDelivr is just a layer over top). Ultimately that's where the file is stored, and it's not going away any time soon. If Github changed anything in the way they operate, the entire internet would break.
  10. Has anyone considered using a CDN (content delivery network) for this instead? There are free ones for open source. For example, you could publish the FM on Github as a release artifact (yeah, I know, Github...) and then something like jsDeliver can make those files available over their CDN via the client's closest POP (point of presence). All free and zero-maintenance. EDIT: Just noticed that jsdelivr doesn't actually support release assets , but in general, the idea still stands and there might be other, similar solutions available For example, forget CDN and just link to Github release directly. There is no limit on the number of files in a release, and they can be up to 2 GB each: https://docs.github.com/en/github/administering-a-repository/about-releases#storage-and-bandwidth-quotas
  11. Thanks @roygato. Yeah the expert loot goal is probably too high, it might as well just say 'find ALL the loot'. That and... ...are down on the list to fix on a future update. EDIT: Regarding the comment about the building entrances: Most of the buildings don't have usable front doors to encourage vertical exploration. I thought players liked that? Personally I'd rather that than having to pick 10 different front door locks with guards walking by every couple of minutes.
  12. Hi @Geep thanks for that. Well, we can say it definitely doesn't work with uncapped FPS on, so it would be cool if someone added a note here: https://wiki.thedarkmod.com/index.php?title=Console_Useful_Controls#Notes Re: the Wiki - I don't know who all the 'old hands' are, or if they will respond, and the thought of spamming random people with DMs makes me uncomfortable quite frankly. Perhaps that's by design and if so, it's working.
  13. Hi @Gerberox Finding the exit is pretty straightforward - would you like a hint? Also if you didn't finish the mission you'll miss out on the debrief video
  14. @TarhielFYI you CAN blackjack all these guards. Just because they have a helmet doesn't mean you can't KO them. They are only immune to KO when they are alert, so you have to be careful not to alert them while sneaking up on them. However, I see now that the perception seems to be that 'has helmet = no KO' as you're not the first person to think this so will keep that in mind for future missions.
  15. I've just noticed that the 'timescale' console command only works if you have 'uncap FPS' set to 'off'. Is this something that is common knowledge? Are there any other settings that affect it? Either way, it should be in the Wiki page where the command is mentioned as my first thought was it must be a 2.09 bug because I used it quite a few times before I switched. I'm happy to update the Wiki - how do I go about getting an account?
  16. I've figured this out - I think the arrow had a bind parent, so using getWorldOrigin() instead gave me the proper origin.
  17. Is there something weird about the origins of projectile entities? I'm trying to determine the origin of a rope arrow after it has struck something, but the value I'm getting back is in a completely different part of the map (quite far behind and below where the entity is actually sitting). I'm doing something like this, and it's called from a trigger_touch when the arrow hits it: void checkArrow(entity ent) { string className = ent.getKey("classname"); if (className == "atdm:ammo_ropearrow_stuck") { vector entOrigin = ent.getOrigin(); // this is waaaaay off } }
  18. Thanks @NeonsStyle! Just a word about the AI for everyone - the only modification we did was bump up the light gem sensitivity by a value of 1 for hard and expert. This applies to all AI in the map. I will add this to the original post so it's clear what's happening
  19. Certainly, that would be great if you don't mind! Hope you enjoy the rest of the mission Oh, and I can't take credit for the cinematics - that was all @Kerry000!
  20. What is the primary reason for this, besides being able to set things like 'noshadows' on the func_static? Is that the only reason? Does is 'hurt' to just leave them as worldspawn somehow? I mean, I've started doing this as well without really understanding why, just that we 'should'.
  21. Ok thanks. The boards on the stairs I think were just the skin choice. As with the floorboards, yes those were conscious choices that I didn't really think looked too bad (esp. the inn, as it was supposed to be a bit of a dive). But the misaligned ceiling texture - yikes! not sure how that got past us. That needs fixing, as well as the model clipping
  22. Hi @Bienie and thanks for the kind words and feedback. I'm collecting things for a possible future update - do you remember where you saw the 'scaled up' textures you mentioned? I wouldn't have done that on purpose so it must have slipped by us all.
  23. For reference It was developed on an old Xeon workstation with a GTX 1060 (3GB), and ran totally fine. I also tested on a similarly old CPU with a 1050, which was also fine. The settings were medium-ish. I don't tend to bother cranking up the graphics settings because I can barely tell the difference, but that's just me
  24. Thank you @stgatilov this is exactly what I was looking for
  • Create New...