Jump to content
The Dark Mod Forums

CMake build


Recommended Posts

As of Git commit 67be9762f446d27abd84b3df49ba8cf1f4271077 we now have a functioning CMake build for DarkRadiant on Linux. The motivations for this are twofold:

  1. The main mod is using CMake already, and I am interested (at some point in the distant future) in the possibility of trying to re-use some code from the mod inside DR, in order to reduce code duplication for certain common functionality like parsing DEF files. Using the same buildsystem will make this much easier.
  2. Autotools is quite honestly an outdated and hacky buildsystem, in which you use M4 macros to generate a shell script which generates another shell script which generates Makefiles from Makefile templates using unintuitive and complicated text substitution rules, whereas CMake is a modern and well-designed buildsystem in which almost everything Just Works in an intuitive and robust way; even finding WxWidgets and Python was really easy compared to the multiple lines of shell script hackery which was needed to get this working in configure.ac. I imagine Autotools is great for compiling the df utility on a variety of weird BSD flavours, but it really isn't good choice for a complex modular application like DarkRadiant.

So far I've checked that the CMake-built version of DR runs and looks correct with the expected plugin features available, although I haven't done exhaustive testing. I also need to update the Debian package scripts; my plan is to use CMake by default for the build on Linux but the Debian scripts are still pointing to configure, make etc.

  • Like 2
Link to comment
Share on other sites

Other things I particularly like about CMake in no particular order:

  • It keeps all CPU cores occupied for the entire build, and will happily build different submodules in parallel. Automake would only ever parallelise compilation of CPP files within a single module, with each module needing to be linked before it would move on to the next one.
  • If you adjust a compiler or linker option in a CMakeLists.txt, it automatically rebuilds whatever might be affected by that compiler option. With autoconf the configure script would be regenerated automatically but you would have to manually make clean or touch particular files to get them rebuilt with the new options.
  • Compile and link flags are transitive, and automatically propagate from shared libraries to whatever links against those libraries. There is no need to redundantly specify that everything that links against wxutil also needs to use the wxWidgets link flags, because the wxutil target already has these links flags and they propagate to downstream targets.
  • make install just copies stuff (fast), rather than relinking all of the shared libraries to bake in an unwanted RPATH (slow).

No doubt some or all of these things could be solved with Autotools by reading enough obscure GNU mailing list archives from 2003, but that's the point: CMake just does the right thing without you needing to figure out how to make it work (and the documentation is excellent).

Link to comment
Share on other sites

Sounds very nice, I already saw your commits and the cmake files look so much less complicated than the makefiles, I'll definitely shed no tear when we get rid of automake. And I can relate to your comment about the automake docs - whenever I tried to look up a configure or makefile problem I either found nothing or it was not applicable to our setup. Not a single good tutorial getting you up to speed. One often ends up reading other projects' makefiles in the hope of finding anything useful there.

And I might add that I find the use of mailing lists to get support was probably cool in 1992. I've recently subscribed to the GNU autoconf mailing list to get notified about a AC_CHECK_LIB problem and now I end up deleting 3 or 5 unrelated messages from my inbox every day.

Link to comment
Share on other sites

10 hours ago, OrbWeaver said:
  1. The main mod is using CMake already, and I am interested (at some point in the distant future) in the possibility of trying to re-use some code from the mod inside DR, in order to reduce code duplication for certain common functionality like parsing DEF files. Using the same buildsystem will make this much easier.

I think there is no difference.

If you ever manage reusing the code, it would most likely be like this:

  1. Extract the necessary code into some functions which don't use globals variables, into a separate h/cpp file
  2. Include these files into DR build.

You will have more problems due to the fact that svn cannot be git submodule (hint: but it can be hg subrepo 😉).

 

But to be honest, the whole parsing code in game is centralized in global singletons. For simplicity (parsing one type of file can easily recurse into parsing something very different), debugging (see everything through globals in debugger), and for reuse (caching loaded resources). You will have a lot of trouble extracting it.

Link to comment
Share on other sites

  • 1 month later...

@greeboThanks for getting the tests working in CMake. I had got as far as compiling and installing the binary but it always hung after the first few tests, and without being an expert on GTest I had no idea how to solve this. I only noticed yesterday that you had fixed the test resources directory and now all of the tests succeed.

I must say I really like this new test mechanism. It is so much easier to use and more comprehensive than the old approach that could only test standalone classes. Being able to create a new test fixture in a couple of lines and immediately start calling GlobalEntityClassManager() and GlobalEntityCreator() as if this is real DarkRadiant code should open the door to being able to cover pretty much everything with unit/integration tests.

Link to comment
Share on other sites

Thanks, glad you like it. :)

I'm still not quite there when it comes to grouping those tests logically into some hierarchy, so right now we have a growing flat list of test categories that might get cluttered as we move along. Kind of a luxury problem, I suppose.

The downside of this integrated setup is that it takes some orders of magnitude longer for a single test case to complete (~200ms per "RadiantTest"), but that's still a win in comparison to what happens when a regression is slipping into a release. GTest is not quite as convenient as NUnit which we're using at work, but its support in Visual Studio and it being cross-platform is enough to win me over.

In the past few months I forced myself to set up tests for any change that is done to the algorithms in the core module - all that happens there is UI-unrelated and should be testable. We're at around 200 tests now, and counting - I still want to have more loading and saving tests, since that is my most painful spot when it comes to regressions.

Link to comment
Share on other sites

9 hours ago, greebo said:

I'm still not quite there when it comes to grouping those tests logically into some hierarchy, so right now we have a growing flat list of test categories that might get cluttered as we move along. Kind of a luxury problem, I suppose.

Yes, I see that. I guess each different test fixture becomes a top-level node, with its tests underneath it, so it's possible to group things which share a test fixture but if you need a different fixture for a particular group of tests you automatically get a new node. Perhaps there are more explicit ways to group the tests but I haven't looked into it yet.

Quote

The downside of this integrated setup is that it takes some orders of magnitude longer for a single test case to complete (~200ms per "RadiantTest"), but that's still a win in comparison to what happens when a regression is slipping into a release. GTest is not quite as convenient as NUnit which we're using at work, but its support in Visual Studio and it being cross-platform is enough to win me over.

At my work we use QtTest from the Qt library (and a horrific in-house build system based on QMake which is possibly one of the worst build tools ever written), and some of our tests take 10 minutes or more to run, so honestly this feels like a Formula One race by comparison.

The Visual Studio integration wasn't even something I'd considered — I'm using Visual Studio Code (which I'm aware isn't the real Visual Studio and some people say it doesn't count as a "real IDE", but it seems pretty good to me) and there are indeed four highly-rated extensions which give full indication of the tests and their results. Nice.

1833298323_Screenshotfrom2021-01-2519-35-44.png.5fc6661cd30c512a41947fad4e7f39a1.png

 

Quote

In the past few months I forced myself to set up tests for any change that is done to the algorithms in the core module - all that happens there is UI-unrelated and should be testable. We're at around 200 tests now, and counting - I still want to have more loading and saving tests, since that is my most painful spot when it comes to regressions.

I agree, now that the tests are fully operational there's no reason why every non-UI change shouldn't be covered by tests (new or existing), and I will certainly start following this principle myself. I very much enjoy the feeling of having fixed something and covered it by a test so not only is it fixed for now but it's hopefully fixed forever.

Link to comment
Share on other sites

7 hours ago, OrbWeaver said:

The Visual Studio integration wasn't even something I'd considered — I'm using Visual Studio Code (which I'm aware isn't the real Visual Studio and some people say it doesn't count as a "real IDE", but it seems pretty good to me) and there are indeed four highly-rated extensions which give full indication of the tests and their results. Nice.

Cool, didn't know there's an extension, but hey, why am I surprised? Visual Studio Code is underestimated, imo, it's fast, has built-in Git support, there's an extension for virtually everything, and its development is moving along very quickly. It was stunning to see how easy it was to launch gdb and step through the C++ code in Linux, it requires almost no setup.

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recent Status Updates

    • OrbWeaver

      Does anyone actually use the Normalise button in the Surface inspector? Even after looking at the code I'm not quite sure what it's for.
      · 2 replies
    • Ansome

      Turns out my 15th anniversary mission idea has already been done once or twice before! I've been beaten to the punch once again, but I suppose that's to be expected when there's over 170 FMs out there, eh? I'm not complaining though, I love learning new tricks and taking inspiration from past FMs. Best of luck on your own fan missions!
      · 4 replies
    • The Black Arrow

      I wanna play Doom 3, but fhDoom has much better features than dhewm3, yet fhDoom is old, outdated and probably not supported. Damn!
      Makes me think that TDM engine for Doom 3 itself would actually be perfect.
      · 6 replies
    • Petike the Taffer

      Maybe a bit of advice ? In the FM series I'm preparing, the two main characters have the given names Toby and Agnes (it's the protagonist and deuteragonist, respectively), I've been toying with the idea of giving them family names as well, since many of the FM series have named protagonists who have surnames. Toby's from a family who were usually farriers, though he eventually wound up working as a cobbler (this serves as a daylight "front" for his night time thieving). Would it make sense if the man's popularly accepted family name was Farrier ? It's an existing, though less common English surname, and it directly refers to the profession practiced by his relatives. Your suggestions ?
      · 9 replies
    • nbohr1more

      Looks like the "Reverse April Fools" releases were too well hidden. Darkfate still hasn't acknowledge all the new releases. Did you play any of the new April Fools missions?
      · 5 replies
×
×
  • Create New...