cmake with MSVC

I have used cmake for a couple of years with my hobby projects, and I love it. It is a cross platform meta build system. Like with Qt, people tend to first think that “cross platform” is the main feature. But like with Qt it’s actually one great feature amongst many others. It brings so many advantages that I can’t even list them all here.  Since last week, we also use it for PointLine at work. While the process is straightforward on linux, there are some things worth mentioning when using it on Windows.

Finding External libraries

Cmake has lots of finder scripts for commonly used libraries, and they work great in most cases. But we want to have multiple versions of the same libraries side by side, and depending on the version of PointLine we develop for, use the appropriate versions of the libraries. To be precise, not just the libraries, but also the headers and debug symbols need to be present in different versions. And we want to be able to debug different versions of our product using different versions of the libraries, simultaneously on the same machine. Some cmake finder scripts already support a main directory from where they start searching for the components. Others don’t. So I had to customize most of the finder scripts we use. Just small changes so far. Our collection of these external libraries lives in a mercurial repository, and although we have a standard location, every developer is free to choose a different location. I will then try to locate it with the following logic:

SET(PL_XERCES_VER        xerces-c-3.1.1)

IF(NOT PL_ExtLibDir)
	# If somebody has different paths, use it from this file which should contain a command to set the variable PL_ExtLibDir and not be under version control!
			SET(PL_ExtLibDir "Q:/Externals" CACHE PATH "Your working copy of the Externals repository")
			SET(PL_ExtLibDir ${PointLine_MAIN_DIR}/../Externals CACHE PATH "Your working copy of the Externals repository")


Shipping the proper version of the external libraries

Cmake’s install target copies all the binaries to the right place on a target filesystem which can also be a temporary sub tree if used with a prefix. That usually covers binaries that are built with the cmake project. On linux the package can depend on all the required libraries and thus make sure to find everything that’s required. But in absence of proper package management on Windows, most applications ship the dll’s they need themselves. There is enough to read on the internet about the different incarnations of dll hell, so no need to elaborate further. To make it short, we want to have the dlls we need in the application’s bin directory. Before switching to cmake, we had to ensure manually that we ship the right version of the dependent libraries. What I did was adding cmake install commands for the dll’s in all the finder scripts. Uncommon, but effective.

Runtime directories for debugging

During development, the dlls for the external libraries are scattered over the many directories for these libraries. So I had to make sure that they are available when running the application in the debugger. The first thing that comes to mind is setting the PATH environment variable prior to starting the development environment. In fact that’s what we did before. But since the cmake finder scripts search for the directories, there’s a chicken and egg problem here. The second obvious thought was copying all the dlls to the binary output or the installation directory. But having the stuff in multiple places was not really what I wanted, and there are a couple of additional downsides to that approach. Luckily, I found some cmake scripts for adding launcher commands. They prepare the environment by setting amongst other things the PATH environment only for the target being debugged. The runtime directories are collected in the enhanced finder scripts.


I wanted to have immediate feedback from the unit tests. Before, we had one project with all the tests and it was built last. So if I modified something in a critical low level part, I essentially did a full build of the whole solution before I could test anything. We wanted to split the tests into smaller parts for a while, and now was a good opportunity. On linux I used to have this immediate feedback by executing the tests right after building with ADD_CUSTOM_COMMAND() as a post build action. Well, on linux, the libraries are all in standard locations, so I didn’t have to care. But on Windows, I faced the same problem as with debugging. And the same first thoughts as for debugging were discussed on the internet, as well as executing the install target prior to testing. But the early feedback that I was after was not really possible with this approach. I found out that ADD_CUSTOM_COMMAND() can execute multiple commands in succession. And in order to manually execute the tests, I ended up writing a small batch with setting up the PATH and calling the test executable, and then call the batch. I wrapped all that into the following macro:

MACRO(RunUnitTest TestTargetName)

			COMMAND echo "PATH=${TEMP_RUNTIME_DIR};%PATH%" > ${TestTargetName}_script.bat
			COMMAND echo ${TestTargetName}.exe --result_code=no --report_level=no >> ${TestTargetName}_script.bat
			COMMAND ${TestTargetName}_script.bat

As an unexpected, but most welcomed side effect, our compile times were signifficantly reduced after the switch to cmake.



, ,



Leave a Reply

Your email address will not be published. Required fields are marked *