Running hostile software in a container

Remember Skype, the once popular phone software? I used it a lot when we were traveling in South America, and international calls were insanely expensive. But I stopped using it when it was acquired by Microsoft, and they switched from a P2P model to centralized servers. From what I could observe, it gradually worsened from there, and I really thought I wouldn’t have to use it ever again. That was until somebody decided that we had to use Skype for Business instead of XMPP at work. There are a plethora of better alternatives. The one I use the most these days is Tox.

I use the Windows Workstation only for things that I can’t do on Linux. There is not much that falls into this category, besides VisualStudio compiling projects that involve MFC. There is Skype for Linux, but there is no official Skype for Business for Linux. So for a moment it looked like the Windows machine got a second task. But running an obfuscated malicious binary blob from Microsoft with known backdoors, that is online all the time on an operating system that can not be secured makes me uneasy. So I looked for a way to run it securely on Linux. The first thing I found was an open source implementation of the reverse engineered proprietary protocol as a plugin for Pidgin. That sounded good, but it didn’t work unfortunately. The second option was a closed source clone from tel.red. They provide their own apt repository with regular updates. That’s quite good actually, if you don’t care about closed source software, and the security of your device and data in general.

I learned about docker a while back, but only used it marginally so far. This was the first real use I had for it, so I started learning more about it. Copying and adapting a docker file is a lot easier than the articles I read so far made me believe. I found a couple of sites about packing Skype into a docker container, but none for Skype for Business. So I took one of the former ones and adapted it. To use my container, just follow these easy steps:

git clone https://github.com:ulrichard/docker-skype-business
cd docker-skype-business
sudo docker build -t skype .
sudo docker run -d -p 55555:22 --name skype_container skype
ssh-copy-id -p 55555 docker@localhost
ssh -X -p 55555 docker@localhost sky

The password for ssh-copy-id is “docker”.

Then log into sky with your credentials. You can do this every time, or you can store a configured copy of the container as follows:

docker commit skype_container skype_business

The next time, you just run it with:

sudo docker run -d -p 55555:22 skype_business
ssh -X -p 55555 docker@localhost sky

I left some pulseaudio stuff from the original container at least in the README file. I don’t intend to use it for anything but receiving chat messages. But if you want to, feel free to experiment and report back.

Verifying downloads

Last week I stumbled across a post from last year, where somebody described how it was impossible to download an important infrastructure program securely on Windows. My first reaction was of course to pity the poor souls that are still stuck with Windows. How easy is it to just type apt-get install and have all the downloading and validation and installation conveniently handled for you.

Today I was going to set up my new server. First I downloaded the current iso file from ubuntu.com. Before installing it onto an USB stick, I thought about this blog post again. Actually I should validate this ISO! I knew before, but I usually didn’t bother. So I gave it a try. I had to search a bit for it on the download page. The easiest is if you manually pick a mirror. Then you will find the hash sum files in the index page. Some websites along the way were encrypted, others were not. The downloads themselves were not. But that didn’t matter since the hashes were GPG signed. I don’t have to do this all too often, so I  just followed the community howto. My downloaded iso file was valid, so I moved on installing it.

The hardware is actually from computer-discount.ch. For quite some time I was searching for ways to buy computer equipment with BitCoin. The American big name tech companies that accept BitCoin either do it only outside of central Europe, or don’t deliver here. So I was quite excited to find this company from Ticino. The experience so far is very good.

The crapware platform

I complained many times that there is no standard package manager on Windows, and that installations and especially upgrading software on that platform is an unholy mess. On my office computer there are probably close to ten different mechanisms present to keep different software packages up to date. Some lurk in the system tray, and most of them constantly waste resources. The update mechanism of our software is a little bit better than most in that respect. It doesn’t waste resources while it’s not in use, but it’s still a separate proprietary solution. And the worst part is, that most of the software on usual Windows Systems don’t even get updated at all.

I looked for a solution as simple, elegant and powerful as apt-get many times. The best I found so far was Npackd. It’s still a decade short of the debian system, but better than anything else I found. The repository has grown significantly in the years I have used it. But even if Npackd implements dependency management, the packages rarely make use of it. It’s just not the way Windows packages are made. Rather than managing the dependencies, they keep inventing new versions of dll hell.

I don’t know what is the reason that upgrades in Npackd frequently fail. It’s usually that the uninstall of the old version fails, and thus the update stops. What I usually did in the past, was installing the new version in parallel. I think there is not much Npackd could do about WindowsInstaller packages failing to uninstall. Having crafted WindowsInstaller packages myself, I know how brittle and error prone this technology can be.

Today I upgraded some packages that Npackd flagged as upgradeable. You select the ones you want to bring up to date, and click update. It’s not like “sudo apt-get upgrade” and done, but it still makes Windows a lot more bearable. And for a long time the quality of the packages was good, at least for Windows standards. It started out with mostly open source projects and a few big name packages. The crapware that is so stereotypical for the Microsoft platform had to stay out.

That impression changed today. One of the packages that I upgraded was IZArc, a compression package with nice Windows Explorer integration. Already during the upgrade process I had a strange feeling, when I saw the ads in the installer window. And when it was done, I was certain something fishy had happened. Some windows popped up wanting to install browser toolbars, changing the default search engine and scan the computer for possible improvements. Holly shit I thought is this some scareware? I would expect this from some random shareware downloaded from a shady page, but not from Npackd.

And that’s my main point. When you install software on your computer, you trust the issuer not to hijack your system. And if you install software through a software repository, you trust the repository even more. On Windows, you’re pretty much dependant on lots of individuals and companies involved in the creation of all the packages you install. There is a Microsoft certification process, and I don’t know what it checks and entails. There is also the possibility to sign your packages with a key signed by Microsoft. But that merely protects from tampering between the issuer and you. With OpenSource software however, you can examine the sourcecode yourself, and rely on the fact that other people checked it as well. Then most distributions have build hosts that compile and sign the binary packages. To be included in the repository, a maintainer has to take responsibility for the package, and upload a signed source package. The source package can be verified by everyone. So, the only thing you have to trust is the build host. But even that you could verify by building the package yourself, and compare the result. So the whole thing is fully transparent. Hence, if one individual decided he wanted to earn some bucks from advertising and bundling crapware, he wouldn’t get very far. As a nice add on, apt (or synaptic for that matter), can tell you exactly what files get installed to what location for every package in the system.

Just as a side note, crapware is the unwanted software that is pre-installed when you buy a new computer, or that is sneaked onto your computer when you install oracle’s java. When I bought my netbook, I booted Windows exactly once to see how much crapware they bundled, before wiping the disk and installing ubuntu. Needless to say no such problems exist on the Linux side.

So I checked the “Programme und Funktionen” in the system settings. That’s one of the configuration items that changes its name and appearance with every version of Windows. I found about 7 unwanted packages with today’s installation date. I removed them immediately, and I can only hope that they didn’t install additional malware.

cmake with MSVC

I have used cmake for a couple of years with my hobby projects, and I love it. It is a cross platform meta build system. Like with Qt, people tend to first think that “cross platform” is the main feature. But like with Qt it’s actually one great feature amongst many others. It brings so many advantages that I can’t even list them all here.  Since last week, we also use it for PointLine at work. While the process is straightforward on linux, there are some things worth mentioning when using it on Windows.

Finding External libraries

Cmake has lots of finder scripts for commonly used libraries, and they work great in most cases. But we want to have multiple versions of the same libraries side by side, and depending on the version of PointLine we develop for, use the appropriate versions of the libraries. To be precise, not just the libraries, but also the headers and debug symbols need to be present in different versions. And we want to be able to debug different versions of our product using different versions of the libraries, simultaneously on the same machine. Continue reading “cmake with MSVC”

Is Windows 7 based on MS DOS?

Yesterday I copied an InstallAware Project on the Jenkins continuous integration server. The copy always failed to build while the original succeeded. They were really the same at this point, so WTF!!!

In the InstallAware forum I found out that the path name was getting too long. Well, yes, the name of the copied jenkins project was slightly longer and thus the resulting path had some more characters.

In the installaware forum they say that Windows still has a limit of 256 character for absolute paths. According to Wikipedia, the limitation doesn’t come from the filesystem. So it must be somewhere in the OS. Now, Microsoft told us that WindowsNT which Windows7 is based on, was no longer based on MS DOS. Were they lying? I mean, this is a 64 bit operating system with limitations from its 16 bit pre-pre-pre-pre-predecessor….

A co worker ran into the same limitation lately when trying to copy a folder structure from linux to Windows.