Almost exactly two years ago, I wrote a post over on the Zettlr blog titled Selling Trust. In it, I spoke about the giant trust machine that is the internet. The post back then was initiated by the fact that I had to buy a code signing certificate and I was not happy with how the process went. In a few weeks, that certificate will expire, so I bought another one.
It is time to revisit the process. What did I learn about trust on the internet? While the actual process of getting the new certificate was quite painless this time, new aching points have emerged that deserve a novel article. So here we go!
Three Levels of Trust
As I mentioned, the process of renewing the code signing certificate was quite painless. I stuffed the trust machine with a few hundred additional dollars and within a few days I got the new and shiny code signing certificate. This time it’s valid until 2025, so plenty of time of not having to care about it again. Right?
Well, not quite. It turns out, there are actually three different trust levels on the internet: Unsigned and untrusted, signed and untrusted, and signed and trusted. The internet turns out to be a quite stratified society.
In the original Selling Trust article, I mentioned that I can fully understand the rationale behind requiring developers to code sign their software. By requiring a valid code signing certificate you can make sure that malware has a much harder time finding its way onto innocent users’ computers.
Code signing certificates are verified by submitting legal information about your person, such as your government issued ID. This makes sure you’re actually a human being and not your dog who is trying to scam people. That, however, was only the first step towards ensuring all people could use my app.
After I shipped the first updates of Zettlr with the new certificate, users were still complaining that they received a warning that said “This software may harm your computer.” And, if these users were using a company computer with no administrative rights, they were unable to install Zettlr, because they could not dismiss this warning.
What happened? It turns out that having a code signing certificate is not enough for Windows to trust you. Remember: By signing the application with a code signing certificate that was bound to me as a person, Microsoft could look me up and even contact me, if they wanted to. If I were to write malware, they would have an easy way of dragging me to court, because the software can be traced back to me. So why don’t they trust me?
It turns out the certificate I bought was a second class citizen.
IV, OV, EV, and the Two-Class Certificate System
There are three types of code signing certificates out there: IV, OV, and EV. IV stands for “Individual Validation” and means that I as a person receive a code signing certificate after the issuer verifies my existence as a person. OV stands for “Organizational Validation” and is the same, except that instead of binding the certificate to me as a person, the certificate is being issued for an organization. But technically, both certificates are the same.
Then there is EV, which stands for “Extended Validation”. That type of certificate is much more expensive than the cheaper IV/OV type certificates, but it also allows you to do more. For example, an EV certificate allows you to create kernel drivers for Windows.1 You can only get an EV certificate for an organization, so as an individual you can’t really develop kernel drivers.
However, these EV certificates have another benefit: Microsoft’s “SmartScreen” system will immediately trust these certificates. So as soon as you purchase such a certificate and begin shipping software signed with these, users will immediately be able to install the software with no warning.
However, if you “only” possess an IV/OV certificate, it will not. It turns out that Microsoft will not trust your certificate despite you having to completely lay bare your entire existence to some U.S. corporation. Let that sink in: Even if you jump through all these hoops and spend hundreds of dollars just for a small text file, Microsoft will still not trust you.
Even though Windows will display my name and acknowledge that the app has been signed, it will still recommend to the users not to install my software.
Rather, what you need to do additionally is, accumulate installations. In other words, users have to trust you and dismiss this warning until, at some undefined point in the future, Windows will not nag users with this ugly warning anymore.
And this is simply not how this should work. It is important that operating systems make sure malware has a harder time getting onto innocent users’ computers, but it should not do this at the expense of software developers who create things in their free time.
How Microsoft’s Actions Make the Internet Less Safe
It turns out that it was sensible of me to buy the certificate ahead of time. Because since it is a new certificate, it has to build up trust again. However, Zettlr is already out there and non-technical users are using it.
If I were now to release a patch with the new and untrusted certificate, these users would probably be scared by a warning that an app “may harm your computer.” They might ask themselves: “Did I accidentally download the update from some shady corner of the internet?”
Then they will probably go directly to the website, try it again, and they will see the warning again. This will make them insecure about what to do, and most of them will probably just not update their apps. And this will then leave their computers more vulnerable than before. Because each update also contains bug fixes.
By requiring me to (a) pay several hundred dollars, (b) collect an unspecified amount of installs and (c) wait for an undefined amount of time before my certificate is trusted and non-technical users will be able to install my app, Microsoft is creating a window of insecurity in which I cannot push updates because users cannot be expected to differentiate between a warning because some certificate is not trusted and a warning because of some actual malware.
Microsoft is putting non-organizational Open Source software into some rat race without any technical or social need for it. It does not just feel like it, this is a sick joke by a corporation that has completely lost touch with reality.
So what I have to make sure is that the new certificate is already trusted by Microsoft before I release an official patch. And the method I’m using for this feels so unprofessional I’m almost ashamed of telling you: I literally just told those users of Zettlr who do trust me to simply install every new nightly2 release of Zettlr until that warning goes away. Because we do know that a mixture of some time and many installations without any reports of malware do help to build up trust.
So what I’m effectively hoping is that the mass of Zettlr users who followed my call to action and installs the nightlies, dismissing that warning over and over again, will help build up so much trust that whenever I release the next update to Zettlr, users will not be scared.
Think about the insanity of that: I basically have to abuse a system together with a bunch of people to make sure new users are not greeted with a scary warning and I can continue to provide safe software that does not put users at harm. All of this in the name of said “security.”
If users are greeted by this simply wrongful warning, that will hurt their trust in me, in the application, and, by extension, in Open Source at large.
Apple and Microsoft are not the Same
This leads me to another insight: Apple and Microsoft are, in this regard, not the same. I do pay approximately the same amount of money for both certificates (I need separate code signing certificates for both the macOS and the Windows versions), i.e. ~20$ per month. However, my Apple certificate is (a) immediately trusted and (b) already valid for five years without me having to pay those five years in advance.
Furthermore, with the Apple developer program, I get much more than just those certificates. I also get access to some developer tools, and I can even download beta versions of upcoming operating systems to check if the app might break on these.
For Windows, on the other hand, I get only the certificate and nothing more, and that certificate is even useless until it becomes trusted by the operating system.
Apple is doing it right here: They raise the bar for malicious software, but they do so in a reasonable way. They have my address, they can sue me if I ever distributed malware, but they don’t require me to give my users weird instructions that feel more like wishful thinking than a technically valid strategy.
Two Classes of Programs
This leads me to a last insight. If you followed this blog for the past two years, you may have noticed that I frequently wonder why there are so few good graphical Open Source programs. I now believe that code signing certificates are one additional reason for this development.
If you are a single programmer writing a graphical program, you have to deal with so many more obstacles than if you were writing a command line program. A command line program does not need to be code signed. You can just install it from GitHub, call it, and it will work.
There are many more things to consider when writing a graphical program as opposed to writing a console program. I don’t think that this is the main reason for why there are so few (good) graphical Open Source applications, but I think it is at least one additional factor to consider.
Where does this leave us? I think that requiring code signing is a good thing to ensure trust online. But I also think that if people do comply with all of that, you shouldn’t put additional burden on them and basically tell them “Yeah, it’s all fine that you paid hundreds of dollars not to have your users see this warning, but we’re going to show it to them anyway.”
I think in this regard the approach by Apple is quite sensible: You do have to pay, which raises the bar for malicious actors, but after you sign up for their developer program, everything works absolutely fine and without any issues. And that is how it should be.
Raising the bar by requiring people to pay money will deter a lot of possible trouble makers. But those who are determined to causing harm will not even use a code signing certificate. Nobody would be this dumb to buy a certificate that can be traced back to them, just to distribute malware. These people will find other ways.
And therefore, I think the approach by Microsoft not to trust certificates from the beginning is just mean and without any justification. It is giving corporations with enough money a head start, thereby solidifying the social inequalities that are already ingrained in the digital world.
Two years ago I was just annoyed by the amount of work. Today, I am appalled by the fact that the internet more and more resembles a class society. We have three hierarchies of trust, two classes of certificates, two classes of applications, and thereby two classes of users.
Microsoft is an advocate for corporations who do all of the things I once criticized in my Open Source video and they are actively kicking Open Source developers in the teeth.
But here we are, having to somehow get Microsoft’s stupid gate keepers to not deter people from using Zettlr. And we will have to do that again in three years.
I am not looking forward to that.
A kernel driver is basically a small piece of software that interacts with the hardware of the computer itself. It is a much more critical piece of software, and so it is in and of itself relatively understandable why Microsoft would want to be extra safe with these. ↩
A nightly release is normally intended to ship cutting-edge builds of an application for those users who are experienced to deal with potential bugs, but who don't want to build the app themselves. Nightly releases for Zettlr are available at https://nightly.zettlr.com. ↩