Web of Trust, Part 1: Concept

Every day we rely on technologies who nobody can fully understand. Since well before the industrial revolution, complex and challenging tasks required an approach that broke out the different parts into smaller scale tasks. Each resulting in specialized knowledge used in some parts of our lives, leaving other parts to trust in skills that others had learned. This shared knowledge approach also applies to software. Even the most avid readers of this magazine, will likely not compile and validate every piece of code they run. This is simply because the world of computers is itself also too big for one person to grasp.

Still, even though it is nearly impossible to understand everything that happens within your PC when you are using it, that does not leave you blind and unprotected. FLOSS software shares trust, giving protection to all users, even if individual users can’t grasp all parts in the system. This multi-part article will discuss how this ‘Web of Trust’ works and how you can get involved.

But first we’ll have to take a step back and discuss the basic concepts, before we can delve into the details and the web. Also, a note before we start, security is not just about viruses and malware. Security also includes your privacy, your economic stability and your technological independence.

One-Way System

By their design, computers can only work and function in the most rudimentary ways of logic: True or false. And or Or. This (boolean logic) is not readily accessible to humans, therefore we must do something special. We write applications in a code that we can (reasonably) comprehend (human readable). Once completed, we turn this human readable code into a code that the computer can comprehend (machine code).

The step of conversion is called compilation and/or building, and it’s a one-way process. Compiled code (machine code) is not really understandable by humans, and it takes special tools to study in detail. You can understand small chunks, but on the whole, an entire application becomes a black box.

This subtle difference shifts power. Power, in this case being the influence of one person over another person. The person who has written the human-readable version of the application and then releases it as compiled code to use by others, knows all about what the code does, while the end user knows a very limited scope. When using (software) in compiled form, it is impossible to know for certain what an application is intended to do, unless the original human readable code can be viewed.

The Nature of Power

Spearheaded by Richard Stallman, this shift of power became a point of concern. This discussion started in the 1980s, for this was the time that computers left the world of academia and research, and entered the world of commerce and consumers. Suddenly, that power became a source of control and exploitation.

One way to combat this imbalance of power, was with the concept of FLOSS software. FLOSS Software is built on 4-Freedoms, which gives you a wide array of other ‘affiliated’ rights and guarantees. In essence, FLOSS software uses copyright-licensing as a form of moral contract, that forces software developers not to leverage the one-way power against their users. The principle way of doing this, is with the the GNU General Public Licenses, which Richard Stallman created and has since been promoting.

One of those guarantees, is that you can see the code that should be running on your device. When you get a device using FLOSS software, then the manufacturer should provide you the code that the device is using, as well as all instructions that you need to compile that code yourself. Then you can replace the code on the device with the version you can compile yourself. Even better, if you compare the version you have with the version on the device, you can see if the device manufacturer tried to cheat you or other customers.

This is where the web of Trust comes back into the picture. The Web of Trust implies that even if the vast majority of people can’t validate the workings of a device, that others can do so on their behalf. Journalists, security analysts and hobbyists, can do the work that others might be unable to do. And if they find something, they have the power to share their findings.

Security by Blind Trust

This is of course, if the application and all components underneath it, are FLOSS. Proprietary software, or even software which is merely Open Source, has compiled versions that nobody can recreate and validate. Thus, you can never truly know if that software is secure. It might have a backdoor, it might sell your personal data, or it might be pushing a closed ecosystem to create a vendor-lock. With closed-source software, your security is as good as the company making the software is trustworthy.

For companies and developers, this actually creates another snare. While you might still care about your users and their security, you’re a liability: If a criminal can get to your official builds or supply-chain, then there is no way for anybody to discover that afterwards. An increasing number of attacks do not target users directly, but instead try to get in, by exploiting the trust the companies/developers have carefully grown.

You should also not underestimate pressure from outside: Governments can ask you to ignore a vulnerability, or they might even demand cooperation. Investment firms or shareholders, may also insist that you create a vendor-lock for future use. The blind trust that you demand of your users, can be used against you.

Security by a Web of Trust

If you are a user, FLOSS software is good because others can warn you when they find suspicious elements. You can use any FLOSS device with minimal economic risk, and there are many FLOSS developers who care for your privacy. Even if the details are beyond you, there are rules in place to facilitate trust.

If you are a tinkerer, FLOSS is good because with a little extra work, you can check the promises of others. You can warn people when something goes wrong, and you can validate the warnings of others. You’re also able to check individual parts in a larger picture. The libraries used by FLOSS applications, are also open for review: It’s “Trust all the way down”.

For companies and developers, FLOSS is also a great reassurance that your trust can’t be easily subverted. If malicious actors wish to attack your users, then any irregularity can quickly be spotted. Last but not least, since you also stand to defend your customers economic well-being and privacy, you can use that as an important selling point to customers who care about their own security.

Fedora’s case

Fedora embraces the concept of FLOSS and it stands strong to defend it. There are comprehensive legal guidelines, and Fedora’s principles are directly referencing the 4-Freedoms: Freedom, Friends, Features, and First

Fedora's Foundation logo, with Freedom highlighted. Illustrative.

To this end, entire systems have been set up to facilitate this kind of security. Fedora works completely in the open, and any user can check the official servers. Koji is the name of the Fedora Buildsystem, and you can see every application and it’s build logs there. For added security, there is also Bohdi, which orchestrates the deployment of an application. Multiple people must approve it, before the application can become available.

This creates the Web of Trust on which you can rely. Every package in the repository goes through the same process, and at every point somebody can intervene. There are also escalation systems in place to report issues, so that issues can quickly be tackled when they occur. Individual contributors also know that they can be reviewed at every time, which itself is already enough of a precaution to dissuade mischievous thoughts.

You don’t have to trust Fedora (implicitly), you can get something better; trust in users like you.

FAQs and Guides Using Software


  1. Derek

    A very interesting article opening up some of the many reasons why people should use and trust Fedora. Many thanks.

    • Fe Dora

      True! So many articles out there saying to use mint or some other hack os fork of a fork of a fork that holds back security updates and has other horribly negligent practices. Fedora/RHEL/Centos all use SELinux wisely and have lots of other extremely trustworthy policies. For desktop os, Fedora is as solid as it gets…I defy someone with experience to name another distro that is even close in comparison!

  2. heliosstyx

    Well written article, but it’s only one side of the reality. First only few users of an FLOSS system are able to inspect, understand such systems and warn others against threats. It’s only voluntary. The FLOSS community does it well but there is no obligation.

    The commercial solutions must be designed with high-quality and reliability and must meet the customer demands and here comes the other side of the reality: the customers have to pay a price for the industrial solutions and so there is an obligations of the industry to keep their promises otherwise the will be charged and so on. The IT industry standards are high and every solution provider will give their best to keep their customers satisfied. FLOSS is fine but if security and trust is critical I recommend commercial solutions.

    • I think that presents a rather optimistic view. There are plenty of examples where commercial solutions have fallen well short of obligations (same as for FLOSS). And “commercial” and FLOSS are not mutually-exclusive. Red Hat has done rather well commercially with FLOSS. I think your position is better stated as “if you need guarantees, you need a contract”. Whether or not the software itself is FLOSS or proprietary is immaterial.

      • Dave Cooper

        Yeah, I totally agree with Ben Cotton. “IT standards are high… ” it would be nice to think, but the pressures of time and commerce lead to some pretty eye-watering fudges on occasion. Lets not kid ourselves!

      • As someone that started out with Red Hat in 1999, and uses both open and closed source software, I would dare say it is equally optimistic to still stick to the many eyes theory. Believing that just because anyone can inspect the code it is therefor the safest, because a great number of eyes have allegedly vetted the code.

        As someone that is not a coder, unless you include writing bash scripts, I cannot vet any code. So regardless of open or not, I have to trust that the code has not just been inspected, but also that it does not contain any back doors or nasty bugs.

        We all know how well this theory worked for the following bugs:
        Shellshock – Introduced in the code 1989, fixed 2014.
        Heartbleed – Introduced 2011, fixed 2014.
        Dirty COW – Introduced 2007, fixed 2017
        SigSpoof – Introduced 1998, fixed 2018.

        To name a few that affect crucial open source software. So to say that just because you have transparency with open source does not mean that it is more secure. That is akin to liken it to security through obscurity.

        Of course, by being transparent a bug is more likely to be found and become publicly know. As the other side of closed source is that a bug can be known within a closed group, never to be released just so they can keep exploiting it. That can be harder with open source, but I dare to argue only marginally as we see above where some bugs existed for over one decade.

        • laolux

          In addition to the bugs mentioned by Dr. W I would like to point out that even a rather careful analysis of the source code does not necessarily uncover malicious bugs. Have a look at the underhanded C contest to get an idea of how creatively malicious behavior can be disguised, even to a more or less trained eye.

  3. jrDantin

    Great idea, but why not tell us who the Author: Kevin “Eonfge” Degeling is, in order to begin building a Web of Trust?

  4. Linus T

    Fedora is the best. Of course, two high school kids in a basement do not have the resources to maintain a true FLOSS structure. Fedora absolutely has the resources and makes FLOSS a very valid concept. So with the right people involved, FLOSS is legit.

  5. Adrien

    Interesting piece, thanks Kevin. I think you’ve made a strong argument in favour of FOSS as key to build trust between end-users and developers / editors. What I’d like to read you about is the opposite direction — trust between developers and end-users, especially in cases where the end-uses is also a developer. This direction of trust is interesting — and probably less discussed among FOSS aficionados — because soon or later it runs into this problem:
    – developers X release the source code of a project P as per some FOSS-compliant license;
    – P builds into a software whose value-proposition heavily depends on usage and/ or on the quality of its deployment infrastructure (i.e. a social network, a messaging app, a financial service);
    – developers X deploy P and provide it as a service on their own infrastructure for a very, very cheap price; a community starts to grow
    – developers Y come across P’s repository, fork it and without changing a comma, run it on their private infrastructure, providing it as a service to their own community, so that eventually:
    – the service provided by developers X eclipses the service provided by developers Y, because they are little bit more lucky in their choice of infrastructure, spend a tad more money in advertising for their service, or provide it completely for free (as opposed to a very, very cheap price).

    So devs Y have hijacked devs X not by changing a single line of code, just through external factors and circumstances. As a result they might get more funding, more end-users and more “soft power” in the sense of influence on what people use, do, and value. And devs X might have their trust toward peers and end-users irremediably undermined.

    Interestingly, they’ve achieved to do so without:
    – abusing “hard power” in the sense of controlling the end-users — the end-users are just as autonomous as before, because they just switched from the original app to its hijacked clone;
    – abusing licensing norms.

    So, is that a blind-spot in the FOSS philosophy? How to make sure that devs can trust their users and peers? Is there a solution internal to the FOSS philosophy, or is the FOSS philosophy falling short because it is wedded to the assumption that users and peers have virtuous motivations?

  6. Librecat

    Don’t get me wrong fedora is an awesome distro but compared to fsf approved distros it has a lot of non-free software BUT, you can easily make it more free my suggestion is merge the freed-ora repository from https://linux-libre.fsfla.org/pub/linux-libre/freed-ora/ it contains a non-free blocklist and linux-libre kernel and you dont have to install them by default they can be optional

Comments are Closed

The opinions expressed on this website are those of each author, not of the author's employer or of Red Hat. Fedora Magazine aspires to publish all content under a Creative Commons license but may not be able to do so in all cases. You are responsible for ensuring that you have the necessary permission to reuse any work on this site. The Fedora logo is a trademark of Red Hat, Inc. Terms and Conditions