Skip to main content

An Open Ecosystem

Linux is an open-source community that focuses on sharing power and responsibility among people instead of centralizing within a select group.  The Linux kernel – which acts as the foundation for many Linux-based distributions – is built on an even older framework that matured alongside computers.

The PDP-7 ran the first Unix code – used for creating the demo video game Space Travel.

1000000479.jpg 1000000480.png

An Accidental Movement

Unix was a general-purpose operating system that began to take shape in the mid-1960.  This was a collaborative project between the Massachusetts Institute of Technology, Bell Labs, and General Electric.  Academic researchers within the burgeoning computer science field experimented with the potential for time-sharing to innovate what was possible with these new digital machines. 

1000000452.png

Unix itself was based on an even older exploration in computers – an operating system called Multics.  Pronounced as "eunuchs", the name itself was intended as a pun on it's predecessor.  Multics had yielded truly innovative ideas, but it's exploratory nature didn't yield immediate profit potential.

1000000461.png

What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.

Dennis Richie, UNIX pioneer

The original AT&T Unix – created in 1969 – was a proprietary and closed-source operating system first investigated by Bell Labs.  As the result of a result of a 1958 ruling by the US Department of Justice, AT&T was forbidden from entering into the computer business under consent decree.  This was a part of the larger breakup of the Bell systems that continued through the 1980s.  

1000000462.jpg

This meant that AT&T was required to license it's non-telephone technology to anyone that asked.  While Unix was intended for use within their labs, they began licensing it to colleges and corporations for a modest fee.  This lenient licensing scheme played an important part in the widespread adoption of Unix and the eventual open-source movement.

Cathedral and the Bazaar

During the early days of computers, programmers and researchers were one and the same.  While developing programming languages like C – the backbone of Unix – we were also exploring what computers could accomplish for the first time.  To that end, it was common to share software and learn from each other while studying computing.  

Cathedral and the Bazaar was a foundational book by Eric S. Raymond about opposing software project management styles.

Unix was revolutionary not only as an operating system, but because it came bundled with a complete copy of the source code used to build it.  This allowed researchers to modify the code to fulfill their needs while also enabling corporations to create their own custom Unix distributions – for use in-house or as a marketable product.  This led to a proliferation of Unix operating systems, each with exciting new features.  

Windows vs Mac vs Linux vs Unix timeline graphic

1000000439.png

1000000440.png

1000000443.webp

Software – like hardware – became increasingly commercialized throughout the 1970s.  Corporations sought to mold hardware into compact personal devices while simultaneously fashioning software into the killer application that would draw consumers to their products.  The Unix Wars throughout the 1980s exacerbated the friction between vendors as the operating system became fragmented between multiple competing standards.

As corporations navigated this space, many preferred to follow the proprietary development model.  These release cycles are often measured in years – meaning that software was released as polished product with meticulous planning put into final 'gold' release.  On the flip side, bug fixes and feature requests could take years to manifest in the publicly available product.  Important software updates may never emerge – or may even be released as part of the product's successor.

This 'release late—release rarely' philosophy arises when the software developers regard their project as a consumer product.  While the product is marketed towards consumers, their role in the creative process is rather limited.  Their feedback is often collected reactively during formative beta testing – or even after the product is released to the public.  

Proprietary software is often "closed-source", meaning that the code to create it is private and legally protected – or even a trade secret.  The code is compiled into a binary file containing the raw binary data – ones and zeros – used to control a computer system.  This data it is not human-readable and only works on a specific platform – such as Windows, MacOS or Debian Linux.

This makes it relatively difficult to reverse engineer, but it also means that the code wasn't compiled to run efficiently on your specific computer system. Instead, it is compiled to meet 'minimum system requirements' and more advanced hardware is rarely leveraged to your advantage.

Software Freedoms

During the 1970s, the original computer hacker culture – who enjoyed the creative challenge of overcoming hardware and software limitations – formed within academic institutions. 

It was around this time that the Free Software Movement began to take shape. Researchers continued to develop software collaboratively by sharing their discoveries and the source code that powered them.  This was foundational to the continued growth of the Unix experiment.

1000000506.jpg

In 1984, Richard Stallman resigned from his position at MIT citing that proprietary software stifled collaboration by limiting his labs ability to share source code.  He began work on the GNU Project – which stands for GNU's Not Unix – and represented an idealized ,"free" operating system.  It behaved almost exactly like Unix to attract developers, but the source code would be available for anyone to modify.

1000000463.png

The word "free" in our name does not refer to price; it refers to freedom. First, the freedom to copy a program and redistribute it to your neighbors, so that they can use it as well as you. Second, the freedom to change a program, so that you can control it instead of it controlling you; for this, the source code must be made available to you.

GNU's Bulletin, Volume 1

The Free Software Foundation he sparked – through his call-to-action known as the GNU Manifesto – initially caused some confusion.  He often had to explain that he meant "free" as in "freedom" not as in "beer".  This led to the foundation of the movement: the four software freedoms

Counter_1

Freedom 1

The freedom to run the program as you wish, for any purpose.

 

Counter_2

Freedom 2

The freedom to study how the program works, and change it so it does your computing as you wish.

 

Counter_3

Freedom 3

The freedom to redistribute copies so you can help your neighbor.

 

Counter_4

Freedom 4

The freedom to distribute copies of your modified versions to others. By doing this you can give the whole community a chance to benefit from your changes.

Fulfilling these freedoms required unrestricted access to the underlying source code.  Through GNU, a new decentralized model of development emerged that enabled everyone to contribute bug fixes, code suggestions and feature requests.  Communication took place primarily on internet newsgroups like Usenet – one of the first examples of a public-facing digital bulletin board.

1000000508.png

GNU developed in sharp contrast to proprietary software with many open-source projects following the 'release early—release often' development philosophy.  These software programs are not generally viewed as a consumer product, but as a tool to reach an end. 

While these projects may feel less polished, users have the power to add their voice throughout the entire development process.  This means the potential for bugs to be fixed promptly and – depending on community feedback – features can be quickly integrated into the ongoing evolution.

The GNU Project is an umbrella for the hundreds of smaller projects that are necessary to build an operating system from the ground up. While developed through collaboration, these constituent projects are produced independently of the others.  

Modular by Design

While lying the foundations for Unix, computer scientists were careful to consider it's design philosophy.  They decided that Unix should provide a simple set of tools – each able to perform a limited function with well-defined parameters.  This facilitated a modular and decentralized approach to developing the new operating system.

This philosophy disconnected the lifecycle of applications from each other – as well as from the operating system. This led to a rolling release model where individual software packages were updated frequently and released quickly.  These modular programs could be maintained independently and development teams only needed to coordinate how their modules would communicate – such as a mutually-defined API or the system call interface.

Software was specifically crafted to fill an empty technological niche and fulfill a desired goal.  Communities formed to compare and contrast methods for reaching the same end.  Developers held the autonomy to make their own decisions about how the project would progress.

After using a high-level programming language to code a software program, it needs to be translated into machine code.  This creates the executable program to can communicate with your underlying hardware through your operating system.  This process happens in multiple stages – some well in advance, while others happen just-in-time.

((Diagram showing compilation happening in stages, including object code, machine code and just in time))

The Windows operating system revolves around compiled programs that are easy-to-use and operate.  Similarly, Linux provides access to compiled software out of the box, but also provides the source code for anyone to compile software from the ground up for their specific hardware.  In practice, this can bring new life into even the oldest computer systems.

Licensing

1000000262.jpgStarting in 1985, GNU Software was released under the GNU General Public License and enabled full use by anyone – with specific restrictions.  This was the first copyleft license mandating that all derivative works maintain reciprocity.  

Copyleft Licenses

There are a spectrum of requirements for fulfilling the developer's intended spirit of reciprocity – from maintaining a similar software license to simply attributing through a file.  While modern "weak" copyleft licenses allow embedding within proprietary projects, the original "strong" GPL license required that the derivative project retain the same license. 


Copyleft Licenses

License

GPL

1989

A strong copyleft license that comes with many conditions for usage within derivative software while providing express consent to use related patents.

 

License

The Unlicense

2010

This license foregoes intellectual copyright and attributes all work to the public domain.  While not technically copyleft, this anti-copyright license is compatible with the GNU GPL.

 

License

Mozilla Public License 2.0

2012

This license balances the concerns of free software proponents and proprietary software developers. 

 

By 1989, University of California, Berkeley introduced BSD – or the Berkeley Software Distribution – and created the first publicly accessible Unix operating system.  By rewriting proprietary AT&T Unix code from the ground up, they released BSD to facilitate open collaboration. 

They created their own permissive software license that placed barely any restrictions on how you could use the software.  This also provided no warranty about the continues maintenance of the project and removed the developers from all liability.  This even explicitly allowed proprietization – meaning it could be used within private, "closed-source" programs.

Permissive Licenses

One of the primary differences between a copyleft license and a permissive license is the concept of "share-alike" – which relates to the Creative Commons.  Copyleft licenses require that information about the original work is available, often creating a web of interconnected projects.  Permissive licenses, on the other hand, have few stipulations on how they can be used.


Permissive Licenses

License

Apache

2004

A permissive license that allows that this software can be incorporated into larger projects that themselves are released under a different license.

 

License

MIT

1987

This straightforward license only requires that the licensing information is shown, otherwise the software can be used freely for any reason.

 

License

BSD 4-Clause

1990

The first in a family of permissive licenses with the original requiring acknowledgement in the advertisement of all derivative works.

 

https://www.mend.io/resources/blog/open-source-licenses-trends-and-predictions/

1000000486.png

Free – or Open?

During the early 1990s, the GNU Project proceeded until it neared completion – the only thing it was missing was a kernel.  This integral system handles all interactions between software and hardware within a computer system.  Without it, the operating system wouldn't even be able to operate.  Their free kernel – known as GNU Hurd – was still incomplete.

Linus Torvalds, operating independently of the GNU Project, created the first version of the Linux kernel during his time as a computer science student.  He released the software under the General Public License. This meant the software could be shared freely and modified by anyone.

GNU adopted the new Linux kernel as its own – which was now rapidly evolving into a global community. The resulting operating system is now known most commonly as Linux – even though there is a movement to change this to GNU/Linux. Many Linux Distributions use the Linux Kernel by default.

At a conference, a collective of software developers concluded that the Free Software Movement's social activism was not appealing to companies.  The group – later known as the Open Source Initiative – felt that more emphasis was needed the business potential of open-source software. Linux was quickly adopted as the flag-ship project of the newly forming Open-Source Movement.

Compared to the earlier Free Software Movement, this new initiative took less of a focus on social, moral and ethical issues – like software freedom and social responsibility.  Instead, the open-source movement chose to highlight the quality, flexibility and innovation that could be accomplished by sharing access to source code.

Social Contract

By creating mechanisms to decentralize software development and focusing on discrete modules, Linux began to produce viable operating systems.  The process was not always easy or direct, there was often difficulty with navigating divide – both geographical and interpersonal.  The Linux – and larger computing – experiments have produced some notably strong personalities.

 

Decentralized development let us work together over geographic distances, but It was all easy.

1993: Over 100 developers work on the Linux kernel. With their assistance the kernel is adapted to the GNU environment, which creates a large spectrum of application types for Linux. The oldest currently existing Linux distribution, Slackware, is released for the first time. Later in the same year, the Debian project is established. Today it is the largest community distribution.

There were some strong personalities 

Linus Torvalds and creator of slackware


Sls was the first virally supported Linux but it is as buggy.

Softlanding Linux System (SLS) was one of the first Linux distributions. The first release was by Peter MacDonald[4] in May 1992.[5][6] Their slogan at the time was "Gentle Touchdowns for DOS Bailouts".[7] SLS was the first release to offer a comprehensive Linux distribution containing more than the Linux kernel, GNU, and other basic utilities, including an implementation of the X Window System.[4][8]

SLS was the most popular Linux distribution at the time, but it was considered to be rather buggy by its users.  It was rather difficult to use on several fronts.

Sls-linux_1.05.png

Defining Simplicity

Slackware was created yo fix it's bug, but it was a mysterious process led by a benevolent dictator for life

It was soon superseded by Slackware (which started as a cleanup of SLS by Patrick Volkerding)

Slackware is a Linux distribution created by Patrick Volkerding in 1993. Originally based on Softlanding Linux System (SLS),[5] Slackware has been the basis for many other Linux distributions, most notably the first versions of SUSE Linux distributions, and is the oldest distribution that is still maintained.[6]

Slackware_1.01_screenshot.png

The name "Slackware" stems from the fact that the distribution started as a private side project with no intended commitment.

Volkerding had no intentions to provide his modified SLS version for the public. His friends at MSU urged him to put his SLS modifications onto an FTP server, but Volkerding assumed that "SLS would be putting out a new version that included these things soon enough", so he held off for a few weeks.

Slackware aims for design stability and simplicity and to be the most "Unix-like" Linux distribution.[7] It makes as few modifications as possible to software packages from upstream and tries not to anticipate use cases or preclude user decisions. In contrast to most modern Linux distributions, Slackware provides no graphical installation procedure and no automatic dependency resolution of software packages.

In this context, "simple" refers to the simplicity in system design, rather than system usage. Thus, ease of use may vary between users: those lacking knowledge of command line interfaces and classic Unix tools may experience a steep learning curve using Slackware, whereas users with a Unix background may benefit from a less abstract system environment.

There is no formal issue tracking system and no official procedure to become a code contributor or developer. The project does not maintain a public code repository. Bug reports and contributions, while being essential to the project, are managed in an informal way. All the final decisions about what is going to be included in a Slackware release strictly remain with Slackware's benevolent dictator for life, Patrick Volkerding.[37][38][39]


Similarly, Ian Murdock's frustration with SLS led him to create the Debian project.[9]

Debian (/ˈdɛbiən/)[4] is a free and open source[b] Linux distribution, developed by the Debian Project, which was established by Ian Murdock in August 1993. Debian is one of the oldest operating systems based on the Linux kernel, and is the basis of many other Linux distributions.

Debian 1.3

Debian-bo.png

Debian distribution codenames are based on the names of characters from the Toy Story films.

The Debian project is a volunteer organization with three foundational documents:

  • The Debian Social Contract defines a set of basic principles by which the project and its developers conduct affairs.[127]
  • The Debian Free Software Guidelines define the criteria for "free software" and thus what software is permissible in the distribution. These guidelines have been adopted as the basis of the Open Source Definition. Although this document can be considered separate, it formally is part of the Social Contract.[127]
  • The Debian Constitution describes the organizational structure for formal decision-making within the project, and enumerates the powers and responsibilities of the Project Leader, the Secretary and other roles.[29]

As of September 2023, Debian is the second-oldest Linux distribution still in active development: only Slackware is older. The project is coordinated over the Internet by a team of volunteers guided by the Debian Project Leader and three foundational documents: the Debian Social Contract, the Debian Constitution, and the Debian Free Software Guidelines.

Debian thought unity between free and open software alongside proprietary software was possible - even preferable

In general, Debian has been developed openly and distributed freely according to some of the principles of the GNU Project and Free Software.[5][7] Because of this, the Free Software Foundation sponsored the project from November 1994 to November 1995.[8] However, Debian is no longer endorsed by GNU and the FSF because of the distribution's long-term practice of hosting non-free software repositories and, since 2022, its inclusion of non-free firmware in its installation media by default.[5][6] On June 16, 1997, the Debian Project founded the nonprofit organization Software in the Public Interest to continue financing its development.

From 1999, the project leader was elected yearly.

Debian is used by several institutions, such as many universities, NGOs and other non-profit organizations (including Wikimedia Foundation),[247] and commercial companies.[248] It has even been used in space, in laptops on board the International Space Station.[249]

Debian has been very helpful to numerous government agencies in the public sector, such as in the city of Munich, which used a Debian-based distribution in its LiMux initiative for the government computer migration to Linux.[250] Schools in Extremadura and Andalusia (Spain) also utilized Debian-based systems (gnuLinEx and Guadalinex, respectively) to develop digital skills and open-source computing in schools.[251][252] There are many other cases of usage of Debian-based distributions in education, such as the deployment of Skolelinux/Debian Edu in Norwegian schools.[253] In addition, other public administrations use Linux systems indirectly based on Debian, such as French Gendarmerie, which uses Ubuntu-derived GendBuntu distribution.[254]

Debian is one of the most popular Linux distributions, and many other distributions have been created from the Debian codebase.[258] As of 2021, DistroWatch lists 121 active Debian derivatives.[259] The Debian project provides its derivatives with guidelines for best practices and encourages derivatives to merge their work back into Debian.[260][261]

Debian wanted to focus more on a humanist approach


https://lists.debian.org/debian-announce/1997/msg00017.html

The Debian Social Contract (DSC) is a document that frames the moral agenda of the Debian project. The values outlined in the Social Contract provide the basic principles for the Debian Free Software Guidelines that serve as the basis of the Open Source Definition.

Debian believes the makers of a free software operating system should provide guarantees when a user entrusts them with control of a computer. These guarantees include:

  • Ensuring that the operating system remains open and free.
  • Giving improvements back to the community that made the operating system possible.
  • Not hiding problems with the software or organization.
  • Staying focused on the users and the software that started the phenomenon.
  • Making it possible for the software to be used with non-free software.

The idea of the DSC was first proposed by Ean Schuessler after a conversation with Bob Young, co-founder of Red Hat[when?]. Schuessler said Red Hat should issue a set of guidelines that would guarantee to the community as the company expanded it would always be committed to the ideals of Free Software. Young said this would be a "kiss of death" for Red Hat, implying it would constrain the company's ability to generate profit. Concerned about Young's response, Schuessler and other Debian developers decided to broach the idea of a "social contract" that would supplement Debian's initial manifesto written by Ian Murdock. Bruce Perens later led the effort from June 1997[1] to coordinate the creation of the DSC using the Free Software Definition as its basis.[2][3]

The Debian project ratified its social contract 1.0 on July 5, 1997.[4]

The modular nature often allowed projects to work together regrdles of different philosophy.  Debian began to provide criteria for compatible license.

The Debian Free Software Guidelines (DFSG) was first published together with the first version of the Debian Social Contract in July 1997.[2] The primary author was Bruce Perens, with input from the Debian developers during a month-long discussion on a private mailing list, as part of the larger Debian Social Contract. Perens was copied to an email discussion between Ean Schuessler (then of Debian) and Donnie Barnes of Red Hat, in which Schuessler accused Red Hat of never elucidating its social contract with the Linux community. Perens realized that Debian did not have any formal social contract either, and immediately started creating one. The (then) Three Freedoms, which preceded the drafting and promulgation of the DFSG, were unknown to its authors.[3]

The guidelines were:

  1. Free redistribution.
  2. Inclusion of source code.
  3. Allowing for modifications and derived works.
  4. Integrity of the author's source code (as a compromise).
  5. No discrimination against persons or groups.
  6. No discrimination against fields of endeavor, like commercial use.
  7. The license needs to apply to all to whom the program is redistributed.
  8. License must not be specific to a product.
  9. License must not restrict other software.
  10. Example licenses: The GNU GPL, BSD, and Artistic licenses are examples of licenses considered free.[2][4]

Crowdsourcing Security 

Security is a process, not a checkbox.

Crowdsourcing is the act of sourcing labor or ideas from a large group of people, typically through the use of the internet. The concept has been used for a variety of purposes, such as product development, marketing, and now security.

In software development, Linus's law is the assertion that "given enough eyeballs, all bugs are shallow". The law was formulated by Eric S. Raymond in his essay and book The Cathedral and the Bazaar (1999), and was named in honor of Linus Torvalds.[1][2]

A more formal statement is: "Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone." Presenting the code to multiple developers with the purpose of reaching consensus about its acceptance is a simple form of software reviewing. Researchers and practitioners have repeatedly shown the effectiveness of reviewing processes in finding bugs and security issues.[3]

The persistence of the Heartbleed security bug in a critical piece of code for two years has been considered as a refutation of Raymond's dictum.[6][7][8][9] Larry Seltzer suspects that the availability of source code may cause some developers and researchers to perform less extensive tests than they would with closed source software, making it easier for bugs to remain.[9] In 2015, the Linux Foundation's executive director Jim Zemlin argued that the complexity of modern software has increased to such levels that specific resource allocation is desirable to improve its security. Regarding some of 2014's largest global open source software vulnerabilities, he says, "In these cases, the eyeballs weren't really looking".[8] Large scale experiments or peer-reviewed surveys to test how well the mantra holds in practice have not been performed.[10]

Empirical support of the validity of Linus's law[11] was obtained by comparing popular and unpopular projects of the same organization. Popular projects are projects with the top 5% of GitHub stars (7,481 stars or more). Bug identification was measured using the corrective commit probability, the ratio of commits determined to be related to fixing bugs. The analysis showed that popular projects had a higher ratio of bug fixes (e.g., Google's popular projects had a 27% higher bug fix rate than Google's less popular projects). Since it is unlikely that Google lowered its code quality standards in more popular projects, this is an indication of increased bug detection efficiency in popular projects.

GNU programs have been shown to be more reliable than their proprietary Unix counterparts.[32][33]

Benefits

  • Proprietary software forces the user to accept the level of security that the software vendor is willing to deliver and to accept the rate that patches and updates are released.[1]

Drawbacks

  • Simply making source code available does not guarantee review. An example of this occurring is when Marcus Ranum, an expert on security system design and implementation, released his first public firewall toolkit. At one time, there were over 2,000 sites using his toolkit, but only 10 people gave him any feedback or patches.[4]
  • Having a large amount of eyes reviewing code can "lull a user into a false sense of security".[5] Having many users look at source code does not guarantee that security flaws will be found and fixed.

Secure by design

https://en.m.wikipedia.org/wiki/Secure_by_design

While not mandatory, proper security usually means that everyone is allowed to know and understand the design because it is secure. This has the advantage that many people are looking at the source code, which improves the odds that any flaws will be found sooner (see Linus's law). The disadvantage is that attackers can also obtain the code, which makes it easier for them to find vulnerabilities to exploit. It is generally believed, though, that the advantage of the open source code outweighs the disadvantage.

https://dev.to/salamilinux/why-linux-is-secure-by-design-but-still-needs-you-3ic4


    Code is publicly available for scrutiny.
    Vulnerabilities are often found and fixed quickly.
    The community and vendors (like Red Hat, Ubuntu, Debian) actively patch security holes.

    Transparency leads to faster response and higher trust.

Thanks to open source nature, all or most distributions benefit from the security advances of others, for example Ubuntu suggesting fixes for an important library will also help arch Linux if it uses that library.

Some open source distributions are backed by for profit or support based model, like redhat and ubuntu

https://www.forbes.com/sites/stevendickens/2023/08/16/the-future-of-open-source-enterprise-linux-and-community-collaboration/?sh=12802a187a52

Furthermore, open-source software's security and reliability aspects have played a significant role in its rise. The availability of source code to a large community of developers allows for thorough code review, which helps promptly identify and address potential security vulnerabilities. With a collective effort to maintain and enhance the software, the open-source approach ensures higher reliability and stability.

Open source promotes a vendor-neutral environment, encouraging diverse contributions and giving users more choices. The collaborative nature of open-source projects facilitates contributions from diverse sources, leading to increased competition and choice for end-users. This vendor independence empowers users to make informed decisions based on their specific needs and preferences.

Simply put, vibrant communities support open-source software, facilitating knowledge exchange and collaboration, ultimately fostering adoption, growth, and better code for all.

 

The Debian project handles security through public disclosure. Debian security advisories are compatible with the Common Vulnerabilities and Exposures dictionary, are usually coordinated with other free software vendors and are published the same day a vulnerability is made public.[228][229] There used to be a security audit project that focused on packages in the stable release looking for security bugs;[230] Steve Kemp, who started the project, retired in 2011 but resumed his activities and applied to rejoin in 2014.[231][232]

The stable branch is supported by the Debian security team; oldstable is supported for one year.[142] Although Squeeze is not officially supported, Debian is coordinating an effort to provide long-term support (LTS) until February 2016, five years after the initial release, but only for the IA-32 and x86-64 platforms.[233] Testing is supported by the testing security team, but does not receive updates in as timely a manner as stable.[234] Unstable's security is left for the package maintainers.[142]

The Debian project offers documentation and tools to harden a Debian installation both manually and automatically.[235] AppArmor support is available and enabled by default since Buster.[236] Debian provides an optional hardening wrapper, and does not harden all of its software by default using gcc features such as PIE and buffer overflow protection, unlike operating systems such as OpenBSD,[237] but tries to build as many packages as possible with hardening flags.[238]