An Open Ecosystem
Linux is an open-source community that focuses on sharing power and responsibility among people instead of centralizing within a select group. The Linux kernel – which acts as the foundation for many Linux-based distributions – is built on an even older framework that matured alongside computers.
The PDP-7 ran the first Unix code – used for creating the demo video game Space Travel.
![]() |
![]() |
An Accidental Movement
Unix was a general-purpose operating system that began to take shape in the mid-1960. This was a collaborative project between the Massachusetts Institute of Technology, Bell Labs, and General Electric. Academic researchers within the burgeoning computer science field experimented with the potential for time-sharing to innovate what was possible with these new digital machines.
Unix itself was based on an even older exploration in computers – operating system called Multics. Pronounced as "eunuchs", the name itself was intended as a pun about it's predecessor. Multics yielded truly innovative ideas, but it's exploratory nature didn't yield immediate profit potential.
What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.
— Dennis Richie, UNIX pioneer
The original AT&T Unix – created in 1969 – was a proprietary and closed-source operating system first investigated by Bell Labs. As the result of a result of a 1958 ruling by the US Department of Justice, AT&T was forbidden from entering into the computer business under consent decree. This was a part of the larger breakup of the Bell systems that continued through the 1980s.
This meant that AT&T was required to license it's non-telephone technology to anyone that asked. Unix was originally intended for use within their labs, but they began licensing it to colleges and corporations for a modest fee. This open licensing played an important part in the widespread adoption of Unix and the eventual open-source movement.
Windows vs Mac vs Linux vs Unix timeline graphic
Cathedral and the Bazaar
During the early days of computers, programmers and researchers were one and the same. While developing programming languages like C – the backbone of Unix – we were also exploring what computers could accomplish for the first time. To that end, it was common to share software and learn from each other while studying computing.
Cathedral and the Bazaar was a foundational book by Eric S. Raymond about opposing software project management styles.
Unix was revolutionary not only as an operating system, but because it came bundled with a complete copy of the source code used to build it. This allowed researchers to modify the code to fulfill their needs while also enabling corporations to create their own custom Unix distributions – for use in-house or as a marketable product. This led to a proliferation of Unix operating systems, each with exciting new features.
Software became increasingly commercialized throughout the 1970s. Corporations sought to mold hardware into compact personal computers while simultaneously fashioning software into the killer application that would draw consumers to their products. The Unix Wars during the 1980s created friction between vendors as the operating system became fragmented with competing standards.
This 'release late—release rarely' philosophy arises when the software developers regard their project as a marketable consumer product. Despite this product being marketed towards end-users, the consumer's role in the development process is rather limited throughout and they are often required to voice their feedback after the product is released.
Proprietary code is often more meticulous that open source code. Tech companies have rigid internal standards and code review processes to catch bugs, inefficiencies and other errors. However, as they add more features, these teams can become segmented. When developing macos, different teams create productivity apps, vs system apps vs the os.
With this binary-only deployment model, the software vendors choose the 'minimum system requirements' and compile for that target. More advanced systems get little advantage.
to the flexibility considerations mentioned in Section 1: Linux Is Open Source, this implies that software is 'compatible' with the lowest common denominator for the target audience of the software. This lowest common denominator could be Windows version, CPU type, memory configuration, video/graphics capability, etc.
In summary, with software for Windows, you get what you get. Software vendors cannot assume that you have a compiler (if they want to supply source code, which is often not the case anyway), so 'must' supply an executable program. So that they do not have to maintain many, many copies of the same program, they compile for the lowest end system in their target market.
Software Freedoms
It was around this time that the Free Software Movement began to take shape. Researchers continued to develop software collaboratively by sharing their discoveries and the source code that powered them. This was foundational to the continued growth of the Unix experiment.
In 1984, Richard Stallman resigned from his position at MIT citing that proprietary software stifled collaboration by limiting his labs ability to share source code. He began work on the GNU Project – which stands for GNU's Not Unix – and represented a fully free operating system. It would behave almost exactly like Unix to attract developers, but the source code would be free for anyone to modify.
The word "free" in our name does not refer to price; it refers to freedom. First, the freedom to copy a program and redistribute it to your neighbors, so that they can use it as well as you. Second, the freedom to change a program, so that you can control it instead of it controlling you; for this, the source code must be made available to you.
The Free Software Movement he sparked – through his call-to-action known as the GNU Manifesto – initially caused some confusion. He often had to explain that he meant "free" as in "freedom" not as in "beer". This led to the foundation of the movement: the four software freedoms.
Counter_1 |
Freedom 1 The freedom to run the program as you wish, for any purpose.
|
Counter_2 |
Freedom 2 The freedom to study how the program works, and change it so it does your computing as you wish.
|
Counter_3 |
Freedom 3 The freedom to redistribute copies so you can help your neighbor.
|
Counter_4 |
Freedom 4 The freedom to distribute copies of your modified versions to others. By doing this you can give the whole community a chance to benefit from your changes. |
Fulfilling these freedoms required unrestricted access to the underlying source code. Through GNU, a new decentralized model of development emerged that enabled everyone to contribute bug fixes, code suggestions and feature requests. Communication took place on internet newsgroups.
In sharp contrast to proprietary software, many open-source projects follow the 'release early—release often' philosophy of software development. These software programs are not generally viewed as a consumer product, but as a tool to reach an end.
Free software projects, although developed through collaboration, are often produced independently of each other. The fact that the software licenses explicitly permit redistribution, however, provides a basis for larger-scale projects that collect the software produced by stand-alone projects and make it available all at once in the form of a Linux distribution.
While these projects may feel less polished, users have the power to input their voice throughout the entire development process. This often means that bugs are fixed promptly and – depending on community feedback – features can be quickly integrated into the ongoing evolution.
In additionLinux, on the other hand, provides the user with the option to use pre-compiled binaries or to compile from source. This is much more powerful and fits the needs of a far broader spectrum of users.
But even more importantly, users can modify the programs themselves to suit their needs if the development is not to their liking. This is key, and lies at the basis of the term 'Free' in Free Software as it is applied in the Open Source Movement.
Modular by Design
Modularity is the initiative to disconnect the lifecycle of applications from each other and from the lifecycle of the operating system
Unix systems are characterized by a modular design that is sometimes called the "Unix philosophy". According to this philosophy, the operating system should provide a set of simple tools, each of which performs a limited, well-defined function.[7]
https://en.m.wikipedia.org/wiki/Unix_philosophy
There are several different core pieces that create the foundations of Linux: bootloader, kernel, init, desktop environment, daemons, libraries, graphics server, applications.
As a result of the open-source Linux ecosystem, there are a myriad combination of different components that fill each of these roles. These are called Distributions, or distros, and they cover a wide gamut of possible usages. DistroWatch provides an overview of thousands of Distros and how they're all related.
For example, Debian purports itself as "the universal operating system" which focuses on ensuring support for legacy hardware and systems. Ubuntu, an extremely popular Linux distribution geared towards ease of use, is built upon this Debian foundation. Even further down the family tree, you'll find operating systems like ZorinOS that aim to closely emulate Windows or MacOS to ease the transition for new users.
This customizability is a key feature of Linux and an important principle within the open ethos. Distros provide support for different hardware or offer different featuresets more specifically tailored to tasks. Many of these distributions are built upon the work of others, creating a solid foundation and community for the insular modules.
For example, the Windows kernel is heavily integrated with the graphical user interface; there is no Windows without windows.
Linux on the other hand is very modular by design. Different components of a Linux system originate from different developers; each has their own specific design goals and focus on those goals. Further, each component is configured separately, generally by the use of text based configuration files. This modular design means that the Linux kernel is independent of the graphical interface, for example. The net result is that crashes and security vulnerabilities in applications tend to remain localized, rather than affecting the system as a whole.
Licensing
GNU Software was released under the GNU General Public License and allowed full use by anyone – with specific restrictions. This was an early example of a copyleft license which mandated that all derivative works have a similar license ensuring reciprocity.
By 1989, University of California, Berkeley introduced BSD – or the Berkeley Software Distribution – and created the first publicly accessible Unix operating system. By rewriting proprietary AT&T Unix code from the ground up, they released BSD openly to facilitate open collaboration.
They created their own permissive software license that placed barely any restrictions on how you could use the software, while also providing no warranty. This even allowed proprietization – meaning it could be used within private, "closed-source" programs.
Creative commons vs copy left/permissive
Creative Commons (CC) and open source licenses are both about sharing and usage rights, but they differ in their primary focus and application. Creative Commons licenses are designed for creative works like music, art, and writing, while open source licenses are primarily for software code. Open source licenses allow for free use, modification, and redistribution of the software, while CC licenses offer a broader range of options for sharing and using creative content.
https://en.m.wikipedia.org/wiki/Creative_Commons
https://creativecommons.org/share-your-work/
A strict copy left license that comes with many conditions for usage within derivative software while providing express patent allowance.
Perhaps the strictest copyleft license, this license is a derivate of GPL but with additional sharing requirements
This straightforward license only requires that the licensing information is shown, otherwise the software can be used freely for commercial or personal usage.
A permissive license that allows that this software can be incorporated into larger projects that can be released under a different license.
Free – or Open?
During the early 1990s, the GNU Project proceeded until it neared completion – the only thing it was missing was a kernel. This integral system handles all interactions between software and hardware within a computer system. Without it, the operating system wouldn't even be able to operate. Their free kernel – known as GNU Hurd – was still incomplete.
Linus Torvalds, operating independently of the GNU Project, created the first version of the Linux kernel during his time as a computer science student. It was also released under the copy left General Public License. GNU adopted Linux as it's kernel – which was now rapidly growing into a community.
The resulting operating system is now generally referred to as Linux – even though there has been a movement to change this to GNU/Linux. Linux was quickly adopted as the flag-ship project of the newly forming Open-Source Movement.
A collective of developers concluded that the Free Software Movement's social activism was not appealing to companies. The group – later known as the Open Source Initiative – felt that more emphasis needed to be placed on the business potential for openly sharing and collaborating on source code.
Free Software Movement vs Open Source Movement
https://e.foundation/what-is-the-difference-between-free-software-and-open-source-software/
Philosophy and objectives:
– Free Software : Focuses on the ethical and moral freedoms of users. The Free Software Foundation (FSF) emphasises the user’s freedom to control the software and co-operate with the community.
– Open Source : Emphasises the practical benefits such as quality, flexibility and innovation of sharing source code. The Open Source Initiative (OSI) focuses less on ethical aspects and more on the development model.
1993: Over 100 developers work on the Linux kernel. With their assistance the kernel is adapted to the GNU environment, which creates a large spectrum of application types for Linux. The oldest currently existing Linux distribution, Slackware, is released for the first time. Later in the same year, the Debian project is established. Today it is the largest community distribution.
https://lists.debian.org/debian-announce/1997/msg00017.html
Debian believes the makers of a free software operating system should provide guarantees when a user entrusts them with control of a computer. These guarantees include:
Ensuring that the operating system remains open and free.
Giving improvements back to the community that made the operating system possible.
Not hiding problems with the software or organization.
Staying focused on the users and the software that started the phenomenon.
Making it possible for the software to be used with non-free software.
Dfsg
Debian Free Software Guidelines (DFSG)
https://en.m.wikipedia.org/wiki/The_Open_Source_Definition#Debian_Free_Software_Guidelines
Access to the source code:
2. Freedom of Modification:
3. Free Redistribution:
4. Permissive licence:
5. Community and Collaboration:
6. Transparency and security
Community Security
In software development, Linus's law is the assertion that "given enough eyeballs, all bugs are shallow". The law was formulated by Eric S. Raymond in his essay and book The Cathedral and the Bazaar (1999), and was named in honor of Linus Torvalds.[1][2]
A more formal statement is: "Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone." Presenting the code to multiple developers with the purpose of reaching consensus about its acceptance is a simple form of software reviewing. Researchers and practitioners have repeatedly shown the effectiveness of reviewing processes in finding bugs and security issues.[3]
The persistence of the Heartbleed security bug in a critical piece of code for two years has been considered as a refutation of Raymond's dictum.[6][7][8][9] Larry Seltzer suspects that the availability of source code may cause some developers and researchers to perform less extensive tests than they would with closed source software, making it easier for bugs to remain.[9] In 2015, the Linux Foundation's executive director Jim Zemlin argued that the complexity of modern software has increased to such levels that specific resource allocation is desirable to improve its security. Regarding some of 2014's largest global open source software vulnerabilities, he says, "In these cases, the eyeballs weren't really looking".[8] Large scale experiments or peer-reviewed surveys to test how well the mantra holds in practice have not been performed.[10]
Empirical support of the validity of Linus's law[11] was obtained by comparing popular and unpopular projects of the same organization. Popular projects are projects with the top 5% of GitHub stars (7,481 stars or more). Bug identification was measured using the corrective commit probability, the ratio of commits determined to be related to fixing bugs. The analysis showed that popular projects had a higher ratio of bug fixes (e.g., Google's popular projects had a 27% higher bug fix rate than Google's less popular projects). Since it is unlikely that Google lowered its code quality standards in more popular projects, this is an indication of increased bug detection efficiency in popular projects.