What is the Cloud?
Computers are machines that can automatically process data based on defined rules – like logic and arithmetic. Humans have long explored purpose-built "analog" machines that perform specific tasks, such as the Antikythera mechanism from 200 BCE for calculating the position of stars and planets.
By the early 1900s, the programmable computer theorized by Charles Babbage and Ada Lovelace was finally demonstrated. While analog computers could reliably perform a pre-defined task, the Analytical Engine was the first to be "Turing-complete".
This new class of computers was capable of being programmed to perform specific tasks – like multiplication or conditional statements. This not only made computers more versatile, but ensured similar systems could share their programming.
Digital Systems
These first computers bear little resemblance to the discrete personal computers we use today. Analog systems used physical properties to perform their computations – such as connecting a circuit to turn on a warning light when the temperature within the room exceeds a certain threshold.
The first "moderndigital" general-purpose computers, ENIAC and Colossus,computers were created in the early–mid 20th century. These Bysystems transmit and process electrical signals to achieve the 1960s,desired computersoutcome. Colossus and ENIAC were becomingboth moredigital commonplaceand atprogrammable large– institutions,meaning butthat werethe stillhardware largelycould inaccessiblebe instructed to manyperform professionalsnew andtasks researchers,even letafter alonethe everydaycomputer consumers.had been built.
Women were the first "computers" – tasked with maintaining the programming necessary to operate the first general-purpose digital computer systems.
By the 1960s, digital computers were becoming more commonplace at large institutions, but were still largely inaccessible to many professionals and researchers, let alone everyday consumers. While hardware advances allowed computers to perform more complex tasks, software improved how many tasks could be performed at once.
Local Sharing
Time sharing became a technique that allocated computer resources to multiple users, enabling true multi-tasking for the first time. In the past, programs ran in sequence and tried to completely finishfinished a single task before moving onto the next. Now, each user could run their own program and the servercomputer would quickly cycle through each user's task, performing a little bit of each at a time.

UsersPeople could connect to this shared computer system through a "terminal" – or a keyboard and monitor that transmitted data back and forth over a physical cable. This maintained the illusion of a personal computer that ran only their task when, in reality, it was a personal computer running only their task when in actuality it was only a personal portal to a shared local server. computer. Before the internet was even invented, we had begun our exploration into the concept of the modern "cloud".
ARPANETTime (Advancedsharing Research Projects Agency Network), unveiled in 1969, marks the creation of the long-distance Internet and implemented the same TCP/IP standards we use to this day. At the same time, IBM's CP-40was expanded upon time sharing to create virtualization – or dividing a single computer's hardware resourcescomputer into multiple softwarevirtual environments.computers. This allowed the creation of multiple secure and isolated serverenclaves environmentswhere withineach user – whether a singleperson physicalor hardwarecorporation system.

Interconnection
In 1969, ARPANET (Advanced Research Projects Agency Network) marked the birth of the long-distance Internet. During this time, the foundations were laid for the same "TCP/IP" standards we still use to this day. Using these new tools, researchers (and their computers) could share their resources across vast distances.
By 1991, the modernWorld internetWide Web as we know it was released to global consumers. With a world-wide consumer base. By subscribingsubscription to an internet service provider, anyone could connect to the worldWorld wideWide webWeb. Before the turn of the centurycentury, the first theory of cloud computing is defined, creatingdefined a foundation for easily scalable servers.
This technology allowsenabled independent hardware systemscomputers to be linked together over a network and share their resources towards supporting a unified service. Distributed computing enabled websites to support a hundred thousand users by balancing the work load across virtualmultiple clusterscomputers instead trying to juggle everythingeveryone onusing one server.system.

This allowed individual data centers to be spread across the globe. Whenever a service was accessed over the internet, the user could be geo-located and directed to the server that was closest to them. This provided a better connection quality,connection, split up the workload and created more space for physical data storage. As a result, data centers quickly proliferated across the globe.
Data Centers
By the early 2000s, technology companies had invested heavily into thisdata center infrastructure. This enabledThrough this model, businesses tocould forego hosting their own internalserver technology infrastructure within their building spacecomputers and instead rely on anthe externalinfrastructure contractedmaintained service.by someone else.
These distributed server farms became increasingly integrated into our every day lives – personally and professionally. Google and Apple both released services targeting corporations and consumers, offering access to storage space and cloud-based tools. Soon after, Amazon Web Services (AWS) introduced their "cloud" computing and storage servicesservices, whereinenabling businesses couldto rent remotely managed servercomputer resources and– like processing power or storage space.
|
|
Google |
Contemporary cloud computing combines virtualization, distributed computing, and secure internet connections to build vast server farm infrastructures around the globe. By dividing up server clusters into numerous isolated software environments, data centers essentiallycreate createthe digital apartmentsequivalent forof an apartment building housing multiple tenants.
|
|
ByWhen contracting out onlinedigital infrastructure services, corporations may lose physical ownership of their data andbecause haveit can be nearly impossible to keep track. Under this paradigm, many the potential to shirk their responsibilities for digital stewardship. – Thisor hashow ledthey tokeep data safe, secure and up-to-date.
Everything as a Service
Data centers have fomented great deal of uncertainty about data privacy, unauthorized access and legal compliance: with digital data outside the physical control of a company, how could they ensure their digital security? For better or worse, data centers power much of the internet and have become a driving force behind technological expansion.

This vast infrastructure has led to a standardized "cloud computing" service model focused on providing a spectrum of granularity in regards to what is managed by the client versus the cloud computing service provide.
|
Dns |
The practice of renting maintained computer systems allowing for expanse usage scenarios. |
| Code_blocks |
Subscription service to software deployment platforms like Docker allowing rapid deployment. |
| Terminal |
Connects users – occasionally on behalf of a corporation – to web-based software, such as office and file-hosting services. |
| Function |
Singular functions to be completed, typically paid per usage, such as image or data processing.
|
On one end, the client is in complete control over how they utilize their rented hardware while the other occurs completely behind-the-scenes.

While these technologies were groundbreaking for creating global internet platforms, they have had many impacts on privacy and technology development. You may lose access to your data at any point because service provider decides they don't want to host your content. With its current trajectory, cloud computing threatens to completely monopolize and privatize the very infrastructure that powers the Internet. More than 85% of global business are expected to adopt a "cloud-first" approach by 2025.
Self-Hosting
Federation is nascent computing concept that enables servers to share services that can operate across multiple autonomous servers. By decentralizing the ownership of a software, we can ensure no one entity can unilaterally control the service. The Fediverse – or a collection of social networking services – all operate using a standardized protocol that allow them to communicate while maintaining individuality.

By utilizing these same cloud computing technologies, anyone has the ability to host their own services – even from home. Linux, Docker and all of the necessary software are open-source, available for free to anyone. When self-hosting your own cloud server, you have the power – and responsibility – to act as an infrastructure, platform and software provider.








