Skip to main content

Common Language

Computers are complex machines that require the near constant communication and synchronization to continue functioning.  These systems consist of numerous modular parts, each containing their own components and core functions.  From the procedure that powers on your computer to the process that controls electron flow within a integrated chip, standards are what define our global infrastructure and facilitate communication.

Building a global computer network, more commonly known as the Internet, requires a great deal of planning. We need to ensure that computers are created in a way that promote synergy and harmony – a method for ensuring parts can work together even when created by competitors.  Building upon this foundation, we need to reach an accord about how each of these parts can communicate with the others to create larger systems.

As a collective, we convene to curate standards – or a mutually agreed upon definition about something and how it should operate. They are a lingua franca, allowing us to bridge systems through common language each party can understand.  Standards help set the stage for creation and collaboration on an unrivaled scale by ensuring the ability to share important information across a divide.

Coming to an agreement on standards can be a contentious process that is never quite finished.  As computers gained prominence, many corporations vied to create their own proprietary standards in an attempt to corner the market.  This meant that computers and parts created by one manufacturer would not always compatible with another's.

Everyone from engineers to corporations explored their own ideas about how the Internet should work. Starting in the 1960s, the Protocol Wars polarized researchers and nations alike as we debated how a World Wide Web of computers would take shape.  This required finding robust and reliable means for computers to connect across vast distances. It wasn't until the early 1990s that we firmly agreed on a standard – just in time for the Unix Wars to begin.

Reasearchers were learning how to computers worked about as quickly as corporations sought to generate a profit from them.  By creating divergent standards, consumers were often locked-in to the ecosystem of the hardware manufacturer.

The prevailing Internet standard was not created by any one corporation, but through an organization focused on the creation of open standards.  This means that, instead of being owned by any one party, these standards are freely available for anyone to use so that no one can own the Internet.

Standards

During the first booms within digital computers, we quickly faced the reality that there were many ways to achieve the same end.  Finding a path from one point to another can contain many complexities along the way.  Standards ensure consistency and play an important role in defining expectations.

Hub

Interoperability

This allows individual components or entire systems to work seamlessly with other discrete parts.  Originally intended for technical information systems, it has expanded to consider external factors – such as social, political and power structure.  Interoperability is crucial within technical fields – like computers and networks – for widespread adoption.

Security

Security

Standards often contain best practices that help guide security decisions and provide safety.  This allows us to take steps to protect sensitive data through add-on mechanisms – such as encryption, authentication and access controls.  By defining expectations, standards can help us avoid the risks caused by unsupported or incompatible systems

 

Verified

Reliability

Standards will often formalize the minimum requirements to ensure safety and continued operation.  This can prevent accidents and injuries caused by product failures, as well as provide assurance that the system will keep functioning.

These are an important part of global systems like agriculture, communication, science and technology. Defining measurements, symbols and terms we can more easily have conversations across languages and geographical boundaries. 

The metric system is an example of a standardized system – what you consider a kilogram is the same as everyone else.  These form a foundation on which other standards can be built, enabling larger systems with more complexity.

Technical standards set the rules for building computer components that work together and networks that can seamlessly share information over great distances.  They cover a broad area from dimensions and processes all the way to safety and performance requirements.  Standards are used for creating both hardware and software.

For standards to be used and become widely accepted, they need to be formalized with technical aspects thoroughly documented.  They are often maintained by organizations that focus on their continued development.  How complex is will vary based on the specific problem it is addressing.

The prominence of a standard can be greatly affected by how it is implemented:

Gavel

By Regulation

These du jure standards have been created by officially recognized Standards organizations – such as IEEE, ISO, and ANSI.  These standards help ensure global collaboration by defining important terms.

 

For example: the Internet, websites, email, mobile broadband, and cellphone service

 

Stars

By Convention

These de facto standards are not created by any official organization so they are not strictly required.  They are often created by private manufacturers and have gained widespread adoption within consumers.

 

For example: PDFs, MP3s, HDMI cables, USB connectors, PCIe extension portsBluetooth and 3.5mm audio cables.

Not all standards are created equal and come from many different entities. They can be proprietary "closed" standards or community-focused open standards.  They each fill their own niche, coming with their own intentions and complexities

Some standards are maintained by a private corporation and are not available to the general public.  Generally, these cost a fee or require some other contribution before they can be used.  While this may increase profit, adoption may be limited because of the additional steps and increased cost. 

This has the potential to create perverse incentives for the standard owner to focus solely on their profit generation. Membership can provide corporations means of providing their voice to its development and push back against this.

HDMI, Bluetooth and USB are examples of "closed" standards that require licensing to claim compliance.

This process may drive manufacturers to alternative open standards that are community-developed. All necessary documentation and specifications are openly available to the public.  These also generally come with an open license that opens the avenue to extensibility and new features.  Based on how they are implemented, there may be variability from the standards as written.

The Internet and cell phone mobile broadband are examples of open standards.

 

Hardware

Specifications

In essence, a standard is a guideline for achieving uniformity, efficiency, and quality in products, processes, or services. A specification details the requirements, dimensions, and materials for a specific product or process. 

Standards often include specifications. For example, a standard for a brick might include specifications for its dimensions and materials.

Specification:

    Purpose:
    Specifications detail the requirements and characteristics of a product, service, or process.

Scope:
Specifications can be very detailed, outlining everything from dimensions and materials to performance criteria and testing procedures.
Example:
A technical specification for a computer component might include its dimensions, power requirements, and operating temperature range.
Key Characteristics:
Specifications are often used in contracts and are crucial for ensuring that products and services meet the required standards.

https://en.m.wikipedia.org/wiki/Open_specifications

An open specification is a specification created and controlled, in an open process, by an association or a standardization body intending to achieve interoperability and interchangeability. An open specification is not controlled by a single company or individual or by a group with discriminatory membership criteria. Copies of Open Specifications are available free of charge or for a moderate fee and can be implemented under reasonable and non-discriminatory licensing (RAND) terms by all interested parties.

A specification often refers to a set of documented requirements to be satisfied by a material, design, product, or service.[1] A specification is often a type of technical standard.

There are different types of technical or engineering specifications (specs), and the term is used differently in different technical contexts. They often refer to particular documents, and/or particular information within them. The word specification is broadly defined as "to state explicitly or in detail" or "to be specific".

A requirement specification is a documented requirement, or set of documented requirements, to be satisfied by a given material, design, product, service, etc.[1] It is a common early part of engineering design and product development processes in many fields.

A design or product specification describes the features of the solutions for the Requirement Specification, referring to either a designed solution or final produced solution. It is often used to guide fabrication/production. Sometimes the term specification is here used in connection with a data sheet (or spec sheet), which may be confusing. A data sheet describes the technical characteristics of an item or product, often published by a manufacturer to help people choose or use the products. A data sheet is not a technical specification in the sense of informing how to produce.

Specifications are a type of technical standard that may be developed by any of various kinds of organizations, in both the public and private sectors. Example organization types include a corporation, a consortium (a small group of corporations), a trade association (an industry-wide group of corporations), a national government (including its different public entities, regulatory agencies, and national laboratories and institutes), a professional association (society), a purpose-made standards organization such as ISO, or vendor-neutral developed generic requirements. It is common for one organization to refer to (reference, call out, cite) the standards of another. Voluntary standards may become mandatory if adopted by a government or business contract.

https://en.m.wikipedia.org/wiki/Single_UNIX_Specification

The Single UNIX Specification (SUS) is a standard for computer operating systems,[1][2] compliance with which is required to qualify for using the "UNIX" trademark. The standard specifies programming interfaces for the C language, a command-line shell, and user commands. The core specifications of the SUS known as Base Specifications are developed and maintained by the Austin Group, which is a joint working group of IEEE, ISO/IEC JTC 1/SC 22/WG 15 and The Open Group. If an operating system is submitted to The Open Group for certification and passes conformance tests, then it is deemed to be compliant with a UNIX standard such as UNIX 98 or UNIX 03.

Very few BSD and Linux-based operating systems are submitted for compliance with the Single UNIX Specification, although system developers generally aim for compliance with POSIX standards, which form the core of the Single UNIX Specification.

Specification need

In many contexts, particularly software, specifications are needed to avoid errors due to lack of compatibility, for instance, in interoperability issues.

For instance, when two applications share Unicode data, but use different normal forms or use them incorrectly, in an incompatible way or without sharing a minimum set of interoperability specification, errors and data loss can result. For example, Mac OS X has many components that prefer or require only decomposed characters (thus decomposed-only Unicode encoded with UTF-8 is also known as "UTF8-MAC"). In one specific instance, the combination of OS X errors handling composed characters, and the samba file- and printer-sharing software (which replaces decomposed letters with composed ones when copying file names), has led to confusing and data-destroying interoperability problems.[33][34]

Protocols

Protocols are often defined by specifications. For instance, a communication protocol standard might have a specification document outlining the rules and formats for message exchange.
Standards can also influence the development of protocols and specifications.

Protocol:

    Purpose:
    Protocols are the rules that govern how information is exchanged, ensuring that different systems can communicate effectively.

Scope:
Protocols primarily deal with communication, covering aspects like data formatting, error checking, and message sequencing.
Example:
TCP/IP (Transmission Control Protocol/Internet Protocol) is a fundamental suite of protocols that governs how data is transmitted over the internet.
Key Characteristics:
Protocols are often implemented in software and hardware and are essential for network communication. 

Protocols define how data is sent, received, and processed, while standards ensure that various technologies are compatible with each other. This coordination is critical for the Internet and other networks to function constantly and efficiently. 

In essence, a standard is a guideline for achieving uniformity, efficiency, and quality in products, processes, or services. A protocol is a set of rules that govern how data is transmitted and received, facilitating communication between systems. A specification details the requirements, dimensions, and materials for a specific product or process. 

https://en.m.wikipedia.org/wiki/Communication_protocol

A communication protocol is a system of rules that allows two or more entities of a communications system to transmit information via any variation of a physical quantity. The protocol defines the rules, syntax, semantics, and synchronization of communication and possible error recovery methods. Protocols may be implemented by hardware, software, or a combination of both.[1]

Communicating systems use well-defined formats for exchanging various messages. Each message has an exact meaning intended to elicit a response from a range of possible responses predetermined for that particular situation. The specified behavior is typically independent of how it is to be implemented. Communication protocols have to be agreed upon by the parties involved.[2] To reach an agreement, a protocol may be developed into a technical standard. A programming language describes the same for computations, so there is a close analogy between protocols and programming languages: protocols are to communication what programming languages are to computations.[3] An alternate formulation states that protocols are to communication what algorithms are to computation.[4]

Multiple protocols often describe different aspects of a single communication. A group of protocols designed to work together is known as a protocol suite; when implemented in software they are a protocol stack.

Internet communication protocols are published by the Internet Engineering Task Force (IETF). The IEEE (Institute of Electrical and Electronics Engineers) handles wired and wireless networking and the International Organization for Standardization (ISO) handles other types. The ITU-T handles telecommunications protocols and formats for the public switched telephone network (PSTN). As the PSTN and Internet converge, the standards are also being driven towards convergence.

A protocol is a set of rules that determines how data is sent and received over a network. The protocol is just like a language that computers use to talk to each other, ensuring they understand and can respond to each other's messages correctly. Protocols help make sure that data moves smoothly and securely between devices on a network.

To make communication successful between devices, some rules and procedures should be agreed upon at the sending and receiving ends of the system. Such rules and procedures are called Protocols. Different types of protocols are used for different types of communication.

  • Syntax: Syntax refers to the structure or the format of the data that gets exchanged between the devices. Syntax of the message includes the type of data, composition of the message, and sequencing of the message. The starting 8 bits of data are considered as the address of the sender. The next 8 bits are considered to be the address of the receiver. The remaining bits are considered as the message itself.
  • Semantics: Semantics defines data transmitted between devices. It provides rules and norms for understanding message or data element values and actions.
  • Timing: Timing refers to the synchronization and coordination between devices while transferring the data. Timing ensures at what time data should be sent and how fast data can be sent. For example, If a sender sends 100 Mbps but the receiver can only handle 1 Mbps, the receiver will overflow and lose data. Timing ensures preventing of data loss, collisions, and other timing-related issues.
  • Sequence Control: Sequence control ensures the proper ordering of data packets. The main responsibility of sequence control is to acknowledge the data while it get received, and the retransmission of lost data. Through this mechanism, the data is delivered in correct order.
  • Flow Control: Flow control regulates device data delivery. It limits the sender's data or asks the receiver if it's ready for more. Flow control prevents data congestion and loss.
  • Error Control: Error control mechanisms detect and fix data transmission faults. They include error detection codes, data resend, and error recovery. Error control detects and corrects noise, interference, and other problems to maintain data integrity.
  • Security : Network security protects data confidentiality, integrity, and authenticity. which includes encryption, authentication, access control, and other security procedures. Network communication's privacy and trustworthiness are protected by security standards.


In essence, standards provide the overall framework, specifications detail the requirements within that framework, and protocols define the rules for communication within that framework.

Interface

In computing, an interface is a point of interaction or connection between different systems, components, or software, while a standard is a rule or specification that defines how these interactions should occur. Interfaces define what can be done, while standards define how it should be done. Relationship:

    Standards often define the specifications for interfaces. For example, the USB standard specifies the physical and electrical characteristics of the USB interface.

Interfaces can be designed to conform to specific standards, ensuring interoperability and compatibility.

In essence, an interface is the "what" (the connection point), and a standard is the "how" (the rules for that connection). 

https://en.m.wikipedia.org/wiki/Interface_(computing)

In computing, an interface (American English) or interphase (British English, archaic) is a shared boundary across which two or more separate components of a computer system exchange information. The exchange can be between software, computer hardware, peripheral devices, humans, and combinations of these.[1] Some computer hardware devices, such as a touchscreen, can both send and receive data through the interface, while others such as a mouse or microphone may only provide an interface to send data to a given system.[2]

https://en.m.wikipedia.org/wiki/Hardware_interface

Hardware interfaces exist in many components, such as the various buses, storage devices, other I/O devices, etc. A hardware interface is described by the mechanical, electrical, and logical signals at the interface and the protocol for sequencing them (sometimes called signaling).[3] A standard interface, such as SCSI, decouples the design and introduction of computing hardware, such as I/O devices, from the design and introduction of other components of a computing system, thereby allowing users and manufacturers great flexibility in the implementation of computing systems.[3] Hardware interfaces can be parallel with several electrical connections carrying parts of the data simultaneously or serial where data are sent one bit at a time.[4]

https://en.m.wikipedia.org/wiki/ACPI

Advanced Configuration and Power Interface (ACPI) is an open standard that operating systems can use to discover and configure computer hardware components, to perform power management (e.g. putting unused hardware components to sleep), auto configuration (e.g. Plug and Play and hot swapping), and status monitoring. It was first released in December 1996. ACPI aims to replace Advanced Power Management (APM), the MultiProcessor Specification, and the Plug and Play BIOS (PnP) Specification.[1] ACPI brings power management under the control of the operating system, as opposed to the previous BIOS-centric system that relied on platform-specific firmware to determine power management and configuration policies.[2] The specification is central to the Operating System-directed configuration and Power Management (OSPM) system. ACPI defines hardware abstraction interfaces between the device's firmware (e.g. BIOS, UEFI), the computer hardware components, and the operating systems.[3][4]

https://en.m.wikipedia.org/wiki/Application_binary_interface

https://en.m.wikipedia.org/wiki/Application_programming_interface

A software interface may refer to a wide range of different types of interfaces at different "levels". For example, an operating system may interface with pieces of hardware. Applications or programs running on the operating system may need to interact via data streams, filters, and pipelines.[5] In object oriented programs, objects within an application may need to interact via methods.[6]

In practice

A key principle of design is to prohibit access to all resources by default, allowing access only through well-defined entry points, i.e., interfaces.[7] Software interfaces provide access to computer resources (such as memory, CPU, storage, etc.) of the underlying computer system; direct access (i.e., not through well-designed interfaces) to such resources by software can have major ramifications—sometimes disastrous ones—for functionality and stability.[citation needed]