Common Language
Computers are complex machines that require the near constant communication and synchronization to continue functioning. These systems consist of numerous modular parts, each containing their own components and core functions. From the procedure that powers on your computer to the process that controls electron flow within a integrated chip, standards are what define our global infrastructure and facilitate communication.
Building a global computer network, more commonly known as the Internet, requires a great deal of planning. We need to ensure that computers are created in a way that promote synergy and harmony – a method for ensuring parts can work together even when created by competitors. Building upon this foundation, we need to reach an accord about how each of these parts can communicate with the others to create larger systems.
As a collective, we convene to curate standards – or a mutually agreed upon definition about something and how it should operate. They are a lingua franca, allowing us to bridge systems through common language each party can understand. Standards help set the stage for creation and collaboration on an unrivaled scale by ensuring the ability to share important information across a divide.
Coming to an agreement on standards can be a contentious process that is never quite finished. As computers gained prominence, many corporations vied to create their own proprietary standards in an attempt to corner the market. This meant that computers and parts created by one manufacturer would not always compatible with another's.
Everyone from engineers to corporations explored their own ideas about how the Internet should work. Starting in the 1960s, the Protocol Wars polarized researchers and nations alike as we debated how a World Wide Web of computers would take shape. This required finding robust and reliable means for computers to connect across vast distances. It wasn't until the early 1990s that we firmly agreed on a standard – just in time for the Unix Wars to begin.
Reasearchers were learning how to computers worked about as quickly as corporations sought to generate a profit from them. By creating divergent standards, consumers were often locked-in to the ecosystem of the hardware manufacturer.
The prevailing Internet standard was not created by any one corporation, but through an organization focused on the creation of open standards. This means that, instead of being owned by any one party, these standards are freely available for anyone to use so that no one can own the Internet.
Standards
During the first booms within digital computers, we quickly faced the reality that there were many ways to achieve the same end. Finding a path from one point to another can contain many complexities along the way. Standards ensure consistency and play an important role in defining expectations.
Hub |
Interoperability This allows individual components or entire systems to work seamlessly with other discrete parts. Originally intended for technical information systems, it has expanded to consider external factors – such as social, political and power structure. Interoperability is crucial for fields like computers and networks for adoption and overall experience. |
Security |
Security Standards often contain best practices that help guide security decisions and provide safety. This allows us to take steps to protect sensitive data through add-on mechanisms – such as encryption, authentication and access controls. By defining expectations, standards can help us avoid the risks caused by unsupported or incompatible systems
|
Verified |
Reliability Technical standards establish minimum performance and safety requirements for products and services, helping to prevent accidents, injuries, and product failures. For example, standards for electrical wiring, fire safety, and medical devices are essential for protecting public health and safety. |
These are an important part of global systems like agriculture, communication, science and technology. Defining measurements, symbols and terms we can more easily have conversations across languages and geographical boundaries.
The metric system is an example of a standardized system – what you consider a kilogram is the same as everyone else. These form a foundation on which other standards can be built, enabling larger systems with more complexity.
Technical standards set the rules for building computer components that work together and networks that can seamlessly share information over great distances. They cover a broad area from dimensions and processes all the way to safety and performance requirements.
For standards to be used and become widely accepted, they need to be formalized with technical aspects thoroughly documented. They are often maintained by organizations that focus on their continued development. Complexity of a standard will vary based on the specific problem it is trying at address.
The prominence of a standard can be greatly affected by how it is implemented:
Gavel |
These du jure standards have been created by officially recognized Standards organizations – such as IEEE, ISO, and ANSI. These standards help ensure global collaboration by defining important terms.
For example: the Internet, websites, email, mobile broadband, and cellphone service
|
Stars |
These de facto standards are not created by any official organization so they are not strictly required. They are often created by private manufacturers and have gained widespread adoption within consumers.
For example: PDFs, MP3s, HDMI cables, USB connectors, PCIe extension ports, Bluetooth and 3.5mm audio cables. |
A standard can be a closed standard or an open standard. The documentation for an open standard is open to the public and anyone can create a software that implements and uses the standard. The documentation and specification for closed standards are not available to the public, enabling its developer to sell and license the code to manage their data format to other interested software developers. While this process increases the revenue potential for a useful file format, it may limit acceptance and drive the adoption of a similar, open standard instead.[6]
Private standards typically require a financial contribution in terms of an annual fee from the organizations who adopt the standard. Corporations are encouraged to join the board of governance of the standard owner[24] which enables reciprocity. Meaning corporations have permission to exert influence over the requirements in the standard, and in return the same corporations promote the standards in their supply chains which generates revenue and profit for the standard owner. Financial incentives with private standards can result in a perverse incentive, where some private standards are created solely with the intent of generating money. BRCGS, as scheme owner of private standards, was acquired in 2016 by LGC Ltd who were owned by private equity company Kohlberg Kravis Roberts.[25] This acquisition triggered substantial increases in BRCGS annual fees.[26] In 2019, LGC Ltd was sold to private equity companies Cinven and Astorg.[27]
https://en.m.wikipedia.org/wiki/Open_standard
An open standard is a standard that is openly accessible and usable by anyone. It is also a common prerequisite that open standards use an open license that provides for extensibility. Typically, anybody can participate in their development due to their inherently open nature. There is no single definition, and interpretations vary with usage. Examples of open standards include the GSM, 4G, and 5G standards that allow most modern mobile phones to work world-wide.
In computer network engineering, an Internet Standard is a normative specification of a technology or methodology applicable to the Internet. Internet Standards are created and published by the Internet Engineering Task Force (IETF). They allow interoperation of hardware and software from different sources which allows internets to function.[1] As the Internet became global, Internet Standards became the lingua franca of worldwide communications.[2]
https://en.m.wikipedia.org/wiki/Software_standard
A software standard is a standard, protocol, or other common format of a document, file, or data transfer accepted and used by one or more software developers while working on one or more than one computer programs. Software standards enable interoperability between different programs created by different developers.
Representatives from standards organizations, like W3C[4] and ISOC,[5] collaborate on how to make a unified software standard to ensure seamless communication between software applications. These organisations consist of groups of larger software companies like Microsoft and Apple Inc.
Hardware
- Peripheral Component Interconnect (PCI) (a specification by Intel Corporation for plug-in boards to IBM-architecture PCs)
- Accelerated Graphics Port (AGP) (a specification by Intel Corporation for plug-in boards to IBM-architecture PCs)
- PCI Industrial Computer Manufacturers Group (PICMG) (an industry consortium developing Open Standards specifications for computer architectures )
- Synchronous dynamic random-access memory (SDRAM) and its DDR SDRAM variants (by JEDEC Solid State Technology Association)
- Universal Serial Bus (USB) (by USB Implementers Forum)
- Internet Protocol (IP) (a specification of the IETF for transmitting packets of data on a network – specifically, IETF RFC 791
- MQTT (Message Queuing Telemetry Transport) is a lightweight, publish-subscribe network protocol that transports messages between devices.
- Transmission Control Protocol (TCP) (a specification of the IETF for implementing streams of data on top of IP – specifically, IETF RFC 793)
- PCI Express electrical and mechanical interface, and interconnect protocol used in computers, servers, and industrial applications.
- HDMI, Display Port, VGA for video, RS-232 for low bandwidth serial communication.
- USB for high speed serial interface in computers and for powering or charging low power external devices (like mobile phones, headphones, portable hard drives) usually using micro USB plug and socket.
- 3.5 inch and 2.5 inch hard drives.
- ATX motherboard, back plane, and power standards
Specifications
In essence, a standard is a guideline for achieving uniformity, efficiency, and quality in products, processes, or services. A specification details the requirements, dimensions, and materials for a specific product or process.
Standards often include specifications. For example, a standard for a brick might include specifications for its dimensions and materials.
Specification:
Purpose:
Specifications detail the requirements and characteristics of a product, service, or process.
Scope:
Specifications can be very detailed, outlining everything from dimensions and materials to performance criteria and testing procedures.
Example:
A technical specification for a computer component might include its dimensions, power requirements, and operating temperature range.
Key Characteristics:
Specifications are often used in contracts and are crucial for ensuring that products and services meet the required standards.
https://en.m.wikipedia.org/wiki/Open_specifications
An open specification is a specification created and controlled, in an open process, by an association or a standardization body intending to achieve interoperability and interchangeability. An open specification is not controlled by a single company or individual or by a group with discriminatory membership criteria. Copies of Open Specifications are available free of charge or for a moderate fee and can be implemented under reasonable and non-discriminatory licensing (RAND) terms by all interested parties.
A specification often refers to a set of documented requirements to be satisfied by a material, design, product, or service.[1] A specification is often a type of technical standard.
There are different types of technical or engineering specifications (specs), and the term is used differently in different technical contexts. They often refer to particular documents, and/or particular information within them. The word specification is broadly defined as "to state explicitly or in detail" or "to be specific".
A requirement specification is a documented requirement, or set of documented requirements, to be satisfied by a given material, design, product, service, etc.[1] It is a common early part of engineering design and product development processes in many fields.
A design or product specification describes the features of the solutions for the Requirement Specification, referring to either a designed solution or final produced solution. It is often used to guide fabrication/production. Sometimes the term specification is here used in connection with a data sheet (or spec sheet), which may be confusing. A data sheet describes the technical characteristics of an item or product, often published by a manufacturer to help people choose or use the products. A data sheet is not a technical specification in the sense of informing how to produce.
Specifications are a type of technical standard that may be developed by any of various kinds of organizations, in both the public and private sectors. Example organization types include a corporation, a consortium (a small group of corporations), a trade association (an industry-wide group of corporations), a national government (including its different public entities, regulatory agencies, and national laboratories and institutes), a professional association (society), a purpose-made standards organization such as ISO, or vendor-neutral developed generic requirements. It is common for one organization to refer to (reference, call out, cite) the standards of another. Voluntary standards may become mandatory if adopted by a government or business contract.
https://en.m.wikipedia.org/wiki/Single_UNIX_Specification
The Single UNIX Specification (SUS) is a standard for computer operating systems,[1][2] compliance with which is required to qualify for using the "UNIX" trademark. The standard specifies programming interfaces for the C language, a command-line shell, and user commands. The core specifications of the SUS known as Base Specifications are developed and maintained by the Austin Group, which is a joint working group of IEEE, ISO/IEC JTC 1/SC 22/WG 15 and The Open Group. If an operating system is submitted to The Open Group for certification and passes conformance tests, then it is deemed to be compliant with a UNIX standard such as UNIX 98 or UNIX 03.
Very few BSD and Linux-based operating systems are submitted for compliance with the Single UNIX Specification, although system developers generally aim for compliance with POSIX standards, which form the core of the Single UNIX Specification.
Specification need
In many contexts, particularly software, specifications are needed to avoid errors due to lack of compatibility, for instance, in interoperability issues.
For instance, when two applications share Unicode data, but use different normal forms or use them incorrectly, in an incompatible way or without sharing a minimum set of interoperability specification, errors and data loss can result. For example, Mac OS X has many components that prefer or require only decomposed characters (thus decomposed-only Unicode encoded with UTF-8 is also known as "UTF8-MAC"). In one specific instance, the combination of OS X errors handling composed characters, and the samba file- and printer-sharing software (which replaces decomposed letters with composed ones when copying file names), has led to confusing and data-destroying interoperability problems.[33][34]
Protocols
Protocols are often defined by specifications. For instance, a communication protocol standard might have a specification document outlining the rules and formats for message exchange.
Standards can also influence the development of protocols and specifications.
Protocol:
Purpose:
Protocols are the rules that govern how information is exchanged, ensuring that different systems can communicate effectively.
Scope:
Protocols primarily deal with communication, covering aspects like data formatting, error checking, and message sequencing.
Example:
TCP/IP (Transmission Control Protocol/Internet Protocol) is a fundamental suite of protocols that governs how data is transmitted over the internet.
Key Characteristics:
Protocols are often implemented in software and hardware and are essential for network communication.
Protocols define how data is sent, received, and processed, while standards ensure that various technologies are compatible with each other. This coordination is critical for the Internet and other networks to function constantly and efficiently.
In essence, a standard is a guideline for achieving uniformity, efficiency, and quality in products, processes, or services. A protocol is a set of rules that govern how data is transmitted and received, facilitating communication between systems. A specification details the requirements, dimensions, and materials for a specific product or process.
https://en.m.wikipedia.org/wiki/Communication_protocol
A communication protocol is a system of rules that allows two or more entities of a communications system to transmit information via any variation of a physical quantity. The protocol defines the rules, syntax, semantics, and synchronization of communication and possible error recovery methods. Protocols may be implemented by hardware, software, or a combination of both.[1]
Communicating systems use well-defined formats for exchanging various messages. Each message has an exact meaning intended to elicit a response from a range of possible responses predetermined for that particular situation. The specified behavior is typically independent of how it is to be implemented. Communication protocols have to be agreed upon by the parties involved.[2] To reach an agreement, a protocol may be developed into a technical standard. A programming language describes the same for computations, so there is a close analogy between protocols and programming languages: protocols are to communication what programming languages are to computations.[3] An alternate formulation states that protocols are to communication what algorithms are to computation.[4]
Multiple protocols often describe different aspects of a single communication. A group of protocols designed to work together is known as a protocol suite; when implemented in software they are a protocol stack.
Internet communication protocols are published by the Internet Engineering Task Force (IETF). The IEEE (Institute of Electrical and Electronics Engineers) handles wired and wireless networking and the International Organization for Standardization (ISO) handles other types. The ITU-T handles telecommunications protocols and formats for the public switched telephone network (PSTN). As the PSTN and Internet converge, the standards are also being driven towards convergence.
A protocol is a set of rules that determines how data is sent and received over a network. The protocol is just like a language that computers use to talk to each other, ensuring they understand and can respond to each other's messages correctly. Protocols help make sure that data moves smoothly and securely between devices on a network.
To make communication successful between devices, some rules and procedures should be agreed upon at the sending and receiving ends of the system. Such rules and procedures are called Protocols. Different types of protocols are used for different types of communication.
- Syntax: Syntax refers to the structure or the format of the data that gets exchanged between the devices. Syntax of the message includes the type of data, composition of the message, and sequencing of the message. The starting 8 bits of data are considered as the address of the sender. The next 8 bits are considered to be the address of the receiver. The remaining bits are considered as the message itself.
- Semantics: Semantics defines data transmitted between devices. It provides rules and norms for understanding message or data element values and actions.
- Timing: Timing refers to the synchronization and coordination between devices while transferring the data. Timing ensures at what time data should be sent and how fast data can be sent. For example, If a sender sends 100 Mbps but the receiver can only handle 1 Mbps, the receiver will overflow and lose data. Timing ensures preventing of data loss, collisions, and other timing-related issues.
- Sequence Control: Sequence control ensures the proper ordering of data packets. The main responsibility of sequence control is to acknowledge the data while it get received, and the retransmission of lost data. Through this mechanism, the data is delivered in correct order.
- Flow Control: Flow control regulates device data delivery. It limits the sender's data or asks the receiver if it's ready for more. Flow control prevents data congestion and loss.
- Error Control: Error control mechanisms detect and fix data transmission faults. They include error detection codes, data resend, and error recovery. Error control detects and corrects noise, interference, and other problems to maintain data integrity.
- Security : Network security protects data confidentiality, integrity, and authenticity. which includes encryption, authentication, access control, and other security procedures. Network communication's privacy and trustworthiness are protected by security standards.
In essence, standards provide the overall framework, specifications detail the requirements within that framework, and protocols define the rules for communication within that framework.
Interface
In computing, an interface is a point of interaction or connection between different systems, components, or software, while a standard is a rule or specification that defines how these interactions should occur. Interfaces define what can be done, while standards define how it should be done. Relationship:
Standards often define the specifications for interfaces. For example, the USB standard specifies the physical and electrical characteristics of the USB interface.
Interfaces can be designed to conform to specific standards, ensuring interoperability and compatibility.
In essence, an interface is the "what" (the connection point), and a standard is the "how" (the rules for that connection).
https://en.m.wikipedia.org/wiki/Interface_(computing)
In computing, an interface (American English) or interphase (British English, archaic) is a shared boundary across which two or more separate components of a computer system exchange information. The exchange can be between software, computer hardware, peripheral devices, humans, and combinations of these.[1] Some computer hardware devices, such as a touchscreen, can both send and receive data through the interface, while others such as a mouse or microphone may only provide an interface to send data to a given system.[2]
https://en.m.wikipedia.org/wiki/Hardware_interface
Hardware interfaces exist in many components, such as the various buses, storage devices, other I/O devices, etc. A hardware interface is described by the mechanical, electrical, and logical signals at the interface and the protocol for sequencing them (sometimes called signaling).[3] A standard interface, such as SCSI, decouples the design and introduction of computing hardware, such as I/O devices, from the design and introduction of other components of a computing system, thereby allowing users and manufacturers great flexibility in the implementation of computing systems.[3] Hardware interfaces can be parallel with several electrical connections carrying parts of the data simultaneously or serial where data are sent one bit at a time.[4]
https://en.m.wikipedia.org/wiki/ACPI
Advanced Configuration and Power Interface (ACPI) is an open standard that operating systems can use to discover and configure computer hardware components, to perform power management (e.g. putting unused hardware components to sleep), auto configuration (e.g. Plug and Play and hot swapping), and status monitoring. It was first released in December 1996. ACPI aims to replace Advanced Power Management (APM), the MultiProcessor Specification, and the Plug and Play BIOS (PnP) Specification.[1] ACPI brings power management under the control of the operating system, as opposed to the previous BIOS-centric system that relied on platform-specific firmware to determine power management and configuration policies.[2] The specification is central to the Operating System-directed configuration and Power Management (OSPM) system. ACPI defines hardware abstraction interfaces between the device's firmware (e.g. BIOS, UEFI), the computer hardware components, and the operating systems.[3][4]
https://en.m.wikipedia.org/wiki/Application_binary_interface
https://en.m.wikipedia.org/wiki/Application_programming_interface
A software interface may refer to a wide range of different types of interfaces at different "levels". For example, an operating system may interface with pieces of hardware. Applications or programs running on the operating system may need to interact via data streams, filters, and pipelines.[5] In object oriented programs, objects within an application may need to interact via methods.[6]
In practice
A key principle of design is to prohibit access to all resources by default, allowing access only through well-defined entry points, i.e., interfaces.[7] Software interfaces provide access to computer resources (such as memory, CPU, storage, etc.) of the underlying computer system; direct access (i.e., not through well-designed interfaces) to such resources by software can have major ramifications—sometimes disastrous ones—for functionality and stability.[citation needed]