Robustness Principle

Robustness Principle

The Internet Engineering Task Force maintains a numbered series of Request for Comments documents (RFCs) that define the protocols that direct the Internet. In the Transmission Control Protocol Request For Comments RFC 793, written in 1981, American computer scientist Jon Postel stated:

:"TCP implementations will follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others.

The second clause has been generalized into the robustness principle (known otherwise as Postel's Law):

Be conservative in what you do; be liberal in what you accept from others. [Schwartz, Mattathias. " [http://www.nytimes.com/2008/08/03/magazine/03trolls-t.html?pagewanted=5&_r=2&hp Malewebolence] ", "The New York Times", August 3, 2008.]

The principle suggests that Internet software developers carefully write software that adheres closely to extant RFCs but accept and parse input from clients that might not be consistent with those RFCs.

Postel's principle is often misinterpreted as discouraging checking messages for validity. For example, a subsequent RFC, RFC 3117, suggests that Postel's principle be followed only loosely, lest errors or less-than-desirable implementations should be propagated generally:

Counter-intuitively, Postel's robustness principle ("be conservative in what you send, liberal in what you accept") often leads to deployment problems. Why? When a new implementation is initially fielded, it is likely that it will encounter only a subset of existing implementations. If those implementations follow the robustness principle, then errors in the new implementation will likely go undetected. The new implementation then sees some, but not widespread deployment. This process repeats for several new implementations. Eventually, the not-quite-correct implementations run into other implementations that are less liberal than the initial set of implementations. The reader should be able to figure out what happens next. Accordingly, "explicit consistency checks in a protocol are very useful", even if they impose implementation overhead. [emphasis added]

However, a deeper understanding of Postel's principle encourages such consistency or validity checks. While errors detected by such checks should indeed be logged (and perhaps even displayed to the user), they should not result in the rejection of invalid messages unless necessary. See RFC 1122:

At every layer of the protocols, there is a general rule whose application can lead to enormous benefits in robustness and interoperability [IP:1] :
Be liberal in what you accept, and conservative in what you send

Software should be written to deal with every conceivable error, no matter how unlikely; sooner or later a packet will come in with that particular combination of errors and attributes, and unless the software is prepared, chaos can ensue. In general, it is best to assume that the network is filled with malevolent entities that will send in packets designed to have the worst possible effect. This assumption will lead to suitable protective design, although the most serious problems in the Internet have been caused by unenvisaged mechanisms triggered by low-probability events; mere human malice would never have taken so devious a course!

Adaptability to change must be designed into all levels of Internet host software. As a simple example, consider a protocol specification that contains an enumeration of values for a particular header field -- e.g., a type field, a port number, or an error code; this enumeration must be assumed to be incomplete. Thus, if a protocol specification defines four possible error codes, the software must not break when a fifth code shows up. "An undefined code might be logged (see below), but it must not cause a failure."

The second part of the principle is almost as important: software on other hosts may contain deficiencies that make it unwise to exploit legal but obscure protocol features. It is unwise to stray far from the obvious and simple, lest untoward effects result elsewhere. A corollary of this is "watch out for misbehaving hosts"; host software should be prepared, not just to survive other misbehaving hosts, but also to cooperate to limit the amount of disruption such hosts can cause to the shared communication facility. [emphasis added]

References

External links

* [http://ironick.typepad.com/ironick/2005/05/my_history_of_t.html History of the principle]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Mutational robustness — describes the extent to which an organism’s phenotype remains constant in spite of mutation.[1] Natural selection can directly induce the evolution of mutational robustness only when mutation rates are high and population sizes are large.[2] The… …   Wikipedia

  • TCP/IP model — See also: Internet Protocol Suite The TCP/IP model (Transmission Control Protocol/Internet Protocol) is a descriptive framework for the Internet Protocol Suite of computer network protocols created in the 1970s by DARPA, an agency of the United… …   Wikipedia

  • Internet Layer — The Internet Layer is a group of internetworking methods in the TCP/IP protocol suite which is the foundation of the Internet (RFC 1122). It is the group of methods, protocols, and specifications which are used to transport datagrams (packets)… …   Wikipedia

  • Jon Postel — Infobox Person name=Jon Postel caption=Photo by Irene Fertik, USC News Service. Copyright 1994, USC birth name=Jonathan Bruce Postel birth date=birth date|mf=yes|1943|08|06 death date=dda|1998|10|16|1943|08|06 known for=Request for… …   Wikipedia

  • X.25 — is an ITU T standard network layer protocol for packet switched wide area network (WAN) communication. An X.25 WAN consists of packet switching exchange (PSE) nodes as the networking hardware, and leased lines, Plain old telephone service… …   Wikipedia

  • Info-gap decision theory — is a non probabilistic decision theory that seeks to optimize robustness to failure – or opportuneness for windfall – under severe uncertainty,[1][2] in particular applying sensitivity analysis of the stability radius type[3] to perturbations in… …   Wikipedia

  • Degeneracy (biology) — Within biological systems, degeneracy refers to circumstances where structurally dissimilar components/modules/pathways can perform similar functions (i.e. are effectively interchangeable) under certain conditions, but perform distinct functions… …   Wikipedia

  • Control theory — For control theory in psychology and sociology, see control theory (sociology) and Perceptual Control Theory. The concept of the feedback loop to control the dynamic behavior of the system: this is negative feedback, because the sensed value is… …   Wikipedia

  • Mandatory access control — In computer security, mandatory access control (MAC) refers to a type of access control by which the operating system constrains the ability of a subject or initiator to access or generally perform some sort of operation on an object or target.… …   Wikipedia

  • Field electron emission — It is requested that a diagram or diagrams be included in this article to improve its quality. For more information, refer to discussion on this page and/or the listing at Wikipedia:Requested images. Field emission (FE) (also known as field… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”