D. Crocker <firstname.lastname@example.org>
(c) 1993, Association for Computing Machinery
[Reprinted from StandardView , Vol 1, No. 1,1993(1)]
(A postscript version is also available.)
The Internet began as a research activity by the US. Defense Advanced Research Project Agency (DARPA) and has developed into a global data communications service, operated as a loose confederation of many different organizations. At the core of this service is a collection of networking technologies that were originated by the DARPA-funded researchers but which now benefit from improvements and additions by an equally loose international confederation drawn from research, academia and industry. The Internet currently is estimated to include about 50,000 networks and 30 million users. It is doubling approximately every year, so its technology is reaching further into the general population.
Given the variety of other activities and groups pursuing development of communication standards, the success of the Internet and its technology is remarkable. This paper discusses the style of technical development that is used within the Internet and suggests the reasons for its success. Some comparisons with other standards effort are offered, as well as an attempt to gaze into the future for the Internet's technology development. An extensive discussion of this topic can also be found in Crocker (1992). The formal description of the Internet standards process is documented in Chapin (1992).
However, it is useful first to discuss the realm of standardization in which the Internet developers play. Called "open systems" the term has come to have very different meanings.
What's in a Standard?
In data communications, a standard specifies a set of procedures. A specification typically pertains to computer-to-computer interaction but might be more limited, such as describing only the format of data, rather than all of the rules for passing that data back and forth. While mildly controversial, it also is legitimate to specify characteristics of information that is exchanged among humans, such as for electronic mail address strings that should be placed on business cards. Standardizing such strings greatly facilitates the "out of band" passing of information which eventually winds up as input data to a computer.
A standard might also specify the procedures to use when operating a system. Typically, Internet standards shy away from such dictates, since there is a strong desire to leave network operators free to conduct business as they see fit. However, guidelines occasionally are published, when conformance to them will be highly beneficial for the overall health of the Internet. Still, such guidelines are not formal standards.
Discussions often distinguish de jure from de facto standards. As the name denotes, the former is made legitimate by force of law, whereas the latter is legitimate by virtue of popularity. Since the Internet's researchers had no intention of developing a global service, its technology definitely falls into the camp of de facto. Unfortunately, this is sometimes used against it, to suggest that it is less legitimate than the formally-commissioned products of other groups. Instead, one should note that its adoption has been possible only because of its very strong virtues.
By and large, all successful specifications are de facto standards, since there is little leverage that their developers have on those doing the adopting. De jure standards can, and do, fail to gain popular use. So, it's much more helpful to consider these technologies on their merits than on their pedigrees.
The commercial pressure for open systems has been specifically intended to let customers obtain products from a variety of vendors, potentially buying each component with a competitive bid. But there are different ways to create multiple sources of a product, so the remainder of this section considers the options and particularly the types of organizations that produce these various kinds of open systems.
A vendor may publish the specifications of their proprietary technology. This allows a third-party "aftermarket" to exist, usually selling products at a lower price than the vendor who owns the specification. At any time, however, the vendor may choose to change the specification and delay publication of the changes until after the vendor has released its own new products.
Another concern is that specifications are not universally available. For example, requiring consortium membership, with high membership fees, effectively restricts the free flow of information to the community at large. Certainly consortia have special advantage by controlling the content of a specification, while preventing community-wide review of its choices.
Traditional, "accredited" standards bodies have relatively liberal rules of membership and conduct open meetings. They publish their specifications, though usually for a significant price, making them available to any customer or vendor. No single company and no market-driven consortium control the specifications, allowing vendors to work from a reasonably level playing field. Work is done only at meetings which are held at venues around the world. This requires major investment by anyone wishing to attend, constituting an implicit barrier to regular, broad-based participation.
The most extreme approach develops specifications in a forum open to anyone who is interested in participating, allowing on-line contribution so that travel is not required. The results then are also available to all, at little or no charge and in a highly convenient on-line format to anyone interested in reading them. This is the approach used by the Internet.
Selection of technical topics also can be by open process. If a topic lacks an adequate constituency, it's not pursued. If a topic has diverse constituencies they are free to go their own ways and the market chooses among them. Continuing on-line discussions, away from the meetings, allows progress to be made quickly.
A standards development process must perform a difficult juggling act. It must select among a range of technical alternatives, and it must do so in a matter that attends to the political concerns of its members. A process which attends only to technical excellence may produce a solution which is applicable only in a very narrow context. For example, it might not provide an adequate transition path for a large installed base of users of older technology. However, if the process places too much emphasis upon polite accommodation of the desires of each and all its members, the well-known problems of "design by committee" are guaranteed to sabotage the results.
A communication standard always is responding to the needs of several constituencies. At the least, there are product developers, service providers, and end users. Determining their needs is difficult. Accommodating all of those needs usually is impossible.
The IETF Standards Process
From a small effort by a few researchers, the Internet's technical development effort has grown considerably. Today's work is performed by a group known as the Internet Engineering Task Force (IETF). In 1987, its attendance numbered 40 souls. Today, approximately 700 people attend its thrice-annual week-long working meetings.
The original Arpanet effort involved very focused research on basic issues of packet switching. However, much of the use of the technology was subject to development by happenstance. The informality of the process had the detriment of relying entirely upon the energies of one or a few "champions" rather than the more deliberated outcome of an organizational commitment. Documentation tended to be incomplete at the start and was not revised in a timely fashion. On the other hand, it had the great advantage of being produced quickly while being only part of the shared knowledge needed to produce interoperable systems. The rest came from attending the working group meetings. Another feature of the informality was that a scribe could make "enhancements" to the specification and have them implicitly accepted -- if no one objected too loudly. The original Arpanet mail facility was the result of just such a casual, private decision.
Since the community was geographically distributed, but specifications and ideas needed quick dissemination, an on-line publication series called Request for Comments (RFC) was initiated in 1969. The name very accurately reflected the desires of authors. RFCs were explicitly viewed as working documents to be used within a relatively small community. They ranged from casual ideas to detailed specifications and from expressions of operations concerns to whimsical fantasy. If an idea seemed attractive, an individual might spontaneously specify a protocol or a group might meet to discuss it further. If a protocol seemed interesting, someone implemented it and if the implementation was useful, it was copied to similar systems on the net.
By 1981 the Internet effort, which followed the Arpanet effort, had matured and grown to the point that the DARPA Program Manager decided to form an advisory group, called the Internet Configuration Control Board (ICCB) and having the task of giving DARPA technical advice. Initially consisting of eight members, this is essentially the management structure that is in place today. In 1984, it was renamed to the Internet Activities Board (IAB).
In 1989, the IAB created the Internet Engineering Task Force (IETF) and the Internet Research Task Force (IRTF). The former was chartered to provide near-term solutions to technical difficulties in Internet operations and to develop near-term enhancements for the Internet. The latter group was asked to pursue those topics of long-term interest that carry some technical risk.
Because the bulk of the funding for TCP/IP research and development initially came from the US. military establishment, there is a natural tendency to assume that the work was fundamentally biased towards the needs of the United States. However one of the three original research groups to work on TCP/IP was University College London, in England. As the ICCB formed, a body called the International Collaboration Board (ICB) was formed at the same time and usually met in parallel with it, often in Europe or Canada. The ICB had a European focus, with the goal of coordinating requirements of transatlantic and NATO use of TCP/IP, particularly in the context of the multi-site Atlantic Packet Satellite Network (SATNET), which included Norway, United Kingdom, Italy and Germany.
The success of the Internet and its technology, in particular with its expanding commercial market and international scope, has created pressure for a more formal affiliation. There was some exploration of an association with an existing standards body, but without productive outcome. The result was the January 1992 formation of a professional organization, called the Internet Society (ISOC). In June 1992, the IAB was placed under the ISOC with responsibility for "...oversight of the architecture of the worldwide multi-protocol Internet" including continued standards and publication efforts. As part of the move, the IAB changed its name to be Internet Architecture Board, since the IAB does not, in fact, participate directly in the operational activities of any Internet component.
The ISOC and IAB bodies serve to provide ultimate oversight to the IETF standards efforts. Direct, line-management of the process comes from the Internet Engineering Steering Group (IESG), which charters efforts and approves their results. The IESG comprises the IETF Chair and a number of Area Directors who oversee efforts within various technical areas. The list of areas changes from time to time, but the current list is:
Earlier OSI-related activities were managed by a separate OSI Integration area. The IESG recently decided that such protocol work is now broadly and fully incorporated into the full range of IETF activities (Huizer, 1993).
The productive efforts of the IETF are performed in working groups. Each working group has a chair. Anyone may participate in working group activities, on-line or at meetings. The real challenge to working group management is balancing the requirement to give full and fair hearing to all sides in a debate, while still ensuring that forward progress is made in reaching the working group's goals. As participation has become larger and more diverse, many working groups find it difficult to develop specifications entirely within working group plenary sessions (face-to-face or in on-line discussions.) As a result, homogeneous, self-selecting groups, called design teams, have formed. They conduct the core of the specification work, responding to requirements and suggestions made by the working group. While there is occasional concern about the leverage that a design team can have over the contents of a specification, the team always is subject to the "rough" consensus of the working group.
Working groups are commissioned with a charter that details goals and schedule, either of which may be renegotiated as the working group progresses. A typical working group operates for 9-18 months. When it produces a specification that formally enters the standards track, the working group goes quiescent, although its mailing list remains operational and often is quite active.
A small IETF Secretariat provides the substantial support effort needed to mount a major, week-long meeting three times a year, run many IESG and IAB teleconferences, and otherwise perform the administrative legwork of this volunteer organization. The Secretariat is administered by the Corporation for National Research Initiatives, with funding from several US. government agencies and the Internet Society. Over time, support must come from a broader base of international private and public organizations.
Working documents of the IETF are maintained as part of a replicated, on-line store, called the Internet Repository. Documents under development are called Internet-Drafts (ID). Documents which have reached a level of stability, possibly by attaining a standards track status, are published in the continuing Request for Comments (RFC) series. Those that have a standards status also are assigned an STD number. Protocol specifications permit a wide range of enhancements to be registered and the issuance of registered values for these is provided by the Internet Assigned Number Authority (IANA). RFC publication and IANA operation are by USC's Information Sciences Institute.
A document intended to be an Internet standard goes through four stages. The first is basic development, during which time the specification has no formal status and might not result in a submission to the standards process. When the specification is stable, has a sufficient constituency, and has no known omissions or problems, it may formally enter the standards track as a Proposed Standard. In general, testing before standardization is an important principle of the Internet process. Implementation and testing are encouraged before a specification enters the standards track, although they are not required in most cases.
A specification may be submitted for elevation to Draft Standard when there exist at least two independent implementations which have interoperated to test all functions, and the specification has been a Proposed Standard for at least six months. When a Draft Standard has gained significant field experience, providing a clear demonstration of community interest in using the specification and has held its status for at least four additional months it may be elevated to the status of a full Internet Standard.
Documents which are produced by other standards bodies, other organizations, or individuals simply wishing to make their work available to the Internet may publish a version as an RFC, with a status of Informational. These are not Internet standards and are not intended to be the subject of direct Internet effort. Specifications which are not on the standards track but that the author seeks to gain Internet experience may be published as Experimental. The specifications may change, may be incomplete in some respects, or may contain significant errors. However, the specification's author wishes to encourage technical review and experience, possibly for later consideration in the standards process.
Most work is in the production of Technical Specifications (TS). These are the familiar descriptions of formats and procedures. However, there may be separate Applicability Statements (AS) which describe the circumstances under which one, or more, TS's are to be used.
When considering a specification for adoption as an IETF standard, the general criteria for accepting it are:
The process for creating Internet standards is relatively simple, although it has become more formal over time. A working group is chartered and a working group chair is assigned by the IESG. The working group then conducts its business on-line and at IETF meetings. (Additional meetings are allowed, but are relatively rare.) Each working group establishes its own details for operations, ranging from a loose, conversational style, to much more formal and structured attacks on well-defined problems.
At base, the keys to a working group's operation are that it reach a "rough" consensus about decisions and that it make those decisions in a manner which maintains forward progress towards the goals stated in the charter. When the working group agrees that it has a stable specification which satisfies appropriate technical requirements, it submits it to the IESG for approval. A brief, public review permits final expression and evaluation of concerns about technical content or working group process.
There is no voting in working groups, since there is no formal membership. This guarantees moments of divisiveness, since parties that lose various debates will occasionally feel that they were not given a fair opportunity to express their views or that the consensus of the working group was not accurately read. All such expressions of concern are taken very seriously by the IETF management. More than most, this is a system that operates on an underlying sense of the good will and integrity of its participants. Often, claims of "undue" process will cause a brief delay in the standard-track progression of a specification, while a review is conducted. While frustrating to those who did the work of technical development, these delays usually measure a small number of weeks and are vital to ensuring that the process which developed the specification was fair.
The Salient Points in IETF Success
The Internet standards process did not set out to achieve its current role. It was only the side-effect of a small research community. While that community was reasonably clear about the basis for its good work, the global perception of that success is quite recent. Hence, it is worth considering the constituents of this remarkable process.
For all of the increasingly formal procedure in the IETF standards process, the real work of the IETF relies on individual judgment, as well as individual effort. The formal rules provide beacons for guidance and for synchronization. The real test that is applied to difficult choices is whether the people involved conducted themselves fairly and made the best choices under the circumstances. Reliance on general "rough" working group consensus is the constant check-and-balance to potentially misguided behavior of individuals.
Usually, an IETF specification is the clear and direct result of a specific technical vision. One, or a few, individuals see a solution and recruit others to that vision. While the working group's participants can, and do, dictate changes, successful working groups are careful to maintain the integrity of the original vision.
IETF specifications usually attempt to solve specific, immediate problems, rather than to encompass a wide-range of long-term goals. This permits work to be directly responsive to immediate requirements. Keeping goals simple tends to make the resulting designs also simple. While some might call the results "limited", others call them "elegant". Typically they do prove to be quite extensible.
Because the Internet can field solutions quickly, Internet standards can benefit from considerable operational feedback. This, in turn, permits another round of specification, if needed. While too much iteration would certainly result in unstable specifications, this problem happens rarely.
Newcomers to the IETF never quite believe that the process is as open as it is. Anyone with a fresh perspective, clear insight, or good jokes is always welcome. As the Internet, itself, increases its global reach, many IETF contributors participate exclusively by email. While attendance at IETF meetings is extremely helpful, it is not required to be an effective working group participant. Since it is easy to join a working group mailing list, many members remain silent until some aspect of debate triggers their interest or calls on their special expertise.
As described above, the ability to have on-line working group participation is paramount. It fundamentally eliminates the barriers of time and cost for participation and contribution. This enormously increases the number and diversity of people who can contribute. Further, it means that progress does not have to wait for the next meeting.
The IETF has very loose requirements for the style in which its standards are written. In general, this results in documents that are easily read by the average implementer. Although formal analysis often uncovers ambiguities and errors in such documents, the informal network of implementers convey whatever additional information is necessary. This is certainly not an ideal system, but manages to balance flexibility and immediacy well enough to be highly productive.
The existence of the Internet Repository means that anyone with Internet access can obtain standards and working documents of the IETF, for no additional cost. This is in marked contrast with many other standards organizations. It is another example of the ways in which the IETF work is highly accessible to the broadest audience, permitting better analysis and broader use.
IETF participants usually are directly involved in producing or using the technology. In particular, they are not professionals in standards development. Even more important, IETF members build what they specify and then use it. The Internet, itself, provides a very large scale live test environment and as is often true with software, once it passes the test it is instantly used in production. If a working group's efforts are not useful, this is quickly evident before the work is made into a standard.
Comparison with Typical Standardization Efforts
It can be quite telling to look at the real membership requirements for organizations which declare themselves in the business of developing "open" specifications. They usually have very severe membership filters, in terms of membership cost or travel expenses needed to go the meetings. These expenses usually seem small, for most businesses. But the costs often serve to exclude smaller businesses, various research and education organizations, and personal participation by those without an appropriate organizational affiliation. This necessarily restricts the range of views that can be offered to the development process.
Most standards efforts seek to solve a problem in the most general manner and for the longest-term possible. Such intentions cannot be criticized. They are well-meant. Unfortunately, the goal of extreme generality requires very long and careful analysis and requires attending to a very broad range of requirements, which further adds to the design and analysis burden. Hence, it usually takes a very long time to produce these general solutions. Hence, these solutions often have missed their window of opportunity. Worse, they often have become cumbersome, difficult to implement, resulting in very large software modules.
IETF work occasionally suffers from these problems, too. In fact, the IETF is not very successful at fixing working groups that make the mistake of walking down the seductive path of long-term, general design. Fortunately, most IETF working groups operate within a narrow scope, trying to solve immediate problems. The wide range of views that contribute to the work usually make painfully clear what features a specification is lacking. As a result, designs often include hooks for later extensions, so that those who did not get their favorite feature into the current draft, can separately specify enhancements. If the community decides that one or another enhancement is valuable, it gets adopted. But the evaluation process for the additional features does not impede development and adoption of the functional core.
By rights, the narrow focus and near-term goals of the IETF work should make its specifications rigid and short-lived. Real-world experience shows a different performance record. The specifications are comprehensible to a broad range of implementers. The software operates on the complete range of platforms and is useful in most data communication contexts. Better still, its utility continues after more than ten years of production use.
As the Internet technology is applied to a wider range of environments, various deficiencies are identified. Security and accounting are the ones most commonly cited, though support for guaranteed levels of service, such as for real-time traffic, also are noted. To date, the IETF has shown an astonishing ability to add capabilities to the core technology, and there is little indication that it has reached a limit in that ability.
Over the last five years, work from the Internet community has shown vastly greater market acceptance and use than the work of the OSI community. It's puzzling to try to determine the engineering rule of thumb that explains this. One possibility is the OSI community's desire for functional completeness and accommodation of all interests leads to the philosophy of including as much as possible in a design. In contrast, successful IETF working groups are driven by near-term needs and consequently try to produce designs that remove as much as possible. At first blush, this should produce highly limited designs. The trick in the process appears to be the group consensus requirement. As one would expect, each participant contributes their list of desired features, but the short time-fuse on the work requires that the group reach consensus quickly. This can only be done by removing features, since only a small core of features will be clearly acceptable to most participants. (The alternative approach of including all of everyone's preferences requires too much group debate and results in a design that is too-obviously unacceptable.) However, the process of removing features also requires some assurance that some of those features can be added later. Hence, the design usually permits extensibility which is itself, designed with an approximate sense of the sorts of extensions that are likely to be made.
The Future for IETF Standardization
Surely there is a down-side to the good-will and good results of the IETF? And indeed there is.
The IETF's growth is proving a fundamental challenge to its style of operation. More people means less familiarity on a personal and professional level. Internet technology now represents a multi-billion dollar business. Hence, IETF decisions have significant financial impact and that can raise the heat of a debate quite a bit.
While more people are participating, the number of senior, experienced contributors has not risen proportionately. Such folk are essential for providing working groups with guidance about successful practice. Without such guidance, working groups run the serious risk of have good consensus about a bad design.
In general, the IETF is applying its own technical design philosophy to its own operation. So far, the technique seems to be working. With luck, it will demonstrate the same analytic likelihood of failure, with the same experiential fact of continued success.
Chapin, A.L. 1992. The Internet Standards Process. RFC 1310, Network Information Center, Mar.
Crocker, D. 1993. Evolving the System in Internet System Handbook, D. Lynch. and M. Rose (eds.). Addison-Wesley, Reading, Mass.
Huizer, E. 1993. The IETF integrates OSI related work. ConneXions, Vol. 7, No. 6., Jun. &;