Introduction 2
Overview 2
Aims 2
Standards in practice 2
What is a standard? 2
ISO 3
Case study: ISO 8859 3
The evolution of character encoding 6
ASCII and ANSI 6
Unicode 6
Key points regarding standards 7
Classifying standards organisations 7
Types of organisation 7
Recommendations and RFCs 9
Classifying standards 9
Open and closed standards 10
Sponsored and unsponsored standards 12
A classification matrix 12
De facto and de jure standards 13
The importance of standards 14
Producers and consumers 14
Network effects 15
Standards as innovations 16
Summary 16
Glossary 17
Answers to self-assessment activities 18
Acknowledgement 19
2.1
E-business technologies:
foundations and practice
Block 1 Part 2: Setting the standards
Introduction
Overview
The relationship between business and technology is one that is often mediated by
standards. Increasingly, we choose a technology based upon what standards it
supports. But just what are standards and who makes them? Is it possible for us to
play a part in this construction process or, if not, do we have to follow the standards
that others have set? These are the kinds of questions to which this part of Block 1
helps you find answers.
Standards have a fundamental role in e-business. In this text I shall examine some of
the standards and standard-setting organisations responsible for the development of
the Internet and the Web. To do this I'll classify different organisations with respect to
their role in the standard-making process and use a simple three-dimensional
framework to classify standards according to their openness, sponsorship status and
ratification by standardisation bodies.
I shall end with a look at the importance of standards as both a choice and a factor in
the technology adoption decision.
Aims
The aim of this text is to introduce standards as a mediator in the technology adoption
decision.
When you have finished this part of Block 1 you should be able to:
• Describe the roles and responsibilities of different organisations in the standard-
making process.
• Explain the importance of open standards.
• Classify a standard as somewhere between open and closed.
• Classify a standard as either sponsored or unsponsored.
• Understand the difference between de facto and de jure standards.
• Consider standards as innovations that precede or succeed ICT solutions.
Standards in practice
What is a standard?
There's a humorous quote, often attributed to pioneering computer programmer Grace
Hopper, that has something valuable to say to us about standards:
The wonderful thing about standards is that there are so many of them to
choose from.
This neatly sums up what we could refer to as the standards paradox. Standards are
supposed to define with authority the one way that things should be done, and yet there
are so many of them it's sometimes difficult to know which one we should be following.
I'm not going to say any more about this issue for now; instead, let's follow that
tongue-in-cheek quote with a serious definition to provide some balance. On its web
site, the International Organization for Standardization (ISO), which is a long-
established and influential standard-setting organisation, defines a standard as follows:
a documented agreement containing technical specifications or other
precise criteria to be used consistently as rules, guidelines, or definitions of
characteristics to ensure that materials, products, processes and services
are fit for their purpose.
Block 1 Part 2 | 2
Quotes and definitions can be useful (I'll leave you to decide which of the two given
above is the most useful here), but what we really need, in order to get a feel for what
a standard is all about, is some examples.
As an example of the standards process I am going to look in detail at the evolution of
the standards that deal with character encoding for computers. This example, in
addition to providing detail that is a fundamental building block for some of the
technologies later in the course, will allow you plenty of opportunity to explore the
function of standards and the role of standards bodies in the standard-making
process.
ISO
ISO, which provided us with our definition of a standard, is one of a large number of
organisations involved in the standardisation process. It is a formal standard-setting
organisation; that is, it is recognised by governments internationally. There are other
formal standard-setting organisations, as well as lots of informal ones. You will
encounter more of these later on.
ISO plays an important role in the setting process of a diverse range of standards, not
just those relating to ICT. For now, let's view ISO as sitting at the top of a hierarchy of
organisations involved in the standard-setting process, in accordance with the status
and recognition it has long been afforded. Here is a small sample of the many
standards that ISO is responsible for maintaining:
• ISO 8859
• ISO 9660
• ISO 9899
• ISO 216
• ISO 9000.
These numbers may not mean much to you in isolation. I'm going to help by telling
you that these are the standards for, respectively, 8-bit character encoding used in
computers; a file system on CD-ROM; the C programming language; the paper size
system that encompasses A4; and the quality management of key business
processes. Maybe now you'll recognise the importance of them, or maybe not!
Let's use ISO 8859, a family of standards for 8-bit character encoding in computers,
as our example so that we can look at the principles behind standards in more detail.
We can learn a lot from this standard.
Case study: ISO 8859
Everything stored in a computer can be viewed as a sequence of ones and zeros, also
referred to as bits (binary digits). When I type this sentence into my word processor it
needs to be converted into the ones and zeros that the computer can handle.
Therefore in 8-bit character encoding each character is represented by a unique
sequence of eight bits, each of which may be either a one or a zero. Those eight bits
provide enough flexibility to store up to 256 (28) letters and symbols. Take a look at
your keyboard. It looks like more than enough, doesn't it?
The character set ISO 8859-1 specifies which 8-bit sequence should represent each
character. Figure 1 shows a portion of this character set that covers all the lower-case
letters in the English alphabet.
ISO 8859-1 is big enough that it can cover many of the languages of Western Europe
and others that are derived from the Latin, or Roman, alphabet. Sometimes it is
referred to as Latin 1. Figure 2 shows some of the non-English characters included in
the character set.
Block 1 Part 2 | 3
Block 1 Part 2 | 4
Figure 1 Part of the ISO 8859-1 character set
Source: ISO/IEC (1998)
Figure 2 More of the ISO 8859-1 character set
Source: ISO/IEC (1998)
So it looks like all possible characters are covered, at least for the majority of Western
Europeans. This is indeed the case, and not just for most Western Europeans either;
ISO has standards for other languages of European origin and beyond too. I referred
initially to ISO 8859 being a family of standards; in fact ISO 8859-1 is just one of a
family of, at present, sixteen. This family covers many popular (from a Western
perspective) living languages and a dead one too (Latin).
Activity 1 (Self-assessment)
Table 1 gives a breakdown of a part of the ISO 8859 family of standards and its
coverage. I'd now like you to fill in the gaps in this table. In order to do so, you will need
to take a look at the ISO 8859 standard itself. I've provided a link to British Standards
Online (BSOL) in the 'Library resources' area of the course web site; you will also find
some guidance there on how to locate specific standards. Retrieve ISO 8859 and
skim through it until you find the information you need. I just want you to get a feel for
the standard; you certainly don't need to worry about all the detail. Remember that we
are using character encoding as an example that serves to demonstrate some
principles regarding standards; it's these principles that I want you to bear in mind.
Comment
You'll find my version of the completed table in the 'Answers to self-assessment
activities' section. Bear in mind that some languages are covered by multiple
members of the ISO 8859 family, and that there are of course many more
languages than the examples I've started you off with. I'm just providing you with a
flavour of the standard.
Table 1 ISO 8859 standard for character encoding
Standard Name Examples of language coverage
ISO 8859-1 Latin Alphabet No. 1 English, Portuguese, German, Italian, Latin
ISO 8859-2 Latin Alphabet No. 2 Croatian, Czech, Hungarian
ISO 8859-3 Latin Alphabet No. 3 Esperanto, Maltese
ISO 8859-4 Latin Alphabet No. 4
ISO 8859-5
ISO 8859-6 Arabic
ISO 8859-7 Greek
ISO 8859-8 Hebrew
ISO 8859-9 Turkish
ISO 8859-10
In Activity 1 I directed you to look at the standard via British Standards Online (BSOL).
BSOL is owned and maintained by the British Standards Institution (BSI), which is the
UK's national standards body. I'll look at national standards bodies (NSBs) shortly
when I examine the evolution of character encoding. NSBs have a two-way
relationship with international standards organisations. Sometimes NSBs develop a
standard nationally that is then adopted as an international standard; other times, an
international standard is developed first and then adopted by national standards bodies.
With ISO 8859 we have a standard, albeit an umbrella standard that in fact
encompasses a number of standards, that can be used by anybody producing
technology that needs to store and retrieve characters in a computer. Great … unless
you're producing a technology that handles Eastern Asian characters used in
languages such as Japanese, Korean and the world's most spoken language,
Mandarin Chinese. The standards that make up ISO 8859 won't work for you in this
instance and you will need yet another standard. In fact, ISO does have another
standard for Eastern Asian languages: the ISO 2022 family.
Remember the words of Grace Hopper? It's all getting a bit messy, isn't it? Let's
ignore these complications for now and see how ISO 8859 became an international
standard.
Block 1 Part 2 | 5
The evolution of character encoding
ASCII and ANSI
You might think that the importance of an agreed standard for character encoding is
such that ISO 8859 was probably formulated around the time of the first computer
communications. In fact, it wasn't. ISO 8859 originated in the mid-1980s and was
adopted by ISO from another standards body, the European Computer Manufacturers
Association (ECMA).
Prior to this the UK and the USA, at least, were using a 7-bit standard for character
encoding called ASCII, the American Standard Code for Information Interchange.
ASCII was developed by another standards body, the American National Standards
Institute (ANSI), in the early 1960s.
ANSI is a national standards body (NSB) and so its focus, quite rightly at the time,
was on the setting of a standard for encoding its own national language. Other
standards bodies in non-English speaking countries, but with alphabets based on
Latin, modified US ASCII to meet their own needs. The ANSI standard was embraced
and extended. These extensions are commonly referred to as the ASCII extensions.
However, back in the 1960s and 1970s ASCII was not the only standard in use. ASCII
may have been the de facto standard for mini computers, but IBM's Extended Binary
Coded Decimal Interchange Code (EBCDIC) was, and still is, the predominant
standard used by mainframe computers. I won't complicate matters any further by
looking at EBCDIC.
The ISO 8859-1 standard is the international standard extension of ASCII that
represents many of the languages based on the Latin alphabet. It does this by
keeping the first 128 characters of ASCII intact and using the other 128 8-bit codes to
represent non-English Western European characters. Because ISO is an international
standards body, it has global interests and so aims for all-inclusive representation. But
as we've seen, even the ISO 8859 family of languages has serious limitations when
we move away from a European perspective. One of its major shortcomings, as I said
earlier, is that its character sets do not cater for Eastern Asian languages such as
Chinese, Japanese and Korean. However, we must also consider dead languages
that are not spoken any more, but may have been written down. Do you think these
other languages matter?
Unicode
Another industry-focused standard-development organisation, the Unicode
Consortium, thought it did matter that there existed several character sets that were
incompatible with each other and did not cater for all known languages of the world,
living or dead. Although certain languages may not be spoken any more, that doesn't
preclude their characters being displayed on a computer screen. The Unicode
standard is an attempt to extend the 8 bits used in the ISO 8859-1 character set to
16 bits and to encode all of the world's languages in one single character set. ISO
have worked with Unicode and the resulting standard is known as ISO 10646.
The Unicode consortium consists of most of the major players in the computing
industry today (including IBM, HP, Microsoft and Apple) and is typical of many
consortiums that include, amongst others, major representatives from the computing
industry. It is in the interests of those in the industry that they play a role in setting the
standards that they will use. There are two good reasons for this.
Firstly, these organisations have to keep a close eye on developing standards to
ensure that they are developing products with standards compatibility in mind.
Secondly, technology innovators often have to work ahead of the standards in order to
steal a march on the competition. It's important for them that any extensions they may
make are adopted as industry standard if products built on this standard are to see
Block 1 Part 2 | 6
widespread adoption. One way of doing this is to ensure that these extensions form
part of an official standard. There are other mechanisms too, however, and I shall
discuss these later.
Nevertheless, it would be a mistake to think that computer manufacturers have to
adhere to these standards by law. They don't. Often, a standard that might currently
seem in the best interests of the majority has not previously been viewed as making
the most business sense. Apple's Mac OS X operating system may now use Unicode
as its standard default for character encoding, but prior to this Apple's Mac OS used a
customised implementation of ASCII called Mac OS Roman. Microsoft also continues
to use its own modified version of ISO 8859-1, called Windows-1252, for the legacy
components of its Windows operating system.
Standards are of course of vital importance when it comes to information exchange.
The global connectivity of the Internet means that local solutions based on national
standards no longer have sufficient scope. Unicode as a standard for character
encoding looks like the most sensible choice if we are to exchange information across
boundaries between languages that are both living and dead, and mix multiple
languages at the same time. It may have taken forty years to get there, but it looks like
this is the one standard that may replace all the others for character encoding. I won't
spoil the happy ending by telling you that even the Unicode standard is split into
UTF-7, UTF-8, UTF-16 and UTF-32!
Key points regarding standards
The ISO 8859 story is typical of the evolution of many standards, so we can draw a
number of lessons from it.
• Standards evolve alongside technology and are sometimes replaced when their
limitations become too great.
• Standards are important for data exchange, communication and interoperability.
• Numerous standards organisations, often competing amongst themselves, can be
involved in setting standards for the same technology applications.
• International standard-setting organisations have a wider scope than national
ones, but this scope does not mean that an international standard is always all-
encompassing.
Classifying standards organisations
Types of organisation
Earlier I introduced you to the ISO 8859 standard for character encoding that is
published and maintained by the International Organization for Standardization. ISO is
a long-established standard-setting organisation founded in 1947. Its membership is
made up of 157 countries, many of which have their own national standards bodies
that sit on the ISO committee. You have already met a couple of examples of national
standards bodies: the American National Standards Institute (ANSI) and the British
Standards Institution (BSI). Owing to ISO's inherently international remit, and the
national scope of the member organisations that are entitled to vote as part of the
standard-ratification process, ISO standards generally carry much authority. However,
there is a serious downside that comes with this authority: because the ISO standard-
development process involves so many member countries, it is generally a slow
process and can be ill-suited to fast-moving technological innovations.
The slow-moving nature of the ISO standardisation process made it a poor fit for the
standardisation efforts required for the Web. In addition, there was another mismatch
between ISO and Web standardisation: ISO was firmly behind OSI as a networking
standard rather than TCP/IP, which were the protocols upon which the Internet and
the Web were built. (I'll touch upon OSI as a standard for networking a little later.) So
Block 1 Part 2 | 7
when Tim Berners-Lee set about standardising the Web he used the Internet
Engineering Task Force (IETF) rather than ISO to oversee the standardisation of the
low-level components such as HTTP, and formed the World Wide Web Consortium
(W3C) for the higher-level parts such as HTML. The W3C and the IETF are
community-driven standardisation organisations that are more agile than the likes of
ISO and the IEEE. The W3C and the IETF take a more 'bottom-up' rather than 'top-
down' approach, and publish recommendations and requests for comments (RFCs)
rather than standards. However, this makes them no less important as standardisation
organisations.
I've listed these standards bodies, and some of the standards they are responsible for
maintaining, in Table 2. I have included the WS-* standards for web services, as well
as the XML, XPath and XQuery standards. You'll be meeting these standards in much
more detail later in the course, and I'm not assuming that you have any knowledge of
them now. The others should mostly be familiar to you.
Table 2 Sample standards bodies and sample standards
Standards body Sample standard Purpose
ISO ISO 8859
ISO 9660
ISO 9899
ANSI ASCII
SQL
W3C HTTP
HTML
CSS
WAI
XHTML A version of HTML standard that conforms to the XML
standard
XML A simplified version of the SGML standard designed as a
metalanguage for data exchange over the Internet
XPath A language for referencing parts of an XML document
XSLT An XML-based language for converting XML documents
into other document formats
IETF Requests for
comments (RFCs) for:
TCP, IP
SMTP
FTP
Telnet
IEEE 802.3
802.11a,b,g, etc.
802.15.1
1394
OASIS WS-* Web services standards and interoperability profiles,
dominated by vendors and large business interests
Block 1 Part 2 | 8
Activity 2 (Online, discussion)
I'd like you to complete Table 2 by adding entries to the 'Purpose' column. Just a brief
sentence for each standard will suffice; the more concise you are here, the better. You
will probably need to carry out a quick search online for information about those
standards with which you are less familiar.
Once you have completed your version of the table, share your answers with other
students in the Course Discussion forum.
Comment
The aim of this activity is to give you a flavour of some of the standards upon which
the Internet is built and of the standardisation organisations responsible for those
standards.
Recommendations and RFCs
I have already mentioned that community-driven standardisation organisations, such
as the W3C and IETF, publish recommendations and requests for comments rather
than standards. But what does this mean?
The W3C, which as you have seen is an important organisation when it comes to
standard-setting for the Web, has this to say about the recommendations that it
publishes (W3C Advisory Board, 2003):
A W3C Recommendation is a specification or set of guidelines that, after
extensive consensus-building, has received the endorsement of W3C
Members and the Director. W3C recommends the wide deployment of its
Recommendations. Note: W3C Recommendations are similar to the
standards published by other organizations.
The IETF, which is responsible for publishing Internet standards, uses the term
'request for comments' (RFC) to describe any of its documents that are published for
discussion and may or may not eventually become Internet standards. This is what
the IETF has to say about RFCs in RFC 2026, which itself is an RFC that describes
the IETF's Internet standards process (Bradner, 1996):
Some RFCs document Internet Standards. These RFCs form the 'STD'
[standards track documents] subseries of the RFC series … When a
specification has been adopted as an Internet Standard, it is given the
additional label "STDxxx", but it keeps its RFC number and its place in the
RFC series. … It is important to remember that not all RFCs are standards
track documents, and that not all standards track documents reach the
level of Internet Standard.
So some RFCs are standards, but not all. Some RFCs are experimental, while some
reflect best current practice. Experimental RFCs may develop into full-blown Internet
standards over time through working-group consensus. The RFCs for TCP, IP, SMTP,
FTP and Telnet are all Internet standards.
There are also some RFCs that are jokes. See, for example, RFC 2549, which
describes IP over Avian Carriers with Quality of Service (published by D. Waitzman on
1 April 1999).
Classifying standards
We can classify standards in a number of ways. I'm going to use three complementary
ways of classifying those standards that pertain to ICT and e-business. First, we can
classify a standard in terms of whether it's open or closed. Second, we can classify a
Block 1 Part 2 | 9
standard as either sponsored or unsponsored. And finally, we can talk of standards
that are either de facto or de jure.
Although I've presented each of these classifications in terms of binary opposites, in
practice each standard occupies a position somewhere along a continuum that runs
from open to closed, sponsored to unsponsored, de facto to de jure.
I shall now look in turn at each of these three ways of classifying standards.
Open and closed standards
In our previous case study of an evolving standard, ASCII, ISO 8859-1 and Unicode
could all be said to be examples of open standards; that is, they are published for
public use and available for third parties to read and implement without royalties. In
contrast, a closed standard is one that is not published for public use and is not
available for third parties to implement without the payment of royalties.
These are my simple definitions of open and closed standards. They serve a purpose
in getting us started here, but you should be aware that there is considerable debate
about just what constitutes an open standard.
Some, for example the European Union, say that in order to be classed as open, a
standard must adhere to the following principles (European Communities, 2004, p. 9):
– The standard is adopted and will be maintained by a not-for-profit
organisation, and its ongoing development occurs on the basis of an open
decision-making procedure available to all interested parties (consensus or
majority decision etc.).
– The standard has been published and the standard specification
document is available either freely or at a nominal charge. It must be
permissible to all to copy, distribute and use it for no fee or at a nominal fee.
– The intellectual property – i.e. patents possibly present – of (parts of) the
standard is made irrevocably available on a royalty-free basis.
– There are no constraints on the re-use of the standard.
Others, such as open source evangelist Bruce Perens (2006), go further:
1. Availability
Open Standards are available for all to read and implement.
2. Maximize End-User Choice
Open Standards create a fair, competitive market for implementations of
the standard. They do not lock the customer in to a particular vendor or
group.
3. No Royalty
Open Standards are free for all to implement, with no royalty or fee.
Certification of compliance by the standards organization may involve a fee.
4. No Discrimination
Open Standards and the organizations that administer them do not favour
one implementor over another for any reason other than the technical
standards compliance of a vendor's implementation. Certification
organizations must provide a path for low and zero-cost implementations to
be validated, but may also provide enhanced certification services.
5. Extension or Subset
Implementations of Open Standards may be extended, or offered in subset
form. However, certification organizations may decline to certify subset
implementations, and may place requirements upon extensions (see
Predatory Practices).
Block 1 Part 2 | 10
6. Predatory Practices
Open Standards may employ license terms that protect against subversion
of the standard by embrace-and-extend tactics. The licenses attached to
the standard may require the publication of reference information for
extensions, and a license for all others to create, distribute, and sell
software that is compatible with the extensions. An Open Standard may not
otherwise prohibit extensions.
Don't confuse open source with open standards, by the way. They refer to two
different, but often related, ideas. The term 'open source' refers to conditions for the
software itself rather than any standards upon which it is built.
There are lots of other definitions available of what it means for a standard to be open.
I kept my working definition simple, but you should be aware that the definition of open
and closed when it comes to standards is under debate. Bear in mind, too, that
regardless of which definition we take as a starting point, most standards will in fact
occupy the space between open and closed.
There are two more points that I'd like to make before we go on. First, you may be
wondering why anyone would want to create a closed standard. Closed standards
have proved successful in the past in creating vendor lock-in, so that one
organisation has a monopoly over the provision of a product or service that uses that
standard. For instance, Microsoft Word is the de facto standard for word processing,
with reports suggesting that its market share is somewhere between 80% and 95%.
This means that the switching costs involved in moving from Microsoft Word to
another word processor now appear far too high. If Microsoft had made the DOC file
format an open standard then its competitors would have been free to produce fully
compatible alternatives to Microsoft Word, to which users could easily switch.
There are of course alternatives to Microsoft Word, such as the open source
OpenOffice.org office suite, that read and write the DOC standard to a certain degree.
However, this has been achieved not by having access to the standard (it is closed),
but by reverse engineering the standard instead. This results in a less than 100%
compatibility with the standard, which may not be sufficient in all cases.
Despite the advantages to the vendor, closed standards have to some extent fallen
out of fashion, mainly as a result of the proliferation of the open standards that have
been behind the growth of the Internet and the Web. Even Microsoft's Word DOC
format, along with the rest of Microsoft Office, is in the process of being opened up as
Office Open XML (OOXML) and is now seeking to become an open standard ratified
by the ECMA and ISO standards organisations.
My second point is that open standards do not necessarily preclude vendor lock-in. In
his open standard definition, Bruce Perens draws attention to protection against
predatory practices such as 'embrace-and-extend tactics'. Software and hardware
vendors can still attempt to subvert open standards for competitive advantage.
One good example of such a strategy, which has since become known as embrace,
extend and extinguish, took place during the browser wars of the mid-1990s. In this
case two organisations, Netscape on the one hand and Microsoft on the other, initially
embraced HTML in their respective browsers, Navigator and Internet Explorer. They
then raced each other to extend the HTML standard with a set of proprietary tags that
were not standards-compliant and yet offered greater functionality. This could be seen
as a sign of frustration at the slow pace of standards ratification in an area of rapid
technological development, or as an attempt to lock those writing HTML, and those
reading it, into their own proprietary standard. The end result was to promote
adoption of one particular browser over the market alternatives, thus extinguishing the
competition.
Block 1 Part 2 | 11
Sponsored and unsponsored standards
Do not confuse closed standards with proprietary standards (also referred to as
sponsored standards). A sponsored standard is one that is owned and maintained
by a single business organisation rather than a standardisation body. Its opposite is
the public standard or unsponsored standard. All the standards I have looked at so
far have been unsponsored standards; that is, they have been formulated and
published by standardisation bodies. Although a standardisation body may include
representatives from business organisations, it is the body as a whole that has control
of the standard-setting and publication process for unsponsored standards.
Sponsored standards, however, are firmly under the control of a single business
organisation.
A sponsored standard can tend towards the open (e.g. Adobe's PDF document format)
or the closed (Microsoft's Word DOC document format, at least in its current form).
A classification matrix
You can use the matrix shown in Figure 3 to review what you have learnt so far about
classifying standards according to their position in two dimensions.
Figure 3 Standards classification matrix
Using this matrix, standards can be classified as being positioned along one
continuum between open and closed, and along another continuum between
sponsored (proprietary) and unsponsored (public). We can use the matrix to plot the
relative positions of some of the standards that have already been discussed.
Activity 3 (Self-assessment, discussion)
Try using the matrix given in Figure 3 to position the following standards:
• PDF file format
• DOC file format
• HTML
• XML
• Flash SWF file format.
I've talked about the first two in the text; the others will probably require you to do
some searching online. Take either my definition of an open standard from this text or
one of the alternatives I've discussed. The idea is to situate standards relative to each
other in the open/closed and sponsored/unsponsored dimensions, so don't worry
about absolute positions.
Open
UnsponsoredSponsored
?
Closed
Block 1 Part 2 | 12
Are there any standards that might fall into the bottom right section of the matrix? That
is, are there any standards that are both unsponsored and closed by anybody's
definition? Discuss this in the Course Discussion forum.
Comment
You will find my thoughts and a version of the completed matrix in the 'Answers to
self-assessment activities' section.
De facto and de jure standards
So far, I've looked at two dimensions that we can use to classify standards: open or
closed, and sponsored or unsponsored. Now let's look at a third dimension that
embraces two terms often used when describing a standard. These are de facto and
de jure.
A de facto (from the Latin meaning 'in fact') standard is one that dominates because it
is the most popular in practice rather than because it has been set out by a standards
body. In the world of operating systems for personal computers, Microsoft Windows is
the de facto standard. It dominates this landscape with an oft-reported 95% market
share.
De facto standards are often contrasted with de jure (from the Latin meaning 'in law')
standards, which are those set down in law or, more loosely, prescribed by
standardisation bodies. For my purposes I am going to use the looser of the two
definitions: a de jure standard is one that has been ratified and published by a
standardisation body. Microsoft Word DOC format is currently a de facto standard
rather than a de jure standard, because a standardisation body hasn't ratified and
published it.
In the 1980s the de jure standard for networking was OSI. It was planned and
prescribed by ANSI and then ISO. At the time, each different computer architecture
had its own standard for networking, and these were mutually incompatible. It was
believed that OSI was the networking standard that would bring together the
fragmented computer networking market. Thus the OSI standard appeared to be of
vital importance; so much so, in fact, that the USA and some European governments
insisted that they would do business only with organisations using this standard.
You don't need to worry about the details of the standard. What you do need to know
is that although the seven-layer OSI reference model (shown in Figure 4) was the de
jure standard published by ISO and was theoretically a good idea, it was too complex
in practice.
user
transmit receive
data
application layer
presentation layer
session layer
transport layer
network layer
data link layer
physical layer
data
physical link
Figure 4 Seven-layer OSI reference model
Block 1 Part 2 | 13
The Internet, which was largely built on Unix machines that used the TCP/IP protocol,
grew quickly at the beginning of the 1990s. TCP/IP was very open because it had
been released into the public domain by the US Department of Defense in the
1980s. This meant that anyone was free to implement the TCP/IP networking model.
TCP/IP became the de facto standard for computer communications because it was
the one that was most widely used in practice. The de jure OSI model was never
implemented in its entirety.
This example of TCP/IP and OSI standards largely focuses on the field of personal
computing: the world of mainframe networking was, and still is, dominated by IBM's
System Networking Architecture (SNA). However, it's not the details of this OSI versus
TCP/IP story I want you to remember; it's the way that we can classify standards in
terms of de jure or de facto, and how deciding to use one standard over another can
sometimes prove a costly mistake.
You can see, therefore, that in terms of the technology adoption lifecycle model
(Part 1), it sometimes pays to be a laggard. I'll return to this model, and the influence
that standards can have on technology diffusion and adoption, later on.
For now, I shall finish my discussion of de jure and de facto with a brief look at those
standards that are sponsored de facto standards whose sponsors then seek out de
jure status. Adobe's PDF document format is one such standard. At the start of 2007,
Adobe began the process of seeking ratification for PDF by a standards body. Here's
an excerpt from the press release made by Adobe (2007, p. 1) at the time:
"Today's announcement is the next logical step in the evolution of PDF
from de facto standard to a formal, de jure standard," said Kevin Lynch,
senior vice president and chief software architect at Adobe. "By releasing
the full PDF specification for ISO standardization, we are reinforcing our
commitment to openness. As governments and organizations increasingly
request open formats, maintenance of the PDF specification by an external
and participatory organization will help continue to drive innovation and
expand the rich PDF ecosystem that has evolved over the past 15 years."
This quote sums up the current preference for open, unsponsored, de jure standards.
As we saw earlier, the European Union provides a definition of an open standard as
part of the European Interoperability Framework. The EU, like many other
organisations both large and small, sees open, unsponsored, de jure standards as the
key to interoperability, low switching costs and avoiding vendor lock-in.
The importance of standards
Producers and consumers
So why are standards important? Well, first of all we should consider for whom
standards are important. Technology consumers like standards when they are open
because it gives them the freedom to choose between different technology suppliers,
safe in the knowledge that different products using the same standard will be
interoperable. This means that should they wish to switch to another supplier, it is
relatively easy to do so and they have a choice of which supplier to go to. They are
not locked in to one supplier.
Technology producers have for a long time favoured closed standards. The most
successful technology producers of ICT in the past have been those that have
capitalised on closed standards. IBM in the 1960s and 70s and Microsoft in the 80s
and 90s both produced technologies based on closed standards, which gave
consumers little freedom to choose alternatives once these standards had been
established as de facto.
Accompanying the successful diffusion of the Internet, which as you have seen is built
on open standards, has been a shift away from closed towards open standards in ICT.
Block 1 Part 2 | 14
The shift in power in the producer–consumer relationship has consequently been
away from the producer and towards the consumer.
Network effects
A network effect occurs when individuals receive value from other people using the
same technology. The classic example is a fax machine. A fax machine was a very
popular early way of conducting business electronically, providing the permanence of
paper, but with speed and convenience approximating that of a phone call. Its value to
a business would be pretty limited, however, if that business was the only one with a
fax machine. The value would increase in relation to the number of other people who
also had a fax machine.
Bob Metcalfe, one of the inventors of Ethernet, formulated a 'law' that expresses the
relationship between value and the number of users using the same technology.
Metcalfe's law, which is really a rule of thumb, can be paraphrased as follows:
The value of a telecommunications network (V) is proportional to the
square of the number of users of the system (N2).
The fax machine example is a little simplistic, however. In reality it's not just about
having a fax machine, but about having a fax machine that communicates using the
same standard as existing fax machines. There is a series of open, unsponsored, de
jure standards for fax transmissions that is published by the International
Telecommunication Union (ITU), another long-standing standardisation organisation.
These open standards provide the freedom to choose between any manufacturers
who are producing a fax machine that adheres to the ITU standard. So it is really the
standards that exhibit network effects.
The fax machine is an example of a direct network effect; that is, the value is
derived from the number of users using the same product (utilising the same
standards). However, there are also indirect network effects that are a result of the
availability of complementary products. For our fax machine example it might be the
availability of replacement paper rolls. The greater the availability of replacement
paper rolls (made to the same standard), the greater the network effect and the more
value the fax machine has.
We can see lots of examples of indirect network effects in ICT too. One of the
valuable things about Microsoft Windows is that there are lots of complementary
products available for that operating system platform. I'm thinking of software
packages, peripherals with Windows drivers, and books and other documentation.
Because there are so many complementary products, the indirect network effect is
strong.
Indirect network effects are not just about the availability of complementary products.
They can also be in the form of knowledge and skills gained from learning about a
technology. Standards can mean that individuals only invest once in learning about a
technology. The archetypal example here is the QWERTY keyboard layout for UK and
US computers.
The QWERTY layout was intentionally designed to slow down typists in the days of
mechanical typewriters so that the hammers striking the paper would not become
jammed. With the advent of computers as word processors, we no longer have any
need for this standard, yet the indirect network effects of individuals having learnt and
become accustomed to the QWERTY standard are so strong that adoption of another
standard such as the Dvorak layout (often touted, unsurprisingly, by its inventor as
more efficient) seems unthinkable.
The indirect network effects of learning also ensure that organisations and industries
are keen to standardise technology adoption around key standards so that there is
always a large, skilled pool of labour to draw upon.
Block 1 Part 2 | 15
Standards as innovations
Standards can be seen to mediate the relationship between technology and business.
Increasingly, the choice for organisations has become not what technology to adopt,
but what standard to adopt. We have already looked at some Internet standards, and
it seems very difficult to think back to a time before open standards dominated the ICT
landscape. However, as you saw in the case of character encoding, there can be
multiple open standards that are in competition with each other. New standards are
innovations that can displace more established standards.
Once a standard has become established as de facto, technologies that support that
standard are readily accepted. De facto standards such as these speed up the
technology diffusion process by reducing the uncertainty and risk involved in adoption.
Remember, though, that a closed de facto standard may well reduce risk and
uncertainty, but it also locks organisations in to a specific vendor. Open standards
reduce this chance of lock-in and separate the choice of technology from the choice of
standard.
However, developing standards still present organisations with choices. A de facto
standard is only de facto because it is most popular. If we look at a standard as being
an innovation, we can use the technology adoption lifecycle model to think about
those standards that have not crossed the chasm and are yet to see adoption by the
mainstream market. There are often multiple emerging standards that are competing
with each other. The area of web services is a good example. You'll meet some of the
standards for web services later in the course, and I'll pick up on the REST versus
SOAP debate at the end of this block.
Part 3 of this block looks at a framework for the adoption of e-business standards. As I
said previously, the proliferation of open standards now means that the adoption
decision is about which standard to adopt rather than which technology. From the
perspective of the late majority and laggards, making decisions only when the de facto
standard has been firmly established, it's usually pretty clear what the choice should
be. But for the innovators, the early adopters and even the early majority, there are a
number of factors that must be given careful consideration in order to result in an
informed adoption or rejection decision. The framework in Part 3 considers these
factors.
Summary
In this part of Block 1 I have given you a glimpse of the bewildering array of available
standards. Classifying them in terms of open or closed, sponsored or unsponsored, de
facto or de jure can help us to make sense of this complexity.
Standards bodies often compete amongst themselves and can set different standards
for the same technologies. Heavy investment of an organisation's resources into an
immature standard can prove to be a costly mistake if the standard fails to cross the
chasm. Technology producers, consumers and governments all have an interest in
setting, maintaining and sometimes extending standards.
Open standards are important in order to maintain vendor neutrality and reduce
lock-in. They promote interoperability and encourage sharing. For developers and
consumers they provide a much-needed stability. An important factor in the diffusion
of the Internet and the Web has been that its foundations are built on open standards.
Standards can be treated as innovations. As such, organisational adoption decisions
apply increasingly to which standard to adopt rather than which technology to adopt.
Block 1 Part 2 | 16
Glossary
ASCII (American Standard Code for Information Interchange) A standard for
computer encoding those characters that are based on the decimal numeral system
and the English/Latin alphabet.
closed standard A standard that is not published for public use and is not available
for third parties to implement without the payment of royalties. Contrast with open
standard.
de facto Describes a standard that dominates because it is the most popular in
practice rather than because it is prescribed by a standards body. Contrast with de jure.
de jure Describes a standard that has been ratified and published by a
standardisation body or prescribed in law. Contrast with de facto.
direct network effect The exponential increase in value of a product that is directly
related to an increase in the number of users using a similar product and
communicating via compatible standards. Contrast with indirect network effect.
Dvorak An alternative to the QWERTY keyboard layout, named after its inventor
August Dvorak. It aims to improve typing speed and accuracy with an ergonomic
arrangement of characters according to frequency of use, but has not seen
widespread adoption.
EBCDIC (Extended Binary Coded Decimal Interchange Code) A standard for
character encoding developed by IBM for use on its mainframe computers.
embrace, extend and extinguish A three-phase strategy used by businesses in an
attempt to gain a competitive advantage. The strategy involves embracing an open
standard, adding proprietary extensions to that standard, and then using the network
effect of these extensions to extinguish the competition.
Ethernet A widely used local area networking technology.
HTTP (Hypertext Transfer Protocol) The lightweight protocol used to communicate
requests between web clients and web servers.
IEEE (Institute of Electrical and Electronic Engineers) A large technical professional
association, formed in 1963 after the merger of two older institutes, that is responsible
for the formation of a range of international technology standards.
IETF (Internet Engineering Task Force) An open community whose purpose is to
coordinate the evolution of the Internet through standards processes based upon RFCs.
indirect network effect The increase in value of a product as a result of an increase
in the availability of complementary products. Contrast with direct network effect.
interoperability The ability of software and/or hardware produced by different
manufacturers to communicate with each other.
ITU (International Telecommunications Union) An international standardisation
organisation for radio, video and telecommunications, formed in 1865.
open source A term referring to software that has its source code made publicly
available for user modification.
open standard A standard published for public use and available for third parties to
read and implement without royalties. Contrast with closed standard.
OSI (Open Systems Interconnection) A de jure networking standard ratified by ISO
in the 1980s. Although the OSI seven-layer model was embraced by some
governments, it was never implemented in its entirety.
proprietary standard Another term for sponsored standard.
public domain A term referring to something that belongs to the public as a whole
and is not subject to copyright law, i.e. it is 'in the public domain'.
Block 1 Part 2 | 17
public standard Another term for unsponsored standard.
QWERTY The de facto standard keyboard layout on English-language computer
keyboards, in which q, w, e, r, t and y are the first six keys on the top row of letters.
REST (Representational State Transfer) An architectural style that encompasses the
common web standards such as HTTP, URL, HTML (and XML), etc.
reverse engineering The process of analysing the workings of a product for the
purpose of constructing another when access to the original design or source code is
not possible.
RFC (request for comments) The name given by the IETF to the documents it
publishes, some of which are Internet standards.
SNA (Systems Network Architecture) A proprietary networking protocol used in IBM
mainframe environments.
SOAP A W3C standard that aims to be a lightweight protocol to enable the creation
of distributed applications using heterogeneous software components across the
Internet. The acronym was formerly considered to stand for Simple Object Access
Protocol, but this expansion is now defunct.
sponsored standard A standard that is owned and maintained by a single business
organisation. Contrast with unsponsored standard.
switching costs The costs associated with moving from one provider of a product
or service to another.
unsponsored standard A standard that is ratified and published by a
standardisation organisation. Contrast with sponsored standard.
vendor lock-in A situation where a customer is dependent on the products or
services of one vendor because of a lack of (compatible) alternatives or because of
the prohibitively high switching costs involved in moving vendors.
Answers to self-assessment activities
Activity 1
Table 3 is my version of the completed table. Your example languages may well differ
from mine in some instances. Note in particular the non-standard naming convention
adopted by the standard once it reaches ISO 8859-5.
Table 3 Completed version of Table 1
Standard Name Examples of language coverage
ISO 8859-1 Latin Alphabet No. 1 English, French, German, Italian, Latin
ISO 8859-2 Latin Alphabet No. 2 Croat, Czech, Hungarian
ISO 8859-3 Latin Alphabet No. 3 Esperanto, Maltese
ISO 8859-4 Latin Alphabet No. 4 Latvian, Lithuanian, Slovene
ISO 8859-5 Latin/Cyrillic Alphabet Russian, Serbian, Ukrainian
ISO 8859-6 Latin/Arabic Alphabet Arabic
ISO 8859-7 Latin/Greek Alphabet Greek
ISO 8859-8 Latin/Hebrew Hebrew
ISO 8859-9 Latin Alphabet No. 5 Turkish
ISO 8859-10 Latin Alphabet No. 6 Faroese, Icelandic
Block 1 Part 2 | 18
Activity 3
For this activity I used the European Union's definition of an open standard,
introduced earlier, to inform my decision regarding a standard's openness or
closedness. My version of the completed matrix is shown in Figure 5. Yours will of
course differ, but you should be able to explain why yours differs from mine using a
reasoned argument, including supporting evidence. In particular, at the time of writing
(2007) the PDF format is still a sponsored standard owned by Adobe; however, Adobe
promises soon to release the format to a standards organisation.
Open
UnsponsoredSponsored
XML
HTML
PDF
SWF
DOC
?
Closed
Figure 5 Completed version of the standards classification matrix
Regarding the bottom right section of the matrix (representing standards that are both
unsponsored and closed), I think it would be difficult by anybody's definition to find a
standard that would fit into this box. Unsponsored implies open.
Bear in mind also that a sponsored, open standard is not possible according to the
EU's definition, as the EU stipulates that an open standard must be maintained by a
not-for-profit organisation. Using my more liberal initial definition of an open standard
would have allowed PDF to occupy this section of the matrix.
Acknowledgement
Grateful acknowledgement is made to the following source:
Figure 1 and Figure 2: Permission to reproduce extracts from British Standards is
granted by BSI. British Standards can be obtained in PDF format from the BSI
online shop (http://www.bsi-global.com/en/Shop/) or by contacting BSI Customer
Services for hard copies (tel: +44 (0)20 8996 9001, email: cservices@bsi-
global.com).
Block 1 Part 2 | 19
