Print this Section

  
  

Internet History and Usage

The World Wide Web (WWW) is pervasive in daily life. Since surfing the Web and using email are routine activities for most people, it seems as though these technologies have been around forever. Certainly, the underlying technology of the Internet goes back at least 40 years, but the Web is a recent phenomenon that has experienced major growth within the past decade.

Like most technologies, the Web evolved from technological predecessors that were unable to predict the final form into which they would morph. Technology has a way of starting with a nascent sense of purpose that forever branches into arenas that were not imagined at the start. Historical development of these background technologies provides an interesting canvas upon which to paint what is still an adolescent portrait of the Web.

ARPANET - Internet Beginnings

Sputnik

Figure 1-1.

Sputnik

The Advanced Research Projects Agency (ARPA) was created in 1957 in response to the Soviet Union's successful launch of Sputnik ‒ the world's first artificial satellite. Funded by the Department of Defense, the Agency brought together the human intelligence needed for America's first successful satellite launch 18 months later. By 1962, however, ARPA's purpose had expanded to encompass the application of computers to military technology, and a significant part of the expansion dealt with computer communications and networking.

A persistent problem in research and development is bringing together the needed intellectual capital to work on problems or exploit opportunities. Since experts are often times scattered geographically, it makes it difficult to maintain interaction among participants or hinders the continuity of projects. Therefore, electronic communications were deemed an important area of investigation to support ARPA work efforts.

ARPANET

Figure 1-2

ARPANET original drawing

In addition, the Cold War raised concerns about the impact that nuclear war could have on the integrity of computer networks to sustain military command and control. It was unacceptable to think that even a minor network outage could disrupt military command, thereby tampering with the final outcome of a major war and increasing the spread of devastation. Thus, the need to support research cooperation among scientists and engineers combined with concerns about network vulnerability led to the concept of distributed packet switching as the preferred computer communications model.

In this model, network transmissions are split into small packets that may take different routes to their destination through different nodes ‒ through different computers ‒ along the network. Computers hand off packets of data from one to another through various routes, and the destination computer collects all the packets and reassembles them into the original message. By transmitting different pieces of a message along different routes, the security of the message is heightened. Also, since a packet can travel various routes to its destination, an alternate route can be used if the first route is malfunctioning. As a result, a distributed network of interconnecting computers is more secure and can better withstand large scale destruction than can a centralized network connected to one or a few host computers.

In 1969, the Department of Defense commissioned ARPANET for research into networking. The first node was at UCLA, closely followed by nodes at Stanford Research Institute, the University of California at Santa Barbara, and the University of Utah. By 1972, much of the work of developing hardware, software, and communications protocols had shifted to universities and research labs. By 1973, ARPANET had linked 40 machines and had international connections to England and Norway.

Dr. Leonard Kleinrock

Figure 1-3.

Dr. Leonard Kleinrock

Dr. Leonard Kleinrock is known as the inventor of Internet technology, having created the basic principles of packet switching while a graduate student at MIT. This was a decade before the birth of the Internet which occurred when his host computer at UCLA became the first node of the Internet in September 1969. He wrote the first paper and published the first book on the subject; he also directed the transmission of the first message ever to pass over the Internet.

One of the issues in computer communications is the reliability of messages sent from one computer to another. It is possible, if not probable, that the computers are different makes and models and have different methods for sending and receiving packets of electronic information. When the information does not reach the intended computer because of transmission problems, the issue of lost packets comes into focus. These concerns led to the development of TCP (Transmission Control Protocol) to ensure reliable connections between diverse governmental, military, and educational networks. The parallel development of IP (Internet Protocol) dealt with the problems of assembling packets of data and ensuring those packets reached the correct destinations.

By 1982 it had been decided that ARPANET was to be built around the TCP/IP protocol suite. Doing so enabled direct communications between computers using land lines, radio, and satellite links among different networks of computers. At this point, an "internet" became defined as a connected set of networks, specifically those networks interconnected through TCP/IP. That same year External Gateway Protocol (EGP) specifications were drawn up under which different networks communicated with one another. By 1984, over 1,000 host computers were part of ARPANET. Domain Name Servers (DNS) were introduced to permit use of host names (e.g., "www.cox.net"), and numeric IP addresses (68.1.17.9) were introduced for identifying and linking computers on the networks.

NSFNET - Internet Growth

Expansion of what now had become the Internet began in 1986 through funding by the National Science Foundation. The NSFNET was originally designed to link supercomputers at major research institutions, but it quickly grew to encompass most major universities and research and development labs. By 1990, there were over 300,000 host computers. In 1994, a report commissioned by the NSF entitled "Realizing The Information Future: The Internet and Beyond" was released. This report presented a blueprint for the evolution of the "information superhighway" and had a major impact on the way the Internet was to evolve.

In 1995, after a short but successful history, NSFNET was "defunded" and restrictions were lifted on commercial use. At this point, the stage had been set for exponential growth in Internet usage. Funding that previously supported NSFNET was redistributed to regional networks to help purchase Internet connectivity from the now numerous, commercial network service providers. Over the next three years, host sites increased by a million per year. During 1995-1997, the number of sites increased by over 6 million per year to nearly 20 million host sites. By now, government agencies, educational institutions, and private enterprises were all energetic clients of the Internet.

On October 24, 1995, the Federal Networking Council unanimously passed a resolution defining the term Internet:

"Internet" refers to the global information system that -- (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein.

The Internet can be thought of as a technical infrastructure composed of computers, cables, networks, and switching mechanisms by which one computer can communicate with another computer. In the final analysis, the benefits of networked computers are realized by the information being exchanged among the people sitting behind the computers. From the start, email and file transfer programs were integral to the purpose behind the Internet. Such programs allowed people to keep in touch with each other and gather all the information they needed.

WWW - Information Net

Although email and file transfer methods were important to Internet growth, they did not provide the "user friendly" methods needed by novice users to get to the growing repositories of information scattered around the world. It was still very much a technical issue to communicate through the Internet. Realization of the goal of an information superhighway required the development of tools to "hide" Internet technology behind a human interface. This came with the development of the World Wide Web and Internet browser software.

Ted Nelson

Figure 1.4

Ted Nelson

In the mid-1960s Ted Nelson coined the word "hypertext" to describe a system of non-sequential links between text. The idea was to navigate among textual references without having to read material in a linear sequence. A piece of information here would lead to a related piece of information there in a chain of links to gather intelligence from sources scattered throughout multiple documents. It was not until fifteen years later that Tim Berners-Lee, a consulted with the European Laboratory for Particle Physics (CERN), wrote a program entitled "Enquire-Within-Upon-Everything" which allowed links to be made between arbitrary text nodes in a document. Each node had a title identifier and a list of bidirectional links so readers could jump from one section of a document to another by activating the text links.

Tim Berners-Lee

Figure 1.5

Tim Berners-Lee

In 1990 Berners-Lee started work on a hypertext "browser". He coined the term "WorldWideWeb" as a name for the program and "World Wide Web" as a name for the project. The WWW project was originally developed to provide a distributed hypermedia system from which any desktop computer could easily access physics research spread across the world. The Web included standard formats for text, graphics, sound, and video which could be indexed easily and searched by all networked machines. Standards were proposed for a Uniform Resource Locator (URL): the addressing scheme for the Web; the HyperText Transfer Protocol (HTTP): the set of network rules for transmitting Web pages; and the HyperText Markup Language (HTML): the subject of this tutorial.

The prototype browser was written for the not widely used Apple Next computer. A simplified version adaptable to any computer platform was built as the "Line-Mode Browser," and was released by CERN as freeware. Berners-Lee later moved to Massachusetts Institute of Technology (MIT) and helped found the World Wide Web Consortium (W3C) which today maintains standards for Web technologies.

In January 1993, Marc Andreessen, who was working for the National Center for Supercomputing Applications at the University of Illinois, released a version of his new point-and-click graphical browser for the Web that was designed to run on Unix machines. In August, Andreessen and his co-workers at NCSA released free versions for Macintosh and Windows. Andreessen and Eric Bina developed the Mosaic browser and later founded Netscape Corporation to produce the Navigator browser: the offspring of the Mosaic browser and one of the first and most popular commercial browsers. In August 1994, NCSA assigned all commercial rights to Mosaic to Spyglass, Inc. Spyglass subsequently licensed its technology to several other companies, including Microsoft for use in Internet Explorer. It was not until 1996 that Microsoft became a major player in the browser market. Today, however, Google Chrome has become the most popular browser with approximately 51% of worldwide market share.

History of the Internet

Figure 1.6

The diagram, developed by the Malone Media Group, outlines the History of the Internet.

As the Internet evolved and became available to the general public in the 1990s, ISP companies such as AOL, CompuServe, and Prodigy allowed people to gain access to the World Wide Web directly from their home computers. This led to the growth of Electronic Commerce (e-commerce). In 1995, Amazon.com set the standard for a customer-oriented e-commerce Web site.

The era of social media took off in the early 2000's with the release of Friendster. This was followed by LinkedIn and MySpace in 2003, and the launch of Facebook in 2004. Since then other social media applications like Instagram, Twitter, and Pinterest have gained popularity in both mobile and desktop environments.

Today, over three billion people have access to the Internet. The World Wide Web provides access to social media, shopping, business transactions, research and learning. There is no doubt that the World Wide Web will continue to evolve and provide even greater access to resources.

Technical Convergence

The Internet has been a convergence of many technologies brought together for the purpose of sharing information electronically. Today, the Internet is a system of interconnected networks that uses common communications protocols, or rules of exchange, to transmit information among computers. One of these protocols is the HyperText Transfer Protocol (HTTP). HTTP governs the exchange of hypertext documents, or Web pages, between computers. Information exchanges that use this protocol are collectively called the World Wide Web (WWW). Other Internet protocols include those used for transferring files ‒ the File Transfer Protocol (FTP) ‒ and for exchanging email ‒ the Simple Mail Transfer Protocol (SMTP). The Internet, then, is not a single entity. It integrates many different ways of maintaining and exchanging information among many different computers on many different networks scattered around the world.

The World Wide Web is one such collection of information exchange methods. It is based on the use of Web pages as the mechanisms for packaging and transmitting information among computers connected to the Internet. A Web page includes textual information along with links to related textual or graphical content located anywhere else on the Internet. The information is formatted for presentation using the HyperText Markup Language (HTML) to arrange and style presented information and to link to other content on distant computers. This formatting language is the key to unlocking and bringing world-wide repositories of information to the computer desktop. It is also the means by which personal information is shared with the world. HTML, being the subject of these tutorials, will be examined in depth.

From these beginnings, the World Wide Web has grown into the primary infrastructure to deliver information around the globe. A single individual can establish a Web presence accessible by anyone else in world with an Internet connection; and a single company can establish a Web site to take part in the global marketplace of products and services. Although the Web began as a public utility with limited scope, today it has grown, through the entrepreneurship of individuals and organizations, into what its name implies — a worldwide web of interconnected networks to conduct the public and private affairs of the world community.

In 1969, the Internet began with four nodes and four users. In 2015, according to Internet Live Stats, there are over 3 billion Internet users worldwide which equates to approximately 40% of the world population. Unfortunately, internet expansion has not been equal around the globe. Countries with the intellectual and managerial talent along with the political and economic systems to promote that talent typically lead the way.

When countries are classified by region, Internet usage rankings appear as shown in Table 1-2. The Americas and European countries account for less than 50% worldwide usage the largest number of users are found in Asian countries.

Table 1-1. Region Ranking by Internet Users
Source: Internet Live Stats - July 2013
Rank Region Users Percent
Asia1,322,491,06948.4%
Americas (North and South)596,331,29121.8%
Europe520,381,48119%
Africa268,209,1629.8%
Oceania25,109,5900.9%

Internet Technologies

For most of the decades leading to the present, Internet connection was slow. Users were restricted to using existing telephone lines with unreliable dial-up connections. Until recently, most users connected to the Internet at speeds topping out at 56,000 bits of information per second. The last few years, however, have witnessed a significant rise in Internet speeds through the availability of Digital Subscriber Lines (DSL) and cable modem connections with speeds of up to 5,000,000 bits per second. These broadband connections to the Internet continue to rise in the U.S. Today, approximately 70% of Americans have some type of broadband access to the Internet.

Most workers in the U.S. also have high-speed lines to the Internet through their company's network connections. As of mid-2005, over 80% of workers had access to high-speed connections.

In developing Web pages, it is important to know the target browsers being used by site visitors. Browsers differ in their underlying technologies and the extent to which they support common standards. There are few guarantees that a Web page will display the same way through two different browsers. Statistics reported in Table 1-2 show the percentages of browsers in use today.

Table 1-2. Browser Usage
Source: W3Counter - July 2015
Browser Percent
Chrome 46.2%
Internet Explorer 13.5%
Safari 16.4%
Firefox 13.2%
Opera 3.9%

If you are designing Web pages for a known audience with a known browser then your development efforts become relatively easy. It is necessary only to test your pages through that particular browser. In designing for the general public, you need to make assumptions about your likely audience. Ideally, you should test your pages on all of the most popular browsers. For example, test pages on both Internet Explorer and Firefox. As a general rule, if you follow the W3C standards presented in these tutorials, your pages will have the best chance of being displayed correctly on all browsers that follow the outlined standards.

All modern PC monitors can display in 1366 x 768 (pixels) screen resolution or higher, and many users choose this resolution for displaying Web pages. A safe approach is to design Web pages for display at 1366 x 768 resolution unless you are aware that your audience prefers the larger page sizes possible under higher resolutions. With technology quickly advancing, in a very few years 1366 x 768 resolution will become the minimum standard.

It should be noted that screen resolution is not related to screen size. Even small screens (15" or 17" for example) can be set to high resolution display depending on the amount of video memory installed in the system. Still, the window size within which the browser is opened can have a significant effect on Web page display. A full-screen display of a page normally looks different from a page opened in a smaller window because the page adjusts its layout to the window size. This automatic adjustment permits the page to expand or contract to the chosen window width, and makes it less crucial to design for a particular screen resolution or a particular window size.

When displaying color graphics on a page, consideration needs to be given to the color depth (the range of colors) of monitors. There are three common color precision modes. Users with older PC normally have 8-bit (256 colors) monitors (only about 1% of users have this restriction). Other users vary between 16-bit (65,536 colors) and 24-bit (16,777,216 colors) monitors which represents approximately 18% and 72% of users, respectively. When creating your own graphics, you have the choice of the color depth displayed. When using prepared graphics, you may not have this choice. Just be aware that images saved with high color depths may not accurately display colors on monitors with small amounts of video memory and fewer possible colors.

Given the trends in Web technologies, the good news for Web developers is that fairly modern computer systems are in preponderance. This means that it is usually safe to use the latest Web technologies in authoring Web pages with little need to compromise best practices to reach audiences with older technologies. The best bet is to design for the Internet Explorer browser displayed at 11366 x 768 resolution at a color depth of 32-bits in a full-screen window. Adjustments can be made for other browsers, other screen resolutions, and other color precisions if it is reasonable to expect page visits by users with other technologies.

Internet Standards and Coordination

The Internet is a global network that is not run or managed by a single person or group. Each separate network that makes up the Internet is managed individually. There are however, a number of groups that develop standards and guidelines.

Internet Society

The Internet Society (ISOC) is an independent international nonprofit organization founded in 1992 to provide leadership in Internet related standards, education, and policy around the world. The Internet Society is home to the Internet Engineering Task Force (IETF) and the Internet Architecture Board (IAB) which are responsible for Internet infrastructure standards.


Internet Engineering Task Force

The Internet Engineering Task Force (IETF) is a large open international community of network designers, operators, vendors, and researchers concerned with the evolution of the Internet architecture and the smooth operation of the Internet. It is open to any interested individual.


Internet Architecture Board

The Internet Architecture Board (IAB) is a committee of the IETF that provides guidance and direction to the IETF. A major function of the IAB is the publication of the Request for Comments (RFC) document series. A Request for Comments (RFC) is a memorandum published by the Internet Engineering Task Force (IETF) describing methods, behaviors, research, or innovations applicable to the working of the Internet and Internet-connected systems.Through the Internet Society, engineers and computer scientists may publish discourse in the form of an RFC, either for peer review or simply to convey new concepts, information, or (occasionally) engineering humor. The IETF adopts some of the proposals published as RFCs as Internet standards.


TOP | NEXT: Serving Web Pages