Tommi Space

A History of the Internet and the digital future

Notes, citations, and thoughts about a greatly insightful work I am using to give an historical context to my bachelor’s thesis.

p. 15:

Baran was suggesting combining two previously isolated technologies: computers and communications. Odd as it might appear to readers in the digital age, these were disciplines so mutually distinct that Baran worried his project could fail for lack of staff capable of working in both areas.

Splitting information and communications in small, fragmented packets of data.

page 16:

Baran’s concept had the same centrifugal character that defines the Internet today. At its most basic, what this book calls the centrifugal approach is to flatten established hierarchies and put power and responsibility at the nodal level so that each node is equal. Baran’s network focused on what he called user to user rather than … centre-to-centre operation.

Even if it was economically extremely convenient (60M$ instead of 2B$ per year), AT&T refused to invest in digital communication technologies.

Scientific advancement and research perceived as the new frontier for modern warfare, at which the Soviets succeeded way more than the US up to the late 50s. RAND comes into play: a degree of intellectual freedom which is … unique.

In February 1958 ARPA was born:

It would be a small operation that would issue contracts for research and development to other organizations.

In 1962, Licklider hired by ARPA to focus on command and control of behavioral sciences. Licklider believed that improving the usability of computer systems would lay the foundations for improved command and control.

At the core of Licklider’s thinking was an emphasis on collaboration. Licklider posited a future scenario in which a researcher at one research centre could find a useful computer resource over the network from a research centre elsewhere. This, in a world of incompatible machines and jealously guarded computing resources, was far-sighted talk indeed.

What in the 60s was strongly and jealously guarded, computing resources, now is abundant. It is now software and algorithms the platforms provided.

Licklider’s influence was felt further afield through his support of large research programmes in universities that stimulated the early computer studies departments and attracted the new generation of students to the new field.

Zott’s beer garden as one of the main locations part of the story of the internet. The recurring partying factor in the development of digital technologies, as with, for example, Facebook (page 34)

The invention of the transistor at Bell Labs in 1947 liberated computing from the large, unreliable vacuum tubes on which the first digital computers were based.

page 53

Tim Berners-Lee developed a piece of software called Enquire in the 1980s to map relationships between the various people, programs and systems he encountered there [CERN]

page 105

He [Berners-Lee] and visionaries such Ted Nelson and Douglas Engelbart who came after him both sought to develop tools that could allow human beings to better cope with the weight of information bearing down on them.

page 106, hypertext by Ted Nelson

Its [the www’s] universality is essential: the fact that a hyperlink can point to anything, be it personal, local or global, be it draft or highly polished.

In October 1990 HTML editing and viewing, in December 1990 the first servers. Uninterest, slow implementation and adopting even within CERN. The first thing to be widely written in HTML was CERN’s telephone directory.

In 1995 Netscape went public with the largest IPO ever.

For the first few years of its existence the Web was defined by nothing more than specifications stored on some disk sitting around somewhere at CERN.

When the early developers of www software met in early 1994 for a www wizards’ workshop, Berners-Lee proposed the idea of a consortium to oversee the future direction of the Web. The www Consortium, known as W3C, was formed to promote interoperability and standardization of web technologies.

Stallman and proprietary software

Open-source: the hacker renaissance, from page 111:

delivery of new computers whose manufacturers demanded that their users sign non-disclosure agreements prohibiting them from swapping code and building on each other’s work. From Stallman’s perspective:

This meant that the first step in using a computer was to promise not to help your neighbor. A cooperating community was forbidden. The rule made by the owners of proprietary software was, If you share with your neighbor, you are a pirate. If you want any changes, beg us to make them.

The community that emerged from the wide collaboration on Linux was a loosely governed structure over which Torvalds watched as a benevolent dictator and moral authority.

This scale of collaboration was possible because of the Internet, much as Licklider and Roberts had envisaged in 1968 when they mooted that the interconnection of separate communities would make available to all the members of all the communities the programs and data resources of the entire super community The community that emerged from the wide collaboration on Linux was a loosely governed structure over which Torvalds watched as benevolent dictator and moral authority. His lieutenants have ownership of parts of the project.
They are, he claims, selected by the community according to merit:

The programmers are very good at selecting leaders. There’s no process for making somebody a lieutenant. But somebody who gets things done, shows good taste, and has good qualities - people just start sending them suggestions and patches. I didn’t design it this way. This happens because this is the way people work. It’s very natural.

He, however, maintains approval over final decisions on major new modifications to Linux and acts as a coordinator. Alerting readers to the advent of open-source in 2003, Wired published a feature on the phenomenon that delivered the message in no uncertain terms: Open source is doing for mass innovation what the assembly line did for mass production. The involvement of such a large community of developers in proposing, testing and refining contributions to Linux made the system adaptable but solid.

  • What distinguished the Capitalist efforts at making Web technologies and communitarian ones such as Linux?
  • consider the aspect of the technology and software vs the hardware it is run on

page 118: Google’s initial focus was on back links. PageRank, Freedom to seek at page 119. googol: Google’s success and effectiveness is based on massive amount of data, centralized, processed, inspected, ranked:

The enormity of the googol, according to Google, reflects their mission to organize a seemingly infinite amount of information on the Web.

p. 120

eBay proved that the Internet was a marketplace so massive, and so diverse, that nothing is beyond commodification.

The Internet became a global marketplace without any explicit plan to make it so. Indeed, until the very beginning of 1995 commercial activity on the Internet was expressly forbidden under the National Science Foundation’s Acceptable Usable Policy.

By the 90s, more than half of the Internet was run by the private sector. Preventing commercial activity within the Internet would have inhibited investments in the infrastructure of commercial TCP/IP services.

In 1993 the NSF took steps to privatize the operation of the Internet. NSF published a plan for the change in May 1993: ISPS would operate their own networks and gateways would interconnect them.

In 2003 Joshua Schracter founded, a directory of bits and pieces of Web content that users had tagged with reference phrases to denote that it was of interest for one reason or another.

p. 144