Robert Fulford's column about Tim Berners-Lee, the inventor of the Web

(The National Post, August 1, 2000)

Think of mathematics as the Latin of modern times. Across the world, it plays, as several historians have noted, the role that Latin played for Europeans in the Middle Ages. It's the international language of vital work. It unites those whose thoughts produce big changes, and it helps make those changes occur. We who know nothing of mathematics (like Europeans who knew nothing of Latin in, say, 1350) are fated to be, in a crucial sense, more spectators than participants at the central dramas of our lifetime.

Tim Berners-Lee, who devised the World Wide Web, is no spectator. He received mathematics as a birthright, the way others absorb carpentry or journalism from their parents. His mother and father, both Manchester University mathematicians, helped design the first commercially sold computer. It was born in 1951, several years before Tim. When he was growing up, "we discussed imaginary numbers over breakfast."

His father read books about brain structure, trying to guess how a computer might someday make that sudden intuitive leap that solves otherwise insoluble problems. The father's curiosity affected the son, though that seems clear only in retrospect.

Berners-Lee did a physics degree at Oxford and worked as a software engineer. In 1980, he went to Geneva as a consultant at CERN, the European Particle Physics Laboratory. There, a little miracle happened. He wrote his first Web-like program, to keep track of the CERN scientists and their projects. Eventually he developed a series of electronic connections that the scientists could use both at CERN and when they went home to institutions such as the Stanford Linear Accelerator in California or the Rutherford Lab in Britain. Berners-Lee made it possible for them to do what every Web user does now: click on a certain line to make a document appear.

People just starting to use these systems sometimes confuse the Internet and the Web. In fact, the Internet was there long before Berners-Lee, and so was e-mail. So, for that matter, was the hypertext link, a way of connecting one document to another. But these were specialized tools of academics, governments and some businesses. When you used them to get information, you paid money.

As Berners-Lee puts it, "I happened to come along ... after hypertext and the Internet had come of age. The task left to me was to marry them." Ten years ago this summer, he was working over these ideas, testing them privately, showing them to colleagues, refining them. In the autumn of 1990 he wrote the first Web browser (WorldWideWeb) and created the first Web site (info.cern.ch). In 1991 he made his system public, and soon others were extending it.

Berners-Lee and his colleagues wrote mathematical codes that expressed a new way to organize information. They taught the world a new language. They worked out a programming script, Hypertext Markup Language (HTML), which everyone masters before creating a Web site. They standardized the system for linking documents, a hypertext transfer protocol (HTTP). Most important, perhaps, they worked out the system for finding sites and documents, the universal resource locator (URL).

These three keys unlocked a universe of knowledge. They made it possible for me, sitting at home, to search the Library of Congress catalogue, get this morning's political news from Tokyo, learn my bank balance or carry out tasks I haven't even thought about yet. On the Web, hypertext became a particularly enthralling device. It seems to mimic the restless flickering of the human intelligence, the spontaneous brain activity that makes connections and leads to intuition.

Berners-Lee was 34 years old, in 1990, when he did his most important work. It didn't make him rich. In fact, he believes it wouldn't have worked if he had patented it. Other companies would then have developed different systems, designed to keep out the non-paying public. It would have been impossible to link them. The Web as we know it might not have developed for a long time.

Something else surprises anyone who considers the history of the Web: It has no central control. You do not apply for a licence (except to the extent of registering a name for your site) and when you leave, you do not bother to resign. You remove your document from the Web without necessarily telling anyone. That leaves the Web with dangling links -- frustration creators; we think we've found the material we want, we click on the blue line, and the screen shows a "not found" sign. But Berners-Lee realized early on that this was a price we would all have to pay in order to have the Web operate freely across the world.

He's still at work on his project. Since 1994 he's been at the Massachusetts Institute of Technology, running the World Wide Web Consortium (W3C), an agency representing Web companies, hardware providers, etc. It keeps the Web on an even keel by refining the rules in ways that are acceptable to just about everybody. This means, he says, "as close as possible to no rules at all."

Berners-Lee is a Utopian evangelist, like many people connected with the Web. He says things like "the openness of the Web is a powerful attraction. Everyone can not only read what's on the Web but contribute to it, and everybody is in a sense equal." That's nonsense, of course, but it's the sort of thing Web pioneers say.

Last week, to learn a little about Berners-Lee, I bought a copy of his 1999 autobiography, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by Its Inventor (HarperCollins). I would love to tell you that this superb narrative deserves a place among the masterworks of science writing. Alas, neither Berners-Lee nor his ghost writer, Mark Fischetti, can tell a story, explore an idea or describe human beings in a way that brings them to life.

Never has the foundation myth of a great enterprise been described with so little spirit. I imagine an account of the Rotary Club's charitable activities in Kelowna would be as gripping. But then, we can hardly expect Berners-Lee to be good at everything. My tiny annoyance at his inadequate book doesn't offset for a second my profound gratitude to one of the major inventors of my lifetime, a man who used mathematics to bestow a previously unimagined gift on everyone who lives by knowledge.

Return to the List of Robert Fulford's Columns

Return to Robert Fulford's Home Page
typewriter image