April 1, 2008
by Jack Kessler, firstname.lastname@example.org
|1) Magic!... Experiments, the late 1980s|
|2) Lessons learned, the early 90s|
|3) Education, the late 90s|
|4) Commerce, the 21st c|
|5) Communication: "The Revolution
Will Be Handheld" |
-- access and other myths
|6) Of Feedback Loops & Falcons -- conclusion|
|Notes and Bibliography|
The first Internet I ever knew was buried deep in a cavern, underground, at Berkeley...
1) Magic!... Experiments, the late 1980s
We users had little preparation for networked digital information's wonders, back in 1989. We knew, and imagined that we understood, radio and television and telephony. Many of us had used telegrams, and a few of us even knew the teletype. And most of us had used "computers", by then, although few among us pretended really to understand the things.
The idea, though, that all of these one day might combine, and miniaturize, and multiply, was unimaginable to any of us, back in 1989. William Gibson knew they would -- his Neuromancer 1 people were plugged into The Matrix as long ago as 1984 -- but too few of us were reading Gibson, back in the 80s.
Most late 1980's users already knew the typewriter, though: speed-typing, that most valuable class in our 1960s high schools... The first "magic" about digital information for me, then, was word-wrap: like many early users I sat at my 1980s office desk, fascinated, watching WordStar shift my text instantly and silently down to the next line, without my having to reach for any klunky carriage-return lever. Digital magic.
In 1989 the distance communication I saw in the Evans Hall basement that day was digital magic too. Users think in childhood metaphors: to me getting from Berkeley to a book at Oxford over 5000 miles away, digitally and instantly, was "leaping tall buildings in a single bound". That was the thrill of it; and it still is, now many years later, and even though now I know a little more about how all of it works.
2) Lessons learned, the early 90s
There was so much to learn, particularly for those of us with what the 60s had been pleased to call "humanities" or "social science" educations, which had offered schooling in just about every logic except the mathematical. IBM's warning, Think, quickly was found to be narrowly-arithmetical -- and their equally-famous caveat, Garbage In / Garbage Out, a perceptive diagnosis of further rigidities ahead.
* Human language, for instance
Back when the digital information world still was confined to "computers" -- those small and isolated segments of mainframe system capacities, selected and arranged mostly for their marketability -- the problem of communicating digitally with others still was minimized.
Work was online, in the 80s and into the early 90s, but output and data transmission were offline. So there still was some chance for traditional intermediation -- for translation, editing, printed document revision -- before work was seen by others.
The Internet, though, offered dis-intermediation: WYSIWYG communication, cutting out not just translators but also publishers and distributors and peer and other reviewers, and the whole panoply of traditional intermediaries, nearly all "editors" of very many types 2.
Brave new publishing world: authorial intention let loose upon an unsuspecting and ill-prepared readership... Typos were not the only result: there also was, symptomatic of many other difficulties, both formal and substantive, the problem of character sets...
To those of us users involved in purely local or national activities, in situations comparatively mono-lingual and mono-cultural, character sets were not so much of a problem. Digital information early-on was primarily English-speaking, to a primarily English-speaking audience, a matter of anglophone nations plus small elites elsewhere educated highly in many subjects including English. So the early computer-world character sets were sufficient: ASCII served, and it served very well.
But the Internet's disintermediated and global communication changed things. Suddenly it wasn't just typos, exchanged with the "foreigners", it was full foreign languages, sent and received, languages requiring punctuation and characters not contained in English-language ASCII or its rapidly-proliferating "extensions".
Among ASCII flavors I remember were IBM ASCII and Microsoft ASCII and l'ASCII américain, and there were others. Then came the alphabet soup of international norms: ISO Latin 1 / 8859-1 and its numerous progeny and extended family. And finally came Unicode, with its 6000-and-counting languages.
Did it matter? Did it all really make a difference, whether a character set contained a "c cedilla" or simply left the cedilla out, or an "œ" was spelled "oe" in text -- or perhaps "différence" was rendered "diffe'rence", as it was stubbornly in my own texts for many years 3.
Well, yes it did, in fact, or at least it mattered to the "foreigners"... As my business background had been in international trade, I knew too well the linguistic capacities and preferences of my customers and suppliers overseas. Even the highly-educated elites among them, who knew my supposedly-lingua-franca English as well or better than I did, simply were offended at the sheer arrogance of typos and character set inadequacies turning up constantly in American business correspondence.
In Asia and Latin America and Africa and Europe they took such pains with their writing, trying always carefully, if not always successfully, at least to spell foreign names and addresses properly. So why couldn't we? Why wouldn't we? If the name was "François", or "Nuñez", why wouldn't we in the anglo-american Internet world be bothered to spell it that way? The foreigners certainly struggled valiantly with American nicknames and address conventions in corresponding with us.
I asked a genius-caliber but very young Internet developer, one day back in 1989. I still remember his wad of gum, feet propped-on-desk, iconic American baseball cap perched backwards -- as he peered over at me sceptically, declaring, "But all those foreigners really do speak English, don't they." It wasn't a question.
Since then, though, the "foreigners" have become "customers" -- globalization having reversed the terms of trade somewhat -- so now the Internet knows French, imperfectly still but better than it did, and as Tom Lehrer used to warble, it is "learning Chinese" 4. But there was a time, back in the early 90s, when things-digital were limited-to-monolingual, even on the Ouebbe.
* Mutilingual / Multicultural
A small step away from multilingualism, then, was full-blown multiculturalism. By the early 1990s the overseas Internet was blooming. Not only were US and UK users puzzling through the mysteries of accented text characters, French and German and Japanese users were too.
Internet nodes overseas were multiplying. Capacities of Transatlantic and other "pipes" were being exceeded periodically, and were expanding. By the 1992-3 arrival of the "killer app", the World Wide Web and its "browsers", there were solidly-constituted and sophisticated pockets of tech-savvy Internet users scattered around the globe.
What does multicultural mean, though? Is it enough, that a second and third language's characters might be accommodated in the new digital text?
Now, twenty years later, at the end of the first 2000's decade -- with the dollar's value falling against the euro and the yen, and the Chinese and Indian economies soaring -- and with Chinese and Indian and Gulf "sovereign funds" and giant multinational corporations all gobbling up distressed US and UK firms -- it may be hard to remember, even to imagine, the insular anglo-american views of the late 1980s.
Back then, though, there still were three geopolitical worlds, only -- Communist, Western, and "Third" -- and the latter group was said, still, to be "developing". Developing into what? Why, developing into one of the first two worlds, becoming either Communist or Western... There were no third alternatives, or fourth, or fifth and more possible worlds, as there are today.
So when we spoke of "multicultural", in the early 1980s as for a long time before, in telecommunications content as in so many other fields, we thought of something temporary: something which, eventually, would become like Us, or perhaps would mistakenly become like Them, but with no other options than that Manichean pair really possible.
There was no room in our conceptual closets, in other words, for a world permanently multicultural 5. Most of us, Communist and Western and Third, were convinced that our world was temporary and "developing", and that we or our single enemy would be its end-product. The daring among us even proclaimed "the end of history", back then 6.
But it didn't -- didn't "end" -- those were just simpler times, perhaps... or times more simply viewed... So maybe it just was far easier, in the late 1980s and even into the early 1990s, to think of a new telecommunications world, the Internet, in which only a single lingua franca would be used: a new Pax Romana united by among other things a seamless communications web, and a single language, English 7.
And then It All Changed: the Berlin Wall fell in December 1989, marking the end of the Tripartite World and of other Cold War simplifications. And there were other rumblings, not least among them some online.
* Protocol wars: ASCII(s), OSI
The Europeans began it... They had the strange idea, fought out in the pages of this journal's predecessor in fact 8 among other places, of designing cooperatively and imposing a common set of technical standards to be shared by all, globally, using the new digital telecommunications techniques.
Open Systems Interconnection, OSI, was developed in the early 1980s and then hotly debated for years, in conferences, papers, and on the new networked digital media, as a networking standardization and a supposed improvement upon the TCP/IP suite of protocols originally fashioned for the Internet.
"The Protocol Wars", these discussions came to be called 9. OSI won several battles, but TCP/IP won the war. Interestingly to outsiders, as we users were, it was a process which mirrored both "protocol wars" being fought elsewhere in the new digital industries, and broader geopolitical developments as well.
TCP/IP was, now famously, a technique developed via grant-funding from the US national government, specifically from its military establishment 10. But it drew its true strength from its very practical application, in its policy struggles against OSI and other competitors. And, unlike OSI, TCP/IP already was in place, it worked well, it had lots of users.
Particularly that last: users already had a "stake", in TCP/IP: they were trained in its approach, they already had built systems based upon and operating within its rules, they were busily attending endless international meetings devoted to its development.
Whatever the design advantages involved, then, all of those much-debated by specialists, one aspect of the OSI vs. TCP/IP competition was fascinating to users: "first-to-market" provided a great edge to TCP/IP, in preserving its role as the fundamental architecture of the Internet -- that was a valuable if painful lesson which the hard-knocks and relentlessly-commercial Internet users' world taught developers.
The closest parallel was the debate which raged for many years over the dominant architecture of the personal computer, Microsoft's "Disk Operating System / DOS". Like TCP/IP, DOS too was attacked by its critics -- commercial competitors, proponents of "open systems" -- as being poorly designed or even faulty, a proprietary system greatly enriching a single corporation, an "American" product.
But like TCP/IP, DOS also was first-to-market: by the time other competing and some supposedly-"better" systems had emerged -- elsewhere in US industry, over in Europe -- DOS had secured its commercial toehold, as the system used in the IBM PC, and had built the critical mass of real users who would ensure its success and dominance. The one thing more difficult than learning information systems is un-learning them, in order to learn others: users will go to great lengths, and pay high prices, to avoid re-training.
Even more than its commercial competition, the geopolitics of TCP/IP dominance became interesting, as well. American pragmatism versus European idealism, unilateralism versus multilateralism, ultimately capitalism, were a few of the debate themes. With the domineering entry of the great commercial marketing firms, onto the TCP/IP-based Internet in the 1992-3 year of Web introduction, such themes dominated.
The Internet's protocols were subject to the passions and fashions of broader geopolitical approaches in their development, too: international partisans took part on both sides -- there were US fans of OSI, and many Europeans made valuable contributions to the work of the Internet Engineering Task Force 11.
Nevertheless, the two fundamentally-different approaches remained: in caricature, the Europeans interested in consensus, and in top=>down authoritative solutions, and in muddling-through -- the US more in relentless and belligerent pragmatism, changeability, flexibility, in commercially-applicable and financially-profitable solutions. It was a healthy competition, then: one amenable to balance -- yin and yang -- what was needed was a window of opportunity in the protocol war.
* Productivity's paradox resolved: the double in-box
But one of the greatest digital lessons-learned, in the early 1990s -- perhaps the greatest, from the user's point of view -- grew from the resolution of the productivity paradox, the economics parable of the "double in-box". And for this the Internet was key -- and it may have been a key to the Internet's success, in return.
All through the 70s and 80s I had two -- in-boxes, on my desk -- in whatever office I worked, and so had all my fellow office workers. Into the first box went the day's traditional tasks: memos, letters, reports, invoices -- all printed on paper.
You worked at the desk: using pens and ink stamps and blotters, occasionally carrying a paper to the desk of another to work on it there, or down the hall to a meeting, or out to a jobsite, or even home in the briefcase to work on at night or over the weekend -- dangerous practice, that last, it's where the coffee stains and chewed edges came from. Research was done on the telephone: those old heavy black sets -- "Mazie, may I please have an outside line?" / "I'll call you when one is available" -- nostalgia...
But the piece of paper was central, the "document", the primary office tool -- still, in the 1970s and 1980s, as it had been already for several centuries -- and it arrived in its own in-box. Nothing new…
The 70s and 80s innovation, then, was to add a second in-box, initially for that same piece of paper. This was because all the numbers, so crucial to any business operation, had been put into a "computer" located down-the-hall.
So, when the desktop PC first arrived, and then got linked to others, via disks and finally cables, all that changed was that the data-entry job moved from the data-entry clerk's to my own desk: just as the ATM did away with bank tellers, the PC did in the data-entry clerks -- we the users were given that work, not instead of but in addition to the work we already had -- first you did the work, then you "entered" the data.
It meant double-work, for me and for most office workers. I still had the old in-box, but now I had the new in-box too, and both were for handling the same work, twice. Not exactly double, no... At first, the office mainframe's primitive capacities accommodated, literally, only the "numbers": statistics, inventory, finance, the sort of data traditionally handled by the bookkeeping department.
It was ironic, then, that as "the computers got smarter" the workload got worse. By the time the "personal" computer reached the office desk, it did far more than simply store and crunch numbers: customer lists and history were there, now, and strategic plans, correspondence -- its new software offered "word processing", "spreadsheets" and their "what-if scenarios", "databases", and the extra work that went with these.
And old-fashioned forms still had to be "typed". Paper documents had to be read and analyzed, prior to their contents being hand-keyed into the computer for consideration by others.
And data: digital information systems are data gluttons -- facts and numbers and documents formerly-ignored, and left sitting on the piece of paper on which they had arrived, at most stored in "pending" and more often consigned to "filing" oblivion or to that final arbiter, the wastebasket, suddenly became incredibly important -- so now all of it, every fact and number no matter how marginal its importance to the enterprise, had to be recorded, by each of us, on our "personal computer", for endless analysis by us and by others.
So there was duplication, in 1970s-80s office automation, if not exactly double then significantly so. And the double workload grew worse, as the new systems grew smarter: the old paperwork in-box did not disappear as predicted -- it stayed, and stayed full, and stared at us, as we entered our offices in the early morning, groaning at the prospect of yet another long day of doing double-duty on the same data.
The economists finally saw this... Economists finally see most things, even if, as the saying goes, "laid end-to-end they'll reach no conclusion"... In the productivity paradox, leading US thinkers in the subject such as Robert Solow and Paul Krugman defined and debated the apparent contradiction between continuing low productivity and the introduction of first office automation and then office information systems, both of which were supposed to have made US worker productivity soar 12. And at last, in the waning years of his long and distinguished tenure, Alan Greenspan – carefully, tentatively -- saw what he thought might be the first glimmer of a sustained US productivity up-tick... cautiously, finally 13...
Any of us could have told them. Then again, though, maybe all three of them -- Solow and Krugman and Greenspan -- suffered from the same phenomenon on their own desks, we all did, and it's not always easy to "see" a thing when you are up to your gills in it.
Perhaps it simply took a long time, much longer than any of us had anticipated: IBM's System 360, the first small-business-oriented "mainframe" computer, was introduced in 1964 14 -- their DOS-powered Personal Computer, which sat on my desk as I fussed with double in-boxes, came along in 1981 15 -- so it's been twenty years... arguably forty...
But I suggest the crucial factor has been not simply time, not just figuring out how to redistribute office workloads to eliminate duplication -- plenty of office managers broke their heads over that one long before "automation" -- no, I believe the catalyst was communication among machines, specifically the Internet.
It was not until users had their own Internet-enabled computers at home, equipped with the user-friendly interfaces of early Mosaic and Netscape browsers -- 1992-3, the year of the death of acceptable use policies, of the emergence of the killer app, and of the birth of the commercial Internet -- that the idea of the users governing the systems, rather than the reverse, was able to dawn.
The overloaded work desk with its double in-boxes was not the place for abstract thinking, about the possibilities of information or anything else: time-was-money, and there was no time -- continuous crisis-management, as Bill Clinton described the White House 16 -- I remember calling down the hall, "Help!", as my most common strategy for "computer-problems", then if none arrived quickly I'd just unplug the thing and plug it in again... Origin of the "help" desk term: that unplugging / re-plugging drove the sysops nuts.
And the office LAN, when that arrived, was no source of inspiration either. There still was no time available for real thinking... The early LAN crowded a cluttered workspace with extra cables, and never worked well, necessitating the dreaded visit from the overly-knowledgeable IBM sysop geek, who would occupy an entire precious work hour with tinkering and tapping and mumbling... all of which would have been OK if only he hadn't tried explaining things, which geeks always love to do but rarely do well or rapidly... Thus was born a full generation of "I-don't-want-to-understand-the-thing-I-just-want-to-drive-it" computer-users.
But the Internet saved us, once it had its killer apps and once we had it on our desks at home, 1992-3 and after: there we were able to play with it -- "homo ludens", the way we learn best 17 -- trying out our own personal "what-if scenarios" in exploring the possibilities of the new toy.
In the early 1990s I spent endless hours, day and night, happily exploring the online catalogs then connecting-up from libraries all over the world -- each to his own... a child of mine later spent his time "blowing away bad guys" in video games he would play online with friends in Italy, Japan, Ireland, interactively and instantaneously.
Thereafter, though, when either of us got to an office offering two in-boxes and double-duty workload, we knew what to do: we knew what the system could do and how to get it to do it -- most of all we knew that via networked communications, both in our own system and outside, we could share information, transfer responsibilities, perform all the resource re-allocation tricks which distribute tasks efficiently in an organization, ultimately even eliminate one in-box entirely by doing all of our work online. Before the Net it had been a matter of "waiting until the meeting" -- now we could, and we did, do it ourselves.
So the Internet did it, for the users: resolved the competitions between TCP/IP and competitors, solved the productivity paradox with its double in-boxes, launched digital information into the general public arena -- and once the Web took off there was no turning back, it had become a nearly-uncontrollable victory, one hurtling forward and with no time now for negotiation, a TCP/IP avalanche.
3) Education, the late 90s
The professions -- doctors, lawyers, accountants -- and government's great bureaucracies, mostly were mired still in their traditional paper-chases.
But on-campus, at US and UK universities, also in Scandinavia, Japan, and increasingly Continental Europe, not only were Internet nodes appearing rapidly and then multiplying, but formal instruction in the techniques and affiliated subjects was appearing as well.
Non-technical users already were not just studying and playing with the new techniques, they actually were using them: on-campus, to research book citations at the library, to reserve classrooms and library seats and study carrels, to publicize course requirements and post grades, and to ask and answer questions...
The birth of the now-omnipresent "Frequently Asked Questions / FAQ" file took place in the late-90s: before then it had been endless repetition -- the same reply to each student, delivered many times, year after year -- "Yes your paper must be typed", "No you may not cite encyclopedias" -- now the online FAQ knew all, and began its guidance of student life.
But students, and professors who after all are former-students, have inquiring minds... So, on-campus at least, if not yet elsewhere, classroom questions began popping up about the new techniques. Were they mere add-ons to those we already possessed, some wondered? If not, were they satisfactory replacements for techniques we had before? Others went further, suggesting that something "entirely different" was being created: there was talk of "virtual communities" 18 and "global cities" 19 and "digital democracy" 20, all somehow enabled, if not entirely created, by the new networked digital media.
Optimists abounded, predicting new eras in collaborative scientific research, remote healthcare delivery via digital "flying doctors", "minority languages" 21, and climate and earthquake prediction.
Pessimists predicted darkly, too, that globalized data crime, digital "identity theft", "The Year 2000 glitch", and other maladies of the new virtual world would bring both it and the real world to a halt. Interest in such subjects, peripheral to the original central concerns of electrical engineering and information science, increased in 1990s universities. Initially an occasional "guest" topic in traditional classes, the Internet gradually became a full course section, then a full course itself, eventually an approved topic even for elite seminars and doctoral theses, and ultimately a "basic skill" requirement for all classes, as online FAQs and research techniques proliferated.
As usual, in academia, each discipline saw the new phenomenon from its own particular perspective. Business schools excitedly investigated online advertising and viral marketing, and new tools for leveraging global finance. Education schools took up online distance learning as a serious cause. Cognitive scientists asked whether the systems are patterned after the brain, or the brain -- or our understanding of that anyway -- is patterned after the systems? And philosophers asked where the "mind" is, either way? Mathematicians became fascinated simply with the Internet's growth, as it appeared to be growing faster and in more interesting ways than most human-created phenomena on record 22.
And that was just the universities, the domain of ".edu". These ancient institutions -- medieval origins, elderly models -- "Being president of a university is no way to make a living," Yale's Bart Giamatti wrote 23, "It is to hold a mid-19th century ecclesiastical position on top of a late 20th century corporation" -- have been going through their own painful structural changes recently, at the same time trying to adapt to the information revolution presented by the Internet.
By 1960 the older ivory tower education model -- independent but culture-bound, reflection of an imagined pre-war European cultural consensus 24 -- was under great strain, financial but not only, and in some places it was already dead. Late death throes in 1968 Paris followed earlier attempts to grasp the problems more directly.
During the 1960s and 1970s, then, national government replaced the old liberal arts consensus, in funding universities 25. In the 1980s and 1990s, large corporations took over government's education role, at least in the US 26.
And by the early 2000s these corporations all had become globalized -- with murmurings of transnational global government on the horizon, although as yet only murmurings. The many smaller changes wrought upon higher education, amid all this fundamental change in such a short time, have been enormous. And disruptive: from "Berkeley '64" and "Paris '68", to calls in 2007 for restructuring the entire approach from bottom to top once again 27, formal higher education has been busy, trying to adjust.
* Online Communities -- The WELL
The Internet became a tool, in those adjustments: a new device for aiding both education's mission, and its operations. In the meantime, though, other institutional candidates appeared: competition… Formal institutions such as "universities" were not the only locations for late-1990s education about the Internet itself. And the Internet even began to promote alternatives to the universities: "education" would not wait.
Online communities arose early: The WELL among the earliest, in 1986 as the Whole Earth 'Lectronic Link -- then Apple EWorld, and many others. These were intended to serve purposes -- of information gathering and storage and retrieval, dissemination, discussion, debate, validation and accreditation -- similar to the offerings of traditional academies.
But in the Internet's earliest years, user-education about the Internet itself was internal: the only place to find about it was on it, in it, online, where it was being invented. By the early-1990s, the originally-chaotic education processes online had been structured, by a variety of services:
And a few such services even developed into "online plus", offering elaborate and extensive online education resources – websites, downloadable materials, links – but also email exchanges, real-time discussion groups, chat sessions, even physical classes.
"Continuing education classes", such as those offered by university extensions, went in this "online plus" direction -- as did the University of Phoenix, the AllLearn project of Oxford and Yale and Stanford, the MIT OpenCourseWare project, and the more recent Open Yale Courses project -- and Wikipedia was to come later... 29
The best general characterization, of the online education trends of the late 90s and of what those led to, was offered by Mitchell Kapor in 2005: "Massively Distributed Collaboration," he called it, "a radically new modality of content creation" 30 -- but the trend Kapor identified was even more broad, in the 1990s, than just the very new ecommunities and econferences and the rest of the online "education and community" effort.
4) Commerce, the 21st c
All this took a while to sink in: commercial firms which "got it" took off into the financial stratospheres, in the 2000's, and firms which didn't, folded. The origins lay in the "massively distributed collaboration" education experiments of the late 1990s.
But by the early 2000s the productivity paradox was over: office workers finally knew how to combine the double "in-box" burdens of previous digital decades. To some extent this was the result of improvements in the technologies. "Computers" had become faster, and cheaper, and more talented. Also, per Moore's Law 31, they offered far greater storage and operating capacities. The systems were far more user-friendly, than they had been before.
And the Internet only really became available in 1992-3: that was a magical transition process -- one poorly-remembered, in accounts of it I have seen. I distinctly remember, though: one moment there we were, fiddling with an awkward beast, one saddled with command-line data entry, still, and screens glowing green, disks 5.25 inch and truly "floppy", and onerous NSFNet usage restrictions converting to paranoia all who signed their documentation, as all users back then had to do.
A year or so later, though, somehow, magically, all that had slipped. Suddenly the beast had GUIs, and bright colors and fancy graphics, and Ted Nelson's 32 -- and Apple Computer's -- hypertext links, and Tim Berners-Lee's World Wide Web, and NCSA's Marc Andreesen's Mosaic, and nearly all the other elements of William Gibson's Matrix including even the weird clothes and hairdos.
As a user, one briefly wondered where all the restrictions "went". Doubtless there was some sort of formal signoff, somewhere -- some surrender of NSF swords -- but the event was not well publicized, and general public users were not being invited to Internet parties, yet.
Most of us just went to renew email accounts, one day, and discovered that the restrictions simply had vanished. I suppose had we asked, plausible explanations might have been forthcoming -- but largely incomprehensible -- and, besides, we were busy...
* Acceptable uses
So the previously-forbidden blossomed, in the late 1990s. NSFNet "acceptable use" policies specifically had prohibited commercial uses and general public access -- and pre-W3 there was very little in graphical imagery available on an Internet-enabled screen, only the funny little alphanumerics which created "smileys" and "videotex" images on French Minitels. In the 1990s, then, Internet images bloomed, the general public began crowding in, and bonanza-sensitive commercial usage exploded.
* Scaling -- Movin' on up, Goin' global
And then it all grew, exponentially. From the first little lists of Internet-enabled users 33, by the late 1990s it was nearly the case that if you weren't online you weren't in operation. From university applications to library catalogs to thousands of other uses, by 1999 any service not online in some form already was "missing its market".
The Network Wizards' Internet Domain Survey paints one of the most graphically-appealing and simple pictures of the Internet's phenomenal growth which I ever have found: a few hosts in 1994 -- over 100 million by 2001 -- some sort of interesting "down-blip" during 2006-7, but with the totals by then in the 450 million range -- now in 2008 approaching 600 million 34.
* Search & Retrieval revolutions
The quintessential late-1990s Internet term became "overload": not in the physical systems sense -- outages due to lack of capacity were rare, and Moore's Law plus the commercial markets ensured that memory and "fiber" were plentiful, so much so that dark fiber even became a problem 35, and systems often were overbuilt and had to wait for usage to catch up.
But information overload became the 1990s watchword among the users, because whatever the system capacities were, for us the users there simply was too much data "out there". "Sipping from a fire hose", was one famous metaphor.
The response of the information sciences was vast improvements in search-and-retrieval: new techniques and criteria, by which "relevance" might be determined -- of term-retrieved to term-sought -- so that the mountains of data being retrieved might be winnowed down, or at least ranked by their degree of relevance.
Keywords, for example, were assigned to Internet sites and other digital data, in embedded metadata or by knowbots and spyders combing the sites -- when matched with keywords gleaned from user queries, sites retrieved then could be ranked by degree of match 36.
Popularity, too, was folded into the search strategies, as researchers figured out how to trace links and logins, to establish whether one site enjoyed more usage than the others in particular categories and searches: combining the approaches in their famous algorithm, two such dataminers, Sergey Brin and Larry Page, founded their Google firm in 1998.
And the Web has not stood still since: led by one of its pioneers, Tim Berners-Lee, now in the late 2000s it is headed off in new directions, in the ongoing effort to winnow down or even just keep up with "information overload" -- the context of the terms, the semantics in which they occur, now is to be folded in as well 37.
Content analysis, though, is an old problem, and "relevance" even older: counting word-term occurrences in political speeches, and drawing policy conclusions from such statistics, is an idea which first came to the social scientist Harold Lasswell in the 1920s 38, and it was an idea which had rough edges around it even then -- so Dr. Goebbels said "democracy" twelve times in his speech today while his total usually is only ten, so what?... And since at least the Greeks, epistemology has puzzled over "relevance"...
So while datamining on specific key terms certainly helps, in information search and retrieval, and popularity and semantics buy us time against information overload, these techniques have diverted but not fully stemmed the flood.
Digitization both new and retrospective goes on; Internet growth may have slowed -- some say 39 -- but still it is expanding; "information overload" is a problem for every user who receives "about 15,100,000 results" and wonders why a particular 10 happened to float to the top 40.
But if the technology's innovations have helped, so have the New Users...
By the early 2000s, office workers entering the marketplace had been born in the 1980s, the same time when much of this technology already was developed but was struggling for acceptance by workers in possession of 1960s educations.
The trouble with a 1960s education, in the 1980s, was that it equipped you to be excited by the new technologies, but it hadn't taught you anything about them or how to "do" them, really -- they were "new" -- so learning was slow, skills acquisition arduous, you were adding something new on top of, and very much in addition to, all of that "stuff you learned in high school", exactly parallel to those two in-boxes on your office desk.
Picture Bill Clinton, born 1946, poking -- enthusiastically and open-mindedly, but still poking -- at a computer keyboard, under the patient tuition of his daughter Chelsea, born 1980, who is trying to show him YouTube, and you will see the distinction. Chelsea and her generation, entering the workplace in the 2000s, took computers and the Internet "for granted".
As XeroxPARC warned, now long ago 41, invisibility is the key to success for any technology: only once it has "faded back into the woodwork" -- become ubiquitous, and a necessary part of everyday life -- can it be said to be truly established and successful.
By the 2000s then, but not until, the Internet and digital techniques generally were taken for granted by the younger generation then entering the workforce: by Chelsea, nimbly texting her friends on her cellphone, although not by her father, still and perhaps forever a member of the thumb-impaired Baby Boomer generation, with its double in-boxes.
So nowadays, in the 2000s, workers and firms and other organizations successfully have incorporated digital technique, including the Internet -- so much so that workers now all take it for granted -- at last, after about a generation of trying.
And nowadays, in the sheer "$ produced per hours worked" productivity terms of economics, Chelsea and her digital and Internet-savvy generation can accomplish far more, in their workweek, than her dad and his ever could have back in their 1970s and 80s days. Even if it's only because they're all doing more of the work, than we did back then -- in what Time now calls "The End of Customer Service" 42, where we-the-Users now do the work, at the self-service ATM, telephone, gas pump, hotel or airline or other service. At least from the general economy's point of view it's all good news.
5) Communication: "The Revolution Will Be Handheld" -- access and other myths
By the 2000s, the Internet had all three. Internet usage statistics by the 2000s had soared. Entire industries -- from catalog sales to general retail to inter-modal transport -- had been revolutionized, firms which "didn't get" the Internet having been supplanted by firms which "did".
Other industries and activities were under great stress, facing replacement in toto or even collapse, as "weak links" in chains of delivery between suppliers and consumers appeared, and were aggravated, the non-Internet portions having become drags on the rest.
In the movie and journalism and publishing industries, for instance, such imbalances appeared early: the cherished studio system came under threat, unwilling to deliver its old filmed content to the new digital media -- traditional journalism became a bottleneck for "news", which burst out into blogs and 500 channel access and ejournalism online -- print publishing, too, was printing more, but as a rapidly shrinking share of a parent corporation's total readership, which rapidly was shifting online as their shifting ad revenues and balance sheet wars and merger frenzies showed.
Health and higher education had become major bottlenecks too: in each case beset on both ends by suppliers and consumers now highly-digitized and fully-networked.
In health, giant pharmaceutical firms and insurance companies and other suppliers overtaxed the abilities of labor-intensive hospitals even to understand the offers that poured in -- while at the other end of the delivery chain consumers raced to Internet-based resources offering them vast arrays of choices, plus epidemics of auto-diagnosis unmediated now by trained health professionals.
Higher education institutions were hit even harder: increasingly, in the 2000s, colleges and universities faced student consumers far more experienced and sophisticated at new information-gathering techniques, thanks to the Internet, than their own instructors. The schools also dealt awkwardly with the flood of new information sources themselves, libraries struggling to keep up with new database offerings, faculty fighting to preserve old standards and outdated "publish or perish" procedures, certain professors woodenly forbidding the use of blogs or Wikipedia only to find their own graduate student instructors using them.
The youngest students meanwhile were gathered down at the library, not reading books but "texting" and "skype-ing" their friends in Shanghai and Paris and Berkeley, simultaneously, in chat sessions.
By the 2000s the digital information overload "flood" had been physically contained somewhat, in huge pipes and sophisticated channels constructed by many institutions. But elderly labor-intensive arenas, bottlenecks, saw their costs soaring and policy inequities rising and were threatening to capsize entirely 44. Expensive "computers" were becoming freshman class necessities, on-campus, although cries elsewhere of "What about the information poor?" increasingly were being swamped by floods of cheap cellphones.
The 2000s effects of the Internet, however, were not entirely institutional, not just a matter of upgrading 20th century organizations -- and 19th century, and some 18th, and even a few 15th...
More fundamental, and even more widespread, was the impact upon personal communication. The Internet's history so far can be viewed as a story of progressive democratization, at least in terms of its use: from its 1980s "closely-supervised and US-government-restricted" era -- involving a tiny group of highly-specialized scientists, certainly by comparison to what came later -- to its 1990s broad dissemination among the scientific community and initial public use "explosion", to its global usage status today.
The key to that global usage is the individual user: nowadays everybody feels that they can use the Internet, and a great many of us do. Qua consumer, then, that "everybody" notion created the global commercial market now delighting both economists and the great majority of large firms which "get" the Internet's commercial significance.
More than just as individual consumers, though, 2000s Internet users represent a global mass phenomenon with broader political, social, and cultural characteristics. Both as individuals and in groups, great masses of people -- none necessarily professional or specialized, in their training and talents and experience, although many are -- are better able now to communicate, one-to-one and one-to-many and many-to-one and many-to-many, than they ever have been before, thanks to their new inter-networked digital media. And they can do research, on their own and together with others...
Wikipedia, then, offers only one notable example: nowadays any reputable science laboratory, too, is linked via the Internet to others in its field around the globe, sharing results, building team efforts, pursuing collaborative research. And Amazon and Google and now many other commercial firms "connect", in very similar ways, with their own users too.
One result has been a new level for the already-burgeoning information flood of the 1990s. As sales-and-marketing but also political campaigns around the globe, and art and religious movements, and social organizing, "distance" education, music composition and other trends, all can attest, networked individuals operating on their own -- not necessarily through existing institutions -- are powerful forces, now 45.
Not yet entirely legal... Still, now in the late 2000s, the laws have not caught up yet to where the politics and general society and leading edges of culture have been for some time: where pop culture and social relations and political campaigns all have "gone digital", already and increasingly globally, legal systems still struggle with outmoded conceptual tools such as "antitrust", 19th c., "long-arm" jurisdiction from 16th c. notions of "nation-states", "copyright" derived from 15th c. ideas of royal privilege.
But then it is the job of some sectors of society to be conservative... Much of modern Internet development results from the forbearance of legal systems, perhaps: wise courts and legislators know when not to take a case on, as well as when to take one.
There has been a defining mythology, in the 2000s, just as there was at the Internet's now-fabled beginnings of two decades ago. The most recent "urban myths" have been the results of older attempts at prediction.
* Users need myths...
|These are the days of miracle and wonder|
|-- Paul Simon|
Not myth in the sense of something we know to be untrue, more myth in the sense of something that inspires us -- what used to be called "urban myth", something urbane, worldly-wise, to which we react not sceptically but rather with awe and amazement, as it helps crystallize our thinking about the complexities of modern life -- not that we know it to be true, then, or false, only that it provides us with clarity.
So, we currently have, regarding the Internet anyway, several ideas emerging, in the 2000s: not substitutes for the certainties which engineers and other scientists can provide -- in a complex and fast-changing world these so often disappoint, anyway, or confuse rather than clarify -- but simply shafts of light which give a busy and distracted user some insight into surrounding complexities.
* Internet myths: Invisibility
One of the most interesting Internet myths has been invisibility. XeroxPARC some time ago identified ubiquity and several other characteristics of a technology's success 46. Invisibility was one of those: once technology had become so common, so mundane, as to have faded back into the woodwork...
By mid-decade, digital technology certainly did seem to be everywhere -- chips in telephony and automobiles and on road signs and in implants in pets, and the global Internet seemed to have become truly a seamless web -- and at least the younger generation seemed to be taking it all very much for granted.
At the same time, though, there were counter-trends. The ubiquitous handheld, for instance, was becoming more visible rather than less. Entire cities, in Scandinavia and Asia and elsewhere, seemed deluged with the little things. Rice paddy farmers in Cambodia had them -- "dumped" there, in the commercial market-saturation efforts of large Japanese and US and European manufacturers -- villages in Bangladesh had them 47, so did teenagers in Urumchi 48.
Yet rather than fade into the woodwork and become invisible, as electrical devices of previous technology revolutions had, the little handhelds -- cell phones, personal digital assistants / PDAs, mobiles -- were becoming centerpieces, more visible, more popular. In conversation, entertainment, advertising, finance, photography, general communication, the already-ubiquitous handheld was, impossibly it seemed, becoming even more exciting.
One development underestimated by XeroxPARC, perhaps -- and by most previous Internet prognostication -- was the effect of marketing, and commoditization, upon digital information. As hardware and software and systems, merely, yes the Internet might have gone the way of previous underlying industrial structures, and become infrastructure support -- ubiquitous but invisible, and taken for granted -- like road networks, air travel, electrical grids, telephony, sewer systems. Marketing, though, took up the Internet in a very big way, in 1992-3, with the arrival online of a killer app which customers -- everyone's customers -- could and would and even were excited to use.
Commoditization is the standard marketing strategy for packaging a good or service for sale to customers: make it and everything about it into saleable "things" – and in the late 1990s this was done quickly and efficiently with the Internet.
The byword was "monetize": "How will you monetize that?", venture capitalists demanded, poking into every aspect of a startup's operation -- or boards demanded, of every corporate information plan. So, online, every webpage, image, header, link -- offline each business unit, office, function, "computer", "modem", any mention of the corporate name...
The result is that now, twenty years later, inter-operability is better but as an "ideal" still an unachieved dream: platforms still make a difference -- my iPhone still struggles with French accents, as my PC did back in the days of supposed Wintel conspiracies to dominate techniques -- competition is alive and well, and so the Internet definitely has not faded back into the woodwork to be taken for granted, at least not yet.
* Internet myths: Decentralization
Decentralization myths, too, have prospered throughout the Internet's twenty-year reign. The earliest 1980s predictions, of founders and early enthusiasts, were all about telework and telecommuting, about distance education and communication generally, tied to neither local time nor local space. In 1992-3 I tested these notions personally, finding that indeed our services-oriented economies no longer cared so much, in many respects, whether a family was physically located on a hilltop in San Francisco, or on one in Lyon.
Arranging luncheons was difficult, after having been "décentralisé"... But much else, including business correspondence and banking and investing and research, and increasingly consumer functions such as buying goods and services and paying bills -- already was location-independent through being largely online. The world also seemed to care less and less about "local time", so much of it already having joined the new globalized work-world of "24/7".
But there were unanticipated wrinkles in the myth, we discovered... The "distance" part certainly was true: globalized and networked communication developments of the 1990s indeed made us more time-and-location-independent in the 2000s. But it may have enabled us to centralize, rather than de-centralize as many originally had predicted the Internet might.
An enormous and growing "global cities" literature describes the increased concentration, over the past two decades, of populations into very large urban centers – connected more to each other, by communications and transportation networks and "global city culture", than they are to their local nation-states' rural areas.
Places such as "Greater San Jose" in California, with 7+ million people now, in formerly-independent cities scattered over a now-urbanized area 150 miles by 200 miles in extent -- containing "San Francisco" now as merely a small wealthy enclave -- were difficult to imagine, or admit, until we had networked digital GIS such as GoogleEarth to help us. New studies confirm: from Paris Région (12+m), Boshington (55m), Greater London (7m), the Randstad (10m), Tokyo Region (43m), Greater Shanghai (well over 20m already and growing fast), and others 49.
Sociologists such as Saskia Sassen and anthropologists such as Laura Nader ask why, with location increasingly "independent" now, we necessarily would opt for one location over another – why stay stuck in a traditional "city" if now one may flee to its suburbs, or further to a mountaintop eyrie in Telluride, or to Davos, the Dordogne, Tasmania, taking education and work and entertainment along via the Internet? Or the reverse – why stay stuck in the suburbs, or in the small towns, if the more lively city center beckons?
Nader offers an explanation, a choice being made: "face-to-faceless" is her anthropologist's characterization of interactions generally had between users and technology's information systems, and users infinitely prefer "face-to-face" with other users, she suggests – not "online/virtual" but "human/real", as found most conveniently in a big city park or pub or across a table. Hence the development of mega-cities, perhaps, now that the location-independent workforces of service industry economies no longer need to be "on the farm" or in "industrial towns" -- so they flock to the cities, where they can do their telework from home.
To which sociologist Sassen's ideas add that certain cities now are better-equipped, for such location-independent populations and their service industry work activities, because they are better-connected via telecommunications among one another. These are her Global Cities 50 -- a concept different from previous Megalopolis and World Cities ideas of Jean Gottman 51 and Peter Hall 52 and other geographers who focused upon urban form. Sassen sees cities' function, the inter-connectedness of their telecommunications, their travel routes, their financial structures and activities -- their networking 53 -- Sassen ses the Internet.
* Internet myths: Massively Distributed Collaboration
Most powerful, among the 21st century Internet urban myths to have emerged so far, has been not what individuals now are able to accomplish, using the new tools -- nor what institutions, commercial or academic or governmental or other, now can do with them -- but what groups of individuals now can do. Including some very large groups... scattered across some very wide geographic areas, including the entire globe... and 24/7, like so much else on the Internet...
The first clues came from networking 1980s video games: MUDs and MOOs and Dungeons & Dragons, showed users the tools -- just as Internet Relay Chat (IRC) in 1988 showed them the possibilities of "texting", 'way back then 54. The first really dramatic demonstration of such bottom=>up mass use, though, was the 1986 PeoplePower movement which brought Cory Aquino to the Philippines presidency: text messages on fliphones defied a powerful military and brought down a national government, on that occasion 55.
Then came the US political campaigns -- Dean's of 2004, and Obama's of 2008 – both organized to make maximum use of Internet technologies. In both, too, online fund-raising was the crucial contribution: the use of the nets to raise enormous amounts of money greatly impressed political pundits -- particularly Obama's campaign, which has been able to reach small-amount contributors multiple times, and so prolong its effort – easier, and cost-effective, to re-solicit for $50 per person from a million people than for $50,000 from one.
The business community noticed all this too. The ability to reach mass market customers having been a key to marketing since Louis Hachette, W.H. Smith, Henry Ford... Amazon's retail empire, revenues now $15b and growing fast, is based fundamentally on the idea that the Internet can reach everyone 56. The firm embodies "economies of scale" sufficient for any economist's dream. And Amazon built in "feedback", for systems analysts or computer scientists: self-correcting marketing -- using customer input and buying patterns to improve customer information retrievals, even to inspire "push" marketing. These are old and cherished salesman's dreams: firms like Amazon have revolutionized commercial retail and wholesale.
Mitchell Kapor says it best: "massively distributed collaboration", he calls it 57. Whether for political organizing, scientific research, or commercial empire-building, "massively distributed collaboration" has broken older molds and set new standards, and is a fundamental Internet urban myth of the digital 2000s.
* Internet myths: "The revolution will be handheld" 58 -- access and other myths
The most characteristic 21st c. Internet urban myth, though, is "access". Access, more than invisibility or decentralization or even massively distributed collaboration, is a trend which not only has lasted but has increased, wildly, scaling-up with the Internet's growth faster than the technologies have.
"Access" is the expectation, by us users, that we will have the Internet, be able to use it, and be able to use it to do very many things – and, as-defined, nowadays there seem to be no limits upon Internet "access". For users, anyway… Now that the Internet is on our cell phones, and Nokia et al. are dumping those, for pennies apiece, into every mountain village and rice paddy on the planet 59, the expectation among all people everywhere is that they will be doing great things with the Internet themselves, very soon.
Dreams fostered by Grameen Bank's famous example, of local self-sufficiency and development, fuel the excitement 60. They may founder still, upon all sorts of political and economic and social and environmental realities. But if such dreams only come true by half, "information overload" until now will seem to have been a trickle.
So what can Internet developers do, to prepare for yet more explosions in user expectations, and much more information overload? Well, prepare for user disappointments, of course… Few forces in the universe have the energy of the human imagination, even Internet growth does not: so, inevitably, there will be some over-selling, and some folks will be disappointed when their cell phone wi-fi DSL connection goes down in Benin.
But human capacities for "hope" and for "change" have been getting much praise, recently: hope for the Internet too, then -- also there is a track record of success, by now, and most people can be very patient with success. So, "keep up the good work", developers: the Internet already is a proven tool for much good in the world, and even the most ignorant user understands that it can do even better, and will, if we let it.
6) Of Feedback Loops & Falcons -- conclusion
|Turning and turning, |
In the widening gyre,
The falcon cannot hear the falconer
|-- Wm. Butler Yeats, 1919|
So all of this has been some of it, at least, some of what the Internet's users think of the Internet, now 20 years later: many truly-useful functions, some improvements on the old plus a few truly new -- but also heavy components, still, of myths, paradox, mistakes, and magic...
Much as we did before, and in spite of many intervening innovations and much education, we users still think of the mythologies: the for-us-concrete results -- what can we do with this thing -- what can we buy or sell or study or play with, using it -- as-advertised, or fancied as urban myths, universal-this or global-that or world peace or just finding friends, whether we really can achieve them "online" or not.
And, as we also have done before, we users still love to "play"... Johann Huizinga studied all of society through its play: "homo ludens" he called us 61. And some of any new technique forever will seem foreign, will be foreign, never user-friendly to all of us -- not because of technical flaws but because we are human, and humans come in varieties. Wetware problems...
The Internet has brought us so many new things. Now we have cloud computing 62, and collaborative biochip development online 63, and microchip tracking of hopefully only our pets 64, and 24/7 globalized and networked research leading us toward "synthetic genomics" 65.
But the Internet's eager advocates need to remember that their entire technical world, with all its potentials and excitements, still occupies only a very small part of a normal user's life and concerns.
As politicians discover about "world events", every time they run for office, it's more the pocketbook issues which interest users -- income, outgo, taxes -- also family health and safety and walks in the park.
In every user's 24-hour day, at least 8 such hours must be reserved for sleep -- a more rare commodity in hitech -- also for meals-with-the-family, relaxed or hectic, say 3 hours more, plus errands, downtime, diaper-changing, picnics, playtime, quality-time, love, life, vacations...
For users it is not simply a matter of reducing the journey-to-work through technology, although that is welcome. It is also though a matter of priorities: users faced with an exciting new Internet feature may simply prefer spending the time with the children -- that's the type.
Much of the technology, too, has precursors -- as Yeats' image points out -- and this makes users suspicious. Before we had "cloud computing", for example, we also had "parallel processing", and before that "distributed processing" -- and each of those, in its time, was advertised as exciting and revolutionary and new.
But there is only so much excitement and revolution and novelty that an essentially-uninterested user can take, if one's personal priorities are diaper-changing and family picnics.
Again the politician's example illustrates: those folks learn to ration their "revolution" carefully -- call the people "to arms" too many times, and one of these times they may not show up. Just so with technology... So the precursors, real or imagined, tend to moderate excesses. It is the "old wine in new bottles" problem: among users -- not in love with the technology per se, remember, and not particularly knowledgeable about it, un-interested -- there always will be some who shrug that it's "the same old wine".
Whatever the reality... Yeats' genius was to point all this out with a single image: that change always involves both the new and the old -- the falcon's spiraling gyre, even as she rises, doubles back on itself. At least that is how users view, and use, the Internet.
We greatly admire its innovations, and we like the new capacities and ease it has brought us, and we love to play with it. But we see it in terms of ourselves, as we do all things: of our daily lives and diaper-changing and picnics -- of higher priorities we have treasured for a long time and will continue to treasure.
It's an old idea, really, it's just that once we thought progress was linear, and it isn't -- it's progressive but cyclical, a widening gyre. And some of all of this simply is, still, magic: is and ought to be, at least to some of us, because the world needs a little magic.
But now to get back to my Mongolian online digital libraries, and to watching word-wrap...
(Wikidisclaimer: I consider Wikipedia a wonderfully-useful research starting-point, on most topics, although a starting-point only. Some non-Wikipedia resources are listed here, also some Wikipedia articles I consider particularly good, or at least interesting... that flame about "OSI", for instance... One cannot do badly, I believe, in beginning "further research" on any topic mentioned here by looking it up in Wikipedia: just don't end the research there -- any more than it might be ended by any encyclopedia's article.)
2^ Temptations, financial and other, of eliminating publishing's intermediaries -- "Editors are failed writers, although so are most writers", T.S. Eliot -- fascinated Internet users, at least until the 2000s' blogging craze revealed just how chaotic unedited text can be.
5^ Or "permanently" anything, as John Mellencamp put it in 1987, remembering, "We were young, and we were improving..." -- although many 1960s book titles already were bemoaning the decline of The Idea of Progress.
7^ Jack Kessler. "Accès multilangue et langue universelle", in Bulletin des bibliothèques de France (2007) tome 52, numéro 3, pages 5-15, ISSN 0006-2006, http://bbf.enssib.fr -- online in English as, "Multilingual Access and Universal Language", http://www.fyifrance.com/f102007a.htm -- considering David Crystal's recent work.
8^ e.g. inter much alia, Peter H. Salus, "Protocol Wars: Is OSI
Finally Dead?", in Connexions: the interoperability report (August, 1995) volume 9, number 8, page 16, ISSN
9^ Lorcan Dempsey. Libraries, networks and OSI : a review, with a report on North American developments (Bath : UK Office for Library Networking, c1991). See also the current Wikipedia article, for a sample of the emotive language and passions often involved in the OSI vs. TCP/IP debates: the editors there even warn the current article's authors that, "The tone or style of this article or section may not be appropriate for Wikipedia" -- http://en.wikipedia.org/wiki/Open_Systems_Interconnection
10^ http://en.wikipedia.org/wiki/Internet http://en.wikipedia.org/wiki/History_of_the_Internet See also, Nerds 2.0.1 : a brief history of the Internet 1, Networking the nerds. (Burbank, CA : PBS Video : Warner Home Video, [distributor], 1998) a production of Oregon Public Broadcasting.
13^ Benjamin Friedman observes, of Greenspan's early 2000's Fed policies, "[Greenspan] had, 'become persuaded that we were on the verge of a historic shift... As the world absorbed information technology and learned to put it to work, we had entered what would prove to be a protracted period of lower inflation, lower interest rates, increased productivity, and full employment...'", in The New York Review of Books (March 20, 2008) page 26, http://www.nybooks.com/articles/21153
14^ "System/360 : The following is the text of an IBM Data Processing Division press release, distributed on April 7, 1964..." http://www-03.ibm.com/ibm/history/exhibits/mainframe/mainframe_PR360.html
20^ Various terms describe the phenomenon, including e-democracy, Internet democracy, etc. See for a general overview of the Dean and Obama campaigns in this respect, comparing the two: Ari Berman. "The Dean Legacy", in The Nation (March 17, 2008) http://www.thenation.com/doc/20080317/berman
22^ i.e. Andrew Odlyzko. "Internet growth: Myth and reality, use and abuse" AT&T Labs Research http://www.research.att.com/~amo, http://www.dtc.umn.edu/~odlyzko/doc/internet.growth.myth.pdf; also, Walter Willinger, Ramesh Govindan, Sugih Jamin, Vern Paxson, Scott Shenker. "Scaling phenomena in the Internet: Critically examining criticality", in Proceedings of the National Academy of Sciences (February 19, 2002) vol. 99, supplement 1, pages 2573-2580, http://www.pnas.org/cgi/content/full/99/suppl_1/2573; also, G. Caldarelli, R. Marchetti and L. Pietronero. "The fractal properties of Internet", in Europhysics Letters (2000) volume 52, pages 386-391, doi: 10.1209/epl/i2000-00450-8, "In this paper we show that the Internet web, from a user's perspective, manifests robust scaling properties of the type P(n) n-?, where n is the size of the basin connected to a given point, P represents the density of probability of finding n points downhill and ? = 1.9 ± 0.1 s a characteristic universal exponent. This scale-free structure is a result of the spontaneous growth of the web, but is not necessarily the optimal one for efficient transport. We introduce an appropriate figure of merit and suggest that a planning of few big links, acting as information highways, may noticeably increase the efficiency of the net without affecting its robustness"...
http://www.well.com/aboutwell.html ; see also Katie Hafner. The Well : a story of love, death, and real life in the seminal online community (New York : Carroll & Graf, 2001).
http://fr.wikipedia.org/wiki/Bruno_Oudet (re. Frognet)
http://www2.universitybusiness.com/viewarticle.aspx?articleid=57 ("What went wrong with AllLearn?")
30^ Mitchell Kapor: http://www.ischool.berkeley.edu/about/events/dls11092005
32^ Theodor Holm Nelson. Literary machines : the report on, and of, Project Xanadu concerning word processing, electronic publishing, hypertext, thinkertoys, tomorrow’s intellectual revolution, and certain other topics including knowledge, education and freedom (Swarthmore, Pennsylvania : Theodor Holm Nelson, 1987).
33^ See just for an example my own little list, of all the Internet-connected libraries in France which I then could find -- now they're all online, or nearly so -- at, http://www.fyifrance.com/Fyarch/fy940115.htm
36^ Gerard Salton. Proposals for a dynamic library (Ithaca, New York : Cornell University, Computer Science Department, nd) (Springfield, Virginia : National Technical Information Service, 1972) ; Automatic information organization and retrieval (New York : McGraw-Hill, ) ; Introduction to modern information retrieval (New York : McGraw-Hill, c1983) ; Automatic text processing : the transformation, analysis, and retrieval of information by computer (Reading, Massachusetts : Addison-Wesley, 1988).
37^ Tim Berners-Lee, James Hendler, Ora Lassila. "The Semantic Web", in Scientific American (May 17, 2001) http://www.sciam.com/article.cfm?id=00048144-10D2-1C70-84A9809EC588EF21&print=true
39^ Tessaleno C. Devezasa, Harold A. Linstoneb, Humberto J.S. Santosa. "The growth dynamics of the Internet and the long wave theory" doi:10.1016/j.techfore.2005.06.001, Elsevier, http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6V71-4GY88V9-1&_user=10&_rdoc=1&_fmt=&_orig=search&_sort=d&vi=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=f4569fd7b1390a835defb526f1834776 -- "The phenomenal growth of Internet users is slowing down and we expect to approach a world limit in the next decade of about 14% of the world population..."
40^ Jean Noël Jeanneney. Google and the myth of universal
knowledge : a view from Europe (Chicago : University of Chicago Press, 2007), translation of "Quand Google défie
l'Europe" -- The président of the Bibliothèque nationale de France, vs. Google's digitization of the world's libraries,
also vs. "information retrieval" which might (?) pull up the Baroness Orczy's version of La Révolution
Française ahead of Victor Hugo's(!)... See also, by Jack Kessler:
Google digital library vs. France?... & vs. others?, http://www.fyifrance.com/Fyarch/fy050215.htm;
Google Inc., libraries digital & other http://www.fyifrance.com/Fyarch/fy050515.htm
41^ XeroxParc: http://www.parc.com/research/publications/results.php?author=944 -- see inter alia, M. D. Weiser, "Some computer science issues in ubiquitous computing", in D. Milojicic and F. Douglis, R. Wheeler eds. Mobility: Processes, Computers and Agents (New York : Association of Computing Machinery, 1999) pages 421-430, also (Reading, Massachusetts : Addison-Wesley, c1999) ISBN 0201379287. Excerpt: "Ubiquitous computing enhances computer use by making many computers available throughout the physical environment, while making them effectively invisible to the user…"; See also, M. Weiser, R. Gold, J. S. Brown. "Origins of ubiquitous computing research at PARC in the late 1980's", in IBM Systems Journal (Armonk, New York : International Business Machines Corporation, 1999) volume 38, number 4, ISSN 0018-8670, 188-670, pages 693-696 ; and, http://en.wikipedia.org/wiki/PARC_(company)
42^ Barbara Kiviat. "The End of Customer Service", in Time Magazine, March 24, 2008, http://www.time.com/time/specials/2007/article/0,28804,1720049_1720050_1721684,00.html
43^ Jack Kessler: "Baby Bell Minitel? Internet Competition from the
French Connection", in Connexions: the Interoperability Report (April 1994) volume 8, number 4, pages 2-11.;
"The French Case: Networked Libraries, the Internet, and the Minitel", in Andy Ewars, Lorcan Dempsey, Jack Kessler. Libraries, Networks and Europe: A European Networking Study (London : The British Library, Research and Development Department, 1994) Library and Information Research Report 101, page 69;
"The French Minitel: Is There Digital Life Outside of the 'US ASCII' Internet? -- A Challenge, or Convergence?", in D-LIB Magazine, The Magazine of the Digital Library Forum (CNRI / Corporation for National Research Initiatives, December 1995) ISSN 1082-9873, http://www.dlib.org/dlib/december95/12kessler.html
44^ On higher education generally see Derek Bok, cit. supra. On health care generally see, http://en.wikipedia.org/wiki/Health_care_reform_in_the_United_States
53^ "Space of flows, flows of spaces..." Manuel Castells. Information Age (Malden, Massachusetts : Blackwell) : The rise of the network society (1997) v. 1; The power of identity (1997) v. 2; End of millennium (2000) v. 3.
59^ Humphrey Cheung. "Slap down the cash and grab a phone in Thailand", on TG Daily (March 25, 2008) Excerpt: "American mobile phone buyers endure pushy sales people, multi-year contracts and high prices, but it's a completely different story in Thailand and the rest of the civilized world. Here cash is king and I've just snagged a nice Samsung E250 phone, case and SIM card for a mere $138. But even more amazing is the time it took -- or, more accurately, didn't take. Most Thai shopping malls have an area devoted to mobile phone sales. Filled with dozens of small kiosks staffed by one to two people, these mom and pop stores hardly resemble the huge Verizon or Sprint stores in the USA. Beguiling Thai girls beam smiles at you, but don't get the wrong idea here -- you're basically a walking ATM machine that's ready to buy a phone. I braved the oppressive heat and choking pollution to Bangkok's massive MBK mall. Here at least 100 phone vendors are jammed together like sardines and that's a good thing for me. A booth to the left specializes in Nokia phones, while one to the right has all Samsung, oh look there's some unlocked iPhones ahead and very expensive Nokia N95s behind me..." http://www.tgdaily.com/content/view/36608/128/
62^ Stephen Baker. "Google and the Wisdom of Clouds", in Business Week, December 13, 2007, cover story, http://www.businessweek.com/magazine/content/07_52/b4064048925836.htm?chan=magazine+channel_top+stories
FYI France (sm)(tm) e-journal ISSN 1071-5916 * | FYI France (sm)(tm) is a monthly electronic | journal published since 1992 as a small-scale, | personal experiment, in the creation of large- | scale "information overload", by Jack Kessler. / \ Any material written by me which appears in ----- FYI France may be copied and used by anyone for // \\ any good purpose, so long as, a) they give me --------- credit and show my email address, and, b) it // \\ isn't going to make them money: if it is going to make them money, they must get my permission in advance, and share some of the money which they get with me. Use of material written by others requires their permission. FYI France archives may be found at http://email@example.com/ (BIBLIO-FR archive), or http://listserv.uh.edu/archives/pacs-l.html (PACS-L archive), or http://www.lib.berkeley.edu/Collections/FYIFrance/ or http://www.fyifrance.com . Suggestions, reactions, criticisms, praise, and poison-pen letters all gratefully received at firstname.lastname@example.org . Copyright 1992- , by Jack Kessler, all rights reserved except as indicated above.
or you can link / jump over to: