“Mobile 1.0” is the first account of the Finnish electronics company’s expansion from a small business into a global player in the mobile phone industry, beating huge established brands. The first season will focus on the years 1988-1990, when technology for mobile phones was in its infancy.
Those who want to reminisce a little more might be interested in these videos from Michael Fisher, aka Mr Mobile.
When phones were fun – YouTube Playlist In “When Phones Were Fun,” Michael Fisher re-reviews cellphones from the golden age of mobile, the decade-long span of experimentation from the turn of the century to approximately 2009. From one-of-a-kind relics like the Samsung Matrix Phone and Motorola AURA, to mainstream smash hits like the T-Mobile Sidekick, “When Phones Were Fun” is 50% retro review, 50% mobile-tech history lesson … and 100% nostalgia comfort-food goodness!
But perhaps I should be more optimistic about current phone designers. Not all of them make glossy, black rectangles. Some are designing glossy, black rectangles that bend and swivel.
After 25 years, the original Space Jam website has been replaced – Esquire Middle East The original website, launched in 1996, became a viral phenomenon in the early 2010s, as an internet that had evolved far past the 56k dial up modem found the site completely untouched from what it had once been. In an online world in which it often seems nothing is preserved, visiting the website felt genuinely like discovering the Tomb of Tutankhamun.
Web designer Max Böck compares the resources and loading times of the two versions. Progress?
Here’s something from the Web Design Museum for those in the mood for more movie reminiscences.
Following the war, Ottens obtained an engineering degree, and he started work at the Philips factory in Hasselt, Belgium, in 1952. Eight years later he was promoted to head of the company’s newly established product development department, and within a year he unveiled the EL 3585, Philips’s first portable tape recorder, which would go on to sell more than a million units.
But it was two years later that Ottens made the biggest breakthrough of his life – born out of annoyance with the clumsy and large reel-to-reel tape systems of the time. “The cassette tape was invented out of irritation about the existing tape recorder, it’s that simple,” he would later say.
I like the idea that it’s irritation and not necessity that’s the mother of invention. But as we’ve seenbefore, time is unstoppable, change is inevitable, people are fickle. As Things Magazine says, “How strange to have seen your invention lauded and adopted worldwide, before slowly and inexorably fading out of view, only to have a strange reemergence right at the end of your life.” At least he got to see 90.
Lou Ottens, inventor of the cassette tape, has died – NPR The resurgence is driven by a mix of nostalgia and an appreciation for tapes’ unique status as a tangible but flexible format. For decades, music fans have used mixtapes to curate and share their favorite songs. Unsigned bands have also relied on them as a way to promote their music. Those who have used cassettes to quickly record music include the Rolling Stones’ Keith Richards, who famously said he captured one of his band’s biggest songs in the middle of the night.
“I wrote ‘Satisfaction’ in my sleep,” Richards wrote in Life, his 2010 autobiography. Adding that he had no memory of writing the song, Richards said he woke up one morning to find that his Philips cassette recorder was at the end of its tape — apparently, he concluded, he had written something during the night. When Richards rewound the tape, he heard the song’s now-iconic guitar riff and his voice saying, “I can’t get no satisfaction.”
Rubbish mixtape: fan reunited with cassette 25 years after losing it – The Guardian Stella Wedell was 12 when she took the tape on a Spanish holiday to listen to songs by the likes of Pet Shop Boys, Shaggy and Bob Marley on her Walkman. Wedell, from Berlin, lost the tape either on the Costa Brava or in Mallorca and was astounded when she spotted it a quarter of a century later in an exhibition by the British artist and photographer Mandy Barker, who specialises in creating pieces out of plastic marine debris.
I got off the iPhone conveyor belt around iPhone 4, I think, so all the recent talk about the new Windows Phone-style widgets within iOS14 has passed me by.
These iOS 14 apps offer home screen widgets and more – 9to5Mac Apple has officially released iOS 14, iPadOS 14, and watchOS 7 to the public. The updates bring many new features to iPhone, iPad, and Apple Watch users, and third-party developers have been hard at work on updating their apps to take advantage of Apple’s latest tools.
I know I’m biased but– I don’t know, is it all starting to look a little too busy? I’d much rather look at these homescreens.
iOS: A visual history – The Verge Although it may be difficult to imagine now, when the original iPhone was introduced, it was actually well behind the competition when it came to a strict feature-by-feature comparison. Windows Mobile, Palm OS, Symbian, and even BlackBerry were all established systems in 2007, with a wide and deep array of features. Comparatively, the iPhone didn’t support 3G, it didn’t support multitasking, it didn’t support 3rd party apps, you couldn’t copy or paste text, you couldn’t attach arbitrary files to emails, it didn’t support MMS, it didn’t support Exchange push email, it didn’t have a customizable home screen, it didn’t support tethering, it hid the filesystem from users, it didn’t support editing Office documents, it didn’t support voice dialing, and it was almost entirely locked down to hackers and developers.
Yet all of those missing features hardly mattered and nearly everybody knew it.
I first read this on my phone and now here I am, blogging about it on my tablet.
Why laptops could be facing the end of the line – The Conversation Research shows that PC and laptop ownership, usage and importance have declined over the past three years, replaced largely by smartphones. A survey of internet users found just 15% thought their laptop was their most important device for accessing the internet, down from 30% in 2015, while 66% thought their smartphone was most important, up from 32%.
This has led some commentators to predict the slow death of the laptop because of young people’s preference for and greater familiarity with the devices in their pocket. But a survey by UK regulator Ofcom in 2017 also found there has also been a record rise in older people using smartphones and tablets.
How your laptop ruined your life – The Atlantic
As laptops have kept improving, and Wi-Fi has continued to reach ever further into the crevices of American life, however, the reality of laptops’ potential stopped looking quite so rosy. Instead of liberating white-collar and “knowledge” workers from their office, laptops turned many people’s whole life into an office. Smartphones might require you to read an after-hours email or check in on the office-communication platform Slack before you started your commute, but portable computers gave workers 24-hour access to the sophisticated, expensive applications—Salesforce CRM, Oracle ERP, Adobe Photoshop—that made their full range of duties possible.
Mobiles, mobiles, mobiles. They’re practically compulsory these days.
When Barack Obama became president in 2008, he famously fought for (and won) the right to keep using his Blackberry (the phone would become the official device given out by large swathes of the federal government, including Congress). And years before reverting to a “dumb phone” became a thing for trendsetters like Anna Wintour, magazine writers tried the same stunt to lessen their Blackberry usage. One Daily Mail headline from 2006 warned of a “Blackberry addiction ‘similar to drugs,’” describing the kind of behavior we now readily associate with social media and phones more generally (“One key sign of a user being addicted is if they focus on their Blackberry ignoring those around them.”).
But for those of us that are/were, here’s a look back at a simpler time.
Old CSS, new CSS – Fuzzy Notepad Damn, I miss those days. There were no big walled gardens, no Twitter or Facebook. If you had anything to say to anyone, you had to put together your own website. It was amazing. No one knew what they were doing; I’d wager that the vast majority of web designers at the time were clueless hobbyist tweens (like me) all copying from other clueless hobbyist tweens. … Everyone who was cool and in the know used Internet Explorer 3, the most advanced browser, but some losers still used Netscape Navigator so you had to put a “Best in IE” animated GIF on your splash page too. […]
Sadly, that’s all gone now — paved over by homogenous timelines where anything that wasn’t made this week is old news and long forgotten. The web was supposed to make information eternal, but instead, so much of it became ephemeral. I miss when virtually everyone I knew had their own website. Having a Twitter and an Instagram as your entire online presence is a poor substitute.
The Czech play that gave us the word ‘Robot’ – The MIT Press Reader Thus, “R.U.R.,” which gave birth to the robot, was a critique of mechanization and the ways it can dehumanize people. The word itself derives from the Czech word “robota,” or forced labor, as done by serfs. Its Slavic linguistic root, “rab,” means “slave.” The original word for robots more accurately defines androids, then, in that they were neither metallic nor mechanical.
The contrast between robots as mechanical slaves and potentially rebellious destroyers of their human makers echoes Mary Shelley’s “Frankenstein” and helps set the tone for later Western characterizations of robots as slaves straining against their lot, ready to burst out of control. The duality echoes throughout the twentieth century: Terminator, HAL 9000, Blade Runner’s replicants.
Machine Morality and Human Responsibility – The New Atlantis This year  marks the ninetieth anniversary of the first performance of the play from which we get the term “robot.” The Czech playwright Karel Čapek’s R.U.R. premiered in Prague on January 25, 1921. Physically, Čapek’s robots were not the kind of things to which we now apply the term: they were biological rather than mechanical, and humanlike in appearance. But their behavior should be familiar from its echoes in later science fiction — for Čapek’s robots ultimately bring about the destruction of the human race.
Before R.U.R., artificially created anthropoids, like Frankenstein’s monster or modern versions of the Jewish legend of the golem, might have acted destructively on a small scale; but Čapek seems to have been the first to see robots as an extension of the Industrial Revolution, and hence to grant them a reach capable of global transformation. Though his robots are closer to what we now might call androids, only a pedant would refuse Čapek honors as the father of the robot apocalypse.
I hope someone’s planning a big celebration next year.
I do miss the early web, sometimes. Amateurish, in a good way—spontaneous, care-free, lighthearted.
The early internet, explained by one weird Celine Dion fan site – The Atlantic Celine Dreams was a bit of a sensation. Toroptsov never lacked for dream submissions, and at the turn of the century—before the internet was a corporatized monoculture repeated across only a handful of giant web properties—a scrappy, DIY fan site could easily build an audience by climbing up search rankings and encouraging active participation. For years, Celine Dreams appeared in the first page of Google and Yahoo search results for Celine Dion—a distinction now reserved for Celine Dion’s official website, Celine Dion’s Wikipedia page, Celine Dion’s Twitter page, Celine Dion on Spotify, and Celine Dion on YouTube.
And then it shut down, blinkering out at the same time as thousands of other fan sites. The whole ecosystem slid into the digital ocean slowly, but pretty much all at once, like a famous ship.
More of these fan sites disappear all the time, and the Wayback Machine isn’t able to keep even a near-perfect record. Toroptsov’s project, and the work of his “competitors,” are vanishing in what information scientists have long been referring to as the “digital dark age.” “However widely the myth of the automatically archival Internet has spread over the past 70 years, the fact is that the system of networked computing utterly fails as a memory machine,” the UC Berkeley media researcher Abigail De Kosnik writes in her 2016 book, Rogue Archives. “The internet and computers do not constitute the greatest archive in human history, but rather the reverse.”
This applies to iconic software, too.
The last vestige of Internet Explorer dies today – Gizmodo When Microsoft decided to use EdgeHTML, it made sense. Internet Explorer had once been the biggest web browser around and consequently, lots of web page designers focused their energies on making their sites work for IE. But Chrome had a foothold when Edge launched and Microsoft’s new browser just never gained the popularity it needed. Instead, more and more web page designers focused on making the best looking sites the could—for Chrome.
Chrome uses the Blink engine and the source code originates with the open-source Chromium project. The Edge that launches today will rely on Blink and Chromium too.
Some people are clinging on, though. I’ve been reading Joanne McNeils’s newsletter for a while, now, and her website is joyously web 1.0.
joannemcneil.com Hi, my name is Joanne McNeil and this is my Home Page on the World Wide Web. My book Lurking is out on February 25, 2020 with MCD.
And do you remember Noah Everyday from the 2000s? He’s back again, and doesn’t look a day older. Ok, that’s a lie. He looks older, we all do.
Here’s a breakdown of the seemingly inconsequential design decisions that led to very significant changes in how we communicate and relate to each other. Take the ‘typing indicator’, for instance…
The loss of micro-privacy – Medium The typing indicator elegantly solved what the team had set out to solve. But it also did a bit more than that. Apart from increased engagement, it also single-handedly introduced a whole new level of emotional nuance to online communication. This seemingly small detail inadvertently conveyed things no message by itself ever could. Picture this scenario:
Bob: “Hey Anna! It was so great to meet you. You’d like to go out for a drink tonight?”
Anna: “Starts typing…”
Anna: “Stops typing…”
Anna: “Starts typing again…”
How convinced is Anna really? You might have experienced it yourself: the angst of prolonged typing indicators followed by a short response or even worse: nothing! Bob might have been happier if he hadn’t observed Anna’s typing pattern. But he did. And now he wonders how such a tiny animation can have such a profound impact on how he feels…
Ouch. It was so much easier in the old days. Well, perhaps it was just easier through these rose-tinted glasses, but it was certainly different, as these interviews from The Atlantic explain.
How the loss of the landline is changing family life – The Atlantic
“The shared family phone served as an anchor for home,” says Luke Fernandez, a visiting computer-science professor at Weber State University and a co-author of Bored, Lonely, Angry, Stupid: Feelings About Technology, From the Telegraph to Twitter. “Home is where you could be reached, and where you needed to go to pick up your messages.” With smartphones, Fernandez says, “we have gained mobility and privacy. But the value of the home has been diminished, as has its capacity to guide and monitor family behavior and perhaps bind families more closely together.”
(It reminds me of an article I found last year, about when we would have just the one shared family computer. Now everyone has their own computer on them at all times, one that they’re very reluctant to part with.)
What’s more, the calls, texts, and emails that pass through cellphones (and computers and tablets) can now be kept private from family members. “It keeps everybody separate in their own little techno-cocoons,” says Larry Rosen, a retired psychology professor at California State University at Dominguez Hills and a co-author of The Distracted Mind: Ancient Brains in a High-Tech World. Whereas early landlines united family members gathered in a single room, cellphones now silo them.
This part particularly resonated with me, as a parent of two teenagers.
Cheryl Muller, a 59-year-old artist living in Brooklyn, raised her two sons, now 30 and 27, during the transition from landline to cellphone. “I do remember the shift from calling out ‘It’s for you,’ and being aware of their friends calling, and then asking them what the call was about, to pretty much … silence,” she says. Caroline Coleman, 54, a writer in New York City whose children grew up during the same transition, recalls how at age 10 her son got a call from a man with a deep voice. “I was horrified. I asked who it was—and it was his first classmate whose voice had changed,” she said. “When you get cells, you lose that connection.”
This is particularly important for teenagers, who are at an important stage in their development. They need to make close friends and renegotiate relationships with their parents. Making friends allows teenagers to learn how to interact with others, learn more about themselves and find their own place in the world. Mobile tech allows teenagers to stay in touch with others and can help them develop closer, more supportive friendships.
That 10,000 year clock is being built in remote, mountainous west Texas, a location thought to be safe from whatever the future might have in store for us. Here’s news of another.
Microsoft apocalypse-proofs open source code in an Arctic cave – Bloomberg
This is the Arctic World Archive, the seed vault’s much less sexy cousin. Friedman unlocks the container door with a simple door key and, inside, deposits much of the world’s open source software code. Servers and flash drives aren’t durable enough for this purpose, so the data is encoded on what look like old-school movie reels, each weighing a few pounds and stored in a white plastic container about the size of a pizza box. It’s basically microfilm. With the help of a magnifying glass, you—or, say, a band of End Times survivors—can see the data, be it pictures, text, or lines of code.
What starts out as a quirky vanity project/photo op for the GitHub CEO becomes to be seen, at the very end of the article, and for him personally, as a timely precaution against a world “fundamentally weirder than it was 20 years ago”.
Teletext was slow but it paved the way for the super-fast world of the internet
The BBC has announced that 2020 will mark the end of the Red Button text service – the final incarnation of what was originally known as CEEFAX and Oracle. Those old text-based TV services would seem ridiculously clunky and old-fashioned to an internet generation used to instant streaming and apps for everything. But – as slow and frustrating as that old text system was – it paved the way for the World Wide Web and helped prepare us for the world of social media.
A kind of internet but without social media — what could be better? It wasn’t quick though, was it?
When you fetch a web page, your browser sends a request to the server and the server sends the requested data back to you. CEEFAX, on the other hand, sent each page in turn, on a sort of endless loop. So you would put in the page number you wanted to see using your remote control, but it could take some time before that page came around again. It was a bit like waiting for your favourite sushi dish at one of those Japanese restaurants which use a conveyor belt to deliver the food, or your suitcase at an airport baggage claim.
How about this for an unsettling glimpse into the future?
Hyper-reality Hyper-Reality presents a provocative and kaleidoscopic new vision of the future, where physical and virtual realities have merged, and the city is saturated in media.
It serves as the introduction to this fantastic overview of augmented reality in urban environments.
City Skins: Scenes from an augmented urban reality In one scene, the film’s protagonist-user (“Juliana”), becomes confused, even anxious, by a technical glitch which forces a reboot of her device while shopping for food, showing the viewer a brief glimpse of a un-augmented and totally featureless supermarket, clearly designed for the express purpose of accommodating a digital overlay. Matsuda’s film ultimately suggests that augmented reality may become so commonplace as to be essential to making sense of the world.
However futuristic it may seem, location-based augmented reality (virtual reality’s more successful but less hyped cousin) has been around for a while.
Growing interest in location-based AR projects, beginning in the late 1990s, can be in part attributed to the confluence of art and networking technologies which emerged out of the gradual popularization of the Internet and the influence of “net art.” Net art, according to critic Josephine Bosma, has often concerned itself with “the public domain as a virtual, mediated space consisting of both material and immaterial matter,” indicating a conceptual and ethical foundation for augmented reality’s radical leap from the space of the screen to a “hybrid space” mixing real and virtual elements.
Near the tail end of the 20th century, pseudonymous author and technologist Ben Russell released The Headmap Manifesto — a utopian vision of augmented reality referencing Australian aboriginal songlines and occult tomes, while pulling heavily from cybernetic theory and the Temporary Autonomous Zones of Hakim Bey. At turns both wildly hypothetical and eerily prescient, Headmap explores in-depth the implications of “location-aware” augmented reality as a kind of “parasitic architecture” affording ordinary people the chance to annotate and re-interpret their environment.
That might sound too abstract and theoretical, but here’s an example of a very real-world, poignant use of AR.
Following the release of the first iPhone and advancements in mobile phone cameras and processing power, AR began to move toward the more visually-dominant experiences we are familiar with today — in the process also opening up possibilities for more explicitly political projects. The group 4 Gentlemen, for instance, embraced AR as a tool for criticizing oppressive government policies in China. A collective of exiled Chinese artists and one American artist, 4 Gentlemen (taking their name from a group of intellectual dissidents central to the Tiananmen Student Protest in 1989) developed a series of works that digitally recreated in situ both the famous “Tank Man” image and the “Goddess of Democracy” statue — two symbols of the Tiananmen protest which have defined the struggle for democracy and human rights in China since.
We take so much for granted these days, screens are everywhere, moving images are all around us. It’s hard to imagine what it must have been like before this deluge.
Our ideas about what early movies looked like are all wrong
During the first film screenings in the 1890s, viewers marvelled at moving images that had an unprecedented power to transport them to faraway places in an instant. At first, these shorts – which included glimpses of everything from Niagara Falls to elephants in India – had no narrative structure. Audiences flocked to theatres simply for the novel experience of seeing people and places, some familiar and others deeply strange, rendered lifelike and immediate before their eyes. And, as the film curator Dave Kehr explains in this video from New York City’s Museum of Modern Art (MoMA), the images were hardly the grainy and frantically paced footage that has become synonymous with ‘old film’ today. Rather, viewed in their original form on large screens and prior to decades of degradation, these movies were vivid and realistic. In particular, early 68mm film, which was less practical than 35mm film and thus used less frequently, delivered startlingly lifelike impressions of distant realities to early moviegoers.
It’s quite arrogant of us to dismiss those early films as merely a stepping stone to our superior technologies today. You could argue that, given the quality of these new versions and the freshness of those first audiences, these movies made more of an impact than what we see today.
And perhaps the same can apply to television a few decades later.
#OnThisDay 1977: Thanks for the Memory – The Viewer's View asked people what they thought about television.
This woman was likely approached for a snappy vox pop, but ended up delivering something closer to a dramatic monologue. Fantastic stuff. pic.twitter.com/vzQTAe3ujz
Here’s an addition to the god-that-makes-me-feel-old list — the Walkman turns 40 this year. Fancy having to explain to someone what a Walkman was. Or what Napster was…
Walkman turns 40 today: How listening to music changed over the years Though it was first invented 40 years ago, in 1979, the iconic cassette tape player defined the decade when legwarmers weren’t part of costumes and Reaganomics ruled the land. It was the first device that allowed listeners to take music with them on the go (hence, the name).
Since then, we’ve evolved to CDs, iPods, and the current age of streaming services like Spotify and Apple Music. It’s easy to forget how revolutionary the Walkman was for its time, and that it marked a pivotal moment in the nearly 150-year-old history of recorded music.
With that in mind, here’s a look at how we’ve listened to music through the years — from the 1800s to today.
There are some great photos here. We’ve certainly gone through a number of formats here. I wonder what’s next.
You’d think they had had their day, but photo booths are still going strong, as this recent Quartz Obsession round-up shows. But first, some highlights from its long history.
Photo booths 1890: The Bosco Automat, an early automated photo machine requiring a human operator, debuts at the First International Exposition of Amateur Photography.
1900: Eastman Kodak debuts the Brownie camera with a price point of just $1.
1925: The first curtain-enclosed booth costs users 25 cents for a strip of 8 photos.
1929: Surrealist René Magritte uses the new technology for his work Je ne vois pas la cachée dans la forêt (“I do not see the woman in the forest”).
1930s: Bluesman Robert Johnson takes a photo booth self-portrait; when he becomes a posthumous legend, it’s made into a US postage stamp.
1953: John and Jackie Kennedy step into a photo booth during their honeymoon.
1963-1966: Andy Warhol manipulates hundreds of photo booth images into silkscreen images.
1993: Photo-Me offers the first digital photo booth.
1995: Video game company Sega introduces a photo machine that prints stickers, launching the purikura selfie trend in Japan.
2015: Photo Booth Expo launches in Las Vegas with 1,200 attendees. In 2019, expo attendance topped 4,000.
It seems smartphones are fully ubiquitous now and Instgram accounts are practically compulsory, but the photo booth is still here, for a new audience.
While the OG photo booth certainly has its charms, we’ve come a long way since the days of black-and-white photography and eight-minute processing times. Purikura is a Japanese photo booth experience that allows users to pick music for their photo shoots, choose lighting, enhance the photo by drawing on it, add digital backgrounds, retouch, and more.
Japanese Photo Sticker Booths: Purikura Adventure Forget the selfie! It’s time to discover the Japanese Purikura Photo Sticker Booths. It’s hard to compete with the almighty smart phone these days, but the purikura “print club” booths have evolved into a US$50M a year industry and a place were one can be beamed into the professional model fantasy world – if only for 10 minutes.
For its anniversary, we’re looking back at some of our favorite websites, from A to Z, as well as some key people and technologies. Of course, there was far too much good stuff to include, so we had to note some additional favorites along the way.
Yes there are the obvious ones like Flickr and Geocities, but what about these blasts from the past?
JENNICAM Jennifer Ringley started broadcasting every moment spent in her college dorm, by way of grainy photos uploaded every 15 minutes, in 1996. She was one of the first people to share her life online without a filter, offering a sense of intimacy and relatability that we now take for granted with digital celebrities. She was also one of the first people to discover the pitfalls of internet fame, including burnout after living years of her life in public, which is why she’s stayed mostly offline since 2003 when Jennicam went dark. […]
LIVEJOURNAL Once upon a time (around the turn of the 21st century), there was a social network called LiveJournal where large numbers of people (some with very confusing pseudonyms) hung out, blogged, argued in long comment threads, posted fiction and poetry and art, and had a generally good time. In 2007, LiveJournal was sold to a Russian media company, and many of its original contributors eventually decamped to Facebook, Twitter, and other foreign climes. LiveJournal is still, well, live; its servers (and its user agreement) are now Russian and so are many of its users.
A review of two books on the histories and possible futures of the internet, that try to position themselves somewhere between the more common approaches that recent studies have taken — either deterministic accounts of the “improbable marriage of countercultural hippie experiments and the military-industrial complex”, or heroic tales from Silicon Valley of “whimsical personalities and talents of digital entrepreneurs and inventors”.
Counter-histories of the Internet Two recent books address similar speculative scenarios in the course of offering alternative histories of the internet: David Clark’s Designing an Internet and Joy Lisi Rankin’s A People’s History of Computing in the United States. Clark’s book introduces its readers to scientists who designed our networks, many of whom still dream of redesigning them. Rankin writes about groups of students and researchers who used early computers with uncommon egalitarianism. Both authors wonder why versions of the internet that they personally favor have not prevailed. They also hope that recalling such forgotten projects could inspire their readers to fight for a better digital future. […]
Rankin explicitly describes herself as “highlight[ing] the centrality of education—at all levels—as a site of creativity, collaboration, and innovation.” More obliquely, but no less forcefully, Clark tries to free his readers from a myopic view of web architecture as a given landscape within which we pursue our goals and interests without considering how that landscape came to be. He shows that knowing more about how the web was built, or could have been built, allows us to think more freely about how we distribute our capacities and resources within it.
It’s an interesting debate, though I worry it may be a little redundant — do the top execs at Facebook, Google and Amazon have these books on their reading list?
It’s hard to believe the web’s thirty years old already. It seems like it’s been around forever in the way it underpins everything we do, from TV watching to banking. But we’re still grappling with the consequences its introduction has had on our societies, and probably will for another thirty years yet.
But let’s step back a little, to how it all began.
CERN 2019 WorldWideWeb rebuild In December 1990, an application called WorldWideWeb was developed on a NeXT machine at The European Organization for Nuclear Research (known as CERN) just outside of Geneva. This program – WorldWideWeb — is the antecedent of most of what we consider or know of as “the web” today.
In February 2019, in celebration of the thirtieth anniversary of the development of WorldWideWeb, a group of developers and designers convened at CERN to rebuild the original browser within a contemporary browser, allowing users around the world to experience the rather humble origins of this transformative technology.
Their timeline is very interesting, too: “thirty years of influences leading up to (and the thirty years of influence leading out from) the publication of the memo that lead to the development of the first web browser.”
But all good things come to an end, and another one of the big players from back in the day is no more.
A eulogy for AltaVista, the Google of its time You appeared on the search engine scene in December 1995. You made us go “woah” when you arrived. You did that by indexing around 20 million web pages, at a time when indexing 2 million web pages was considered to be big.
Today, of course, pages get indexed in the billions, the tens of billions or more. But in 1995, 20 million was huge. Existing search engines like Lycos, Excite & InfoSeek (to name only a few) didn’t quite know what hit them. With so many pages, you seemed to find stuff they and others didn’t.