A great gallery by Larry Luckham, who captured the daily life at the Bell Labs datacenter he used to work for in the late 60s.
What’s immediately noticeable is the amount of women who used to work at the place. As a recent New York Times article explains, gender balance in tech used to be much better than it is today:
Employers simply looked for candidates who were logical, good at math and meticulous. And in this respect, gender stereotypes worked in women’s favor: Some executives argued that women’s traditional expertise at painstaking activities like knitting and weaving manifested precisely this mind-set. (The 1968 book “Your Career in Computers” stated that people who like “cooking from a cookbook” make good programmers.)
Things changed for the worse when programming became a noble activity (back then, developing the software had a lower status over building the hardware) and when personal computers entered everyday life (parents encouraged boys to play with them, less so girls).
[Kodak] didn’t develop a better film for rendering different gradations of brown until economic pressure came from a very different source: Kodak’s professional accounts. Two of their biggest clients were chocolate confectioners, who were dissatisfied with the film’s ability to render the difference between chocolates of different darknesses. “Also,” Connor says, “furniture manufacturers were complaining that stains and wood grains in their advertisement photos were not true to life.”
The New York Times, in an article full of impressive photos:
A conveyor that staff members call “the Cable Highway” moves the cable directly into Durable, docked in the Piscataqua River. The ship will carry over 4,000 miles of cable weighing about 3,500 metric tons when fully loaded.
Inside the ship, workers spool the cable into cavernous tanks. One person walks the cable swiftly in a circle, as if laying out a massive garden hose, while others lie down to hold it in place to ensure it doesn’t snag or knot. Even with teams working around the clock, it takes about four weeks before the ship is loaded up with enough cable to hit the open sea.
From the same people behind Calling Bullshit.
Nine people came together at CERN for five days and made something amazing. I still can’t quite believe it.
Coming into this, I thought it was hugely ambitious to try to not only recreate the experience of using the first ever web browser (called WorldWideWeb, later Nexus), but to also try to document the historical context of the time.
The documentation itself is well worth a read:
Today it’s hard to imagine that web browsers might also be used to create web pages. It turned out that people were quite happy to write HTML by hand—something that Tim Berners-Lee and colleagues never expected. They thought that some kind of user interface would be needed for making web pages and links. That’s what the WorldWideWeb browser provided. You could open a document in one window and “mark” it. Then, in a document in another window, you could create a link to the marked page.
You’ll notice as you use the WorldWideWeb browser that you need to double-click links to open them. That’s because a single click was used for editing.
Internet writing uses subtle punctuation choices to convey sarcasm and other tone of voice nuances. It’s not lazy.
You know, the fact that for example the fullstop in the context of a text isn’t used to mark the ending of said text but rather to convey the addressee our utter aversion to their existence.
In its first era of popularity, it was all pop and pulp, but now it seems reserved for the task of adding just the slightest bit of a smirk to extremely straight-faced endeavors: elegant magazines, important books, experimental theater, and $80 ceramic pipes.
I didn’t realise how popular this typeface was until I stumbled across this article and started noticing it in bookshops or books I myself own (Mark Grief’s Against Everything).
It was used on the first cover of James Joyce’s Finnegans Wake and for the credits of Friends. Quite a weird mix.
The menu bar has been, and in my opinion remains, the best mechanism for providing familiarity, discoverability, and progressive disclosure in user interfaces on any platform. Even beyond the Mac, anyone who has clicked on a File menu in one platform has a pretty good shot at guessing where a Save command might be when provided a File menu somewhere else.
Speaking of how the model of reality which a technology adopts can end up influencing and changing reality itself, here’s George Dyson:
Their models are no longer models. The search engine is no longer a model of human knowledge, it /is/ human knowledge. What began as a mapping of human meaning now defines human meaning, and has begun to control, rather than simply catalog or index, human thought. No one is at the controls. If enough drivers subscribe to a real-time map, traffic is controlled, with no central model except the traffic itself. The successful social network is no longer a model of the social graph, it is the social graph. This is why it is a winner-take-all game. Governments, with an allegiance to antiquated models and control systems, are being left behind.
Maps make for a good example here. We’re all aware that the mercator projection is an inaccurate model of reality, one which distorts the true size of countries and is skewed in favour of Europe, nonetheless that’s what we use to describe the world, it’s what we think of when we think of a map — it’s the default, almost natural, choice.
The risk here is for a model to become so ingrained that we end up forgetting that we had other options — that what we’re using is in fact just a model.