Omar Rizwan

Against nostalgia in computing

We have seen an apparent disconnect between the prototypical child that OLPC imagined and the actual children who received laptops in Paraguay. The source of this disconnect was nostalgic design [...] one of the reasons that OLPC’s laptops were so charismatic for many in the world of computer and software development was because they resonated with the rosily mythologized stories that many told about their own childhoods, which tended to evoke the prototypical yearner child of Papert’s writings [...]

Instead, the children in Paraguay whom I encountered in my fieldwork were using their XO laptops as many around the world use computers today. About half of them were just not that into their machines. They found using the XO unrewarding, whether because of the limitations of the laptop itself or because of the fullness of their lives without it. Another one-third engaged with their laptops as media-rich, internet-connected sources of entertainment. Despite the charisma that a specialized laptop with lots of unique features and educational programs held for its creators and others in the technology world, these children were using their XO laptops as they would use any other computer [...]

In this environment, with these expectations, it is no wonder that children in Paraguay had little interest in exploring the components of the machine that resulted from OLPC developers’ nostalgic design; such components struggle to hold the attention of children everywhere in the face of an internet-connected, media-rich world. For these children, nostalgic technology design held no charisma; it was not their rose-tinted childhood memories being referenced.

(The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child)


A friend was asking me last week what I thought of "digital gardens". I was caught a little off-guard. I didn't even know what exactly he meant (is "Digital Gardens" a specific things someone made? or is it the vague movement or clump of people online that I'm thinking of?)

But my first, gut answer was: "digital gardens" feels too shallow to me, and it feels backward-looking.

It doesn't engage with current technology and current uses of computers.


The average computer today is a smartphone with a multitouch screen, where you have one app open at a time, and it's usually some type of social media feed, probably mostly videos. It's not a Xerox PARC desktop; it's not windows; it's not files; it's not icons; it's not emails; it's not word processing. That potted history is dead, it's just something people recite now.

The average website today is one of five or ten -- Reddit, Twitter, Instagram, and so on. The context of the old web was a resource that doesn't really exist anymore.


we have to reject the idea that things were simply better in the past and that they can be copied uncritically into the present. this is a way of thinking that's just as limited as blindly jumping on new trends, and it's a way of thinking that just won't work -- the context then is not the same as the context now, the people then are not the same as the people now.


nostalgia is too powerful, it's like a drug, it's too tempting to over-rely on it, you need to consciously reject it




To the extent that we can learn from the past, we should learn from specific techniques and ways of thinking, not just copy whole systems (what would the point of those systems be?)

And when we learn those techniques, we should re-apply those techniques in a modern context, to deal with modern problems.

We copy the idea that "everything is a file", but we apply that to the problems we actually encounter today (tab management, media management).

We copy the integrated design and reprogrammability of a Lisp or Smalltalk machine, but we situate that in the physical world, we use compute and recognition and actuation systems that are only available now, we don't try to make another desktop operating system.

And we should go the other way, too: we should learn from what new modern stuff has been successful. We learn from the social media feed -- that's a new interaction model that works -- why shouldn't we apply it to other problems? Why should we limit computing to plain-text terminals and REPLs and windows and mice?