Slate recently ran a piece on the dangers of technological ignorance. It referenced a quote from the Wall Street Journal about the Apple Watch:
When I wear a mechanical watch, I think of the precision required to make it work on such a micro scale. That’s pretty awesome as opposed to just a chip.
This reminds me a lot of when I talk to my dad about technology. He’ll often point to his computer, his smartphone, the TV remote and say, “It should just work!” This seems is as obvious a statement to him as it is absurd to me. Of course, as programmers, we want our users to think that it just works. We want to abstract our processes for any number of reasons: proprietary codebases, security, customer satisfaction. There are reasons we moved from CLIs to GUIs.
It makes me happy when I see people interacting with something I’ve built and it appears to just work. But it’s worrisome when the basic details of my job cause my friends’ eyes to glaze over. And people can get downright vitriolic when an app crashes or something in the hardware is wrong. I got that angry myself when I updated to iOS 9.1 and suddenly my messages app stopped functioning. There are so many hidden levels of complexity in every device we work with. When something goes wrong it’s impossible for the layperson to diagnose. “It just works” apps make users feel empowered when everything functions as predicted. However, when things go wrong it can lead to feelings of helplessness and rage.
Traditional gen-ed classes don’t require students to build mental models for abstraction. My dad, a medical school professor, is plenty smart enough to understand the complexity baked into tech. But as an end user he’s taught not to expect or understand that complexity. He’s probably created his own mental modeling for the complexity of human biology. The Slate piece asserts that we need to start teaching abstraction for understanding tech. I wholeheartedly agree. But tech is inorganic, intangible, and likes to present itself as simple. How do we teach people who don’t work in tech to understand the abstraction involved in development?
Yes. Seriously. A nursery rhyme.
This is the farmer sowing his corn,
That kept the cock that crowed in the morn,
That waked the priest all shaven and shorn,
That married the man all tattered and torn,
That kissed the maiden all forlorn,
That milked the cow with the crumpled horn,
That tossed the dog,
That worried the cat,
That killed the rat,
That ate the malt
That lay in the house that Jack built.
Using a nursery rhyme as a mental model is not meant to be condescending. Rather, it takes a familiar piece and uses that space in someone’s brain to provide a structure for real world understanding.
I like this nursery rhyme as a description of the way a system might get built. Jack’s house could be the app that a user is working with. Everything else is part of the ecosystem that lets the house exist. Without any of the preceding pieces, that house wouldn’t come to be. That isn’t to say that Jack couldn’t build a house without those pieces, but it wouldn’t be the same.
The full text of the poem is structured backwards, starting with the house and zooming out. When talking about tech, or teaching someone who wants to break into the field, I like this approach. I start with the end result and break things down, explaining environments, under the hood procedures, etc., later. The House that Jack Built primes someone for understanding the context surrounding complexity before jumping in to concrete examples.
Why is mental modeling important?
Mental models provide structure for understanding real world concepts. As a potential educator, you can think of mental models as templates used to learn and retain new information. Having a simpler template leads to quicker understanding.
Everyone already has a mental model around how they view tech. For many users, apps are singular entities and the teams behind them are monolithic and inaccessible. This view disempowers users from understanding failures or learning more about their app ecosystem. Going back to The House that Jack Built, if something goes wrong it’s unclear whether there’s a problem with the house, or whether the farmer wrote a shoddy library 10 years ago that everyone keeps monkey-patching to be good enough. This asymmetry in our mental modeling creates an unbridgeable communication gap. This is confusing and frustrating for users and tech teams alike. It also trivializes and dehumanizes tech teams and the work that they do.
As the Slate piece observes, the current system of understanding abstraction teaches users to take complexity for granted. It also runs a serious risk of discouraging people from being curious about tech. We need more new people in the industry who are hungry to learn. As the world around us becomes more miraculous while presenting itself as increasingly mundane, we could be missing out on attracting new brilliant minds to our industry. Or it could just discourage end users from reporting bugs and lead them to abandon our products. Because they “just don’t work.”