The Technium

Triumph of the Default


[Translations: FrenchGeorgianItalian, JapanesePortugueseSpanish, Sindhi]

One of the greatest unappreciated inventions of modern life is the default. “Default” is a technical  concept first used in computer science in the 1960s to indicate a preset standard. Default, for instance, as in: the default of this program assumes that dates are given in two digit years not four. Today the notion of a default has spread beyond computer science to the culture at large.  It seems such a small thing, but the idea of the default is fundamental to the technium.

It’s hard to remember a time when defaults were not part of life. But defaults only arose as computing spread; they are an attribute of complex technological systems. There were no defaults in the industrial age. In the early days of computers, when system crashes were frequent, and variables a lot of trouble to input, a default was the value the system would automatically assign itself if a program failed or when it first initiated. It was a smart trick. Unless a user, or programmer, took the trouble to alter it, the default ruled, ensuring that its host system would probably work. So electronic goods and software programs were shipped with all options set to defaults. The defaults were preset for the expected norms of the buyers (say the standard voltage of the US), or expected preferences (subtitles off for movies), or best practices (virus detector on). Most times presets work fine. We now have defaults installed in automobiles, insurance programs, networks, phones, health care plans, credit cards, and anything that is customizable.

Indeed, anything with the slightest bit of computational intelligence it in (that is any complex modern artifact) has defaults embedded into it. These presets are explicit biases programmed into the gadget, or system, or institution. But a default is more than the unspoken assumptions that have always been present in anything made. For instance most hand tools were “defaulted” to right hand use. In fact assuming the user was right handed was so normal, it was never mentioned. Likewise, the shape of hand tools assumed the user was male. Not just tools: early automobiles were designed assuming the driver was male. Anything manufactured must make a guess about its presumed buyer and their motivations; these assumptions are naturally designed into the technology. The larger the scale of the system, the more assumptions it has to make. A careful examination of a particular technological infrastructure will reveal the broad assumptions that are buried in its design. So American optimism, high regard for the individual, and penchant for change are all wrapped up in the specific designs of the American electrical system, railroads, highways, and  education.

But while these embedded biases, common to all technology, share many attributes with the concept of a default, they are not a default proper. A default is an assumption that can be changed. The assumption of right-handedness in a hammer, or pliers, or scissors, could not be switched. The assumption of a driver’s gender as manifested in the seat position in an automobile could not be altered easily in the old days. But in much of modern technology it can be. The hallmark of flexible technological systems is the ease by which they can be rewired, modified, reprogrammed, adapted, and changed to suit new uses and new users. Many (not all) of their assumptions can be altered. The upside to endless flexibility and multiple defaults lies in the genuine choice that an individual now has, if one wants it. Technologies can be tailored to your preferences, and optimized to fit your own talents.

However the downside to extremely flexible techniques is that all these nodes of exploding possibility become overwhelming. Too many mind-numbing alternatives, and not enough time (let alone will) to evaluate them all. The specter of 99 varieties of mustard on the supermarket shelf, or 2,356 options in your health plan, or 56,000 possible hairdos for your avatar in a virtual world produces massive indecision and paralysis. The amazing solution to this problem of debilitating over-abundant choice are defaults. Defaults allow us to choose when to choose. For example, your avatar is given a standard default look (kid in jeans) to start out. You can alter every default description later.  Think of it as managed choice. Those thousands of variables — real choice — can be managed by adopting smart defaults, which “make” a choice for us, yet reserve our full freedom to choose in the future when we want to. My freedoms are not restricted but staggered. As I become more educated I go back to my preferences and opt in, or opt out, or tweak a parameter up or down, or ditch one thing for another. But until I do, the choices remain veiled, out of sight, and house-trained, obediently waiting. In properly designed default system, I always have my full freedoms, yet my choices are presented to me in a way that encourages taking those choices in time — in an incremental and educated manner. Defaults are a tool that tame expanding choice.

Contrast that expansion to the classic hammer, or automobile, or 1950s phone system. Users simply had few choices in how the tool was used. World-class engineers spent years honing a fixed universal design to work best for the most people, and there’s still an enduring beauty in those designs. The relative inertness of industrial artifacts and infrastructure was compensated with elegant and brilliant access for the average everyman. Today you may not actually make a lot more choices about your phone than 50 years ago, but you could. And  you’ll have more choice in where to make those few choices. These unfolding potential choices are nested within the adaptive nature of mobiles and networks. Choices materialize when summoned. But these abundant choices never appeared in fixed designs.

Defaults first arrived in the complex realms of computation and communication networks, but they aren’t excluded from hammers, or cars, or shoes, or door knobs, for that matter. As we inject adaptability into these artifacts by manufacturing them with traces of computer chips and smart materials, we open them up for defaults as well. Imagine a hammer handle made of some kind of adaptive material that would reform itself to your left hand, or to a woman’s hand. You might very well have the option to designate your gender, or age, or proficiency, or work environment, directly into the small neurons of the hammer. And if so, then the tool would be shipped with defaults.

Ani-1200

But defaults are “sticky.” Many psychological studies have shown that the tiny bit of extra effort needed to alter a default is enough to dissuade most people from bothering, so they stick to the default, despite their untapped freedom. Their camera’s clock blinks at the default of 12:00, or their password remains whatever temporary one was issued them. The hard truth, as any engineer will tell you, is that most defaults are never altered. Pick up any device, and 98 out of 100 options will be the ones preset at the factory. I know from my own experience that I have altered very few of the preferences available to me; I’ve stuck to the defaults. I’ve been using a Macintosh from the day it was introduced 25 years ago and I am still discovering basic defaults and preferences I had never heard of. From an engineering perspective this default inertia is a measure of success, because it means the defaults work. Without much change, products are used, and their systems happily hum on.

Therefore the privilege of establishing what value the default is set at is an act of power and influence. Defaults are a tool not only for individuals to tame choices, but for systems designers — those who set the presets — to steer the system.  The architecture of these choices can profoundly shape the culture of that system’s use. Even the sequences of defaults and choices make a difference too. Retail merchandisers know this well. They stage stores and websites to channel decisions in a particular order to maximize sales. If you let hungry students make their desert choice first rather than last, this default order has an impact on their nutrition.

Every element of a complex technology, from its programming language, to the user interface design, to the selection of its peripherals, harbors a multitude of defaults: Does the system assume anonymity? Does it assume most people are basically good or basically up to no good? Are its defaults set to maximize sharing or maximize secrecy? Should its rules expire after a set period by default or renew automatically by default? How easy is to undo a choice?  Should the process of control be an opt in or opt out process? Recombining four or five different default parameters will spawn systems with hundreds of different characteristics.

Identical technological arrangements — say two computer networks constructed of the same hardware and software — can yield very different cultural consequences simply by altering the defaults embedded in the system. The influence of a default is so powerful that one single default can act as a very tiny nudge that can sway extremely large and complex networks. As an example, most pension investment programs, such as corporate 401k plans, have very low participation rates in part because the plans have an overwhelming number of sub-options to choose from. The behavioral economist Richard Thaler relates experiments whereby making enrollment automatic with a default choice (“mandated choice”) dramatically increased savings rates for employees. Anyone could opt out the program at any time, with full freedom to change the specifics of their plan, but simply shifting the default  from “having to sign up” to “automatic enrollment” changed the entire tenor of the system. A similar shift happens if you make the donation of organs upon death automatically an “opt out” choice (it happens unless you refuse beforehand) versus “opt in” (it does not happen unless you sign up). A opt out donor system greatly increases the number of organs donated.

The tiny default is one of the ways that we can bend the inevitable unrolling of a technological innovation. For instance, an elaborate continent-wide technical system, such as 110-volt AC electricity, may gather its it own momentum as it acquires self-reinforcing support from other technical systems (like diesel generators, or factory assembly lines), and that accelerating momentum may steamroll over prior systems, but at every node in the electrical body, a default resides, and with the proper alignment and deft choices, those slim defaults can be used to nudge the gigantic system toward certain states. The system can be bent towards making it easy to add new but less secure innovations , or making it difficult to change, but more secure. The tiny nudges of defaults can shape how easy the network expands, or not. Or how well it incorporates unusual sources of power. Or whether it tends to centralize or decentralize.  The shape of a technological system is set by the technology itself, but the character of the system can be set by us.

Systems are not neutral. They have natural biases.  We tame the cascading choices we gain from accelerating technology by introducing small nudges — by deliberating embedding our own biases (also called a default) into the system here and there. We wield biases within inevitable technologies to aim them towards our common goals — increasing diversity, complexity, specialization, sentience, and beauty.

Defaults also remind us of another truth. By definition a default works when we — the user or consumer or citizen — do nothing. But doing nothing is not neutral, since it triggers a default bias. That means that “no choice” is a choice itself. There’s is no neutral, even, or especially, in non action. Despite the claims of many, technology is never neutral. Even when you don’t choose what to do with it, it chooses. A system acquires a definite drift and clear momentum from those inherent biases, whether or not we act upon them. The best we can do is nudge it.




Comments


© 2022