CounterIntuitive
Jack Gould: Untitled [women lined up in front of counter,
seen from behind shop counter] (c. 1950)
"We're destined to become mere observers of our computations."
A great myth was created at the dawn of the personal computing age. Before then, when computing exclusively resided within large organizations, computer professionals took considerable pride in their ability to work within tenaciously hostile intellectual environments. They were, after all, professionals, so their methods and practices should have, by all rights, remained obscure and mysterious to the general public. These professionals reveled in their status as eggheads and were seen as much more intelligent than the Average "Regular" Person. Many became conversant in what was properly referred to as "machine language" and could think and dream in ways never imagined by Average "Regular" Persons.
The dilemma arose when technology advanced to the point where personal computers became feasible. How might a company market a computer to an Average "Regular" Person, someone with no intention or inclination to ever become an eggheaded computer person? Early models were quickly snapped up by computer enthusiasts, those who, while lacking the necessary educational and technical background, were obsessive enough to acquire sufficient skills. New forms of computer languages were created, among them a particularly unintimidating one labeled simply, Basic. In practice, though, few Average "Regular" People could master even a language as simple-sounding as Basic. That's where the Great Myth entered the story.
The Great Myth insisted that a computer could be created that required no specialized knowledge to operate. Technology could render the intimidating user interface —the point where the user interacts with the machine —intuitive. An Intuitive User Interface would operate the way even the most naive and inexperienced Average "Regular" Person might anticipate. It would employ simple point-and-click technology, whereby the user would direct the computer's operation using a handheld "mouse," a device that could direct an arrow on a screen to point at things and, using a button or two, instruct that arrow on what to do. What could be more intuitive?
A complication quickly arose as one manufacturer insisted that one button should be adequate to control a computer, while another insisted emphatically that two should be required. Those who imprinted on the one-button machines would be forever baffled by the two-button ones, and vice versa, since the rules for using the second button were never obvious and therefore failed to provide an intuitive experience. Never mind that the whole concept of Intuitive Interfaces amounted to a myth. Each required some entrainment, perhaps not nearly as deep as traditional eggheads required, but orientation adequate to understand the interface designer's intentions. There were underlying rules for successful operation that users were largely expected to discover and quietly incorporate into their practices. This was never intuition, but the misattribution served to successfully convince most Average "Regular" People that their personal computer could successfully intuit their intentions.
This conceit was probably always destined to eventually fail. As with all technology, computers evolved not toward simplicity but toward greater complexity, and increasing complexity inevitably brought the need for ever more convoluted interfaces. Eventually, the whole concept that there might be such a thing as an Intuitive User Interface fell on its face. The Great Myth was exposed as the myth it always was, and naive users like me experienced an existential disappointment. I still believe that user interfaces should be intuitive. Never one to figure out the two-button interface, I understood that some versions of so-called intuitive interfaces never worked, certainly not for me. Throw me on a two-button system and I will be frozen, unable even to point and click, as required for such a system to work. It was as if some systems were designed for specific temperaments. True intuitives were incapable of ever "learning" to operate two-button systems because they were never designed to be intuitive. True intuitives always knew.
But that argument was lost decades ago. Now, even one-button interfaces fail to convincingly appear intuitive. Today, I cannot successfully pick up my mail, for cripes’ sake. The mail queue on my phone differs from the one on my laptop, and the instructions for ensuring they're in synch might as well have been written in Greek. Passwords are routinely forgotten, even by special-purpose applications designed solely for remembering passwords. (They remember every password ever created for every application, leaving it up to the user to try to remember which one might be the most current.) The operating systems have become like piles of storm-tossed debris, defiant of logical order. Try to identify where you control even the most innocuous component of these systems. You will fail, rendering most of their much-touted power as inaccessible as if they didn't exist. I feel confident that the designers no longer understand what they're creating and maintaining, either.
I should have figured this out decades ago. There has never been, and never could be, such a thing as an Intuitive User Interface. Even personal computers have always required an egghead's skills, abilities, and experiences. The Muse usually compensates for my lack of knowledge. When I can't intuitively figure out how to print a document, she comes to the rescue to inject some eggheaded knowledge into the effort. I point and click, at times even enthusiastically, but now, as I enter my Following Chapters, I understand that my enthusiasm never was, nor could it have been, sufficient to overcome the Great Intuitive Interface Myth. The interface was never intuitive. It always required specialized knowledge.
We will one day revert to how it was before the advent of so-called personal computers. It seems likely that the primary use of Artificial Intelligence might become the operation of our personal computers. Once The Great Myth becomes more widely acknowledged, we will rely upon AI to do our pointing and clicking for us. It will be tasked with interpreting our naively intuitive actions into a sequence of commands that satisfy our underlying intentions. We're destined to become mere observers of our computations, thank heavens.
©2025 by David A. Schmaltz - all rights reserved