An Abridged History of UI

Photo by Gilles Lambert on  Unsplash

Photo by Gilles Lambert on Unsplash

Originally published on

This article is for anyone who works or is interested in user interface, whether you are a developer, designer, consumer, or enthusiast, and realized how little you actually know about the history of your industry. My intention is to spark conversation and deeper research.

You have to know the past to understand the present.

Phrases to be familiar with:

User Interface is the dimension that lies between humans and machines. It is the plane of interaction between the operator and the device.

User Interface Design is the intentional composition of elements on a device in which a user interacts with. Maximization of usability (the measurement of ease and learnability of an interface) and user experience (the user’s response to their interaction with the interface) are the two of the main considerations when designing for UI. These ideas are significant, because machines are, in most respects, made by people, for people.

A Graphical User Interface, or GUI, allows users to interact with a computer through secondary interactions, such as icons and input fields, as opposed to learning command-line interfaces (CLIs), which today we would call “coding”. This ingenious technology provided access to a significantly larger population of people to decipher and experiment with computers after its introduction.

Computer/user interaction research and the novel technologies which have enabled humans to do this, is called Human-computer interaction, or HCI. The term was first used in the mid-Seventies/early Eighties, and made a distinction between computers as a complex tool, versus other simple tools, such as a an axe or a pair of pliers. Simple tools, for the most part, have one intended function, where a computer creates an versatile exchange dialogue with humans as they interact with it. Perceptions such as this lead to further comparison of human-computer interaction to human-to-human interaction, a discussion which will I will not entertain in this feature, albeit it is an interesting topic to consider.

A field of study a bit more human-focused is human factors and ergonomics (HF&E). HF&E focuses on the design of products and/or systems that underscore the people who use them, attempting to optimize human well-being and overall system performance. These hypotheses are particularly important, as computers and devices are more readily considered as extensions of the human body in today’s “developed” society.

The Beginning

The evolution of user interface lies in its pneuma: we consider recognizable ideas and employ them to cultivate new concepts and develop more efficient processes.

As with many advancements throughout history—whether it be through the Arts, political tables, social climates, etc.—each new progression forward is typically in response to a preceding movement. The natural expansion of computing power, with the help of countless remarkable minds, is no exception.

1760 –1869: The First Industrial Revolution

Envision yourself in the latter part of the 19th century, and the leading countries in the world have just undergone their first major industrial revolution. Mass-production systems have been developed throughout Britain, western Europe, and Northeast America. The introduction of railroad systems, steamboats, canals, and paved roads opened up society to a new breed of interconnectivity, and, consequently, the development of faster, more efficient modes of communication. Railroad construction gave way to the telegraph, which then yielded the first telephone. These were the first salient examples of what Stephen Kern described as the, “annihilation of distance.”

Workers celebrating the completion of the first transcontinental railroad during the historical Golden Spike Ceremony (1869)

Workers celebrating the completion of the first transcontinental railroad during the historical Golden Spike Ceremony (1869)

1870–1914: The Second Industrial Revolution

The rise of steel and oil giants in America facilitated the need for more increased labor power in factories across the country. Technical skills were suddenly essential in order to make a living, although working conditions were poor, and pay meager.

Political cartoon depicting labor unions banding together to end the shady business practices implemented during the second Industrial Revolution

Political cartoon depicting labor unions banding together to end the shady business practices implemented during the second Industrial Revolution

1911: Taylorism

In the early 20th century, during the rise of factory labor, Frederick Taylor developed a set of ideas, later designated “Taylorism”. In his book The Principles of Scientific Management, Taylor focused on four main standards defining his views on Organizational Leadership:

1. They develop a science for each element of a man’s work, which replaces the old rule-of-thumb method.
2. They scientifically select and then train, teach, and develop the workman, whereas in the past he chose his own work and trained himself as best he could.
3. They heartily cooperate with the men so as to insure all of the work being done in accordance with the principles of the science which has been developed.
4. There is an almost equal division of the work and the responsibility between the management and the workmen. The management take over all work for which they are better fitted than the workmen, while in the past almost all of the work and the greater part of the responsibility were thrown upon the men.

Taylor’s intentions were to streamline operations of the factory institution and to increase worker productivity. However, to liken man to an unrelenting machine, or to dehumanize the laborer, lead to a hunger for sophisticated technology to help elevate the worker.

1939: The First Electronic Speech Synthesizers are Unveiled

The Voder was a simplified version of the original Vocoder, a device developed by engineer Homer Dudley, primarily used for military communication purposes (for more history on the Vocoder, watch a replete explanation here). The Vocoder was the first electronic speech synthesizer, ripping apart human voice input and breaking it down into a necessary-only frequency output. The Voderin contrast, did not use speech input, but rather solely electrical impulses. These were some of the first examples of human speech interaction with a machine in this manner.

1945: Vannevar Bush’s Influential Article “As We May Think” and the Visionary Memex

One could argue that Vannevar Bush was a prophetic design thinker, who in 1945, summarized his ideas for the future of technology in the article As We May Think. Up until this point, most scientists focused on the improvement of technology in relation to warfare. Now that the war was coming to an end, Bush began entertaining ideas of technology that would not merely propel man’s physical powers, but rather the strength of the human mind.

In his essay, Bush covers topics such as photography, organization and storage of information, dispersement of information, television, and sound synthesizers, but one of the most notable thoughts presented was that of the Memex:

Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and, to coin one at random, “memex” will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.
Memex diagram (right), excerpt from “As We May Think”, originally published in 1945

Memex diagram (right), excerpt from “As We May Think”, originally published in 1945

Portrait of Vannevar Bush

Portrait of Vannevar Bush

This “enlarged intimate supplement to memory” was the first written account of a prototype for a personal computer. Little did Bush know, the proclamation of his ideas would lead to a rapid progression of computer technology.

1945–1959: The ENIAC, COBOL, and Leading Female Programmers

Numerous women during WWII were strong mathematicians who offered their knowledge to help further war efforts. Although most of them were not recognized or given proper credit for their vital contributions, their expertise was imperative to the development of pivotal computer technology.

In 1945, Jean Jennings Bartik was offered a job by physicist John Mauchly on a new machine called the Electronic Numerical Integrator and Computer (ENIAC), the first fully electronic general-purpose digital computer. Bartik was one of six female mathematicians who worked on the 1800 sq. ft project to predict bullet trajectories (it is rumored that until the untimely demise of the ENIAC, it had achieved more calculations than the whole of mankind).

Jean Jennings Bartik (left) and Frances Bilas Spence (right) programming the ENIAC

Jean Jennings Bartik (left) and Frances Bilas Spence (right) programming the ENIAC

(Left to right) Patsy Simmers: holding ENIAC board, Mrs. Gail Taylor: holding EDVAC board, Mrs. Milly Beck: holding ORDVAC, Mrs. Norma Stec: holding BRLESC-I board

(Left to right) Patsy Simmers: holding ENIAC board, Mrs. Gail Taylor: holding EDVAC board, Mrs. Milly Beck: holding ORDVAC, Mrs. Norma Stec: holding BRLESC-I board

These six women subsequently joined forces with Grace Hopper, a math professor at Vassar College who later joined the Navy Reserve. Hopper developed a programming language called the Common Business Oriented Language, or more commonly known as COBOL. Unlike its predecessor, FORTRAN—which was primarily used for scientific numerical analysis, COBOL used words instead of numbers to manage large of amounts of data and reports, mainly for business purposes.

Grace Hopper in a mainframe room, 1950

Grace Hopper in a mainframe room, 1950

COBOL was an important development in computer programming because it mimicked human language and was easy to decipher, making it accessible to a vast range of people.

1955 : Henry Dreyfuss and “Designing for People”

Henry Dreyfuss was an American industrial designer and author of Designing for People, a book in which he endorsed his own ideas around the relationship between humans and machines—namely his notion of, “fitting the machine to the man rather than the man to the machine.” With the help of Alvin R. Tilley, Dreyfuss promoted a new way of looking at a human’s relationship with technology.

   Alvin R. Tilley’s drawing of average Americans, later published in The Measure of Man


Alvin R. Tilley’s drawing of average Americans, later published in The Measure of Man

Dreyfuss’ writing and Tilley’s drawings led to the emergence of the Universal Design Movement, the Humanscale Project, and the rise of new practitioners who focused on ergonomic design. These new ergonomic designers began to describe the space where human and machine meet as “interface”. Dreyfuss’ work has expanded to become what we now refer to as interaction design.

1963 : Douglas Engelbart and the Invention of the First Mouse

Douglas Engelbart grew up during the birth of computers, which were still so large that the smallest of them occupied the space of an entire room. As a radar technician during WWII, Englebart was struck by an article he stumbled upon in a reading library in the Philippines. This article was As We May Thinkby Vannevar Bush, which described the Memex, an idea that would propel Engelbart to begin his most influential life work: the computer mouse.

Engelbart envisioned various ways to move a cursor on a computer display—something no one had mastered yet. He finally developed a prototype, which he presented during the “Mother of All Demos” in 1968. The mouse enabled users and technicians to interact with menus and other interface tools without having to write out all of the code. People could now have a dialogue with a computer screen.

Douglas Engelbart holding the first mouse prototype first revealed in 1968

Douglas Engelbart holding the first mouse prototype first revealed in 1968

First mouse prototype

First mouse prototype

Engelbart was a visionary by nature—he offered other forward-thinking ideas in addition to his groundbreaking invention. In 1960 at the Solid States Circuits Conference in Philadelphia, Engelbart proposed his theory that through inescapable technological evolution, computers would begin to shrink and continue to do so until they could no longer physically progress. Five years later, this idea would morph into “Moore’s Law”, named not after Engelbart, but after Gordon Moore, cofounder of Intel.

1965–1969: The First Touchscreen

First introduced by E.A. Johnson in Touch Display — A Novel Input/Output Device for Computers, touchscreen technology would become highly influential in defining the way humans interact with computer screens.

Johnson described a type of touchscreen, now referred to as a capacitive screen, that used electrical charges in the human skin to close an electric circuit to communicate processes. He patented the idea in 1969, and it was later developed in 1973 as a radar screen used by the Royal Radar Establishment in the UK.

Taken from Johnson’s 1969 Patent Document for Touch Displays

Taken from Johnson’s 1969 Patent Document for Touch Displays

This type of touchscreen technology would be used until the 1990s when the resistive touchscreen became popular, and then revived again some years later through popularity of Apple’s products.

1973: Xerox’s Palo Alto Research Center (PARC) and the First GUI

Xerox, a leading print and paper manufacturer, joined the technological revolution and founded the Palo Alto Research Center (PARC) to develop new computer technology. Here, in 1973, leading scientists designed and built the Xerox Alto, the first Graphical User Interface (GUI).

The Alto was not intended for commercial use (as one cost roughly around $12,000 to manufacture). While it utilized the state-of-the-art mouse, it offered straightforward WYSIWYG (What You See Is What You Get) style editing, making it easy for users to comprehend.

The Xerox Alto, the first Graphical User Interface

The Xerox Alto, the first Graphical User Interface

The Alto revolutionized our understanding of computers. With it’s clear processes, users could print exactly what they saw in the screen in front of them. It also offered features like removable data storage and email, connecting people in a way never seen before.

Xerox’s invention catapulted the industry into a race to see who could refine the computer into a faster, smaller, easier to use machine until it would eventually reach its physical limits.

1979: Xerox Makes a Deal

With growing digital technology, Xerox feared their business would become obsolete unless they generated a solution to stay relevant. This propelled the company to hire brilliant minds to develop technology never seen before.

PARC accommodated over half of the nation’s leading scientists and mathematicians who worked incessantly to predict where the future of technology was heading.

However, the Alto was too unconventional for executive approval in the company. Xerox’s leadership at the time didn’t have the vision or understanding of how groundbreaking this technology would become. There was a man, however, who did see the importance of this interface, and that was Steve Jobs. An agreement was made where Apple traded $1 million of stock in its company for Xerox’s GUI prototype technology.

1979: Bill Moggridge Prototypes the First Laptop Computer

The GRiD Compass, the first laptop computer

The GRiD Compass, the first laptop computer

In 1979, Bill Moggridge was commissioned to create a new type of portable computer by GRiD Systems Corporation. Two years later, prototype was released: the GRiD Compass. It was exceptionally expensive to produce, so as with many significant technological inventions, it was initially reserved for the government, high society establishments, and space exploration programs. Moggridge later went on to cofound the design firm IDEO, a household name in the design community.

1981: The IBM PC, the First Industry-wide Accepted Personal Computer

In 1981, IBM was known as the “Big Brother” in the computer industry. When they released their first personal computer, they recruited Charlie Chaplin to advertise their more “approachable” side.

IBM promoted their new product, the IBM 5150, with the help of Charlie Chaplin

IBM promoted their new product, the IBM 5150, with the help of Charlie Chaplin

Although IBM’s marketing tactics were clever, the product they released was even more impressive. Affordable, compact, and accessible, “The IBM PC revolutionized business computing by becoming the first PC to gain widespread adoption by industry.”

The IBM PC utilized Microsoft’s MS-DOS operating system, spawning a prevailing relationship between the two towering companies.

1982: The Compaq Portable

Inspired by the IBM PC, Compaq, an unknown competitor at the time, created a portable computer that was essentially an IBM PC in a “luggable” form. This 28-lb, carefully designed computer quickly caught fire with the public, and skyrocketed the company into the spotlight.

The Compaq Portable

The Compaq Portable

1984: The Macintosh (OS System 1.0)

(Top) Macintosh 128k ad—1984, (Bottom) Steve Jobs with the first Mac Computer

(Top) Macintosh 128k ad—1984, (Bottom) Steve Jobs with the first Mac Computer

Although many other models of personal computers were unveiled, the release of the Macintosh changed the way people thought about personal computers. Jobs was determined to retain Apple’s relevance in the computer industry, not allowing IBM to harbor all the glory.

As Steve Jobs presented the Macintosh in 1984, the audience exalted with delight. The iconic “hello” screen coupled with Apple’s product slogan, “For the rest of us,” generated an air of reliability and trust within the consumer. Apple crafted a new connotation for the computer—one that could be welcomed by any type of person, without overwhelming intimidation or cynicism.

1988: The Gang of Nine and EISA

After much competition from ‘IBM-compatible’ manufacturers, IBM copyrighted their own standard, called Micro Channel architecture, that was no longer compatible with older versions of their PCs in an effort to monopolize the market. This, however, backfired, as it caused problems for many businesses running on the previous standard, in turn rendering the new standard introduced by IBM unsuccessful. In response to IBM’s decision, the nine leading companies manufacturing IBM-compatible computers banded together and created their own standard, the Extended Industry Standard Architecture (EISA), which was not only compatible with previous versions, but also easily available for companies to purchase.

1993: The Apple Newton Flops

In 1992, IBM had released the ThinkPad with great success. Naturally, Apple needed to create something better.

The Apple Newton could do everything: send a fax, save notes, store contacts, manage calendars, and fit in your pocket. Despite its already-long list of capabilities, Apple unveiled a brand new function: handwriting recognition. It also included a stylus that the consumer could use to write on the screen.

The concept began with ample promise, but unfortunately, there were too many inconsistencies within the handwriting recognition software, and the project was terminated. A valiant failure, however vital for progress, the Newton grabbed the public’s attention.

Doonesbury comic strip poking fun at the Newton

Doonesbury comic strip poking fun at the Newton

1995: Don Norman, the “User Experience Architect”

Don Norman worked at Apple for almost half a decade. During his tenure, Norman created the first “User Experience Architect” position. Norman emphasized the need for human-centered design, with the backing of enough skilled technology to manufacture this efficiently. Norman explains his own definition of what user experience embodies:

It’s the way you experience the world, it’s the way you experience your life, it’s the way you experience service.

The focus on user experience would continue to push technology and those creating it to align with our human capabilities and needs.

1998: The First iMac and the Evolution Thereafter

Futuristic, colorful, novel: the design of the first iMac was unlike any computer released before its time. Translucent colorful plastic bestowed upon the consumer a “nothing to hide” maxim, further deepening the growing relationship of trust & understanding between human and computer.


Released just under two decades ago, the futuristic design of the first iMac now seems clunky and cumbersome compared to the attenuated and sleek design of today’s iMac.

2000: An Early Camera Phone, the J-Phone, and a Shrinking Trend

As producers continued to jam-pack additional functionality into small products, the phone became the next hot commodity. The J-Phone created a camera in everyone’s pocket, which would be further refined by hundreds of producers to be more compact, have a larger screen area, and contain even further functionality (apps, video chat, games, portable internet, etc.).


Although the above photo is a bit comical, it speaks to the “shrinking” trend tech producers have, and still are, racing to produce: the most minimal, lightweight, and fastest products physically possible — which should sound peskily familiar to you by now.

2001: Microsoft Windows XP and Apple Mac OSX

In the turn of the century, Apple and Microsoft released two new operating system upgrades for their products: Apple’s Mac OSX and Microsoft’s Windows XP. These designs would be the basis for today’s operating system upgrades.


Each is different in it’s own right, and the path to access essentially the same information is very different between the two. These changes, maybe more than ever, pushed users to ask themselves the dichotomous question, “Am I a Mac or PC?”

2010: The Introduction of Retina Display

In 2010, Apple introduced it’s new Retina Display, which packed twice as many CSS pixels into each inch, creating a crisper, cleaner display of type and image on the screen.


Retina Display has revolutionized the way we communicate with the droves of screens we have the opportunity to interact with—desktops, tablets, phones, watches, glasses—and could be one of the last instrumental updates to a physical device in terms of interface.

Innovations such as this have contributed to the ever-growing arboreal evolution of specialty careers in technology, emphasizing the importance of the trajectory of user interface, not only for consumer, but also for the laborer.

Present-day: The Apex of Moore’s Law

We are, and will soon be completely, traveling past the zenith of Moore’s (or should I more correctly say, Engelbart’s) prediction of technological evolution. Our screens and processors have reached their physical limit, and we have begun to explore virtual screenless display technologies.


The following excerpt from Bush’s As We May Think (1945) speaks to this inevitable evolution, thoughts which at the time seemed almost unattainable past pure abstraction:

All our steps in creating or absorbing materials of the record proceed through one of the senses–the tactile when we touch keys, the oral when we speak or listen, the visual when we read. Is it not possible that some day the path may be established more directly?

Terms like “Zero UI”, “Machine Learning”, and “Touchless Tech” are becoming more commonplace as we enter into the nascent stages of developing the technology to utilize other processes beyond touch screens and controllers to communicate with computers. Devices are presently learning our preferences and offering feedback without constant input from the user, forcing us to reassess our relationship with our technological achievements.

Closing Thoughts

Without digging too deep into the future of user interface—which demands an entire article in itself—I leave with this consideration:

Engelbart’s Law:

The intrinsic rate of human performance is exponential.

We set our own limits and continually surpass them, as has been reflected in this article. The many potential waypoints for the future of user interface evolution is entirely reliant on our own ambitions and how we act upon them.



Bane, Michael. “9 Clonemakers Unite To Take On The Industry Giant.” Chicago Tribune. November 20, 1988. Accessed April 10, 2017.

Bush, Vannevar. “As We May Think.” The Atlantic. July 1945. Accessed November 30, 2016.

Crockett, Zachary. “The Woman Behind Apple’s First Icons.” Priceonomics. April 3, 2014. Accessed December 07, 2016.

Cross, Tim. “After Moore’s Law | Technology Quarterly.” Technology Quarterly. 2016. Accessed December 13, 2016.

Dewib. “Steve Jobs Presenting the First Mac in 1984.” YouTube. October 5, 2011. Accessed November 30, 2016.

“ENIAC.” ENIAC — CHM Revolution. Accessed December 02, 2016.

Engelman, Ryan. “The Second Industrial Revolution, 1870–1914.” US History Scene. Accessed November 30, 2016.

Honan, Matt. “Remembering the Apple Newton’s Prophetic Failure and Lasting Impact.” Wired. August 5, 2013. Accessed December 08, 2016.

Laura Sydell. “The Forgotten Female Programmers Who Created Modern Tech.” NPR. October 06, 2014. Accessed December 01, 2016.

Lupton, Ellen. Beautiful Users: Designing for People. New York, US: Princeton Architectural Press, 2014. ProQuest ebrary. Web. 30 November 2016.

Markoff, John. “Computer Visionary Who Invented the Mouse.” The New York Times. July 03, 2013. Accessed November 30, 2016.

Markoff, John. What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry. New York: Viking, 2005.

Staff, Dispatchist. “The Laws That Govern the Universe.” Dispatchist. January 27, 2016. Accessed February 02, 2017.

Tariq, Ali Rushdan. “A Brief History of User Experience — InVision Blog.” InVision Blog. February 26, 2015. Accessed November 29, 2016.

Taylor, Frederick. “The Principles of Scientific Management, Ch. 2.” National Humanities Center. Accessed November 30, 2016.

“The IBM PC’s debut.” IBM Archives: The IBM Personal Computer. Accessed December 15, 2016.

“Xerox Alto: Computers for “Regular Folks”.” Xerox Alto — CHM Revolution. Accessed December 02, 2016.

Ashley Hopkins