IN DECEMBER 1946, Edmund Berkeley found himself at a crossroads. An idealistic young man with a Harvard degree in mathematics, Berkeley had worked for years as an actuary but was growing disenchanted with his profession. Buried among his personal and professional papers is a typewritten note describing “Possible Projects.” Included among the ideas for “new gadgets” that he might make by “combining modern materials with old needs,” was an offbeat item: “modern age slipper.” The novelty of this footwear, he noted, would come from adding “fluorescent marks” so they could be found in the dark, washable linings, and some magnets “so they cannot be kicked under the bed.” To the best of my knowledge, Berkeley never followed through with this invention. In fact, he soon became absorbed with founding the Association for Computing Machinery — today it has more than 100,000 members worldwide — and writing the first popular book about modern computers. But in the wake of World War II, Berkeley agreed with Ralph Waldo Emerson’s adage: if a person “can make better chairs or knives” (or, in this case, bedtime slippers), then “you will find a broad hard-beaten road” to their house “though it be in the woods.” Such is the power of invention.

In the United States, the conventional narrative assumes that for much of 19th century, invention was the province of heroic individuals like Edison, Colt, Singer, Morse, Bell, and so on. Their very names conjure up images of their inventions. But then, in the early 20th century, so the narrative goes, the locus of invention shifted away from these intrepid figures to anonymous teams, with the well-funded corporate research laboratory becoming the wellspring of novelty and innovation. By the 1930s, invention had been replaced by “research and development” (R&D), now the province of companies like DuPont, AT&T, and General Electric. The individual inventor — once heralded as the “genius in the garret” — was unable to compete. Marginalized, he — and, yes, it was almost always a “he” — was invisible, and ultimately doomed in the expanding marketplace for inventions and ideas.

Eric S. Hintz’s new book, American Independent Inventors in an Era of Corporate R&D, offers a persuasive counternarrative. His goal, achieved via case studies based on a wide array of historical sources, is straightforward: to show that independent inventors did not vanish. This is an important claim. The United States has long touted a certain pragmatic and inventive quality as endemic to its national character. So, who is actually doing this inventing? Is it individuals (white men only, or also women and people of color), or is it corporations? This matters for questions of identity, and obviously also for questions of economics. A historian at Lemelson Center for the Study of Invention and Innovation (part of the Smithsonian Institution’s National Museum of American History), Hintz explains that the road independent creative people traveled was neither smooth nor easily navigated, but it offered a pathway to fortune, if not always fame. Moreover, the community of individual inventors, long depicted as composed only of tenacious, practical, and resourceful white men, was much more diverse in terms of race and gender than is commonly understood.


Who pronounced individual inventors extinct if they weren’t? The public relations and advertising industry was in fact mostly to blame. Corporate R&D labs emerged at the same time as the professional PR firms, and, as Hintz explains, the “former eagerly employed the latter.” Through radio shows, newspaper articles, and advertisements, the industry disseminated an image of corporate labs as inventing the future through their research and products. This image, sold to consumers as well as politicians and regulators, served to marginalize the independent inventor, even as some of that figure’s trappings — rolled-up shirt sleeves, late nights in the laboratory, maybe a flask cooking away on a Bunsen burner — were appropriated to humanize a new breed of company worker.

Company-sponsored advertisements and promotional statements of the 20th century also helped redefine who could be an inventor. So did the managers of corporate labs. C. E. Kenneth Mees, the director of research for Kodak, wrote that almost any well-trained researcher could make valuable contributions “even though he be entirely untouched by anything that might be considered as the fire of genius.” Monsanto was even more direct. In a promotional film from the 1950s, the narrator explained that invention was what naturally happened when you put together a bunch of average Americans and had them work together. “No geniuses here,” the chemical giant proclaimed.

In fact, as Hintz explains, managers at companies like Monsanto and Kodak didn’t even want people who were touched by the so-called “fire of genius.” Geniuses were unpredictable and certainly unmanageable. Companies instead sought to make the R&D process rational, incremental, and continuous, devoid of the genius’s disruptive shout of “Eureka!” The light bulb might be a symbol of inventive inspiration, but its realization as a successful and profitable innovation to be widely deployed throughout American homes and factories required interdisciplinary teamwork, calculated promotion, and loads of capital. Undisciplined tinkerers were not invited; they were perhaps even a tad dangerous insofar as they might invent new things that would undermine corporate patent portfolios.

This shift from individual to team-based invention was extensively critiqued by William Whyte in his classic 1956 book, The Organization Man. Being an inventor in private industry was to be fettered; it was to have one’s creative wings clipped, he claimed. He described, for instance, how “The Organization” — corporations, federal laboratories, and even university departments — was trying to “mold the scientist to its own image.” Industrial managers and other administrators wanted to “rationalize curiosity,” he wrote. The “Organization Man” was, in other words, a compliant team player — no genius! — who could be seamlessly oriented by managers toward specific goals. This push for conformity, warned Whyte, implied a broader drift toward Soviet-style organization. It was, of course, no accident that the book was published at the peak of the McCarthy era.


The visibility issues faced by white male inventors paled in comparison to the challenges encountered by women and people of color. Prior to the passage of more progressive laws in the early 20th century, women, for example, were barred in many states from owning or controlling property, a restriction that extended to intellectual property. Nonetheless, women were active inventors: an 1888 government publication stated that 2,297 patents had been issued to women since 1790.

Women inventors were aided by the efforts of activists like Charlotte Smith. A former hatmaker, Smith eventually turned to publishing as well as lobbying Congress for improved consumer protections. She also founded the Women’s National Industrial League in 1882, aiming to secure better pay and working conditions for women. She was particularly incensed to hear stories of women inventors who were denied patent protections or had their intellectual property taken from them. One vehicle Smith used in pursuit of justice was The Women Inventor, first published in 1891, which Smith gave to political delegates. Despite her persistence, Smith enjoyed little political success and turned her attention to helping abused and homeless women until her death in 1917.

The obstacles that African American inventors faced were even more difficult, and included Jim Crow laws, long-held assumptions of intellectual inferiority, fewer opportunities for technical education, and systemic exclusion from engineering societies. Even when they succeeded in creating marketable inventions, African American inventors often had to conceal their racial identity. Hintz describes, for example, the activities of Garrett A. Morgan, an inventor and entrepreneur from Cleveland who achieved recognition for a new type of gas mask he patented in 1914. When the heads of fire departments as well as engineers and chemists wanted to purchase the device, Morgan had to hire a white associate to pass as “the inventor” when he demonstrated the product at trade shows. (In a bizarre twist, Morgan would masquerade in these trials as a Native American.) Business success followed — the company’s stock went from $10 a share to $250 — and an Ohio newspaper profiled Morgan in 1916 after he helped in a daring rescue mission using his invention. With his racial identity now revealed, some fire departments canceled their orders, but he appears to have been undaunted.


To help enhance their visibility and viability vis-à-vis corporate research labs, independent inventors of all backgrounds launched a variety of mostly short-lived professional organizations and societies. Many of these sought changes in the US patent system that would create more favorable terms for lone inventors. Their argument: The patent system as it existed circa 1920 didn’t enable individuals to compete against the sorts of corporate litigation companies that were used to protect patent portfolios. Here, Hintz spends a little too much time on the minutiae of these efforts. The takeaway: They failed and lobbying attempts for more equitable laws ultimately ended with the start of World War II.

When historians write about research and development during World War II, the focus is almost always on large projects, funded and managed by the government. Such efforts produced the proximity fuse, the mass production of penicillin, new rocket motors, radar systems, and, of course, nuclear weapons. These massive R&D efforts employed hundreds or thousands of people. During the war, for example, some 600,000 Americans — one out of every 250 people in the country and almost all of them anonymous to us today — contributed to the Manhattan Project (often without knowing much about the ultimate inventions they were working on).

But, as Hintz details, hundreds of independent inventors also contributed to national defense. The federal government recognized their potential by forming, under the auspices of the Department of Commerce, the National Inventors Council, which was led by Charles “Boss” Kettering, the head of research at General Motors. Much of their contributions were invisible, however, because, as Hintz notes, the records of the council were not declassified until relatively recently. A wartime poster urged Americans “who have an invention or idea which might be useful to their country” to contact the NIC and “INVENT FOR VICTORY.” The most famous inventor to answer the call was Hollywood actress Hedy Lamarr who submitted a bold new idea to control torpedoes wirelessly. Her invention of “frequency-hopping,” explains Hintz, was foundational for recent technologies central to the use of wi-fi, Bluetooth, and satellite navigation. While few other wartime tinkerers had Lamarr’s celebrity status, the global conflict provided independents a wider range of opportunities to demonstrate their utility.


Hintz’s book is marred by occasional repetitiveness and a habit of reverting to the academic’s defensive crouch. I would have liked a little less focus on legislative reform and more on the place of the independent inventor — long declared dead yet still persistently active — in the American popular imagination. In 1952, John Kenneth Galbraith claimed the lone inventor’s importance was now “pleasant fiction.” In the early 21st century, business and news magazines proclaimed “the return of the lone inventor” as if this individual had indeed disappeared. A deeper explication of this alleged disappearance and resurgence would have been welcome.

Highlighting the word “community” might have helped in this regard. From the mid-19th century onward, lone inventors, despite their seeming solitary nature, sought the company of other like-minded creators, if only as a search for security from predatory corporations. Technological communities — whether they coalesced around building a new scientific instrument or, as I have written about, sought to combine art-making with engineering — offer a flexible yet powerful unit of analysis that Hintz could have pursued. Technological communities are fluid, interdisciplinary groups that come together around particular projects, goals, and artifacts, sometimes existing for years (such as the case of computer hobbyists in the 1960s and 1970s) and, at other times, fading away as members disperse to other efforts. For inventive women and people of color, such communities offered the promise of protection through strength in numbers, as happened with the Negro Development and Exposition Company (founded in 1903 by Giles B. Jackson and described by Hintz in his book). More recently, it’s easy to see the recent surge of interest in the DIY “maker culture” as an expression of this same desire to seek and form a technological community, one with roots in the earlier tradition of independent inventing.

Today, important inventions are, of course, again seen to originate in the minds of lone inventors, toiling in a basement workshop or dorm room. And companies are again keen to tie their innovations to this work, even to fetishize it. The original garage where William Hewlett and David Packard started their company is now on the National Register of Historic Places. One can take a tour and see the house where Steve Jobs and Steve Wozniak worked together when they were inventing the first Apple computers. It remains to be seen whether Harvard will seek similar status for Mark Zuckerberg’s dorm room. And who knows how the trial of Elizabeth Holmes, the inventor indelibly associated with the technological train wreck known as Theranos, will turn out. The manic, perhaps slightly off-kilter, genius is still being iconized if not exactly rendered heroic.


W. Patrick McCray is a professor of history at UC Santa Barbara where he researches and writes about modern technology and science.