You could forgive George Crow for declining the first time Steve Jobs tried to lure him away from Hewlett-Packard.
That was back in early 1981, when Apple was developing the industry-changing Macintosh. Crow, who would eventually be in charge of the power supply and display for the pathbreaking personal computer, didn’t know what the project was. And Jobs didn’t make a stellar first impression.
“Steve Jobs and Rod Holt, who was their power supply guy, contacted me and wanted me to join them,” Crow recalled. “At that point, they weren’t willing to show me what they were doing. And when I arrived at the interview, they both looked like they had just come out of the forest. They were in hiking boots and kind of tattered pants. And my attitude was kind of, You want me to leave Hewlett-Packard to come work for you?”
But about six months later, Jobs and Holt tried again, this time revealing work on what would become the Mac. “It was just so far superior to what I was doing at HP, I jumped at the opportunity,” Crow said.
2019 will mark the 35th anniversary of the Mac, a breakthrough Wired called: “Birth of the Cool.” As the magazine recalled, the Mac “was a product of its time—underpowered and not very easy to use. But it did represent a sea change, a paradigm shift, whichever late-20th century business cliché you care to use. It was the first to feature a graphical user interface that could be called user-friendly and was the first, with the advent of the LaserWriter printer and Aldus PageMaker, to make desktop publishing a reality.”
It’s had tremendous staying power.
Even now, if you google “Jan. 24, 1984,” the introduction of the Macintosh is likely to come up first. The Mac was notable for being an affordable personal computer, with an intuitive user interface—one that software developers were required to adhere to, making applications easy to learn. With the GUI, for the first time you could save, move, or delete files by clicking and dragging the icons around the screen with something called a mouse. Features we take for granted now.
“It opened the computer to a much broader market, and made it much easier to learn how to use it and easier to get around,” Crow recalled.
Which was a long way from where Crow started.
Crow was born in Schenectady, N.Y., but when he was 9 months old, his father, who worked for General Electric, moved the family to Oakland. Crow would graduate from Piedmont High School before getting a B.S. in electrical engineering at UC Berkeley in 1966 and later a master’s in computer science at Santa Clara University.
At Berkeley, Crow proofread texts for new textbooks written by Berkeley professors whom he described as being on the cutting edge of computers. One was the late Donald Pederson, an electrical engineering professor who founded the first university-based integrated circuits lab, which was foundational to the design of nearly every integrated circuit over the next 25 years.
“It really was advanced as it possibly could be,” Crow recalled of his time at Berkeley.
Before the Mac, Crow said, “you just had to memorize all these commands and know when to use them. I was just flabbergasted at how quickly I forgot all the commands after I got my
“When I went to Cal,” Crow explained, “we were still using punch cards”—programs and data were punched by hand on a key punch machine and read into a card reader. “I used to have bad dreams that I was on my way to get my punch cards processed and I dropped them, because they had to be in perfect order.”
In 1978, Jobs proposed that Apple develop a next-generation computer. By 1979, a research project for a new low-cost computer began under the late Jef Raskin, who led the team that created the Mac. “It was to be inexpensive,” Raskin once wrote, “have a small footprint, use a built-in, graphics-based screen and—my most heretical point—it would be based on human factors considerations rather than driven by whatever was hottest in electronic technology at the moment.”
Two years later, Crow joined the team. Hewlett-Packard at the time was working on project for a personal computer that would sell for $10,000. Meanwhile, Jobs was saying the Mac, much smaller than the HP computer, would come in at under $2,000 (it initially listed for $2,500). “Steve was explaining to me about how the stock was going to go straight to the moon,” Crow recalled.
On Jan. 22, 1984, a Super Bowl commercial directed by Ridley Scott (Blade Runner) introduced the Mac. The ad ended with a narrator alluding to George Orwell’s novel, saying, “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ’1984.’”
Two days later, the Mac was officially introduced at Apple’s annual shareholders meeting, where Jobs declared: “Many of us have been working on Macintosh for over two years now, and it has turned out insanely great.”
The simplicity of the new machine was dazzling, even to the pros.
Before the Mac, Crow said, “you just had to memorize all these commands and know when to use them. I was just flabbergasted at how quickly I forgot all the commands after I got my Macintosh.”
There were stumbles, of course. Initially, customers could take home a Mac for a day under a free trial program—which left many of the units unsellable. And there wasn’t much software.
But later, when the Mac shipped with a laser printer, sales really took off. “People could make documents that looked like they were typeset,” Crow said.
“We had a frantic schedule at times, but we had great teamwork,” he continued. “I do take great pride in what we were able to accomplish.”
And yet in 1985, about a year after the Mac’s launch, Jobs and Crow left Apple to found NeXT, a company that aimed to build both hardware and software to create a cohesive computing experience.
Crow described his work at NeXT as being so varied that his title was “VP of elves,” in charge of documentation, product design, and power supply, among other things. “I really was proud of the NeXT computer and I was crushed when it wasn’t successful. It took me quite a while to recover from that, because up until then, I’d had nothing but successes,” Crow said. NeXT’s strategy failed partly because it aimed to sell the education community. But Crow explains, “higher ed doesn’t like to pay for things and our computer was quite expensive.”
Over the years, there were more Macintosh successes: In 1987, two new Macs—the Mac II (color displays) and the Mac SE (internal hard drive)—hit the market. Four years later, the PowerBook laptop was introduced—at just 5 pounds, a significant improvement over the Macintosh Portable, which weighed almost 16. In 1998 was the introduction of the iMac, which featured USB ports and an emphasis on style, with a brightly colored return to uni-body design. And in 2008 came the MacBook Air, the lightest and thinnest Mac notebook ever produced. It was introduced inside a manila envelope.
Crow retired in 2006. Still living in the Bay Area (where he uses a 27-inch iMac and a PC he built himself), he devotes much of his time to supporting the arts.
“I think arts are necessary for a civilized society. That’s why I worry about this country, because with no arts education, I just don’t know what’s going to happen to these kids that are growing up,” he said.
The arts, he continued, are “broadening. I think it expands your senses. When I started going to opera [later in life], I realized how much opera I’d actually heard through Bugs Bunny cartoons. I just fell in love with it, because it allowed me to experience emotions that I normally don’t. As an engineer, I always keep things pretty bottled up. It just was a whole new experience for me.”
Tom Kertscher is a PolitiFact Wisconsin reporter for the Milwaukee Journal Sentinel. His reporting on Steven Avery was featured in Making a Murderer. He’s the author of sports books on Brett Favre and Al McGuire. Follow him at TomKertscher.com and on Twitter: @KertscherNews and @KertscherSports.
Posted on August 29, 2018 - 12:41pm