8008 & 8080 Q&A
A few questions posed to Federico Faggin,
intended to further clarify the early history of the microprocessor

Q: What is the 8008?

F.F.: The 8008 is Intel’s first 8-bit microprocessor. It was originally intended as a custom chip for CTC (Computer Terminal Corporation of San Antonio, TX. The company name was later changed to Datapoint). CTC also commissioned the same chip, at about the same time, to Texas Instruments (TI). TI announced it in mid 1971 as the world’s first CPU-on-a-chip, a few months after the 4004 was successfully completed, was fully functioning and had been shipped to Busicom, its intended customer. The TI chip, designed by Gary Boone, was never marketed or sold to anyone, not even to CTC: apparently it never functioned properly, as I later learned from a private communication by Vic Poor, VP of Engineering at CTC.

Q: What is the significance of the 8008?

F.F.: The 8008 is the world’s first 8-bit microprocessor and it opened the market for simple table-top computers and other user-programmable machines. The personal computer can arguably be traced back to the 8008, which was the CPU used by the French Micral microcomputer, made by R2E Corp. The Micral was the first commercial, user-programmable, fully-assembled, table-top computer based on a microprocessor.
We approach the personal computer (PC) lineage with the next generation CPU – the 8080 – which was machine-code compatible with the 8008. The CPUs for the first true personal computers, however, were the Zilog Z80 (used in Radio Shack’s TRS-80 personal computer, right) and the MOS Technology 6502 (used in the Apple II personal computer), belonging to the CPU generation that followed the 8080.
The 8008 used P-channel silicon gate MOS technology and was packaged in an 18-pin package: a very poor choice, imposed by Intel management’s aversion to high pin-count packages. This package choice required that addresses, instructions, data and control signals be multiplexed in a single 8-bit bus, unnecessarily reducing the effective CPU speed to between 1/2 and 1/3 of the speed achievable without this limitation. Additionally, the need to de-multiplex and latch the 8008 bus information demanded the use of about 30 external TTL chips to interface to memory and I/O devices, negating many of the advantages of cost and simplicity of a single-chip CPU implementation. The 8008 architecture was entirely done by CTC, with the exception of some minor modifications suggested by Intel’s Stan Mazor and Ted Hoff. While sharing the simplicity of the 4004, the 8008 architecture was a more general one. For example, in the 8008 both instructions and data could coexist in RAM memory, whereas in the 4004 the instructions could only be stored in ROM and data in RAM registers, where they could be addressed in a rather cumbersome manner.

Q: Why wasn’t the 8008 announced before the 4004?

F.F.: The design of the 8008 started in March 1970, under the leadership of Hal Feeney, but was stopped only a few months after starting, until the work on the 4004 was nearly completed. In January 1971 I was put in charge of the 8008 project, with Hal Feeney performing the detailed design under my supervision.
There is an interesting personal twist in the history of the 8008 design: I joined Intel in April 1970 and when I found out that Intel had another microprocessor in development beyond the 4004, I was very disappointed. I was convinced that Hal would become the first microprocessor designer because I had to design the chip set consisting of 4001, 4002 and 4003 prior to starting the 4004 CPU, therefore Hal had a jump start of several months over me. However, Hal’s experience was in random logic design with MOS metal gate technology and not in silicon gate technology, and so progress was slow since no infrastructure for random logic design in silicon gate existed at Intel.
The history of the 8008 clearly illustrates that the real obstacle in the creation of the first microprocessor was its implementation into at single chip – designing it within a cost-effective chip size, at the required speed and power dissipation, and making it work reliably. In 1969 many people were able to conceive and specify a simple CPU architecture that could be integrated into a single chip, but not many could effectively design one and make it function. The fact that Texas Instrument, at that time one of the world’s most experienced MOS companies in the design of complex MOS random logic chips, could not design and make a working 8008 look-alike (despite using twice the chip area of the Intel 8008), indicates that the major stumbling block was not architecture but implementation.
The microprocessor race was won by Intel on account of the flawless execution of the 4004, and later the 8008.

Q: How far along was the 8008 project before you took over?

F.F.: Not much work had been done beyond figuring out the general CPU internal block organization, the overall timing and a good portion of the logic equations for the control block. Hal Feeney hadn’t yet undertaken any circuit design or chip layout work.
When the project was restarted, I had already completed and tested the new silicon gate design methodology for random logic chips, which I used in the design of a working 4004 microprocessor. Furthermore, given that the internal organization of the 8008 and most of its circuits were identical to the ones I used in the 4004, the chip design and the physical layout of the 4004 were used as models for the design and layout of the 8008. The detailed design of the 8008, therefore, was rather straightforward.

Q: How did the 8080 come about?

F.F.: I had the idea and motivation for the 8080 as a result of a trip to Europe I took with Hank Smith, in the summer of 1971, to visit possible customers. Hank had just been appointed marketing manager for microprocessors, in preparation for the imminent announcement of the 4004. We visited a number of potential customers of both the 4004 and the 8008, where we discussed both products under non-disclosure agreement.
I got a lot of feedback and reactions ranging from pleased to hostile.
The most pleased customers were the ones without a specific background in computer, whose problems could be solved with a small, low cost computer, and saw in the microprocessor an effective way to do so. The hostile customers were the computer manufacturers – the experts, for example Nixdorf in Germany and Plessey in England. Their negativity appeared to be a visceral reaction to the perceived “invasion” into their business by a semiconductor company rather than an attack on the architecture of our CPUs.
It felt to me they resented Intel’s involvement into what they considered their turf.
On the other hand, they were more sophisticated designers and users than the first-time CPU customers who had a positive reaction. Therefore I made a treasure of their comments and suggestions, turning a put down into an opportunity to make a better product.
When I returned to Intel, I came up with the idea of how to make a substantially better CPU than the 8008 by incorporating many of their suggestions. After working out the basic specifications of the new chip, I wrote a memo to my boss, Les Vadasz, describing the 8080, its architecture and its expected market impact, and soliciting him to let me start the project as soon as the 8008 was completed.
The project, however, was put off until the end of 1972, rather than beginning in March-April 1972, at the completion of the 8008. Despite my many urgings to get started, Intel management’s wanted to test the market reaction to the 4004 and 8008 before investing in the next generation CPU. We almost lost our leadership to Motorola’s 6800 because of that delay and it was very frustrating to me to see us squander the leadership accumulated after so much hard work by unnecessary inaction. Toward the end of 1972 I hired Masatoshi Shima to do the detailed design under my supervision,
Q: What role did Stan Mazor play in the 8080

F.F.: After the 8080 project was approved, Stan wanted to get involved in it and make suggestions at all cost. It became quite an annoyance because the architecture and the product specification were pretty much frozen and agreed to with marketing. Nonetheless, he wanted to suggest a number of new instructions, most of which did not make sense or were far too complicated to implement.
I remember that I actually complained to Vadasz about Stan wasting Shima’s and my time, and asked him to talk to Ted Hoff, Stan’s boss, to keep Stan from interrupting our work. Eventually we ended up using one or two of the instructions he recommended. After I left Intel to start Zilog, Intel’s management decided to erase my contributions from the history of the microprocessor, thus encouraging many to take credit for my work. Since that time, Stan has publicly declared himself as the architect of the 8080, even though his contribution was rather small.

Q: What role did Masatoshi Shima play in the 8080

F.F.: Shima was engaged to do the detailed design of the 8080. Before accepting the job at Intel he was working for Ricoh in Japan, after leaving Busicom. When he joined Intel, around the end of 1972, I gave him my architecture and specifications for the 8080 as well as some preliminary designs. I also extended the design methodology I had created for the 4004 to include MOS transistors made with the new high voltage, N-channel technology that was to be used in the 8080. I spent a couple of months working very closely with him, teaching him how to do chip design, because this was his first chip, and then I left him alone, simply overseeing his work through our weekly meetings and helping him solve specific problems as needed. Shima became an excellent designer, very precise and hard working. 

Q: Why did you leave Intel to start Zilog

F.F.: The main reason was my frustration at the fact that Intel was not really committed to the microprocessor. Intel was a memory company that saw the microprocessor as a way of selling more memory chips. I had worked very hard for five years, personally designing or leading the design of all Intel’s microprocessors from the very beginning. I often had to fight to get my ideas accepted and then, after making them successfully work, it was as if nothing had happened. Intel’s management did not really believe, as I did, in the intrinsic power of the microprocessor as a foundation for many new markets, therefore I felt I could do better by striking it out on my own.
There were other reasons as well: for example, I was very upset when I found out in 1974 that Vadasz – who was my supervisor already at Fairchild – had patented later for Intel the "buried contact", an idea I had implemented in a test chip at Fairchild and had disclosed to him at that time. Also, Andy Grove was rising in power and his management style was not appealing to me. The timing to leave was perfect because I had completed the key projects under way and wanted to go before starting new projects: I wanted to be free to come up with new ideas that did not belong to Intel. It was also a good time because the market was beginning to recognize the microprocessor as an important new product category. I felt there was now room in the market for a company totally dedicated to microprocessors. 

Q: How did the Z80 come about

F.F.: When I left Intel, in October, 1974, I had no idea of what kind of product I wanted to make. I just knew that I wanted to start a company dedicated to microprocessors, get out of Intel and take some time to think about the correct strategy.
I started 
Zilog with Ralph Ungermann who at that time was one of the section managers reporting to me. At the beginning of Zilog, Ralph and I did some consulting in testing and testers to get some money coming into the company while preparing its business plan. My first product idea was a single chip microcomputer for control applications (many years later this device was called a microcontroller) and I worked for a few weeks on its definition. However, from a business point of view, I soon became concerned about the appropriateness of this product for a company that didn’t have its own manufacturing capability. I expected this product to be very price sensitive making it difficult to be profitable as a fabless company.
At the end of 1974, the US economy was in the middle of a recession and venture capital had practically disappeared, therefore I considered the investment required to build a wafer fabrication facility impossible for us to get at that particular time in the economic cycle. I therefore decided that this product had to be delayed until we could have our own manufacturing capability. This product idea was the seed that later became the Zilog Z8. As I started thinking about another, less price sensitive product, the Z80 idea suddenly came to mind as a gestalt. The idea was to create a powerful family of components: an 8-bit CPU, software compatible with the 8080, but much more capable than the 8080, as well as a number of intelligent peripheral components that would seamlessly interface with the CPU, designed around a 5 volt, very fast N-channel process technology with depletion load devices, the same process technology that Mostek pioneered and that I had used at Intel for a very fast static RAM chip, the 2102A.
In the following several months I developed the Z80 architecture as well as the preliminary functional description of the other four members of the Z80 family. This was the start of the product family at the origin of Zilog. It is rewarding to me that both the Z80 and the Z8 are still in high volume production today, in 2006, more than 30 years after I conceived them. 

Q: What role did Zilog play in the microprocessor business

F.F.: For many years, Zilog replaced Intel as the new pacesetter of the microprocessor business, a position Intel held since its inception. It also demonstrated to Intel and the rest of the industry that it was possible to have a company totally dedicated to microprocessors. Zilog retained its leadership position until IBM chose the 8088 for its personal computer. This choice, made in 1980 and mostly dictated by non-technical reasons, spelled the end of Zilog’s ascendancy and returned the leadership to Intel. 

Q: Why did IBM choose Intel’s microprocessor over Zilog’s for their PC

F.F.: There were several reasons. The most important one was that 
Zilog, being a subsidiary of Exxon Enterprises (the original and only investor in Zilog), was considered a competitor by IBM. In the late ‘70s Exxon Enterprises had decided to create a major information company, competing with IBM, by combining a number of companies it had invested in over the years, including Zilog. This decision created a major problem for our company. Word has it that their CEO, Frank T. Cary, sent out a memo forbidding the use of Zilog’s components in any IBM product. Therefore Zilog was not even in the running. 

Q: What role do you see yourself having played in the history of the microprocessor

F.F.: I see myself as the person who provided the creative energy, the determination, the skills and the hard work required to design the first microprocessor, in a very short time, and in an environment with no random logic design infrastructure. I created the infrastructure, designed the 4004 and made it work before the competition. I also saw, from the very beginning, the potential of the microprocessor and I played an important role in convincing Intel’s management to make it broadly available in the marketplace, rather than relegating it to being just another custom product. I played the leading role in the design of all Intel’s early microprocessors: the 4004, 8008, 4040 and 8080. With regard to the 4040 and the 8080, in addition to leading the design, I also had the product idea, I did the architecture and I convinced Intel’s management to let me develop them.
When I saw that Intel continued to be lukewarm about microprocessors, I ventured out on my own, risking my own reputation and betting my career to start a company devoted to it. I did not have any experience about starting and running a company, I was not a born entrepreneur, but I saw a tremendous opportunity and I felt compelled to do it because I truly believed in the microprocessor. I can definitely say that I was one of the main actors in the first 10 years of the microprocessor history, providing the vision, the energy, the skills and the courage that helped create an industry, and I did it despite the difficulties that were thrown in my way.