Computer Architects, Computer-related Professors, Technologist, etc.
If you are qualified to perform a critical review of this work you are welcome to do so. Please forward your review as well as your qualifications (or a link to your qualifications) to Mr Ray Holt raymholt [at] g mail [dot] com. All comments from qualified reviews will be made available in this section.
“Analysis of the CADC F-14 Flight Computer”
by Russell Fishrhfish@dhc.net
September 4, 1998
THE BIG QUESTION
Is the CADC designed in 1968 by Ray Holt and Steve Geller a microprocessor? The answer is “Yes”.
The chip Ray and Steve called the SLF (Special Logic Function) is a single chip CPU. It contains an ALU (Arithmetic Logic Unit), instruction decoding, and control logic. In addition, not only did they design a microprocessor, they designed an entire family of compatible support chips, similar to the 1974 Motorola 6800 family of chips.
Their architecture included many advanced features including, execution pipelines, built-in boundary scan self test, math co-processing, and multiprocessing. Their ROM (read only memory) included its own program counter, seven years before David Chung at Fairchild did the same with the F8. Their PDU and PMU chips are probably the world’s first math coprocessors, a concept reintroduced by INTEL with the 8087 in 1979.
Furthermore, not only is the CADC a microprocessor, it is an advanced type of microprocessor called a DSP (Digital Signal Processor) commercially pioneered in 1980 by Texas Instruments.
In 1967 an engineer one year out of Cal Poly was asked by his employer to design an electronic F-14 flight computer to replace the mechanical computers used on the F-111. Without computer assist, the variable geometry F-14 was thought to be possibly unflyable.
The computer would be known as the CADC (Central Air Data Computer). It was to receive commands from the pilot and perform the actual control of the moveable surfaces (rudder, ailerons, etc.) in an optimized manner taking into account the variable sweep position of the wings. This type of aircraft control is called “fly-by-wire” since there is no mechanical connection between the pilot’s movement of the stick and the control surfaces. The advantage of fly-by-wire is that the flight computer can make decisions to optimize the aircraft performance not possible with human reaction time. In 1968 this was a very advanced state of the art. The F-14 was the first producting fly-by-wire fighter. (NOTE: The Air Force claims it flew the first fly-by-wire fighter, a modified F4 in 1972. According to Grumman, the first F-14 flew in December 1970 and it began deployment to the fleet in 1972.)
THE MILITARY REQUIREMENT
The computation requirement was staggering for the time. Very small, very fast, very smart, and very rugged. Minicomputers of the same vintage executed instructions in the range of 500khz to 1mhz at room temperature, required a hundred or more watts of power, and consumed five or six cubic feet of space. The Navy required computing power more advanced than the leading minicomputers of the day, shrunk to the size of a paper back book, and able to perform in boiling water.
Heat is the enemy of computers. The electrons inside the transistors bump into each other more frequently at high temperature slowing them down. Today’s PCs will drop dead if the fan quits running, long before the temperature of boiling water.
The most popular minicomputer of the time was Digital Equipment’s PDP-8, a 12-bit machine. Twelve bits means that the computer could count to 4,095. To control the F14, the Navy required a 20-bit computer which could count to 1,048,575.
Simply stated, given the 1968 state of the art, the specification was not possible to meet. What was required was a significant advance of the state of the art.
Three decades later, Mr. Holt’s solution must still be viewed with awe. It is by any definition a technical tour-de-force of the first order. In light of today’s of $400 toilet seats the CADC must be considered military procurement at its best.
Furthermore, it is unfortunate that the project was classified since if had been incorporated into a commercial product of the time, it could have certainly dominated the minicomputers of the time in real-time process control and possibly could have accelerated the microprocessor industry by five years and DSPs by a decade. AMI might have been INTEL or at least Motorola.
HOW THEY DID IT
The solution was a multiprocessor general purpose microcomputer implemented in the most advanced P-MOS process of the time. Not only did the design team create a single-chip microcomputer, but they designed the memory chips to work with it, and the multiplier and divide co-processor chips to accelerate math operations.
The solution was so mathematically intensive it was not possible with a single processor. The Holt/Geller design allowed as many as sixteen processors to be connected together. The F-14 version used three. Only last year INTEL was struggling to get its four processor Pentium systems to work.
THE INTEL 4004 AND CONTEMPORARY MICROCOMPUTER CONTEXT
The INTEL 4004 introduced around 1971 was essentially an early calculator chip, and not even the best in the industry. The motivation for its creation was the commercial “adding machine” market. The generalized solution chosen by Ted Hoff created a product that was less than optimum for a calculator but was a solution which might have utility in other areas.
The INTEL genius was marketing this calculator chip as a general purpose computing solution and continuing to invest in and enhance the microprocessor concept.
The 4004 was a 4-bit computer. It could count to 16. To perform the calculator operation it broke the problem down into many operations performed one-after-another. The business calculator it powered performed the four-function calculator operations; addition, subtraction, multiplication, division in response to a human pressing keys one-at-a-time.
The CADC on the other hand was evaluating sixth order polynomial expressions rapidly enough to move the control surfaces of a dogfighting swing-wing supersonic fighter. The 4004 could not interface to the real world analog signals required by a flight computer or any process controller.. It was at least a factor of twenty too slow to perform the complex math of the CADC. The 4004 absolutely could not have controlled a fighter aircraft. Not a dozen 4004s. Neither could the 4040, its successor introduced around 1973 nor the 8008 shortly thereafter.
The INTEL 8080 and 6800 introduced in 1974 were 8-bit processors and could count to 255. The first commercial 16-bit microcomputers were the National Semiconductor PACE and a design by General Instruments, both introduced around 1975. The Nitron division of McDonald Douglas produced a classified military 16-bit processor called the Actron in the 1973 time-frame. There may have been other classified projects of the early 70’s yet to be disclosed. The National, GI and McDonald Douglas chips while only 16-bits, were close to the performance as the CADC. They were also five to seven years later.
The CADC was optimized for computer control of time critical operations, like the much later DSP chips. DSP’s process very complex mathematical problems at high speed. Many minicomputers of 1968 were doing similar control of factory processes and machine tools. Some of the highest volume microprocessors today are found performing similar functions in automobile emission control, fuel injection systems, and just about every military and large commercial aircraft.
INTEL’s place in microprocessor history is secured by a certain amount of technical prowess including the process wizardry of Andy Grove, the manufacturing genius of Bob Noyce, and the marketing skill of Gordon Moore.
In the strictly technical dimension INTEL historically wasn’t always first or even the best. The INTEL 1103 DRAM was inferior to AMI DRAMs which had on-chip sense amps. The 8080 was slower than the 6800, 6502, and even Signetics 2650. The INTEL 8088 was definitely architecturally inferior to the Motorola 68000. The hated segment registers are a standing joke in the industry to this day. Just as the quarterback with the strongest arm and fleetest feet doesn’t always win, the product with the greatest technical merit doesn’t always make the most money. INTEL has consistently won the financial game despite being occasionally late or lacking in technology.
They have done it with a little lucky, a lot of foresight, and intense execution.
INTEL’s success in no way diminishes the recently declassified accomplishments of Mr. Holt and Mr. Geller three decades ago.
WHAT MIGHT HAVE BEEN
After completing the CADC, Mr. Holt was hired by AMI where he completed the design of an advanced microprocessor dubbed the 7200. One day in 1972, AMI management determined that there was no future in microprocessors, and fired the entire microprocessor staff of 18. In its last fiscal year, AMI reported sales of $266 million. INTEL reported sales of $21billion.
by Dave PattersonProfessor in Computer Science at UC Berkeleypatterson@cs.berkeley.edu
September 25, 1998
While I was a grad student I worked at Hughes Aircraft on the computer for the radar system, starting in 1971 or 1972, so it brought back old memories.
Notice that he has lots of references to microprogramming in the document, which is typical of that era. (The aerospace computers I did were microprogrammed too.) The word “microprocessor” was used before the 4004 to mean the processor in a microprogrammed computer by some people of the time, so its not shocking to see the word microprocessor in a document of that era. In kind of used the term ambiguously, but it was a single chip CPU [Ray – Why does a microprocessor have to be single chip? It appears that is revisionist thinking.]
What this document describes is a microprogrammed set of custom LSI chips that can configured into multiple configurations and then microprogrammed, apparently in binary, to perform some application. [Ray – All computers are binary programmed until software tools are developed.]
I think the minimum system is has 3-4 chips for the CPU, but I’m not positive from the stuff I read, but you could use many more to get more performance. It depends whether you include the ROM (microcode memory) and the RAM (“RAS”) as part of the CPU. Then it would be at least 2 more chips (5-6 total) in the CPU. Its plausible to include them since the RAM only had 16 words, so its like a register file, and the microcoded CPUs of the time would include the ROM in the CPU. His designed scaled, so there could also be lots more chips in his computer, without it being a multiprocessor. The designer just had to write the microcode that made it all work. [Ray – The multiple chips allow this chip set to operate with parallel execution which might be called co-processing but whatever the terms used it was decades ahead of other applications. Intel apparently does not include ROM and RAM in its CPU definition.]
You should classify it as a microprogrammed special purpose computer, using a variable number of custom chips packaged as DIPs that could provide good performance in a small footprint. A classic aerospace thing to do, although most aerospace engineers of the time would design computers using standard TTL chips to reduce development costs. [Ray – “The thing to do” is just the point. Any engineer would recognize the final design fits the specification which was not a desktop calculator. Apparently a working microcomputer system on 14 sq inches working in a military environment is the same as standard TTL.]
I agree that its likely that the 4004 wasn’t fast enough [Ray – nor did it exist but two years later]; it had to fit into a single chip, and that meant sacrificing performance to make it fit [Ray – The 4004 chip set was NOT single chip nor was it intended to be]. Holt’s used the technology to solve a fixed problem, and that problem wasn’t a desktop calculator (which led to the 4004) but signal processing for the F14A. [Ray – The 4004 was TWO years later so what is the point? This microprocessor used the EXACT same technology as the later 4004 and operated at mil-spec and had denser chips and worked. If the 4004 was an accomplishment then this was a great accomplishment.]
No way Holt’s computer is a microprocessor, using the word as we mean it today. [Ray – Apparently, Intel disagreed, and “today” is 30 years later.] Arguing that it is simply revisionist history, trying to claim a glory that isn’t deserved. Unless there is some patent deal going on, I don’t know why people would do this. Holt must understand the real issues; he looks to be a good computer engineer. [Ray – Mr Patterson, what is your point? What glory? Defending the United States with a high tech microprocessor? Revisionist? How? Two years before the 4004 is being a revisionist? Why not look at this design for what it is instead of defending some design that was not even on the drawing board at this time? This design is probably worth mentioning in a textbook on architecture. ]
Hughes actually did a wafer-scale integrated CPU a couple of years later, which I worked on, and its was probably one of the first real ones, but who cares? There was no path from it to anything that had commercial impact, and its not like engineers at commercial companies were studying aerospace computers to get ideas to steal and put them in their commercial computers. [Ray – Did the Hughes “wafer” work? was it published? or is it just talk and paper design?][Ray- See my Legacy link to see if this design had any impact on the microprocessor world. There might be some big surprises.]
[Ray – It appears that Mr Patterson published a book in 1997, “Computer Organization and Design: The Hardware / Software Interface”, and the CADC announcement made the book obsolete. It is also interesting that none of his later books mentioned this major accomplishment. Why would Mr Patterson ignore one of the most major accomplishment in the computer history? I am sure he understands the real issues; he seems to be a real professor. I don’t know why professors do this?]
[Ray – I also find it interesting that Mr Patterson has never replied to my comments. Maybe his knee-jerk reaction to this major accomplishment was too sensitive and embarrassing to his Emeritus career. Comments are still welcome.] August 6, 2016