Speculations on Armed Conflict

In a Time of Free Silicon


Chapter 1


Despite the waning of military technology competition, information technology, driven by burgeoning commercial markets, is likely to continue its rapid pace of development for a decade or two. Such advances are most logically deployed in distributed rather than concentrated form.

The influence of technology on conflict over the next several decades will be the result of a great irony. Just as the political motivation for developing military technology has declined, the information technology fungible to conflict is about to accelerate.

Military Competition Quiescent

The years 1939 to 1989, which included World War II and the Cold War, saw intense technological competition between the United States and its adversaries -- first Nazi Germany and then the Soviet Union. During both Hot and Cold Wars our national security was perceived as directly threatened -- any slackening on our part could put us on the wrong side of a deep strategic abyss, with our survival at risk. Our adversaries felt the same hot breath of competition.

Such fears put a premium on developing military technology rapidly lest one side develop an advantage the other could not trump. The strategic arena hosted the nuclear contests, bomber gaps, missile gaps, windows of opportunity and Star Wars. The conventional arena saw submarines vie with ships, tanks with antitank missiles, stealth aircraft with radar- based air defenses, chemical weapons with antidotes and the entire panoply of electronic warfare including counter, and counter-counter. Our advances sparked theirs; theirs sparked ours. Military technology evolved under hothouse conditions, and military equipment became both ever more sophisticated than its commercial counterparts.

The end of the Cold War has retarded military technology competition. Although the United States (and others) may respond with new technology to emergent means of war (e.g., SCUDs used as instruments of terror), no country can respond to our innovations as the Soviets did. Other motivations are also muted. Tomorrow's improved jet fighter may trespass third-world airspace with less loss of life. Yet its successful development would be less likely to influence the global balance of power as preceding developments may have done.

The same trends have, if anything, heightened commercial competition from both former Warsaw Pact technologists, and the growing electronics manufacturing base from a more market-oriented China. Thus commercial information technology will continue to advance at a rapid clip. With every year, more and more technology comes from the commercial side. Even before the Cold War ended, the leading role of defense acquisition had begun to fade. Military electronics started lagging behind commercial electronics and could only hope to stay current through spin-ons of commercial technologies.

It is precisely as the motivation for conducting revolutions in national security technology slows down that the means of doing so accelerates. The latter may yet overcome the inertia of the former. At that point, the world of conflict will be radically transformed. Although most elements of the new battlefield will arrive by 2010, exactly when every aspect appears and is demonstrated will depend on who is fighting whom and where. Yet once someone exhibits such capabilities, others will try to follow close behind. Military competition, though usually latent, does not tolerate fudging when it emerges.

The impact of the information revolution in civil affairs is likely to follow a smoother but not less radical pace. Personal computers, networks, facsimile machines, and cellular telephones have rendered large chunks of the West's workspace unrecognizable. Their spread to the South -- with its far different societies -- is likely to promote even greater discontinuity. In some ways, present conditions in underdeveloped nations resemble past conditions in developed ones: Korea circa 1988 equals Japan circa 1964. In other ways, undeveloped nations are a synchretic mix of the old and the new. Because Java, Indonesia's core island, is underdeveloped, it should resemble nineteenth century America. Yet three-quarters of all households own color televisions, telephone service is increasingly skipping the wireline phase and jumping straight to cellular, and a coterie of Western-educated technocrats support a highly competitive aircraft industry. It is precisely the combination of traditional mores, rapid urbanization, a lagging overall living standard, but cheap high-technology goods that will make the third world such an interesting stew.

Information Technology Ascendent

Information technology doubles roughly every one and a half to three years. Each successive generation is both faster but cheaper, smaller, and less power- hungry as well. Free silicon is inevitable; more precisely, unlimited amounts of information acquisition, processing, storage, and transmission capability will be available from indefinitely small and inexpensive packages. Limitations on information processing capability will constrain the conduct of neither military and civilian operations. In a narrow sense, ending these limits, to echo Freud, leaves behind all the other constraints in life. In a broader sense, as information gets cheaper, it substitutes for activities that are not information intensive.

Both the breadth and speed of these advances mark the flood. IBM introduced its PC in 1981 based on the Intel 8088 chip running 250,000 instructions per second. Pentium-based PCs introduced in 1993 runs 30,000,000 instructions per second -- and for roughly the same cost. The 300 bit per second (bps) modem of 1981 cost more than the 14,400 bps modem of 1993. The IBM PC's original 16K DRAM contrasts with the (slightly more expensive) 16M DRAM expected in 1994. In 1981 the Internet had 213 hosts connected with 56,000 bps digital lines; in 1993, the Internet has more than 1.5 million hosts whose core has 45 million bps digital line today and will have a billion bps lines in two years. The IBM XT of 1983 had a then- enormous 10-megabyte hard disk in a 5 1/4 inch box. Today's choices range from a (much cheaper) 20 megabyte hard disk in a 1 1/2 inch box to a (somewhat more expensive) 600-megabyte disk array in a 3 1/2 inch box. Technologies with no direct precedent in 1981 -- cellular telephones, compact disks, electron tunneling microscopes, and global positioning systems add fizz to the torrent.

Many observers argue that information technologies will be no exception to the rule that while progress can be rapid for a while, eventually, all such revolutions peter out. For example, every new generation of jet aircraft and engines over a quarter-century period was far more capable than its predecessors. But by the late 1960s only evolutionary change was left; the Phantom F-4 and the Boeing 747 are today still cost-effective for many missions. The rate of new product introductions and market growth for plastics and other petrochemicals was swift in the 1950s and 1960s. (Recall the singular advice, "Plastics" offered to the protagonist of the 1968 film The Graduate.) After 1975 both rates declined sharply. If this theme is generally true, how much oomph does the information technology revolution have? Today's best microprocessors use .5-micron features. One commonly cited barrier to further progress is that feature size can only shrink so much (and thus speed can only rise to fast); this limit, .25 microns, some say will be reached in the late 1990s. Advances below that would require a very expensive transition from optical (and/or ultraviolet) to X-ray lithography or something equally powerful. At even finer geometries, quantum effects may play havoc with any chip howsoever fabricated. Yet these predictions, even if true -- and the boundaries below which optical methods fail keep retreating -- would not necessarily end the information technology revolution.

First, expensive transitions are not necessarily impossible ones. By the time a transition is needed, industry will have had time to work out and finance new equipment (even if, being expensive, it comes late). Quantum effects, while harmful at one level, can be exploited at another for atomic-level microprocessors.

Second, even assuming a limit on fine geometries, other methods exist to flog the performance of electronics. New chemistries help. Gallium arsenide, whose use is currently inhibited by its fragility, permits the same design to run three to five times faster than one in silicon. The former would also uses less power and can take more radiation. Other electronic materials (e.g., indium antimonide) also hold promise. Performance gains may also come from adding three- dimensional aspects to two-dimensional chips (e.g., trench capacitors or a fully three-dimensional chip). Chip microcode (e.g., RISC, instruction pre-fetching and pipelining) is getting better, which aids all geometries.

Third, better computer architectures multiply the effects of better semiconductors. Massively parallel machines are already in the market; neural networks and good fuzzy-logic chips may soon follow.

Fourth, software is also improving thanks to more efficient algorithms, more reliable programming tools, the compression of image and data, and more efficient coding of radio transmissions. The technologies of artificial intelligence may also start to bear great fruit as well.

Fifth, ancillary technologies are also improving: photonics (a pure photonic computer was bench-scaled in 1992), purer fiber optics for higher bandwidth, magnetic drives down to the size of a matchbox, ever- denser optical media (e.g., CD-ROMs), the possibility of three-dimensional holographic storage, solid state emitters, and more efficient batteries and solar collectors. Functions that can be transferred from one technology to another will improve system performance even if the technologies themselves have reached a plateau: e.g., switching from slow and power-hungry hard-disk drives to faster and low-power flash memories.

Sixth, not all progress has to be at the leading edge of technology. Steady incremental improvements in the manufacturability of information technology devices spell lower prices which leads to larger economies of scale which spell even lower prices and so on. Since the ubiquity of Mesh and Net is based on the favorable economics of deploying millions of low-cost devices, such improvements make a difference.

Seventh, even if both product and process improvements cease, the spread of these devices through normal investment patterns guarantees a continual upgrading of the global information infrastructure.

Eighth, even after such an infrastructure reaches a plateau, people will still be finding uses for the infrastructure that they missed seeing.

The accumulation of all these advances sets the stage for continued and probably rapid improvements in the capabilities of information technology. Maybe the recent doubling times of a year a half will lengthen. Yet were progress to recede to half its rate (e.g., a doubling time of three years) this would merely postpone the revolution; it would not alter its nature.

The Logic of Distributed Intelligence

Most of the recent benefits of information technology are going, not into more powerful computers, but into more widely distributed intelligence. This truism of commercial life can be applied to the battlefield with even greater force. Proliferation in the civil world has its limits -- one person can get on but one functioning computer at a time. In the military realm, though, computers could be slaved to sensors and networked. The use of intelligent devices on the battlefield has no theoretical lower limit. Several factors suggest that such distribution is not only possible but optimal.

The first reason is economics. Until the late 1970s, Grosch's law held that doubling the cost of a computer multiplied its power fourfold. Since then, the cost- performance ratio of computers has flipped; it is greater at the lower end than the upper end. Microprocessors deliver more mips (million instructions per second) for the buck than their more sophisticated mainframe or even supercomputer rivals. Even supercomputers, these days, are most cost- effective when built from thousands of microcomputer or workstation, and the best microprocessors are found, not in giant machines but in workstations, while the most cost-effective microprocessors are in high-end personal computers. If digital television takes off, the most cost-effective chips may be found within these sets, only further validating this generalization.

The cost-effectiveness of employing less sophisticated products manufactured in the millions rather than a handful of very sophisticated products extends to other information products: photographic film, television and computer displays, tape backup (e.g., audio cassette-sized tapes), CD-ROM, and hard- disk drives.

This pattern of the information age stands in direct contrast to historically recognized patterns of the industrial age, where bigger was more cost-effective. For instance, larger submarines tend to be quieter. Full-sized aircraft carriers can launch far more planes yet cost only slightly more than pocket-sized carriers. Heavy space systems can lift a pound into orbit cheaper than their lighter cousins. The Boeing 747 still offers the lowest cost per seat-mile. Auto factories, nuclear plants, oil refineries, cement kilns, and chemical reactors achieved their greatest economies at largest sizes. Exceptions aside (steel mini-mills prosper as their integrated cousins fail; high-capacity fiber optic lines are still the most cost- effective way to send a bit) information technology tends to be most cost-effective at the low end; industrial technology, at the high end.

The second reason is that distributed systems put intelligence where it can be used. A central box with a hundred phones may offer the most calls per dollar, but forcing everyone to go to the box would be highly inefficient. Even distributing a hundred desktop terminals may be less cost-effective than networked PCs if users cannot customize them and thus avoid using them. One observer has gone so far as to argue that the increase in processing power that PCs brought to the Gulf affected the conflict more than all other computing power combined.

For military operations, efficient area-wide coverage becomes important. A hundred pairs of eyes can always find something in the field most easily if they are spread around rather than bunched up. Dispersion is also good for localizing an object. A hundred low-power noses can detect, and more important, track a scent better than a single high-power nose stuck in one place.

Consider a radar looking for a single intrusion. A single large radar may be more cost-effective in terms of power produced per dollar of installation. Yet the strength of a reflected beam, while proportional to its energy, is inversely proportional to the distance to the object taken to the fourth power. A hundred radars whose maximum distance to the target is ten miles around will be, collectively, as sensitive as a single radar, ten thousand times more powerful, whose maximum distance to the targets is a hundred miles around. Whether the latter is more economic may depend on other factors. If guarding a radar is the largest expense and all radars need the same complement. a hundred small radars may be far more expensive. Conversely, if the small radars can sit in a common truck trailer while the large radar needs a specialized facility, the former may be more cost- effective.

Third, distributed systems are more robust against accidental failure than large ones. The Capital Beltway can carry more cars than four country roads can, but a single overturned tractor-trailer can close it; four identically timed spills are needed to close the country roads. Two independent units of 90-percent reliability are needed to generate a 99-percent availability of at least a single 100-percent redundant capacity. However, fourteen units of 90-percent reliability will keep at least 10 units on-line 99 percent of the time -- only 40-percent redundant capacity. The greater the desired reliability, the greater the advantage of distributing capacity into smaller units. The need for very high reliability can be especially pronounced in a military context. Someone may be willing to wait a year for the opportunities provided during that one hour that the system is down.

Of greater military relevance is that one large item is easier to find than are each of a hundred smaller ones. Small size and large numbers work with each other in this case. First, the one large item usually has a greater signature than each of the smaller ones. Second, far more effort is needed to track, hit, and ascertain the destruction of a hundred small ones.

Many vulnerabilities, it remains, are more easily defeated by concentration. The same mass may be enclosed in eight foot-square blocks or one which is two feet on each side; the former require twice the cladding the latter does. Nevertheless, too great an emphasis on defensive measures can lead to self- defeating cycles. The more valuable a single item, the more self-protection it needs, the more expensive it is, and the fewer are made. The fewer are made, the more important each is, thus the more worth destroying, thus the more protection they need and so on. The aircraft carrier carries not only attack aircraft but defensive fighters, electronic warfare jets, antisubmarine helicopters, and air refueling capability. It must sail with an Aegis cruiser, picket frigates, and an escort submarine. Everything but the attack aircraft is designed to ward off and defeat potential air, surface, and subsurface attacks on the carrier battle group. Thus a ten-billion-dollar armada of ships and planes exists to support twenty-four attack aircraft in certain high-threat environments.

Attacks are ruled by countervailing principles as well. After a certain point attackers can saturate even well constructed defenses simply through numbers. Mere confusion (which is rarely so mere) aside, engaging a target takes a certain amount of time, and these sequences often cannot run in parallel. A defender must either take a certain minimum length of time to go through a find-engage-destroy cycle as well as engage an attack from one aspect and then shift to another. Either way, something gets through.

All this information technology will probably not yield robot soldiers. Robots -- replete with sensors, silicon brains, and artificial legs -- are not impossible. But why must all these be integrated into one package, let alone a man-sized one? Full systems support and integration, if nothing else, is likely to yield a very expensive bionic form, far less capable a network of cheap objects suitably dispersed.

Coordination-and- Convergence

Replacing complex systems with networks of dispersed computers and communications introduces the problem of a complex command-and-control overlay (in civilian terms: coordination-and- convergence). If one head must guide dispersed fingers, both the head and the nerves out to the fingers are vulnerable. Conversely if the functions of a complex distributed system are meted out to various components -- each of which must work correctly -- the difficulty of ensuring that each component works rises far faster than the total number of components does.

Within the last five years, considerable theory has been done on architectures of loosely coupled processors. In many ways such systems possess considerable advantages over tightly coupled ones.

Neural net architectures -- used for pattern recognition -- form one archetype. Although neural nets are densely hierarchical -- information flows up to a central determination point -- they are highly robust. Both sensors and intermediate nodes work without central logic. For pattern recognition, each sensor sees part of a picture, forms a sub-judgment on it, sends a signal to intermediate nodes, which weigh the inputs from sensors and other intermediate nodes, and pass it forward for comprehensive assessments. Matching guesses to outcomes sends grades down the line to each note, subnode and sensor so they can retune their sensing and weighting signals accordingly. Altogether the core does very little work. The system degrades gracefully rather than catastrophically as sensors and sub-nodes go down.

Other models of complex systems built from simple relationships come from self-organizing systems and complexity theory. The former is based on cell differentiation. Multi-cellular creatures such as humans start from a single cell that gives rise to hundreds of types of cells through genetic sequences that can switch other genetic sequences on and off. Such n-fold complexity requires many simple triggers. The latter suggests that very complicated systems can be created from simple homogenous parts if they interact with their neighbors according to a well-tuned pattern. Tuning matters; outside stimuli sometimes produce no reaction and at other times make the system oscillate to death. Analogously, some people form fixed ideas and never take anything new on board; others react only to new notions and are slaves to trends. Some intermediate method of integrating information can be very efficient at responding to the outside world -- even though no individual piece is.

Another, quite different concept is evolutionary programming. Instead of developing a complex optimized program to handle difficult problems, start with a million programs each of whose modules are chosen from a certain set (as a Chinese menu might yield thousands of dinner combinations). Each such program attacks the problem; those who do well survive and start mating (swapping modules) with other successful programs to produce multiple offspring. Eventually, good programs predominate and bad ones die.

These models suggest how systems composed of loosely coupled components can, properly tuned -- and there is a world of sophistication to be tapped -- survive degradation, exhibit complex behavior, and learn from external stimuli.

There Will be Other Changes

To be sure, tomorrow's world will differ from today's in dimensions unrelated to information technology. Other technologies will advance--some in ways which may surprise us. Biotechnology, in particular, may go wondrously right or fearfully wrong. Some differences will stem from forces unrelated to technology. Many are negative. Population will grow and mostly in the South, some large share of which will attempt entry into the West. While most regions get richer, some will get poorer. Pockets of preservation aside, the world's ecology will deteriorate -- although how catastrophically is unknown. Resources will be depleted and garbage piles will grow.

Yet, the most powerful predictable difference is likely to take place through information technology. Vast improvements in information technology are happening now, and will continue to happen; that these improvements will change the conduct and context of national security is virtually certain. Other technologies do not seem to offer as much in the way of change these days and thus do not offer large cumulative advances of deep significance. Technologies that alter society radically -- the automobile, modern medicine, precision warfare, and, yes, phones and computers -- tend to result from a long chain of small discoveries and incremental improvements. Future revolutions should have a visible tail today; predicted revolutions that lack a tail will probably not amount to much even several decades hence.

Although material progress does not itself change society so much by itself, it does permit new forms of wealth, power, and social organization. Such opportunities will be seized on by those seeking advantage. Those otherwise disinclined to risk unpredictable changes will be forced to respond. The automobile was not invented to alter the shape of America's cities or the conduct of adolescent mating rituals -- but it did so just the same. The radio, in the hands of charismatic thugs, fomented wars. Future changes in information technology will as certainly rewrite the assumptions -- both political, and military - - upon which national security rests.