go to NIST home page go to CSRC home page go to Focus Areas page go to Publications page go to Advisories page go to Events page go to Site Map page go to ITL home page CSRC home page link
header image with links


CSRC Homepage
 
Publications Homepage
 
Special Publications page
Table of Contents for
Special Publication 800-12:


Part I:
Introduction & Overview


Table of Contents
 
Chapter 1
Introduction
 
Chapter 2
Elements of
Computer Security

 
Chapter 3
Roles & Responsibilities
 
Chapter 4
Common Threats:
A Brief Overview

 
Part II:
Management Controls

 
Chapter 5
Computer Security Policy
 
Chapter 6
Computer Security
Program Management

 
Chapter 7
Computer Security
Risk Management

 
Chapter 8
Security & Planning in
the Computer Security
Life Cycle

 
Chapter 9
Assurance
 
Part III:
Operational Controls

 
Chapter 10
Personnel / User Issues
 
Chapter 11
Preparing for Contingencies
and Disasters

 
Chapter 12
Computer Security
Incident Handling

 
Chapter 13
Awareness, Training
and Education

 
Chapter 14
Security Considerations in
Computer Support
and Operations

 
Chapter 15
Physical and
Environmental Security

 
Part IV:
Technical Controls
 

Chapter 16
Identification and
Authentication

 
Chapter 17
Logical Access Control
 
Chapter 18
Audit Trails
 
Chapter 19
Cryptography
 
Part V:
Example

 
Chapter 20
Assessing and Mitigating
the Risks to a Hypothetical
Computer System

 
Interdependencies
Cross Reference

 
For a printable copy of Chapter 4.
 

  Special Publication 800-12: An Introduction to Computer Security - The NIST Handbook

 

Chapter 4

COMMON THREATS: A BRIEF OVERVIEW

Computer systems are vulnerable to many threats that can inflict various types of damage resulting in significant losses. This damage can range from errors harming database integrity to fires destroying entire computer centers. Losses can stem, for example, from the actions of supposedly trusted employees defrauding a system, from outside hackers, or from careless data entry clerks. Precision in estimating computer security-related losses is not possible because many losses are never discovered, and others are "swept under the carpet" to avoid unfavorable publicity. The effects of various threats varies considerably: some affect the confidentiality or integrity of data while others affect the availability of a system.

This chapter presents a broad view of the risky environment in which systems operate today. The threats and associated losses presented in this chapter were selected based on their prevalence and significance in the current computing environment and their expected growth. This list is not exhaustive, and some threats may combine elements from more than one area.19 This overview of many of today's common threats may prove useful to organizations studying their own threat environments; however, the perspective of this chapter is very broad. Thus, threats against particular systems could be quite different from those discussed here.20

To control the risks of operating an information system, managers and users need to know the vulnerabilities of the system and the threats that may exploit them. Knowledge of the threat21 environment allows the system manager to implement the most cost-effective security measures. In some cases, managers may find it more cost-effective to simply tolerate the expected losses. Such decisions should be based on the results of a risk analysis. (See Chapter 7.)

4.1 Errors and Omissions

Errors and omissions are an important threat to data and system integrity. These errors are caused not only by data entry clerks processing hundreds of transactions per day, but also by all types of users who create and edit data. Many programs, especially those designed by users for personal computers, lack quality control measures. However, even the most sophisticated programs cannot detect all types of input errors or omissions. A sound awareness and training program can help an organization reduce the number and severity of errors and omissions.

Users, data entry clerks, system operators, and programmers frequently make errors that contribute directly or indirectly to security problems. In some cases, the error is the threat, such as a data entry error or a programming error that crashes a system. In other cases, the errors create vulnerabilities. Errors can occur during all phases of the systems life cycle. A long-term survey of computer-related economic losses conducted by Robert Courtney, a computer security consultant and former member of the Computer System Security and Privacy Advisory Board, found that 65 percent of losses to organizations were the result of errors and omissions.22 This figure was relatively consistent between both private and public sector organizations.

Programming and development errors, often called "bugs," can range in severity from benign to catastrophic. In a 1989 study for the House Committee on Science, Space and Technology, entitled Bugs in the Program, the staff of the Subcommittee on Investigations and Oversight summarized the scope and severity of this problem in terms of government systems as follows:

As expenditures grow, so do concerns about the reliability, cost and accuracy of ever-larger and more complex software systems. These concerns are heightened as computers perform more critical tasks, where mistakes can cause financial turmoil, accidents, or in extreme cases, death.23

Since the study's publication, the software industry has changed considerably, with measurable improvements in software quality. Yet software "horror stories" still abound, and the basic principles and problems analyzed in the report remain the same. While there have been great improvements in program quality, as reflected in decreasing errors per 1,000 lines of code, the concurrent growth in program size often seriously diminishes the beneficial effects of these program quality enhancements.

Installation and maintenance errors are another source of security problems. For example, an audit by the President's Council for Integrity and Efficiency (PCIE) in 1988 found that every one of the ten mainframe computer sites studied had installation and maintenance errors that introduced significant security vulnerabilities.24

4.2 Fraud and Theft

Computer systems can be exploited for both fraud and theft both by "automating" traditional methods of fraud and by using new methods. For example, individuals may use a computer to skim small amounts of money from a large number of financial accounts, assuming that small discrepancies may not be investigated. Financial systems are not the only ones at risk. Systems that control access to any resource are targets (e.g., time and attendance systems, inventory systems, school grading systems, and long-distance telephone systems).

Computer fraud and theft can be committed by insiders or outsiders. Insiders (i.e., authorized users of a system) are responsible for the majority of fraud. A 1993 InformationWeek/Ernst and Young study found that 90 percent of Chief Information Officers viewed employees "who do not need to know" information as threats.25 The U.S. Department of Justice's Computer Crime Unit contends that "insiders constitute the greatest threat to computer systems."26 Since insiders have both access to and familiarity with the victim computer system (including what resources it controls and its flaws), authorized system users are in a better position to commit crimes. Insiders can be both general users (such as clerks) or technical staff members. An organization's former employees, with their knowledge of an organization's operations, may also pose a threat, particularly if their access is not terminated promptly.

In addition to the use of technology to commit fraud and theft, computer hardware and software may be vulnerable to theft. For example, one study conducted by Safeware Insurance found that $882 million worth of personal computers was lost due to theft in 1992.27

4.3 Employee Sabotage

Common examples of computer-related employee sabotage include:
  • destroying hardware or facilities,
  • planting logic bombs that destroy programs or data,
  • entering data incorrectly,
  • "crashing" systems,
  • deleting data,
  • holding data hostage, and
  • changing data.

Employees are most familiar with their employer's computers and applications, including knowing what actions might cause the most damage, mischief, or sabotage. The downsizing of organizations in both the public and private sectors has created a group of individuals with organizational knowledge, who may retain potential system access (e.g., if system accounts are not deleted in a timely manner).28 The number of incidents of employee sabotage is believed to be much smaller than the instances of theft, but the cost of such incidents can be quite high.

Martin Sprouse, author of Sabotage in the American Workplace, reported that the motivation for sabotage can range from altruism to revenge:

As long as people feel cheated, bored, harassed, endangered, or betrayed at work, sabotage will be used as a direct method of achieving job satisfaction -- the kind that never has to get the bosses' approval.29

4.4 Loss of Physical and Infrastructure Support

The loss of supporting infrastructure includes power failures (outages, spikes, and brownouts), loss of communications, water outages and leaks, sewer problems, lack of transportation services, fire, flood, civil unrest, and strikes. These losses include such dramatic events as the explosion at the World Trade Center and the Chicago tunnel flood, as well as more common events, such as broken water pipes. Many of these issues are covered in Chapter 15. A loss of infrastructure often results in system downtime, sometimes in unexpected ways. For example, employees may not be able to get to work during a winter storm, although the computer system may be functional.

4.5 Malicious Hackers

The term malicious hackers, sometimes called crackers, refers to those who break into computers without authorization. They can include both outsiders and insiders. Much of the rise of hacker activity is often attributed to increases in connectivity in both government and industry. One 1992 study of a particular Internet site (i.e., one computer system) found that hackers attempted to break in once at least every other day.30

The hacker threat should be considered in terms of past and potential future damage. Although current losses due to hacker attacks are significantly smaller than losses due to insider theft and sabotage, the hacker problem is widespread and serious. One example of malicious hacker activity is that directed against the public telephone system.

Studies by the National Research Council and the National Security Telecommunications Advisory Committee show that hacker activity is not limited to toll fraud. It also includes the ability to break into telecommunications systems (such as switches), resulting in the degradation or disruption of system availability. While unable to reach a conclusion about the degree of threat or risk, these studies underscore the ability of hackers to cause serious damage.31, 32

The hacker threat often receives more attention than more common and dangerous threats. The U.S. Department of Justice's Computer Crime Unit suggests three reasons for this.

  • First, the hacker threat is a more recently encountered threat. Organizations have always had to worry about the actions of their own employees and could use disciplinary measures to reduce that threat. However, these measures are ineffective against outsiders who are not subject to the rules and regulations of the employer.

  • Second, organizations do not know the purposes of a hacker -- some hackers browse, some steal, some damage. This inability to identify purposes can suggest that hacker attacks have no limitations.

  • Third, hacker attacks make people feel vulnerable, particularly because their identity is unknown. For example, suppose a painter is hired to paint a house and, once inside, steals a piece of jewelry. Other homeowners in the neighborhood may not feel threatened by this crime and will protect themselves by not doing business with that painter. But if a burglar breaks into the same house and steals the same piece of jewelry, the entire neighborhood may feel victimized and vulnerable.33

4.6 Industrial Espionage

Industrial espionage is the act of gathering proprietary data from private companies or the government34 for the purpose of aiding another company(ies). Industrial espionage can be perpetrated either by companies seeking to improve their competitive advantage or by governments seeking to aid their domestic industries. Foreign industrial espionage carried out by a government is often referred to as economic espionage. Since information is processed and stored on computer systems, computer security can help protect against such threats; it can do little, however, to reduce the threat of authorized employees selling that information.

Industrial espionage is on the rise. A 1992 study sponsored by the American Society for Industrial Security (ASIS) found that proprietary business information theft had increased 260 percent since 1985. The data indicated 30 percent of the reported losses in 1991 and 1992 had foreign involvement. The study also found that 58 percent of thefts were perpetrated by current or former employees.35 The three most damaging types of stolen information were pricing information, manufacturing process information, and product development and specification information. Other types of information stolen included customer lists, basic research, sales data, personnel data, compensation data, cost data, proposals, and strategic plans.36

Within the area of economic espionage, the Central Intelligence Agency has stated that the main objective is obtaining information related to technology, but that information on U.S. government policy deliberations concerning foreign affairs and information on commodities, interest rates, and other economic factors is also a target.37 The Federal Bureau of Investigation concurs that technology-related information is the main target, but also lists corporate proprietary information, such as negotiating positions and other contracting data, as a target.38

4.7 Malicious Code

Malicious code refers to viruses, worms, Trojan horses, logic bombs, and other "uninvited" software. Sometimes mistakenly associated only with personal computers, malicious code can attack other platforms.

Malicious Software: A Few Key Terms

Virus: A code segment that replicates by attaching copies of itself to existing executables. The new copy of the virus is executed when a user executes the new host program. The virus may include an additional "payload" that triggers when specific conditions are met. For example, some viruses display a text string on a particular date. There are many types of viruses, including variants, overwriting, resident, stealth, and polymorphic.

Trojan Horse: A program that performs a desired task, but that also includes unexpected (and undesirable) functions. Consider as an example an editing program for a multi-user system. This program could be modified to randomly delete one of the users' files each time they perform a useful function (editing), but the deletions are unexpected and definitely undesired!

Worm: A self-replicating program that is self-contained and does not require a host program. The program creates a copy of itself and causes it to execute; no user intervention is required. Worms commonly use network services to propagate to other host systems.
Source: NIST Special Publication 800-5.

A 1993 study of viruses found that while the number of known viruses is increasing exponentially, the number of virus incidents is not.39 The study concluded that viruses are becoming more prevalent, but only "gradually."

The rate of PC-DOS virus incidents in medium to large North American businesses appears to be approximately 1 per 1,000 PCs per quarter; the number of infected machines is perhaps 3 or 4 times this figure if we assume that most such businesses are at least weakly protected against viruses.40, 41

Actual costs attributed to the presence of malicious code have resulted primarily from system outages and staff time involved in repairing the systems. Nonetheless, these costs can be significant.

4.8 Foreign Government Espionage

In some instances, threats posed by foreign government intelligence services may be present. In addition to possible economic espionage, foreign intelligence services may target unclassified systems to further their intelligence missions. Some unclassified information that may be of interest includes travel plans of senior officials, civil defense and emergency preparedness, manufacturing technologies, satellite data, personnel and payroll data, and law enforcement, investigative, and security files. Guidance should be sought from the cognizant security office regarding such threats.

4.9 Threats to Personal Privacy

The accumulation of vast amounts of electronic information about individuals by governments, credit bureaus, and private companies, combined with the ability of computers to monitor, process, and aggregate large amounts of information about individuals have created a threat to individual privacy. The possibility that all of this information and technology may be able to be linked together has arisen as a specter of the modern information age. This is often referred to as "Big Brother." To guard against such intrusion, Congress has enacted legislation, over the years, such as the Privacy Act of 1974 and the Computer Matching and Privacy Protection Act of 1988, which defines the boundaries of the legitimate uses of personal information collected by the government.

The threat to personal privacy arises from many sources. In several cases federal and state employees have sold personal information to private investigators or other "information brokers." One such case was uncovered in 1992 when the Justice Department announced the arrest of over two dozen individuals engaged in buying and selling information from Social Security Administration (SSA) computer files.42 During the investigation, auditors learned that SSA employees had unrestricted access to over 130 million employment records. Another investigation found that 5 percent of the employees in one region of the IRS had browsed through tax records of friends, relatives, and celebrities.43 Some of the employees used the information to create fraudulent tax refunds, but many were acting simply out of curiosity.

As more of these cases come to light, many individuals are becoming increasingly concerned about threats to their personal privacy. A July 1993 special report in MacWorld cited polling data taken by Louis Harris and Associates showing that in 1970 only 33 percent of respondents were concerned about personal privacy. By 1990, that number had jumped to 79 percent.44

While the magnitude and cost to society of the personal privacy threat are difficult to gauge, it is apparent that information technology is becoming powerful enough to warrant fears of both government and corporate "Big Brothers." Increased awareness of the problem is needed.

References

House Committee on Science, Space and Technology, Subcommittee on Investigations and Oversight. Bugs in the Program: Problems in Federal Government Computer Software Development and Regulation. 101st Congress, 1st session, August 3, 1989.

National Research Council. Computers at Risk: Safe Computing in the Information Age. Washington, DC: National Academy Press, 1991.

National Research Council. Growing Vulnerability of the Public Switched Networks: Implication for National Security Emergency Preparedness. Washington, DC: National Academy Press, 1989.

Neumann, Peter G. Computer-Related Risks. Reading, MA: Addison-Wesley, 1994.

Schwartau, W. Information Warfare. New York, NY: Thunders Mouth Press, 1994 (Rev. 1995).

Sprouse, Martin, ed. Sabotage in the American Workplace: Anecdotes of Dissatisfaction, Mischief, and Revenge. San Francisco, CA: Pressure Drop Press, 1992.


Footnotes:

19. As is true for this publication as a whole, this chapter does not address threats to national security systems, which fall outside of NIST's purview. The term "national security systems" is defined in National Security Directive 42 (7/5/90) as being "those telecommunications and information systems operated by the U.S. Government, its contractors, or agents, that contain classified information or, as set forth in 10 U.S.C. 2315, that involves intelligence activities, involves cryptologic activities related to national security, involves command and control of military forces, involves equipment that is an integral part of a weapon or weapon system, or involves equipment that is critical to the direct fulfillment of military or intelligence missions."
20. A discussion of how threats, vulnerabilities, safeguard selection and risk mitigation are related is contained in Chapter 7, Risk Management.
21. Note that one protects against threats that can exploit a vulnerability. If a vulnerability exists but no threat exists to take advantage of it, little or nothing is gained by protecting against the vulnerability. See Chapter 7, Risk Management.
22. Computer System Security and Privacy Advisory Board, 1991 Annual Report (Gaithersburg, MD), March 1992, p. 18. The categories into which the problems were placed and the percentages of economic loss attributed to each were: 65%, errors and omissions; 13%, dishonest employees; 6%, disgruntled employees; 8%, loss of supporting infrastructure, including power, communications, water, sewer, transportation, fire, flood, civil unrest, and strikes; 5%, water, not related to fires and floods; less than 3%, outsiders, including viruses, espionage, dissidents, and malcontents of various kinds, and former employees who have been away for more than six weeks.
23. House Committee on Science, Space and Technology, Subcommittee on Investigations and Oversight, Bugs in the Program: Problems in Federal Government Computer Software Development and Regulation, 101st Cong., 1st sess., 3 August 1989, p. 2.
24. President's Council on Integrity and Efficiency, Review of General Controls in Federal Computer Systems, October, 1988.
25. Bob Violino and Joseph C. Panettieri, "Tempting Fate," InformationWeek, October 4, 1993: p. 42.
26. Letter from Scott Charney, Chief, Computer Crime Unit, U.S. Department of Justice, to Barbara Guttman, NIST. July 29, 1993.
27. "Theft, Power Surges Cause Most PC Losses," Infosecurity News, September/October, 1993, 13.
28. Charney.
29. Martin Sprouse, ed., Sabotage in the American Workplace: Anecdotes of Dissatisfaction, Mischief and Revenge (San Francisco, CA: Pressure Drop Press, 1992), p. 7.
30. Steven M. Bellovin, "There Be Dragons," Proceedings of the Third Usenix UNIX Security Symposium.
31. National Research Council, Growing Vulnerability of the Public Switched Networks: Implication for National Security Emergency Preparedness (Washington, DC: National Academy Press), 1989.
32. Report of the National Security Task Force, November 1990.
33. Charney.
34. The government is included here because it often is the custodian for proprietary data (e.g., patent applications).
35. The figures of 30 and 58 percent are not mutually exclusive.
36. Richard J. Heffernan and Dan T. Swartwood, "Trends in Competitive Intelligence," Security Management 37, no. 1 (January 1993), pp. 70-73.
37. Robert M. Gates, testimony before the House Subcommittee on Economic and Commercial Law, Committee on the Judiciary, 29 April 1992.
38. William S. Sessions, testimony before the House Subcommittee on Economic and Commercial Law, Committee on the Judiciary, 29 April 1992.
39. Jeffrey O. Kephart and Steve R. White, "Measuring and Modeling Computer Virus Prevalence," Proceedings, 1993 IEEE Computer Society Symposium on Research in Security and Privacy (May 1993): 14.
40. Ibid.
41. Estimates of virus occurrences may not consider the strength of an organization's antivirus program.
42. House Committee on Ways and Means, Subcommittee on Social Security, Illegal Disclosure of Social Security Earnings Information by Employees of the Social Security Administration and the Department of Health and Human Services' Office of Inspector General: Hearing, 102nd Cong., 2nd sess., 24 September 1992, Serial 102-131.
43.Stephen Barr, "Probe Finds IRS Workers Were `Browsing' in Files," The Washington Post, 3 August 1993, p. A1.
44. Charles Piller, "Special Report: Workplace and Consumer Privacy Under Siege," MacWorld, July 1993, p. 1-14.

 

Last updated: July 16, 2004
Page created: July 1, 2004