Center for High Assurance Computer Systems (CHACS) Publications

1999

Bharadwaj, Ramesh and Constance Heitmeyer, Hardware/Software Co-Design and Co-Validation Using the SCR Method. In Proc. IEEE International High Level Design Validation and Test Workshop (HLDVT'99), San Diego, Nov 1999. PostScript, PDF

To date, the SCR (Software Cost Reduction) method has been used to specify system requirements. This paper extends the SCR method to hardware/software co-design and co-validation. Our approach consists of three steps. First, the SCR method is used to specify the required system behavior, i.e., the required relation between environmental quantities (called monitored quantities) that the system monitors and environmental quantities (called controlled quantities) that the system controls. Next, the system designers specify the I/O devices required to compute estimates of the monitored quantities and to set values of the controlled quantities. Finally, the required software behavior is specified as three modules: a device-independent module, specifying how the (estimated) monitored quantities are to be used to compute estimates of the controlled quantities, and two device-dependent modules: an input device interface module, specifying how data from the input devices are to be used to compute estimates of the monitored quantities, and an output device interface module, specifying how the values of controlled variables are written to output devices. To illustrate the approach, we use SCR to specify a simple light control system.

Bharadwaj, Ramesh and Constance L. Heitmeyer, "Model Checking Complete Requirements Specifications Using Abstraction," Automated Software Engineering, Vol 6, No. 1, pp. 37-68, January 1999, Kluwer Academic Publishers. PDF

Although model checking has proven remarkably effective in detecting errors in hardware designs, its success in the analysis of software specifications has been limited. Model checking algorithms for hardware verification commonly use Binary Decision Diagrams (BDDs) to represent predicates involving the many Boolean variables commonly found in hardware descriptions. Unfortunately, BDD representations may be less effective for analyzing software specifications, which usually contain not only Booleans but variables spanning a wide range of data types. Further, software specifications typically have huge, sometimes infinite, state spaces that cannot be model checked directly using conventional symbolic methods. One promising but largely unexplored approach to model checking software specifications is to apply mathematically sound abstraction methods. Such methods extract a reduced model from the specification, thus making model checking feasible. Currently, users of model checkers routinely analyze reduced models but often generate the models in ad hoc ways. As a result, the reduced models may be incorrect.This paper, an expanded version of (Bharadwaj and Heitmeyer, 1997), describes how one can model check a complete requirements specification expressed in the SCR (Software Cost Reduction) tabular notation. Unlike previous approaches which applied model checking to mode transition tables with Boolean variables, we use model checking to analyze properties of a complete SCR specification with variables ranging over many data types. The paper also describes two sound and, under certain conditions, complete methods for producing abstractions from requirements specifications. These abstractions are derived from the specification and the property to be analyzed. Finally, the paper describes how SCR requirements specifications can be translated into the languages of Spin, an explicit state model checker, and SMV, a symbolic model checker, and presents the results of model checking two sample SCR specifications using our abstraction methods and the two model checkers.

Brackin, Stephen, Catherine Meadows, and Jonathan Millen, "CAPSL Interface for the NRL Protocol Analyzer", Proceedings of ASSET 99, IEEE Computer Society Press, March 1999. PostScript, PDF

The Common Authentication Protocol Specification Language (CAPSL) is a high-level language for applying formal methods to the security analysis of cryptographic protocols. Its goal is to permit a protocol to be specified once in a form that is usable as an interface to any type of analysis tool or technique, given appropriate translation software. This paper describes the first operational CAPSL translator to the language used by the NRL Protocol Analyzer (NPA), a software tool developed specifically for the analysis of cryptographic protocols.

Gargantini, A. and C. Heitmeyer, "Using Model Checking to Generate Tests from Requirements Specifications," Proc., Joint 7th Eur. Software Engineering Conf. and 7th ACM SIGSOFT Intern. Symp. on Foundations of Software Eng. (ESEC/FSE99), Toulouse, FR, Sept. 6-10, 1999. PostScript

Recently, many formal methods, such as the SCR (Software Cost Reduction) requirements method, have been proposed for improving the quality of software specifications. Although improved specifications are valuable, the ultimate objective of software development is to produce software that satisfies its requirements. To evaluate the correctness of a software implementation, one can apply black-box testing to determine whether the implementation, given a sequence of system inputs, produces the correct system outputs. This paper describes a specification-based method for constructing a suite of "test sequences", where a test sequence is a sequence of inputs and outputs for testing a software implementation. The test sequences are derived from a tabular SCR requirements specification containing diverse data types, i.e., integer, boolean, and enumerated types. From the functions defined in the SCR specification, the method forms a collection of predicates called "branches", which cover all possible software behaviors described by the specification. Based on these predicates, the method then derives a suite of test sequences by using a model checker's ability to construct counterexamples. The paper presents the results of applying our method to four specifications, including a sizable component of a contractor specification of a real system.

Goldschlag, David M., Reed, Michael G., Syverson, Paul F., "Onion Routing for Anonymous and Private Internet Connections," Communications of the ACM, vol. 42, num. 2, February 1999. PostScript, PDF

Preserving privacy means not only hiding the content of messages, but also hiding who is talking to whom (traffic analysis). Much like a physical envelope, the simple application of cryptography within a packet-switched network hides the messages being sent, but can reveal who is talking to whom, and how often. Onion Routing is a general purpose infrastructure for private communication over a public network. It provides anonymous connections that are strongly resistant to both eavesdropping and traffic analysis. The connections are bidirectional, near real-time, and can be used for both connection-based and connectionless traffic. Onion Routing interfaces with off the shelf software and systems through specialized proxies, making it easy to integrate into existing systems. Prototypes have been running since July 1997. As of this article's publication, the prototype network is processing more than 1 million Web connections per month from more than six thousand IP addresses in twenty countries and in all six main top level domains.

Onion Routing operates by dynamically building anonymous connections within a network of real-time Chaum Mixes. A Mix is a store and forward device that accepts a number of fixed-length messages from numerous sources, performs cryptographic transformations on the messages, and then forwards the messages to the next destination in a random order. A single Mix makes tracking of a particular message either by specific bit-pattern, size, or ordering with respect to other messages difficult. By routing through numerous Mixes in the network, determining who is talking to whom becomes even more difficult. Onion Routing's network of core onion-routers (Mixes) is distributed, fault-tolerant, and under the control of multiple administrative domains, so no single onion-router can bring down the network or compromise a user's privacy, and cooperation between compromised onion-routers is thereby confounded.

Heitmeyer, Constance, "Formal Methods for Developing Software Specifications: Paths to Wider Usage," Proceedings of the International Conf. on Parallel and Distributed Processing Techniques and Applications (PDPTA '99), Las Vegas, June 28-July 1, 1999. PostScript

Although many formal methods have been proposed for improving the quality of software specifications, a number of barriers to widespread use of these methods remain. This paper describes four of these barriers---failure to scale, unnatural interfaces, limited analysis capabilities, and insufficient tool integration---and suggests some promising approaches for overcoming them. These approaches include automated abstraction, user interfaces designed for ease of use, and the application of powerful decision procedures. To illustrate the barriers and approaches to overcoming them, several examples are presented based on the SCR (Software Cost Reduction) requirements method.

Kang, Myong H., Froscher, Judith N., Eppinger, Brian J., Moskowitz, Ira S., "A S trategy for an MLS Workflow Management System," Proceedings of the 13th Annual IFIP WG11.3 Working Conference on Database Security, Seattle, Washington (1999). PostScript, PDF

Current DoD information systems need to support many different missions through cooperation with different organizations and allies. In today^Rs fast paced and dynamic environment, it is almost impossible to design and implement a different information system for each mission. Therefore, DoD needs MLS workflow management systems (WFMS) to enable globally distributed users and existing applications to cooperate across classification domains to achieve mission critical goals. An MLS WFMS that allows users to program multilevel mission logic, securely coordinate widely distributed tasks, and monitor the progress of the workflow across classification domains is required. In this paper, we present requirements for MLS workflow and a strategy for implementing it, especially the method for decomposing an MLS workflow into multiple single-level workflows.

Kang, Myong H., Froscher, Judith N., Sheth, Amit P., Kocku2, Krys J., and Miller, John A., "A Multilevel Secure Workflow Management System," Proceedings of the 11thConference on Advanced Information Systems Engineering, Heidelberg, Germany, 1999. PostScript, PDF

The Department of Defense (DoD) needs multilevel secure (MLS) workflow management systems to enable globally distributed users and applications to cooperate across classification levels to achieve mission critical goals. An MLS workflow management system that allows a user to program multilevel mission logic, to securely coordinate widely distributed tasks, and to monitor the progress of the workflow across classification levels is required. In this paper, we present a roadmap for implementing MLS workflows and focus on a workflow builder that is a graphical design tool for specifying such workflows.

Kirby, James Jr., Myla Archer, and Constance Heitmeyer, SCR: A Practical Approach to Building a High Assurance COMSEC System, Proc. 15th Annual Computer Security Applications Conference (ACSAC '99), IEEE Computer Society Press, December, 1999. To appear. PostScript, PDF

To date, the tabular-based SCR (Software Cost Reduction) method has been applied mostly to the development of embedded control systems. This paper describes the successful application of the SCR method, including the SCR* toolset, to a different class of system, a COMSEC (Communications Security) device called CD that must correctly manage encrypted communications. The paper summarizes how the tools in SCR* were used to validate and to debug the SCR specification and to demonstrate that the specification satisfies a set of critical security properties. The development of the CD specification involved many tools in SCR*: a specification editor, a consistency checker, a simulator, the TAME interface to the theorem prover PVS, and various other analysis tools. Our experience provides evidence that use of the SCR* toolset to develop high-quality requirements specifications of moderately complex COMSEC systems is both practical and low-cost.

Kirby, James Jr., Myla Archer, and Constance Heitmeyer, Applying Formal Methods to an Information Security Device: An Experience Report Proc. 4th IEEE International Symposium on High Assurance Systems Engineering (HASE '99), IEEE Computer Society Press, November, 1999, pp. 81-88. PostScript, PDF

SCR (Software Cost Reduction) is a formal method for specifying and analyzing system requirements that has previously been applied to control systems. This paper describes a case study in which the SCR method was used to specify and analyze a different class of system, a cryptographic system called CD, which must satisfy a large set of security properties. The paper describes how a suite of tools supporting SCR---a consistency checker, simulator, model checker, invariant generator, theorem prover, and validity checker---were used to detect errors in the SCR specification of CD and to verify that the specification satisfies seven security properties. The paper also describes issues of concern to software developers about formal methods---e.g., ease of use, cost-effectiveness, scalability, how to translate a prose specification into a formal notation, and what process to follow in applying a formal method---and discusses these issues based on our experience with CD. Also described are some unexpected results of our case study.

Kirby, James Jr., Myla Archer, and Constance Heitmeyer, Applying Formal Methods to an Information Security Device: A Case Study, Proc. of the Information Systems Technology Panel Symposium on Protecting NATO Information Systems in the 21st Century, Washington, DC, October, 1999. PostScript, PDF

One approach to assuring information security is to control access to information through an appropriately designed device. A cost-effective way to provide assurance that the device meets its security requirements is to detect and correct violations of these requirements at an early stage of development: when the operational requirements are specified. Once it is demonstrated that an operational requirements specification is complete and consistent, that it captures the intended device behavior, and that the operational specification satisfies the security requirements, this operational specification can be used both to guide development of implementations and to generate test sets for testing implementations. This paper describes the application of the SCR (Software Cost Reduction) requirements method and the NRL's SCR* toolset, which includes a set of verification and validation tools, to a US Navy communications security device. It reports on our success in proving that the operational requirements specification satisfies a set of security properties. The paper also discusses the practicality and cost of applying formal methods to the development of security devices.

Meadows, Catherine, and John McLean, ``Computer Security and Dependability: Then and Now'', Computer Security, Dependability, and Assurance: From Needs to Solutions, Edited by Paul Ammann, Bruce Barnes, Sushil Jajodia, and Edgar Sibley, IEEE Computer Society Press, pp. 166-170, 1999. PostScript

We survey security research from the point of view of the dependability taxonomy developed by IFIP Working Group 10.4 and discuss changes since a similar survey was performed four years ago.

Meadows, Catherine, "Analysis of the Internet Key Exchange Protocol Using the NRL Protocol Analyzer," Proceedings of the 1999 IEEE Symposium on Security and Privacy, IEEE Computer Society Press, May 1999. PostScript, PDF

In this paper we show how the NRL Protocol Analyzer, a special-purpose formal methods tool designed for the verification of cryptographic protocols, was used in the analysis of the Internet Key Exchange (IKE) protocol. We describe some of the challenges we faced in analyzing IKE, which specifies a set of closely related subprotocols, and we show how this led to a number of improvements to the Analyzer. We also describe the results of our analysis, which uncovered several ambiguities and omissions in the specification which would have made possible attacks on some implementations that conformed to the letter, if not necessarily the intentions, of the specifications.

Meadows, Catherine, "A Formal Framework and Evaluation Method for Network Denial of Service," Proceedings of the IEEE Computer Security Foundations Workshop, IEEE Computer Society Press, June 1999. PostScript, PDF

Denial of service is becoming a growing concern. As our systems communicate more and more with others that we know less and less, they become increasingly vulnerable to hostile intruders who may take advantage of the very protocols intended for the establishment and authentication of communication to tie up our resources and disable our servers. Since these attacks occur before parties are authenticated to each other, we cannot rely upon enforcement of the appropriate access control policy to protect us. Instead we must build our defenses, as much as possible, into the protocols them selves.This paper shows how some principles that have already been used to make protocols more resistant to denial of service can be formalized, and indicates the ways in which existing cryptographic protocol analysis tools could be modified to operate within this formal framework.

McLean, J. and C. Meadows, "The Future of Information Security," Themes and Highlights of the New Security Paradigms Workshop, Hilary Hosmer and Catherine Meadows, eds., ACM, to appear. PostScript

In this paper we take a new look at the future of computer security, taking into account the first author's three previous reports and assessing their relevance to the emerging world of security as we see it now.

Moore, Andrew P., J. Eric Klinker, and David M. Mihelcic, "How to Construct Formal Arguments that Persuade Certifiers" chapter in "Industrial Strength Formal Methods in Practice" Springer Verlag London Limited, eds. M.G. Hinchey and J.P. Bowen, September 1999. PostScript, PDF

Developers of a critical system must argue that the system satisfies its critical requirements -- those that, if not satisfied, could result in human injury or death, substantial loss of capital, or the compromise of national security. Documenting an explicit, persuasive assurance argument is especially important when the system produced must be evaluated and approved by an independent certifier, as is often the case for safety- and security-critical systems. Past experience developing independently evaluated systems using formal methods demonstrates that the presentation of the assurance argument is as important as the rigor of the assurance evidence on which that argument is based. Formal specifications and analyses must be presented coherently in the context of the overall system decomposition or much of their power to persuade may be lost.

This paper describes and illustrates a general framework that supports gathering, integrating, presenting and reviewing the evidence that we can trust a system to conform to its critical requirements. Our framework enables a developer to produce understandable, coherent and relevant assurance arguments that use formal methods. This is not a step-by-step guide, but a set of guidelines within which individual organizations can customize or improve their existing software development process to use formal methods as an effective tool for convincing an independent certifier of a system's trustworthiness. We describe application of our framework to several examples including one for which we produced a verified software/hardware implementation. The real-world examples provided are simple enough to demonstrate concretely and concisely how to put into practice the techniques on which our framework is based.

Stubblebine, Stuart G., Paul F. Syverson, and David M. Goldschlag, "Unlinkable Serial Transactions: Protocols and Applications", ACM Transactions on Information and System Security, Vol. 2, No. 4, Nov.1999, pp. 354--389. PostScript, PDF

We present a protocol for unlinkable serial transactions suitable for a variety of network-based subscription services. It is the first protocol to use cryptographic blinding to enable subscription services. The protocol prevents the service from tracking the behavior of its customers while protecting the service vendor from abuse due to simultaneous or ``cloned" usage from a single subscription. Our basic protocol structure and recovery protocol is robust against protocol termination failures. We evaluate the security of the basic protocol. We extend the basic protocol to include auditing, which further deters subscription sharing. We describe other applications of unlinkable serial transactions for pay-per-use transactions within a subscription, third-party subscription management, multivendor coupons, proof of group membership, and voting.

Syverson, Paul, "Towards a Strand Semantics for Authentication Logic,", in Electronic Notes in Theoretical Computer Science, vol. 20, eds. Stephen Brookes, Achim Jung, Michael Mislove and Andre Scedrov, 2000. PostScript, PDF

The logic BAN was developed in the late eighties to reason about authenticated key establishment protocols. It uncovered many flaws and properties of protocols, thus generating lots of attention in protocol analysis. BAN itself was also subject of much attention, and work was done examining its properties and limitations, developing extensions and alternatives, and giving it a semantics.

More recently, the strand space approach was developed. This approach gave a graph theoretic characterization of the causally possible interactions between local histories (strands) along with a term algebra to express sent and received messages. This model was designed and has been used by its authors for direct application to authentication protocol analysis. However, it has also quickly attracted the attention of many other researchers in the field as useful in connection to related work, such as model checking approaches.

Here we discuss the idea of using strand spaces as the model of computation underlying a semantics for BAN-style expressions. This will help to integrate some of the approaches to security protocol analysis and to hopefully provide BAN logics with a clearer, more useful underlying model than they have had to date.


Back to the CHACS Publications Page.


Back to the Publications Page.