Fri Apr 8 06:51:39 PDT 2016

Control Architecture: Objectives: What are the protection objectives and how are they applied??


Option 1: Integrity (I)
Option 2: Availability (A)
Option 3: Confidentiality (C)
Option 4: Use control (U)
Option 5: Accountability (T)
Option 5: Transparency (R)
Option 5: Custody (S)

Application 1: Associated with individual content at maximum granularity.
Application 2: Associated with groups of content such as databases, files, or directories.
Application 3: Associated with applications.
Application 4: Associated with systems.
Application 5: Associated with network zones and subzones.
Application 6: Associated with business areas or customer sets.


IF regulatory or competitive drivers compel fine grained controls, THEN use IACUTRS as identified by each authorized identity and with conservative defaults, AND associated with individual content at maximum granularity,
OTHERWISE IF an enterprise has large business units or customer bases in small and unique niches with all content commonly applied, THEN use IACUTRS ranked based on common requirements AND associated with business areas or customer sets,
OTHERWISE IF a portion of an enterprise is highly IT-centric or if IT concerns dominate other business concerns, THEN rate IACUTRS associated with groups of content such as databases, files, or directories,
OTHERWISE IF a portion of the enterprise is focused on applications and independent of individual users THEN rate IACUTRS associated with applications,
OTHERWISE IF content and usage is highly individualized or virtualization is used to the extent that virtual systems appear to be customized as to content and utility, THEN rate IACUTRS associated with systems,
OTHERWISE IF common communications requirements are tightly aligned with IACUTRS requirements, THEN rate IACUTRS for network zones and subzones based on communication requirements associated with network zones and subzones,
OTHERWISE mix these strategies in the most convenient way and rate IACUTRS associated with different properties as ease of implementation and cost effectiveness dictate.


Integrity (I)

In most cases, the integrity of content is most important to its utility because, even if it is available and kept confidential, properly audited, and under use control, if it is wrong, its utility is poor. If it is wrong in specific ways, it can be very harmful. Integrity is often broken down into the integrity of the source, protection from inappropriate or unauthorized changes in the content, and assurance that the content represents an accurate reflection of reality suitable for the purpose.

  • Reflects reality: To the extent that the content at issue conveys information, that information reflects the reality it purports to depict to the extent that the nature of the content is capable of doing so. For example, if the content is an 8-bit value purported to reflects volume of sound over span of time, its precision and accuracy, frequency response, etc. are limited and the modality is limited to sound.
  • Source integrity: The source is what it purports to be. For example, a particular microphone has expected and/or identified properties that should be reflected in the content it provides. Similarly, a person acting as someone else, an organization portraying itself as someone else, or a device that is not what it is identified as may produce accurate content and still not have source integrity.
  • Freedom from alteration: The content is what it was when originated, or in the case of content that has known changes, the content is free from other changes. Many cryptographic technologies are associated with integrity in the sense of freedom from unauthorized change and attribution to some presumed controlled content, but cryptography has serious limitations in integrity protection.

Change control is a vital component of an effective integrity control scheme because it provides redundancy-based controls over changes to verify that they are reasonable, appropriate to the need, and that they operate correctly in the environment before the changes are deployed. Changes also have potentially recursive, complex, and indirect effects that lead to unintended consequences. For example, computer viruses use changes in software to cause transitive spread of the virus from program to program. This is an unintended but predictable consequence of combining general purpose function with transitive information flow and sharing.

Availability (A)

If information is not available in a timely fashion, its utility decreases, but may not completely disappear. Availability is typically measured in terms of mathematical formulas for availability and reliability of the function when needed. Availability is typically measured as percentage of down time per unit time. For example, hours of system outage per year is used for some systems. Sometimes it is normalized for utility in the enterprise, such as the use of user outage hours per month. It can also be calculated based on mean time to failure (MTTF) and mean time to repair (MTTR) as MTTR/(MTTF+MTTR). Assuming that everything is properly accounted for, these are measurements after the fact, but not as useful for prediction, which is critical for design.

Confidentiality (C)

If confidentiality is lost, some content may become useless or even dangerous, but this is rare. In most cases the consequences are limited to potential liability. When classified information, trade secrets, or similar content is involved, consequences are higher. Confidentiality is usually controlled based on the clearance of the identity, certainty of the authentication of that identity, classification of the content, and need for the authorized purpose. The means of creating and operating this basis is often more easily attacked than the real-time protection in an operating system or application. Information flow controls are the only really effective way to limit the movement of information from place to place. All other techniques are leaky in one way or another and most can be defeated to great effect by any reasonably astute attacker. These controls are implemented at routers through network separation technologies (e.g., VLANs with quality of service controls to eliminate covert channels), in computer systems through access controls, in physical technologies by separation of systems and networks by distance and with shielding, and in applications through application-level access control.

Use control (U)

If use control is lost, either content is not usable by those who are supposed to be able to use it, which corresponds to a loss of availability, or content is usable by those would should not be able to use it. This can lead to loss of integrity, availability, or confidentiality, depending on the specifics of the uses permitted. Use control generally associates authentication requirements with identified parties for authorized uses. The basic notion underlying use control is that identified individuals or systems acting on their behalf are granted appropriate use based on their identity and the demonstrated extent of authenticity of that identity. If the current level of authentication is inadequate to the need, additional authentication is required to meet the level required for the use. Use may be more permanently disabled via fail safe if warranted, for example by disabling system use for a period of time.

Accountability (T)

Loss of accountability reduces the certainty with which proper operation can be verified either now or in the future. Accountability is often considered in terms of attribution of actions to actors, the accurate identification and recording of the situation, and the association of the activity with the actor in the situation.

Transparency (R)

Loss of transparency reduces the trust others are likely to place in processes and the results they produce. Transparency is often considered in terms of openness about process, implementation, and history, allowing the truth of what has happened by whom, when, where, and how, and why to be revealed and allowing others to make their own judgments rather than trusting yours.

Custody (S)

Loss of custody implies loss of control and implies an inability to verify that what is being presented is what it purports to be and nothing else, that others have not had access to and/or tampered with what is presented, and supports the general notion of inability to be certain. Custody is often considered in terms of source, chain of events and possessions, and status over time.

Associated with individual content at maximum granularity.

Control objectives can be associated at maximum granularity, for example, by granting bank customers access to only their own account information. However; to maintain this across the enterprise for all content, is potentially very costly and complex to manage.

Associated with groups of content such as databases, files, or directories.

This approach is typically taken for systems management purposes and is aligned with how computers work as opposed to people or businesses.

Associated with applications.

This approach associates controls with applications, components of which may reside across locations, entities, infrastructures, systems, databases, and anything else. Applications tend to be logical groupings based on business functions.

Associated with systems.

This approach breaks down decisions by associating systems with content. It is particularly effective when a single system, or virtual system, is used for a certain class of content.

Associated with network zones and subzones.

This approach uses the zoning architecture to differentiate controls to provide common protective mechanisms associated with assuring the desired properties. It gains an economy of scale through commonality of mechanism.

Associated with business areas or customer sets.

This approach aligns protection objectives with businesses or customers. Alignment with businesses sometimes makes sense at a gross level, but usually only when a business is highly specialized and in a tight niche with common protection objectives for all relevant content. Customer sets are similar in that certain classes of customers may have very similar needs and therefore tight alignment of protection objectives.

Copyright(c) Fred Cohen, 1988-2015 - All Rights Reserved