A Framework for Deception
Draft Report


Table of Computer Deceptions

Attack Mechanism Deception Mechanisms and Levels
cable cuts: A cable is cut resulting in disrupted communications, usually requiring emergency response, and otherwise disrupting normal operations. Deception: Cable cuts simulate natural events and restrict access to information or force the use of alternative communications paths thus inhibiting or altering cognition.
Level: Hardware
fire: A fire occurs causing physical damage and permanent as well as temporary faults, requiring emergency response, and otherwise disrupting normal operations. Deception: Fire simulates natural events and restrict access to information or processing capabilities or force the use of alternative capabilities thus inhibiting or slowing cognition.
Level: Hardware
flood: A flood occurs causing physical damage and permanent as well as temporary faults, requiring emergency response, and otherwise disrupting normal operations. Deception: Flood simulates natural events and restrict access to information or processing capabilities or force the use of alternative capabilities thus inhibiting or slowing cognition.
Level: Hardware
earth movement: The Earth moves causing physical damage and permanent as well as temporary faults, requiring emergency response, and otherwise disrupting normal operations. Deception: Earth movement simulates natural events and restrict access to information or processing capabilities or force the use of alternative capabilities thus inhibiting or slowing cognition.
Level: Hardware
environmental control loss: Environmental controls required to maintain proper operating conditions for equipment fails causing disruption of services. Examples causes include air conditioning failures, heating failures, temperature cycling, smoke, dust, vibration, corrosion, gases, fumes, chemicals. Deception: Environmental control loss simulates natural events and restrict access to information or processing capabilities or force the use of alternative capabilities thus inhibiting or slowing cognition.
Level: Hardware
system maintenance: System maintenance causes period of time when systems operate differently than normal and may result in temporary or permanent inappropriate or unsafe configurations. Maintenance can also be exploited by attackers to create forgeries of sites being maintained, to exploit temporary openings in systems created by the maintenance process, or other similar purposes. Maintenance can accidentally result in the introduction of viruses, by leaving improper settings, and by other similar accidental events. Deception: Maintenance can be used to alter the cognitive function of the system and during maintenance periods, cognitive function tends to be reduced.
Level: All
Trojan horses: Unintended components or operations are placed in hardware, firmware, software, or wetware causing unintended and/or inappropriate behavior. Examples include time bombs, use or condition bombs, flawed integrated circuits, additional components on boards, additional instructions in memory, operating system modifications, name overloaded programs placed in an execution path, added or modified circuitry, mechanical components, false connectors, false panels, radios placed in network connectors, displays, wires, or other similar components. Deception: Trojan horses can be introduced to alter the cognitive function of the computer at all levels.
Level: All
fictitious people: Impersonations or false identities are used to bypass controls, manage perception, or create conditions amenable to attack. Examples include spies, impersonators, network personae, fictional callers, and many other false and misleading identity-based methods. Deception: Fictitious people allow modified access thus introducing the potential for internal modification of cognitive function, deceptive data and observation of internal cognitive function.
Level: All
resource availability manipulation: Resources are manipulated so as to make functions requiring those resources operate differently than normal. Examples include e-mail overflow used to disrupt system operation, [Cohen93] file handle consumption used to prevent audits from operating, [Cohen91] and overloading unobservable network paths to force communications to use observable paths. Deception: Altering resources impacts available resources for cognition and may prevent certain cognitive functions from acting, thus altering or disrupting the cognitive structure of the system temporarily.
Level: Hardware, OS
spoofing and masquerading: Creating false or misleading information in order to fool a person or system into granting access or information not normally available. Examples include operator spoofing to trick the operator into making an error or giving away a password, location spoofing to trick a person or system into believing a false location, login spoofing which creates a fictitious login screen to get users to provide identification and authentication information, email spoofing which forges email to generate desired results, and time spoofing which creates false impressions of relative or absolute time in order to gain advantage. Deception: This provides falsified inputs to the cognitive system which are interpreted as if they were legitimate unless special precautions are used to inhibit this effect.
Level: All
infrastructure interference: Interfering with infrastructure so as to disrupt services and/or redirect activities. Examples include creating an accident on a particular road at a particular place and time in order to cause a shipment to be rerouted through a checkpoint where components are changed, taking down electrical power in order to deny information services, modifying a domain name server on the Internet in order to alter the path through which information flows from point to point, and cutting a phone line in order to sever communications. Deception: This inhibits communication and thus alters observables and cognitive function.
Level: Hardware
insertion in transit: Insertion of information in transit so as to forge desired communications. Examples include adding transactions to a transaction sequence, insertion of routing information packets so as to reroute information flow, and insertion of shipping address information to replace an otherwise defaulted value. Deception: This provides falsified inputs to the cognitive system which are interpreted as if they were legitimate unless special precautions are used to inhibit this effect. Inserted components can also be used to alter cognitive structure and function.
Level: All
modification in transit: Modification of information in transit so as to modify communications as desired. Examples include removing end-of-session requests and providing suitable replies, then taking over the unterminated communications link, modification of an amount in an electronic funds transfer request, and rewriting Web pages so as to reroute subsequent traffic through the attacker's site. Deception: This provides falsified inputs to the cognitive system which are interpreted as if they were legitimate unless special precautions are used to inhibit this effect. Altered content may also be used to alter cognitive structure and function.
Level: All
sympathetic vibration: Creating or exploiting positive feedback loops or under damped oscillatory behaviors so as to overload a system. Examples include electrical or acoustic wave enhancement, the creation of packets in the Internet which form infinite communications loops, and protocol errors causing cascade failures in telephone systems. Deception: This can disrupt cognitive function and reduce available cognitive resources.
Level: All
cascade failures: Design flaws in tightly coupled systems that cause error recovery procedures to induce further errors under select conditions. Examples include the electrical cascade failures in the U.S. power grid, [WSCC96] telephone system cascade failures causing widespread long distance service outages, [Pekarske90] and inter-system cascades such as power failures bringing down telephone switches required to bring back up power stations. Deception: Cascade failures result in massive disruption of cognitive function.
Level: All
invalid values on calls: Invalid values are used to cause unanticipated behavior. Examples include system calls with pointer values leading to unauthorized memory areas and requests for data from databases using system escape characters to cause interprocess communications to operate improperly. Deception: Cognitive system function can be avoided or altered.
Level: OS and above
undocumented or unknown function exploitation: Functions not included in the documentation or unknown to the system owners or operators are exploited to perform undesirable actions. Examples include back doors placed in systems to facilitate maintenance, undocumented system calls commonly inserted by vendors to enable special functions resulting in economic or other market advantages, and program sequences accessible in unusual ways as a result of improperly terminated conditionals. Deception: Cognitive system function can be avoided or altered.
Level: All
excess privilege exploitation: A program, device, or person is granted privileges not strictly required in order to perform their function and the excess privilege is exploited to gain further privilege or otherwise attack the system. Examples include Unix-based SetUID programs granted root access exploited to grant attackers unlimited access, access to unauthorized need-to-know information by a systems administrator granted too-flexible maintenance access to a network control switch, and user-programmable DMA devices reprogrammed to access normally unauthorized portions of memory. Deception: Cognitive system function can be avoided or altered and operative system level can be moved closer to the hardware level.
Level: Application
environment corruption: The computing environment upon which programs or people depend for proper operation is corrupted so as to cause those other programs to operate incorrectly. Examples include manipulating the Unix FS environment variable so as to cause command interpretation to operate unusually, altering the PATH (or similar) variable in multi-user systems to cause unintended programs to be used, and manipulation of a paper form so as to change its function without alerting the person filling it out. In the physical domain, this includes the introduction of gasses, dust, or other particles, chemicals, or elements into the physical environment. In the electromagnetic realm, it includes waveforms. In the human sense, sound, smell, feel, and other sensory input corruption is included. Deception: The context for cognitive function can be altered thus altering function or interpretation of content.
Level: All
device access exploitation: Access to a device is exploited to alter its function or cause its function to be used in unanticipated ways. Examples include removing shielding from a wire so as to cause more easily received electromagnetic emanations, reprogramming a bus device to deny services at a hardware level, and altering microcode so as to associate attacker-defined hardware functions with otherwise unused operation codes. Deception: Physical level access granted to devices can provide arbitrary modification of cognitive function or content.
Level: Hardware, Driver
modeling mismatches: Mismatches between models and the realities they are intended to model cause the models to break down in ways exploitable by attackers. Examples include use of the Bell-LaPadula model of security [Bell73] as a basis for designing secure operating systems - thus leaving disruption uncovered, modeling attacks and defenses as if they were statistically independent phenomena for risk analysis - thus ignoring synergistic effects, and modeling misconfigurations as mis-set protection bits - when the content of configuration files remains uncovered. Deception: Cognitive system is unable to properly interpret information and makes judgments based in irrelevant rules.
Level: Application and above
simultaneous access exploitations: Two or more simultaneous or split multi-part access attempts are made, resulting in an improper decision or loss of audit information. Examples include the use of large numbers of access attempts over a short period of time so as to cause grant/refuse decision software to act in a previously unanticipated and untested fashion, the execution of sequences of operations required for system takeover by multiple user identities, and the holding of a resource required for some other function to proceed so as to deny completion of that service. Deception: Cognitive system may act on the wrong data, misinterpret data, fail to operate, or collapse entirely.
Level: All
implied trust exploitation: Programs operating in a shared environment inappropriately trust the information supplied to them by untrustworthy programs. Examples include forged data from Domain Name Servers in the Internet used to reroute information through attackers, forged replies from authentication daemons causing untrusted software to be run by access control software, forged Network Information Service packets causing wrong password entries to be used in authenticating attackers, and network-based administration programs that can be fooled into forwarding incorrect administrative controls. Deception: Cognitive decisions based on trusted sources fails because sources are not trustworthy.
Level: All
interrupt sequence mishandling: Unanticipated or incorrectly handled interrupt sequences cause system operation to be altered unpredictably. Examples include stack frame errors induced by incorrect interrupt handling, the incorrect swapping out of the swapping daemon on unanticipated conditions, and denial of services resulting from improper prioritization of interrupts. Deception: Cognitive system may act on the wrong data, misinterpret data, fail to operate, or collapse entirely.
Level: Driver, OS
emergency procedure exploitation: An emergency condition is induced resulting in behavioral changes that reduce or alter protection to the advantage of the attacker. Examples include fires, during which access restrictions are often changed or less rigorously enforced, power failures during which many automated alarm and control systems fail in a safe mode with respect to some - possibly exploitable - criteria, and computer incident response during which systems administrators commonly deviate - perhaps exploitably - from their normal behavioral patterns. Deception: Cognitive system operating in an emergency response mode ignores some things it would otherwise attend to and attends to things it would otherwise ignore. This alters the observables and processing resources available for different cognitive functions.
Level: All
desychronization and time-based attacks: Systems that depend on synchronization are desynchronized causing them to fail or operate improperly. Examples include DCE servers that may deny services network-wide when caused to become desynchronized beyond some threshold, cryptographic systems which, once desynchronized may take a substantial amount of time to resynchronize, automated software and systems maintenance tools which may make complex decisions based on slight time differences, and time-based locks which may be caused to open or close at the wrong times. Deception: Cognitive system may act on the wrong data, misinterpret data, fail to operate, or collapse entirely. Specific failures to make proper time associations or inability to properly detect sequences may significantly alter data interpretation.
Level: All
imperfect daemon exploits: Daemon programs designed to provide privileged services upon request have imperfections that are exploited to provide privileges to the attacker. Examples include Web, Gopher, Sendmail, FTP, TFTP, and other server daemons exploited to gain access to the server from over a network, internal use only daemons such as the Unix cron facility exploited to gain root privileges by otherwise unprivileged users, and automated backup and recovery daemons exploited to overwrite current versions of programs with previous - more vulnerable - versions. Deception: Cognitive system function can be avoided or altered and operative system level can be moved closer to the hardware level.
Level: Library, Application
multiple error inducement: The introduction of multiple errors is used to cause otherwise reliable software to fail in unanticipated ways. Examples include the creation of an input syntax error with a previously locked error-log file resulting in inconsistent data state, the premature termination of a communications protocol during an error recovery process - possible causing a cascade failure, and the introduction of simultaneous interleaved attack sequences causing normal detection methods to fail. [Hecht93] [Thyfault92] Deception: The cognitive system starts to break down, losing the ability to properly fuse content, making stored data interpretation questionable, and possibly inducing incorrect memory and state changes.
Level: All
viruses: Programs that reproduce and possibly evolve. Examples include the 11,000 or so known viruses, custom file viruses designed to act against specific targets, and process viruses that cause denial of service or thrashing within a single system. Deception: Viruses can be introduced to alter the cognitive function of the computer at all levels and through reproduction, overwhelm cognitive systems and spread like a disease throughout the cognitive system at all levels.
Level: All
data diddling: Modification of data through unauthorized means. Examples include non-database manipulation of database files accessible to all users, modification of configuration files used to setup further machines, and modification of data residing in temporary files such as intermediate files created during compilation by most compilers. Deception: Content alteration results in cognition based on incorrect observables or stored memories.
Level: Operating system and above
electronic interference: Jamming signals are introduced to cause failures in electronic communications systems. Examples include the method and apparatus for altering a region in Earth atmosphere, ionosphere, and/or magnetosphere, and common radio jamming techniques. Deception: Arbitrary disruption of cognitive function.
Level: Hardware
repair-replace-remove information: Repair processes are exploited to extract, modify, or destroy information. Examples include computer repair shops copying information and reselling it and maintenance people introducing computer viruses. Deception: Controlled alteration of all levels of cognitive function and stored content.
Level: All
wire closet attacks: Break into the wire closet and alter the physical or logical network so as to grant, deny, or alter access. Examples include wire tapping techniques, malicious destruction of wiring causing service disruption, and the introduction of video tape players into surveillance channels to hide physical access. Deception: Alteration of mapping between reality and perception. This also facilitates insertion of content.
Level: Hardware
process bypassing: Bypassing a normal process in order to gain advantage. Examples include retail returns department employees entering false return data in order to generate refund checks, use of computer networks to generate additional checks after the legitimate checks have passed the last integrity checks, and altering pricing records to reflect false inventory levels to cover up thefts. Deception: Select cognitive functions are bypassed.
Level: All
content-based attacks: The content sent to an interpretive mechanism causes that mechanism to act inappropriately. Examples include Web-based URLs that bypass firewalls by causing the browser within the firewall to launch attacks against other inside systems, macros written in spreadsheet or word processing languages that cause those programs to perform malicious acts, and compressed archives that contain files with name clashes causing key system files to be overwritten when the archive is decompressed. Deception: Content fed to the cognitive system in the form of observables are controlled by the attacker.
Level: Library and above
restoration process corruption or misuse: The process used to restore information from backup tapes is corrupted or misused to the attackers advantage. Examples include the creation of fake backups containing false information, alteration of tape head alignments so that restoration fails, and the use of privileged restoration programs to grant privilege by restoring protection settings or ownerships to the wrong information. Deception: Arbitrary alteration of cognitive function or memory starting at restoration time. Inability to properly recover from other system faults.
Level: Library and above
hangup hooking: Activity termination protocols fail or are interrupted so that termination does not complete properly and the protocol is taken over by the attacker. Examples include modem hangup failures leaving logged-in terminal sessions open to abuse, interrupted telnet sessions taken over by attackers, preventing proper protocol completion as in the Internet SYN attacks so as to deny subsequent services, and refusing to completely disconnect from a call-back modem at the CO, causing the call-back mechanism to become ineffective. Deception: Inability to sever observables properly thus forcing undesired observables into the cognitive system.
Level: Hardware, Library, Driver, OS
call forwarding fakery: Call forwarding capabilities are abused. Examples include the use of computer controlled call forwarding to forward calls from call-back modems to that attackers get the call-backs, forwarding calls to illegitimate locations so as to intercept communications and provide false or misleading information, and the use of programmable call forwarding to cause long distance calls to be billed to the forwarding party's account. Deception: Misassociation of data sources.
Level: Hardware
input overflow: Excessive input is used to overrun input buffers, thus overwriting program or data storage so as to grant the attacker undesired access. Examples include sendmail overflows resulting in unlimited system access from attackers over the Internet, Web server overflows granting Internet attackers unlimited access to Web servers, buffer overruns in privileged programs allowing users to gain privilege, and excessive input used to overrun input buffers causing loss of critical data so as to deny services or disrupt operations. Deception: Alteration of the cognitive function and memory at all levels.
Level: All
illegal value insertion: Values not permitted by the specification but allowed to pass the implementation are used to cause abnormal results. Examples include negative dates producing negative interest which accrues to the benefit of the attacker, cash withdrawal values which overflow signed integers in balance adjustment causing large withdrawals to appear as large deposits, and pointer values sent to system calls that point to areas outside of authorized address space for the calling party. Deception: The cognitive system sees things it is not prepared to deal with resulting in potentially bizarre behavior and alteration of cognitive function and memory.
Level: All
privileged program misuse: Programs with privilege are misused so as to provide unauthorized privileged functions. Examples include the use of a backup restoration program by an operator to intentionally restore the wrong information, misuse of an automated script processing facility by forcing it to make illicit copies of legitimate records, and the use of configuration management tools to create vulnerabilities. Deception: Exploitation of trusted cognitive processes to inappropriate ends.
Level: Application, OS, Driver
error-induced misoperation: Errors caused by the attacker induce incorrect operations. Examples include the creation of a faulty network connection to deny network services, the intentional introduction of incorrect data resulting in incorrect output (i.e., garbage in - garbage out), and the use of a scratched and bent diskette in a disk drive to cause the drive to permanently fail. Deception: Disruption of some elements of cognitive function or memory is exploited to cause systemic breakdowns in select areas.
Level: All
audit suppression: Audit trails are prevented from operating properly. Examples include overloading audit mechanisms with irrelevant data so as to prevent proper recording of malicious behavior, network packet corruption to prevent network-based audit trails from being properly recorded, and consuming some resource critical to the auditing process so as to prevent audit from being generated or kept. Deception: Inability of cognitive system to recall elements of what took place.
Level: All
induced stress failures: Stresses induced on a system cause it to fail. Examples include paging monsters that result in excessive paging and reduced performance, process viruses that consume various system resources, and large numbers of network packets per unit time which tie up systems by forcing excessive high-priority network interrupt processing. Deception: Overloading of cognitive system causing cognitive dissonance, mishandling if sequential cognitive functions, and sporadic loss of memory.
Level: All
false updates: Causing illegitimate updates to be made. Examples include sending a forged update disk containing attack code to a victim, interrupting the normal distribution channel and introducing an intentionally flawed distribution tape to be delivered, and substituting a false update disk for a real one at the vendor or customer site. Deception: Alteration of cognitive function and memory at all levels.
Level: All
network service and protocol attacks: Characteristics of network services are exploited by the attacker. Examples include the creation of infinite protocol loops which result in denial of services (e.g., echo packets under IP), the use of information packets under the Network News Transfer Protocol to map out a remote site, and use of the Source Quench protocol element to reduce traffic rates through select network paths. Deception: Low-level cognitive disruption causing actions at low levels that never reach higher cognitive functions for evaluation.
Level: Hardware, Driver, Protocol
distributed coordinated attacks: A set of attackers use a set of vulnerable intermediary systems to attack a set of victims. Examples include a Web-based attack causing thousands of browsers used by users at sites all around the world to attack a single victim site, a set of simultaneous attacks by a coordinated group of attackers to try to overwhelm defenses, and an attack where thousands of intermediaries were fooled into trying to gain access to a victim site. Deception: Massive overload of observables disrupting or dazzling observations and overwhelming cognitive system, possibly resulting in overall cognitive breakdown or cognitive overload.
Level: All
man-in-the-middle: The attacker positions forces between two communicating parties and both intercepts and relays information between the parties so that each believes they are talking directly to the other when, in fact, both are communicating through the attacker. Examples include attacks on public key cryptosystems permitting a man-in-them-middle to fool both parties, attacks wherein an attacker takes over an ongoing telecommunications session when one party decides to terminate it, and attacks wherein an attacker inserts transactions and prevents responses to those transactions from reaching the legitimate user. Deception: Cognitive decisions based on trusted sources fails because sources are not trustworthy.
Level: Hardware, Protocol
replay attacks: Communicated information is replayed and causes unanticipated side effects. Examples include the replay of encrypted funds transfer transmissions so as to cause multiples of an original sum of money to be transferred, replay of coded messages causing the repeated movement of troops, replay of transaction sequences that simulate behavior so as to cover up actual behavior, and the delayed replay of events such as races so as to deceive a victim. Deception: Previously used cognitive sequences are regenerated to induce predictable compliance effect.
Level: Protocol, Application and above
error insertion and analysis: Errors are induced into systems to reveal values stored in those systems. Examples include recent demonstrations of methods for inducing errors so as to reveal keys stored in smart-cards and other similar key-transportation devices, the introduction of multiple errors into redundant systems so as to cause the redundancy to fail, and the introduction of errors designed to cause systems to no longer be used in critical applications. Deception: Cognitive function is indirectly left open to examination for analysis and subsequent exploitation.
Level: All
reflexive control: Reflexive reactions are exploited by the attacker to induce desired behaviors. Examples include the creation of attacks that appear to come from a friend so as to cause automated response systems to shut down friendly communication, induction of select flaws into the power grid so as to cause SCADA systems to reroute power to the financial advantage of select suppliers, and the use of forged or interrupted signals so as to cause friendly fire incidents. Deception: Known cognitive functions are exploited by induction and inhibition of observables to induce compliant behavior.
Level: All
dependency analysis and exploitation: Interdependences of systems and components are analyzed so as to determine indirect effects and attack weak points upon which strong points depend. Examples include attacking medical information systems in order to disrupt armed forces deployments, attacking the supply chain in order to corrupt information within an organization, and attacking power grid elements in order to disrupt financial systems. Deception: Relationships are examined and analyzed to find sequences of induced or inhibited observables resulting in compliant behavior.
Level: All
interprocess communication attacks: Interprocess communications channels are attacked in order to subvert normal functioning. Examples include the introduction of false interprocess signals in a network interprocess communications protocol causing misbehavior of trusted programs, the disruption of interprocess communications by resource exhaustion so as to prevent proper checking or reduce or eliminate functionality, and observation of interprocess communications stored in shared temporary data files so as to gain unauthorized information. Deception: Internal cognitive communication is altered to modify the cognitive process.
Level: OS, Library, Protocol, Application
below-threshold attacks: Attack detection based on thresholds of activity that differentiate between attacks and similar non-malicious behaviors is exploited by launching attacks that operate below the detection threshold. Examples include breadth-first password guessing attacks, breadth-first port scanning attacks, and low bandwidth covert channel exploitations. Deception: Signals below observable levels to some level of the cognitive system are used so as to achieve compliant behavior while avoiding higher level cognitive processing.
Level: All
peer relationship exploitation: The transitive trust relationships created by peer-networking are exploited so as to expand privileges to the transitive closure of peer trust. Examples include the activities carried out by the Morris Internet virus in 1988, the exploitation of remote hosts (.rhosts) files in many networks, and the exploitation of remote software distribution channels as a channel for attack. Deception: Trust associated with specific peers is exploited by falsification so that untrustworthy content is used as if it were trusted.
Level: Protocol, application, and above
piggybacking: Exploiting a (usually false) association to gain advantage. Examples include walking into a secure facility with a group of other people as one of the crowd, acting like an ex-policeman to gain intelligence about ongoing police activities, and adding a floppy disk to a series of floppy disks delivered as part of a normal update process. Deception: False association is exploited to subvert normal cognitive processing inhibitions.
Level: All
collaborative misuse: Collaboration of several parties or identities in order to misuse a system. Examples include creation of a false identity by one party and entry of that identity into a computer database by a second party, provision of attack software by an outsider to an insider who is participating in an information theft, partitioning of elements of an attack into multiple parts for coordinated execution so as to conceal the fact of or source of an attack, and the providing of alibis by one party to another when the collaborated in a crime. Deception: Cognitive systems designed to base decisions on social conditions or preponderance of evidence are exploited by providing adequate content to support compliant decisions.
Level: All
race conditions: Interdependent sequences of events are interrupted by other sequences of events that destroy critical dependencies. Examples include the change of conditions tested in one step and depended upon for the next step (e.g., checking for the existence of a file before creating it interrupted by the creation of a file of the same name by another owner), changes between one step in a process and another step assuming that no such change has been made (e.g., the replacement of a mounted file system previously loaded with data in a start-up process), and waiting for non-locked resources available in one step but not in the next (e.g., the mounting of a different tape between an initial read-through and a subsequent restoration). Deception: Timing limitations of the cognitive system are exploited to induce compliant behavior.
Level: All
kiting: Inherent delays are exploited by creating a ring of events that chase each others' tails, thus creating the dynamic illusion that things are different the static case would support. Examples include check kiting schemes where delays in processing checks causes temporary conditions where the sum of the balances indicated in a set of accounts is far greater than the total amount of money actually invested, techniques for avoiding payments of debts for a long time based on legally imposed delays in and rules regarding the collection of debts by third parties, and the use of revoked keys in key management systems without adequate revocation protocols. Deception: Cognitive system timing limitations are exploited by a system that can operate at a time frame within the decision cycle of the target.
Level: Application and above
salami attacks: Many small transactions are used together for a larger aggregated effect. Examples include taking round-off error amounts from financial interest computations and adding them to the thief's account balance (resulting in no net loss to the system), the slow leakage of information through covert channels at rates below normal detection thresholds, and economic intelligence gathering efforts involving the aggregation of small amounts of information from many sources to derive an overall picture of an organization. Deception: Content that the cognitive system ignores is used in volumes below the level the cognitive system can fuse meaningfully to aggregate into a larger overall result than the system would otherwise allow to go unnoticed.
Level: Application and above
repudiation: A transaction or other operation is repudiated by the party recorded as initiating it. Examples include repudiating a stock trade, claiming your account was broken into and that you didn't do it, and asserting that an electronic funds transfer was not done. Deception: The cognitive system is unable to validate information but doesn't have time or resources to properly validate is and thus makes a premature decision which is exploited.
Level: Application and above