Glossary

Orange Book Glossary, E to O
Orange Book Glossary, P to Z

borrowed and adapted from, Computer Security Basics , Deborah Russell and G. T. Gangemi Sr.

access
The ability of a subject to view, change, or communicate with an object in a computer system. Typically, access involves a flow of information between the subject and the object (for example, a user reads a file, a program creates a directory).

access control
Restrictions on the ability of a subject (eg. a user) to use a system or an object (eg. a file) in that system. Such controls limit access to authorized users only. Access control mechanisms may include hardware or software features, operating procedures, management procedures, or any combination.

access control list (ACL)
For a particular object, a list of the subjects authorized to access that object. The list usually indicates what type of access is allowed for each user. Typical types of access may include read, write, execute, append, modify, delete, and create.

accountability
A security principle stating that individuals must be able to be identified. With accountability, violations or attempted violations of system security can be traced to individuals who can be held responsible for their actions.

accreditation
Official authorization and approval, granted to a computer system or network, to process sensitive data in a particular operational environment. Accreditation is performed by specific technical personnel after a security evaluation of the system's hardware, software, configuration, and security controls.

accuracy
A security_principle that keeps information from being modified or otherwise corrupted either maliciously or accidentally. Accuracy protects against forgery or tampering. Synonymous with integrity.

acquisition
An Orange Book evaluation criteria. It provides a basis for specifying security requirements in acquisition specifications. The Orange Book provides a clear way of specifying a coordinated set of security functions.

active threat
A type of threat that involves the alteration, not simply the interception, of information. For example, an active tap is a type of wiretapping that accesses and compromises data, usually by generating false messages or control signals, or by altering communications between legitimate users. The danger of an active threat is primarily the authenticity of the information being transmitted. Contrast with passive threat.

administrative security
Management rules and procedures that result in protection of a computer system and its data. Sometimes called procedural security.

assurance
A measure of confidence that a system's security features have been implemented and work properly. Assurance is one of the primary issues addressed by the Orange Book.

attack
An attempt to bypass security controls on a system. An active attack alters data. A passive attack releases data. Whether or not an attack will succeed depends on the vulnerability of the system and the effectiveness of existing countermeasures.

audit
To record independantly and later examine system activity (eg. logins and logouts, file accesses, security violations). A security-related activity that relates to a subject's access of an object. In audit terms, such activities are often called events, and auditing itself is sometimes called event logging.

audit trail
The chronological set of records that provides evidence of system activity. These records can be used to reconstruct, review, and examine transactions from inception to output of final results. The records can also be used to track system usage and detect and identify intruders.

authenticate
To validate or prove what something or someone is.

authentication
The process of proving that a subject (eg. a user or a system) is what the subject claims to be. Authentication is a measure used to verify the eligibility of a subject and the ability of that subject to access certain information. It protects against the fraudulent use of a system or the fraudulent transmission of information. There are three classic ways to authenticate oneself: something you know, something you have and something you are. See also identification.

authenticity
A security principle that ensures a message is received in exactly the form in which it was sent. See also message authentication and message authentication code.

authorized
See authorization.

authorization
The granting of rights to a user, a program, or a process. For example, certain users may be authorized to access certain files in a system, whereas only the system administrator may be authorized to export data from a trusted system.

authorized user
A user that has been given authorization and is permitted access to certain information in a system.

automatic data processing (ADP)
A self-governed processing unit for data handling and storage.

availability
A security principle that ensures the ability of a system to keep working efficiently and to keep information accessible. Contrast with denial of service.

back door
See trap door.

bandwidth
A characteristic of a communication channel. The amount of information that can pass through the channel in a given amount of time. See covert channel.

Bell-LaPadula model
The computer security policy model on which the Orange Book requirements are based. From the Orange Book definition: "A formal state transition model of computer security policy that describes a set of access control rules. In this formal model, the entities in a computer system are divided into abstract sets of subjects and objects. The notion of a secure state is defined and it is proven that each state transition preserves security by moving from secure state to secure state; thus, inductively proving that the system is secure. A system state is defined to be "secure" if the only permitted access modes of subjects to objects are in accordance with a specific security policy. In order to determine whether or not a specific access mode is allowed, the clearance of a subject is compared to the classification of the object and a determination is made as to whether the subject is authorized for the specific access mode."

Biba model
An integrity model of computer security policy that describes a set of rules. In this model, a subject may not depend on any object or other subject that is less trusted than itself.

bomb
A type of programmed threat similar to a trojan horse. It is used to release a virus, a worm, or some other system attack. Usually planted by a system developer or a programmer. A bomb works by triggering some kind of unauthorized action when a particular date, time, or condition occurs. Also called a logic bomb.

capability
In capability-based systems, an identifier that identifies an object (eg. a file) and specifies the access rights for the subject (eg. the user) who possesses the capability (sometimes called a ticket).

category
An item in the nonheirarchal portion (the category set) of a sensitivity label. (The heirarchal portion is called the classification.) A category represents a distinct area of information in a system. When included in a sensitivity label in a system supporting mandatory access controls, it is used to limit access to those who need to know information in this particular category. Synonymous with compartment.

certification
The technical evaluation performed as part of, and in support of, the accreditation process that establishes the extent to which a particluar computer system or network design and implementation meets a prespecified set of security requirements.

challenge-response
A type of authentication in which a user responds correctly (usually by performing some calculation) to a challenge (usually a numeric, unpredictable one).

channel
A path used for information transfer within a system.

checksum
Numbers summed according to a particular set of rules and used to verify that transmitted data has not been modified during transmission.

cipher
See ciphertext.

ciphertext
In cryptography, the unintelligible text that results from encrypting original text. Sometimes called "codetext", "cryptotext", or " cipher."

Clark-Wilson model
An integrity model for computer security policy designed for a commercial environment. It addresses such concepts as nondiscretionary access control, privilege separation, and least privilege.

classification
The heirarchal portion of a sensitivity label. (The nonheirarchal portion is called the " category set" or the " compartments.") A classification is a single level in a stratified set of levels. For example, in a military environment, each of the levels UNCLASSIFIED, CONFIDENTIAL, SECRET, and TOP SECRET is more trusted than the level beneath it. When included in a sensitivity label in a system supporting mandatory access controls, a classification is used to limit access to those cleared at that level.

clearance
A representation of the sensitivity level (the classification and the categories) associated with a user in a system supporting mandatory access controls. A user with a particular clearance can typically access only information with a sensitivity label equal or lower than the user's clearance.

closed security environment
An environment in which both of the following conditions are true:

1. Application developers have sufficient clearances and authorizations to provide an acceptable presumption that they have not introduced malicious logic.

2. Configuration control provides sufficient assurance that applications and equipment are protected against the introduction of malicious logic prior to and during the operation of system applications.

communications channel
A path used for information transfer external to a system.

communications security
Protection of information while it's being transmitted, particularly via telecommunications. A particular focus of communications security is message authenticity.

compartment
See category.

compartmentalization
1. The isolation of the operating system, user programs, and data files from one another in a computer system to provide protection against unauthorized access by other users or programs.

2. The breaking down of sensitive data into small, isolated blocks to reduce the risk of unauthorized access.

compartmented mode workstation (CMW)
A trusted workstation that contains enough built-in security to be able to function as a trusted computer. A CMW is trusted to keep data of different security levels and categories in separate compartments.

compromise
Unauthorized disclosure or loss of sensitive information.

computer system security
Protects the information stored in the computer system from being lost, changed either maliciously or accidentally, or read or modified by those not authorized to access it.

confidentiality
A security principle that keeps information from being disclosed to anyone not authorized to access it. Synonymous with secrecy.

configuration management
The identification, control, accounting for, and auditing of all changes to system hardware, software, firmware, documentation, test plans, and test results throughout the development and operation of the system.

confinement
Prevention of leaking of sensitive data from a program.

confinement property
See star (*-property).

controlled access protection
The C2 system class. Security in this class is much more stringent than in C1 systems. The additional user-visible features are accountability of individual users, more detailed discretionary access controls and object reuse. System resources must be protected by access control features. More rigorous testing and documentation is required in this class.

countermeasure
An action, device, procedure, technique, or other measure that reduces the vulnerability of a system or a threat to that system.

covert channel
A communications channel that allows a process to transfer information in a way that violates a system's security policy. It is a path that is not normally used for communication in a system and therefore isn't protected by the system's normal security mechanisms.

covert channel analysis
Analysis of the potential for covert channels in a trusted computer system. In theory, virtually every piece of information stored in, or processed by, a secure computer system is a potential covert channel.

covert storage channel
A covert channel that allows a storage location (eg. a location on disk) to be written by one process and read by another process. The two processes are typically at different security levels.

covert timing channel
A covert channel that allows one process to signal information to another process by modulating the use of system resources (eg. CPU time) in a way that affects the response time observed by the second process.

cryptography
The study of encryption and decryption. From the Greek "kryptos" meaning "hidden" and "graphia" meaning "writing."

data access controls
Monitors who can access what data, and for what purpose. The system may support discretionary access controls or mandatory access controls depending on the system security class.

data hiding
Means that a layer in the heirarchy has no access to data outside itself: data handled by other layers is hidden.

decipher
The process in cryptography that decrypts ciphertext to original text.

decryption
The transformation of encrypted text (called ciphertext) into original text (called plaintext). It may also be applied to any data that can be represented in byte format. Sometimes called " deciphering".

denial of service
An action or series of actions that prevents a system or any of its resources from functioning efficiently and reliably. Contrast with availability.

descriptive top-level specification (DTLS)
A descriptive security model top-level specification for the TCB. It is also used to prove the system's security policy.

design documentation
The focus of the design documentation is on "the manufacturer's philosophy of protection and...how this philosophy is translated into the TCB." A key task is to define the boundries of the system and to clearly distinguish between those portions of the system that are security-relevant and those that are not. Depending on the security class, design documentation may take the form of informal, or formal descriptions and proofs of the security policy.

design specification and verification
Requires a mathematical and automated proof that the design description for a system is consistent with the system's security policy. At each level of security beginning with B1, the Orange Book requires an increasingly formal model of the system's security policy, along with increasing proof that the system design is consistent with this model.

device driver
Software that provides an interface between the system model and hardware. Usually the system side of the device driver interface matches the programming model of the operating system. The purpose of the software is to convert the parameters of the system interface to the functional operations required of the driver, and then to translate these operations to the system hardware.

When the device driver interfaces directly with the kernel (as required for the higher security classes), it is also required to operate with hardware segmentation.

device label
Each physical device attached to a system must have minimum and maximum security levels associated with it. These levels are to be used to "enforce constraints imposed by the physical environments in which the devices are located." For a multilevel, device the lowest levels and highest levels of information that may be sent to that device are specified. For a single-level device, the minimum is the same as the maximum. An integral part of the export of information to devices. See mandatory access control.

digital signature
An authentication tool that verifies the origin of a message and the identity of the sender and receiver. Can be used to resolve any authentication issues between the sender and receiver. A digital signature is unique for every transaction.

discretionary access control (DAC)
An access policy that restricts access to system objects (eg. files, directories, devices) based on the identity of the users and/or groups to which they belong. "Discretionary" means that a user with certain access permissions is capable of passing those permissions to another user (eg. letting another user modify a file). Contrast with mandatory access control.

discretionary protection
See discretionary access control.

discretionary security protection
The C1 system class. It consists of rather limited security features. The Orange Book describes C1 systems as an environment of "cooperating users processing data at the same level(s) of security." Security features are primarily intended to prevent users from making honest mistakes that could damage the system (eg. by writing over system memory or critical software) or from interfering with other users' work (by deleting or modifying their programs or data). The security features are insufficient to keep a determined intruder out. The system architecture must be capable of protecting system code from user programs. It must be tested to ensure proper operation and that security features can't be bypassed in any obvious way. There are also specific documentation requirements.

Two main user-visible features required in this class are passwords and discretionary protection of files and other objects.

domain
The set of objects that a subject is allowed to access.

dominate
A relationship between security levels in a system supporting mandatory access controls. One subject dominates another if the first subject's classification is greater than the second subject's classification, and if the first subject's categories include at least all of the second subject's categories.