Glossary

Orange Book Glossary, A to D
Orange Book Glossary, E to O

borrowed and adapted from, Computer Security Basics , Deborah Russell and G. T. Gangemi Sr.

passive threat
A type of threat that involves the interception, but not the alteration, of information. For example, a passive tap is a type of wiretapping that involves eavesdropping, monitoring, and/or recording of information, but not the generation of false messages or control signals. The danger of a passive threat is primarily the secrecy of the information being transmitted. Contrast with active threat.

password
A secret sequence of characters that's used to authenticate a user's identity, usually during a login process.

penetration
A successful, unauthorized access to a computer system.

penetration testing
A type of testing in which testers attempt to circumvent the security features of a system in an effort to identify security weaknesses.

perimeter
See security perimeter.

permission
A type of interaction a subject can have with an object. For example, file permissions specify the actions particular users or classes of users can perform on a file. Examples are read, write, and execute.

personal trait
Something you are which determines access to a system. It may be a fingerprint, handprint, retina pattern, voice, signature, or keystroke pattern.

physical access device
Generally, something you have to gain entry to a system. It may be a key, token, badge, or smart card.

plaintext
In cryptography, the original text that is being encrypted. Synonymous with cleartext.

policy
The set of laws, rules, and practices that regulate how an organization manages, protects and distributes the subject of the policy. Also see security policy.

privacy
A security principle that protects individuals from the collection, storage, and dissemination of information about themselves and the possible compromises resulting from unauthorized release of that information.

private key encryption
A type of encryption that uses a single key to both encrypt and decrypt information. Also called symmetric, or single-key, encryption. Contrast with public key encryption.

privilege
A right granted to a user, a program, or a process. For example, certain users may have the privileges that allow them to access certain files in a system. Only the system administrator may have the privileges necessary to export data from the trusted system.

protocol
A set of rules and formats for the exchange of information, particularily over a communications network.

protocol layer
A component process within an overall communications process. Typically, each layer provides specific functions and communicates with the layers above and beneath it.

protocol model
The conceptual basis for describing how to communicate within a network.

public key encryption
A type of encryption that uses two mathematically related keys. The public key is known within a group of users. The private key is known only to its owner. Contrast with private key encryption.

rainbow series
A series of books interpreting Orange Book requirements. They are known as the Rainbow Series because each has a different colour cover. There are more than 20 books in the Rainbow Series.

read
An operation involving the flow of information from an object to a subject. It does not involve the alteration of that information.

recovery
The actions necessary to restore a system and its data files after a system failure or intrusion.

reference monitor
From the Orange Book definition, "An access control concept that refers to an abstract machine that mediates all accesses to objects by subjects."

requirement
A requirement is a pre-disposing condition of a system.

requirement dependancy
A requirement dependancy arises when a requirement of a system is dependant, or co-dependant, on another requirement.

requirement conflict
A requirement conflict is the special situation where requirement dependancies of a system are incompatible.

repudiation
The denial by a message sender that the message was sent, or by a message recipient that the message was received.

residue
Data left in storage or on a medium before the data has been rewritten or eliminated in some other way.

risk
The probabilty that a particular security threat will exploit a particular system vulnerability.

risk assessment
An analysis of the system's information needs and vulnerabilities to determine how likely they are to be exploited in different ways and the costs of losing and/or recovering the system or its information.

scaling
The process of modifying the size of an object by use of a multiplicative factor (any positive real number). Scaling introduces non-linearities and disfunctional artifacts when scaled to smaller dimensions.

Although software may be referred to as being scaled, it usually undergoes additive or subtractive operations instead of factoring. Scaling a software system to smaller dimensions may also result in disfunctional artifacts by the loss of reliant functions.

secrecy
A security principle that keeps information from being disclosed to anyone not authorized to access it. Synonymous with confidentiality.

secure state
A condition in which none of the subjects in a system can assess objects in an unauthorized manner.

security
Freedom from risk or danger. Safety and the assurace of safety.

security administrator
The security administration role is limited to those system administration functions that strictly affect security, and to those functions essential to performing the security role effectively.

security control
A function which affects the security operations of a system.

security domain
The B3 system class. There are no new user-visible security features from the B2 level, but system design and assurance features are substantially more rigorous. Trusted facility management is required, as is trusted recovery and the ability to signal the administrator immediately if the system detects an "imminent violation of security policy." The system must satisfy the reference monitor requirement by being simple, tamperproof, and impossible to bypass. Systems must be "highly resistant to penetration." The TCB is very tightly structured to exclude code not required for the enforcement of system security.

security feature
An aspect of system security behaviour that forms part of the user interface.

security features user's guide (SFUG)
Documentation that is aimed at the ordinary, unprivileged system user. It tells them everything they need to know about system security features and how to enforce them.

security kernel
From the Orange Book definition, "The hardware, firmware, and software elements of a Trusted Computing Base that implement the reference monitor concept. It must mediate all accesses, be protected from modification, and be verifiable as correct." It is typically the bottom layer in a multiple-layer design.

security level
A representation of the sensitivity of information, derived from a sensitivity label (consisting of classification and categories).

security model
A precise statement of the security rules of a system.

security perimeter
The imaginary boundary which contains the security kernel, as well as other security-related system functions. The TCB must be designed and implemented in such a way that system elements included in it (those inside the security perimeter) are designed to perform security functions, while those elements excluded from the TCB (those outside the security perimeter) need not be trusted. The interfaces across the security perimeter must be precisely defined and enforced by the security components of the system. The kernel, kernel extensions, Java Virtual Machine and trusted device drivers define the JOS security kernel. The bytecode/JVM interface may then be defined as the security perimeter in a JOS system.

security policy
From the Orange Book definition: "The set of laws, rules, and practices that regulate how an organization manages, protects, and distributes sensitive information."

security policy model
A precise statement of the security laws, rules, and practices of a system that regulate how an organization manages, protects, and distributes sensitive information.

security principle
A fundamental aspect of security.

security requirement
A defined aspect of system operation that is necessary to a given class of system security.

security testing
A type of testing in which testers determine whether the security features of a system are implemented as designed. Security testing may include hands-on functional testing, penetration testing, and formal verification.

segmentation
A hardware protection feature using virtual memory divided into segments. A process will use as many segments as it needs when it executes. The result of segmentation is that unprivileged user processes cannot access or modify the memory used by the system.

self/group/public controls
A form of discretionary access control in which file access is determined by category. File permissions or some other scheme allow the owner of the file to specify what permissions he or she (self) will have, what permissions a group of users will have, and what permissions the rest of the world (public) will have. Typical permissions include read, write, and execute.

sensitive information
Information that, if lost or compromised, would negatively affect the owner of the information, would jeopardize the ability of the system to continue processing, and/or would require substantial resources to recreate. According to the U.S. government (NTISSP 2), "information the disclosure, alteration, loss, or destruction of which could adversely affect national security or other federal government interests."

sensitivity label
A label representing the security level of an object and describing the sensitivity of the data in the object. The label consists of two parts: a hierarchal classification and a set of non-heirarchal categories or compartments. In systems supporting mandatory access controls, sensitivity labels determine whether a particular subject will be allowed to access a particular object.

simple security condition
From the Orange Book definition: "A Bell-LaPadula security model rule allowing a subject read access to an object only if the security level of the subject dominates the security level of the object."

single-level
Used to describe data or devices. Single-level security allows a system to be accessed at any time only by users at the same sensitivity level. A single- level device is one used to process only data of a single security level at any time. Contrast with multilevel.

spoof
A trick that causes an authorized user to perform an action that violates system security or that gives away information to an intruder.

star property (*-property)
From the Orange Book definition: "A Bell-LaPadula security model rule allowing a subject write access to an object only if the security level of the subject is dominated by the security level of the object. Also known as the confinement property."

storage object
A system object that persists information. It may be resident in RAM, ROM, CD-ROM, DVD, cache memory, CPU registers, HD, HD controller cache, floppy, tape, video buffer, animation frame buffers, network cache, printing cache, print server, com server, and other storage devices.

structured protection
The B2 system class. The B1 system class features are extended without adding user-visible security features. Additional assurance is required that the features were designed, and work, properly. The Orange Book says that B2 systems must be "relatively resistant to penetration."

This class requires labeling to include all objects and devices in the system, a trusted path and least privilege access. Hardware modularity features isolate security-related functions from those that are not security related. A formal, mathematical statement of the system's security policy, more stringent testing and related documentation are required. A configuration management system must manage all changes to the system code and documentation and system designers must conduct a search for covert channels.

subject
From the Orange Book definition, "An active entity, generally in the form of a person, process, or device that causes information to flow among objects or changes the system state."

subject sensitivity labels
The idea of subject sensitivity labels is that the user must always know what security level she or he is working at. Trusted systems typically display the user's clearance during log-in, and redisplays it if the security level changes either automatically or at the user's request. A user may not always work at the highest security level they're allowed to see.

system
The hardware, firmware, software, and operational procedures that make up the system.

system access controls
Ensures that unauthorized users don't gain entry to the system by use of passwords, physical access devices or personal traits.

system administrator
A person who carries out an organization's security policy and is responsible for either making or breaking the security of the system. Some of the system administrator's functions are to run daily backups, train system users, set up and protect the password file and other system-critical files, and examine log files.

system architecture
This requirement has to do with the way a system is designed to make security possible - if not inevitable. Some systems have a clear distinction between user and system areas. Other systems have more complex, ring-based architectures in which there may be as many as ten distinct, increasingly more privileged domains. In a typical ring-based architecture, the TCB, or security kernel, occupies the innermost ring; user programs occupy the outermost ring; in between are such intermediate processes as operating system services and administrative programs.

system design
Is the process whereby a designer utilizes to best advantage the basic hardware and software to enhance system security characteristics.

system documentation
See design documentation.

system high
The highest security level supported by a system at a particular time or in a particular environment.

system integrity
States that the hardware and firmware must work and must be tested to ensure that they keep working. The Orange Book states that "Hardware and Software features shall be provided that can be used to periodically validate the correct operation of the on-site hardware and firmware elements of the TCB."

system low
The lowest security level supported by a system at a particular time or in a particular environment.

system resource
A component (eg. subject, storage object) of the system.

system security
The implementation, management and enforcement of security requirements of a given security class on a verified system.

system security and administration
Performs the offline procedures that make or break a secure system - by clearly delineating system administrator responsibilities, by training users appropriately, and by monitoring users to make sure that security policies are observed. Also involves determining security threats that face a system and the cost of protection against them.

system user
See user.

test documentation
Documentation that must "show how the security mechanisms were tested, and results of the security mechanisms' functional testing." Usually voluminous. It must contain a test plan, assumptions about the test environment, the test procedures, expected results, and actual results. The key question the testing and test documentation must address is whether any design or implementation flaws in the TCB would permit a user to read, change, or delete data that he or she normally wouldn't be authorized to access.

threat
A possible danger to a computer system. See also active threat and passive threat.

ticket
See capability.

token
A physical item that's used to provide identity. Typically an electronic device that can be inserted in a door or a computer system to gain access.

top-level specification
A nonprocedural description of system behaviour at an abstract level: for example, a functional specification that omits all implementation details.

topology
A network configuration: the way the nodes of a network are connected together. Examples include bus, ring, and star topologies.

traffic
The message flow across a network. Analysis of message characteristics (eg. length, frequency, destination) can sometimes provide information to an eavesdropper.

trap door
A hidden mechanism that allows normal system protection to be circumvented. Trap doors are often planted by system developers to allow them to test programs without having to follow security procedures or other user interfaces. They are typically activated in some unobvious way (eg. by typing a particular sequence of keys). Synonymous with back door.

trojan horse
A type of programmed threat. An independant program that appears to perform a useful function but that hides another unauthorized program inside it. When an authorized user performs the apparent function, the trojan horse performs the unauthorized function as well (often usurping the privileges of the user).

trust
Reliance on the ability of a system to meet its specifications.

trusted computing base (TCB)
From the Orange Book definition: "The totality of protection mechanisms within a computer system - including hardware, firmware and software - the combination of which is responsible for enforcing a security policy. A TCB consists of one or more components that together enforce a unified security policy over a product or system. The ability of a TCB to correctly enforce a security policy depends solely on the mechanisms within the TCB and on the correct input by system administrative personnel of parameters (eg. a user's clearance) related to the security policy."

trusted distribution
The process of distributing a trusted system in a way that ensures that the system that arrives at the customer site is the exact, evaluated system shipped by the vendor.

trusted facility management
The management of a trusted system in a way that assures separation of duties (eg. separate operator, system administrator, and security administrator roles), with duties clearly delineated for each role.

trusted facility manual (TFM)
Documentation that is aimed at system administrators and/or security administrators. It tells them everything they need to know about setting up the system so it will be secure, enforcing system security, interacting with user requests, and making the system work to its best advantage.

trusted path
A mechanism that allows a user to communicate directly with the Trusted Computing Base. The mechanism can be activated only by the user or the TCB and cannot be initiated by untrusted software. With a trusted path, there is no way an intermediary program can mimic trusted software. Trusted path mechanisms authenticate the TCB; they guarantee to the user that she or he is communicating with trusted software, not with some intermediary program or process that might mimic the trusted software - for example, to spoof a user into giving away a password. Often trusted paths are difficult to implement on personal computers and will require the development of a TCB on the PC itself.

trusted recovery
The set of procedures involved in restoring a system and its data in trusted fashion after a system crash or some other type of system failure.

trusted system
A system designed and developed in accordance with Orange Book criteria and evaluated according to those criteria.

unauthorized
The state of security access when a user fails to pass authentication procedures.

untrusted system
A system that has not been designed and developed in accordance with Orange Book criteria and evaluated according to those criteria.

user
A person or a process who accesses a computer system.

user ID
A unique code or string of characters with which the system identifies a specific user.

validation
The performance of tests and evaluations to determine whether a system complies with security specifications and requirements.

verification
The process of comparing two levels of system specification to ensure a correspondence between them: for example, security policy model with top-level specification, top-level specification with source code, or source code with object code. The process may be automated. See also formal verification.

verified design
The A1 system class. The only additional feature beyond B3 requirements is trusted distribution. It requires additional assurance from formal analysis and mathematical proof that the system design matches the system's security policy and its design specifications.

The Orange Book does discuss the possibility of defining requirements for systems that exceed current A1 requirements in areas of system architecture, testing, and formal verification.

verified protection
Requires a mathematical and automated proof that the design description for a system is consistent with the system's security policy. See verification.

virus
A type of programmed threat. A code fragment (not an independant program) that reproduces by attaching to another program. It may damage data directly, or it may degrade system performance by taking over system resources which are then not available to authorized users.

vulnerability
A weakness in a computer system, or a point where the system is susceptible to attack. The weakness could be exploited to violate system security.

worm
A type of programmed threat. An independant program that reproduces by copying itself from one system to another, usually over a network. Like a virus, a worm may damage data directly, or it may degrade system performance by tying up system resources and even shutting down a network.

write
An operation involving the flow of information from a subject to an object (eg. the alteration of that information).