The ability of a subject to view, change or communicate with an object in a computer system. Typically, access involves a flow of information between the subject and the object (for example, a user reads a file, a program creates a directory).
Restrictions on the ability of a subject (e.g. a user) to use a system or an object (e.g. a file) in that system. Such controls limit access to authorised users only. Access control mechanisms may include hardware or software features, operating procedures, management procedures, or any combination.
Access control list (ACL)
For a particular object, a list of the subjects authorised to access that object. The list usually indicates what type of access is allowed for each user. Typical types of access may include, read, write, execute, append, modify, delete and create.
A security principle stating that individuals must be able to be identified. With accountability, violations or attempted violations of system security can be traced to individuals who can then be held responsible for their actions.
Official authorisation and approval, granted to a computer system or network, to process sensitive data in a particular operational environment. Accreditation is performed by specific technical personnel after a security evaluation of the system’s hardware, software, configuration, and security controls.
A security principle that keeps information from being modified or otherwise corrupted either maliciously or accidentally. Accuracy protects against forgery or tampering. Synonymous with integrity.
A type of threat that involves the alteration, not simply the interceptions, of information. For example, an active tap is a type of wiretapping that accesses and compromises data, usually by generating false messages or control signals, or by altering communications between legitimate users. The danger of an active threat is primarily the authenticity of the information being transmitted. Contrast with passive threat.
The addition of security products and layers to an existing computer system.
Management rules and procedures that result in protection of a computer system and its data. Sometimes called procedural security.
A measure of confidence that a system’s security features have been implemented and work properly. Assurance is one of the primary issues addressed by the Orange Book.
An attempt to bypass security controls on a system. An active attach alters data. A passive attack releases data. Whether or not an attach will succeed depends on the vulnerability of the system and the effectiveness of existing countermeasures.
To record independently and later examine system activity (e.g. logins and logouts, file accesses, security violations).
The chronological set of records that provides evidence of system activity. These records can be used to reconstruct, review, and examine transaction from inception to output of final results. The records can also be used to track system usage and detect and identify intruders.
The process of proving that a subject (e.g. a user or a system) is what the subject claims to be. Authentication is a measure used to verify the eligibility of a subject and the ability of that subject to access certain information. It protects against the fraudulent use of a system or the fraudulent transmission of information. There are three classic ways to authenticate oneself: something you know, something you have, and something you are. See also identification.
A security principle that ensures that a message is received in exactly the same form in which it was sent. See also message authentication and message authentication code.
The granting of rights to a user, a program, or a process. For example, certain users may be authorise to access certain files in a system, whereas only the system administrator may be authorised to export data from a trusted system.
A security principle that ensures that ability of a system to keep working efficiently and to keep information accessible. Contrast with denial of service.
See trap door.
Copying of data to a medium from which the data can be restored if the original data is destroyed or compromised. Full backups copy all data in the system. Incremental backups copy only data that’s been changed since the last full backup. A sound backup plan involves keeping backup media off-site and developing procedures for replacing system components, if necessary, after a system failure.
A characteristic of a communications channel. The amount of information that can pass through the channel in a given amount of time. See covert channel.
tight yoga pamts, yoga pants reviews . somatodrol . Poker togel 526bet home bandar judi. . Check out the review of sbobetonline. . http://kitchen-islands.net/ The computer security policy model on which the Orange Book requirements are based. From the Orange Book definition: “A formal state transition model of computer security policy that describes a set of access control rules. In this formal model, the entities in a computer system are divided into abstract sets of subjects and objects. The notion of a secure state is defined and it is proven that each state transition preserves security by moving from secure state to secure state; thus, inductively proving the system is secure. A system state is defined to be “secure” if the only permitted access modes of subjects to objects are in accordance with a specific security policy. In order to determine whether or not a specific access mode is allowed, the clearance of a subject is compared to the classification of the object and a determination is made as to whether the subject is authorised for the specific access mode.
An integrity model of computer security policy that describes a set of rules. In this model, a subject may not depend on any object or other subject that is less trusted than itself.
The statistical study of biological data. In computer security, the use of unique, quantifiable physiological, behavioural, and morphological characteristics to provide positive personal identification. Examples of such characteristics are fingerprints, retina patterns, and signatures.
A security procedures used with modems connected to terminals dialling into computer systems. When the computer system answers a call, it doesn’t allow a direct login at that time. First, it calls back the telephone number associated with the authorised user’s account.
In capability-based systems, an identifier that identifies an object (e.g. a file) and specifies the access rights for the subject (e.g. the user) who possesses the capability (sometimes called a ticket).
An item in the non-hierarchical portion (the category set) of a sensitivity label. (The hierarchical portion is called the classification). A category represents a distinct area of information in a system. When included in a sensitivity label in a system supporting mandatory access controls, it is used to limit access to those who need to know information in this particular category. Synonymous with compartment.
The technical evaluation performed as part of, and in support of, the accreditation process that establishes the extent to which a particular computer system or network design and implementation meet a prespecified set of security requirements.
A type of authentication in which a user responds correctly (usually by performing some calculation) to a challenge (usually a numeric, unpredictable one).
A path used for information transfer within a system.
Numbers summed according to a particular set of rules and used to verify that transmitted data has not been modified during transmission.
In cryptography, the unintelligible text that results from encrypting original text. Sometimes called “codetext”, “crypotext,” or “cipher.”
An integrity model for computer security policy designed for a commercial environment. It addresses such concepts as nondiscretionary access control, privilege separation, and lest privilege.
The hierarchical portion of a sensitivity level. (The non-hierarchical portion is called the “category set” or the “compartments”.) A classification is a single level in a stratified set of levels. For example, in a military environment, each of the levels UNCLASSIFIED, CONFIDENTIAL, SECRET and TOP SECRET is more trusted than the level beneath it. When included in a sensitivity label in a system supporting mandatory access controls, a classification is used to limit access at those cleared at that level.
A representation of the sensitivity level (the classification and the categories) associated with a user in a system supporting mandatory access controls. A user with a particular clearance can typically access only information with a sensitivity label equal to or lower than the user’s clearance.
Closed security environment
An environment in which both of the following conditions are true:
- Application developers have sufficient clearances and authorisation to provide an acceptable presumption that they have not introduced malicious logic.
- Configuration control provides sufficient assurance that applications and equipment are protected against the introduction of malicious logic prior to and during the operation of system applications.
Protection of information while it’s being transmitted, particularly via telecommunications. A particular focus of communications security is message authenticity.
Short for communications security. The government program whose focus is the techniques (e.g. encryption) that prevents information from modification and unauthorised access while it’s being transmitted.
- The isolation of the operating system, user programs, and data files from one another in a computer system to provide protection against unauthorised access by other users or programs.
- The breaking down of sensitive data into small, isolated blocks to reduce the risk of unauthorised access.
Compartmented mode workstation (CMW)
A trusted workstation that contains enough built-in security to be able to function as a trusted computer. A CMW is trusted to keep data of different security levels and categories in separate compartments.
Unauthorised disclosure or loss of sensitive information.
Short for computer security. The government program whose focus is the techniques (e.g. trusted systems) that prevent unauthorised access to information while it’s being processed or stored.
Protection of information while it’s being processes or stored.
A security principle that keeps information from being disclosed to anyone not authorised to access it. Synonymous with secrecy.
The identification control, accounting for, and auditing of all changes to system hardware, software, firmware, documentation, test plans, and test results throughout the development and operation of the system.
Prevention of leaking sensitive data from a program.
See star property (*-property).
A plan for responding to a system emergency. The plan includes performing backups, preparing critical facilities that can be used to facilitate continuity of operations in the event of an emergency, and recovering from a disaster. Synonymous with disaster recovery plan.
An action, device, procedure, technique, or other measure that reduces the vulnerability of a system or a threat to that system.
A communications channel that allows a process to transfer information in a way that violates a system’s security policy.
Covert channel analysis
Analysis of the potential for covert channels in a trusted computer system.
Covert storage channel
A covert channel that allows a storage location (e.g. a location on disk) to be written by one process and read by another process. The two processes are typically at different security levels.
Covert timing channel
A covert channel that allows one process to signal information to another process by modulating the use of system resources (e.g. CPU time) in a way that affects the response time observed by the second process.
The study of encryption and decryption. From the Greek “kryptos” meaning “hidden” and “graphia” meaning “writing”.
A private key encryption algorithm adopted as the federal standard for the protection of sensitive unclassified information and used extensively for the protection of commercial data as well.
The transformation of encrypted text (called ciphertext) into original text (called plaintext). Sometimes called “deciphering”.
To demagnetise magnetic media (typically tapes) in a way that leaves a very low residue of magnetic induction on the media. This process effectively erases the tape.
Denial of service
An action or series of actions that prevents a system or any of its resources from functionining efficiently and reliably.
An authentication tool that verifies the origin of message and the identity of the sender and the receiver. Can be used to resolve any authentication issues between the sender and the receiver. A digital signature is unique for every transactions.
Disaster recovery plan
See contingency plan.
Discretionary access control (DAC)
An access policy that restricts access to system objects (e.g. files, directories, devices) based on the identity of the users and/or groups to which they belong. “Discretionary” means that a user with certain access permissions is capable of passing those permissions to another user (e.g. letting another user modify a file). Contrast with mandatory access control.
The set of objects that a subject is allowed to access.
A relationship between security levels in a system supporting mandatory access controls. One subject dominates another if the first subject’s classification is greater than the subject’s classification, and if the first subject’s categories include at least all of the second subject’s categories.
Unauthorised interception of information. Usually refers to the passive interception (receiving information), rather than active interception (changing information).
Electrical and electromagnetic signals emitted from electrical equipment (e.g. computers, terminals, printers, cabling) and transmitted through the air or through conductors. If the information carried by these emanations is intercepted and deciphered, sensitive information may be compromised. Also called “emissions”.
The transformation of original text (called plaintext) into unintelligible text (called ciphertext). Sometimes called “enciphering”.
A type of encryption in which a message is encrypted when it is transmitted and is decrypted when it is received. Contrast with link encryption.
Removal of signs recorded on magnetic media. Simply reinitialising a disk or tape doesn’t erase data; it simply makes the data harder to access. Someone who knows how to bypass ordinary volume checking mechanisms may still be able to access sensitive data on reinitialized disks or tape.
Transfer of information from one system to another. Often used to refer to the transfer of information from a trusted system to an untrusted system.
A code associated with a file that indicates the file type and associated file access. Typical classes are public (anyone can read or change the file), read-only (anyone can read, but only the owner and the system administrator can write the file), and private (only the owner and the system administrator can read or change the file).
Protection of files stored on a computer system through discretionary access control and/or mandatory access control.
A biometric system that compares a fingerprint pattern with a stored pattern to determine whether there’s a match.
An error, omission, or loophole in a system that allows security mechanisms to be bypassed.
From the Orange Book definition: “A complete and convincing mathematical argument, presenting the full logical justification for each proof step, for the truth of a theorem or set of theorems. The formal verification process uses formal proofs to show the truth of certain properties of formal specification and for showing that computer programs satisfy their specifications.”
Formal security policy model
From the Orange Book definition: “A mathematically precise statement of a security policy. To be adequately precise, such a model must represent the initial state of a system, the way in which the system progresses from one state to another, and a definition of a “secure” state of the system. To be acceptable as a basis for a TCB, the model must be supported by a formal proof that if the initial state of the system satisfies the definition of a “secure” state and if all assumptions required by the model hold, then all future states of the system will be secure. Some formal modelling techniques include: state transition models, temporal logic models, denotional semantics models, algebraic specification models.”
An automated tool used in designing and testing highly trusted systems. The process of using formal proofs to demonstrate two types of consistency:
- Design verification: consistency between a formal specification of a system and a formal security policy model.
- Implementation verification: consistency between a formal specification of a system and its high-level program implementation.
Typically, a system that is attached to two systems, devices, or networks that otherwise do not communicate with each other. Communications from one system or network to another are routed through the gateway. A gateway system may be used as a guardian or “firewall” between trusted and untrusted systems or networks. The gateway filters our any information that’s not allowed to pass from the trusted system to the untrusted system or network, or vice versa.
The relative fineness or coarseness by which a mechanism can be adjusted. In the Orange Book, the phrase “to the granularity of a single user” means that an access control mechanism can be adjusted to include or exclude any single user.
A set of users in a system. A system security policy may give certain access rights to every member of a group.
A biometric system that compares a handprint pattern with a stored pattern to determine where there’s a match.
The process of telling a system the identity of a subject (e.g. user or another system). Usually, this is done by entering a name or presenting a token to the system. See also authentication.
Posing as an unauthorised user, usually in an attempt to gain access to a system. Synonymous with masquerade.
A label associated with a particular subject or object in a system (e.g. file, process, window). Information labels are used in compartmented mode workstations and are similar to sensitivity labels. However, they differ from sensitivity labels in several ways:
- In addition to a classification and a set of categories, information labels also contain dissemination markings and handling caveats (e.g. EYES ONLY).
- They simply represent the sensitivity of the information in the subject or object; in contrast, sensitivity labels are used to make access decisions.
- They are automatically adjusted as the information of a subject or object changes (for example, the contents of a window); in contrast, sensitivity labels remain static.
The security level implied by an information label’s classification and categories.
Protection of information.
Short for information security. The government program whose focus is the techniques that increase the security of computer systems, communication systems, and the information they process or transmit.
Transfer of information into a system. Often used to refer to the transfer of information from an untrusted system to a trusted system.
A security principle that keeps information from being modified or otherwise corrupted either maliciously or accidentally. Integrity protects against forgery or tampering. Synonymous with accuracy.
See security kernel.
In cryptography, a secret value that’s used to encrypt and decrypt messages. A sequence of symbols (often a large number) that’s usually known only to the sender and the receiver of the message. See also private key encryption and public key encryption.
A system that compares a pattern of keystrokes with a stored pattern to determine whether there’s a match.
In a system supporting mandatory access controls, the assignment of sensitivity labels to every subject.
A security principle stating that a user or a process should be granted the most restrictive set of privileges only for the duration of the task. Least privilege limits the damage that can occur because of accident or system attack.
See security level.
Confidence that a trusted system is designed, developed and maintained with formal and rigidly controlled standards. In the Orange Book, the set of life-cycle assurances includes security testing, design specification and verification, configuration management, and trusted distribution.
A type of encryption in which a message is encrypted when it is transmitted and is decrypted and then encrypted again each time it passes through a network communications node. Sometimes called “online encryption.” Contrast with end-to end encryption.
A type of programmed threat. A mechanism for releasing a system attack of some kind. It is triggered when a particular condition (e.g. a certain date or system operation) occurs.
The process of identifying oneself to, and having one’s identity authenticated by, a computer system.
A measure of the density of the magnetic flux remaining after a magnetic force has been removed. Data remaining on a magnetic medium such as tape.
Code that is included in a system for an unauthorised purpose.
Mandatory access control (MAC)
An access policy that restricts access to system objects (e.g. files, directories, devices) based on the sensitivity of the information in the object (represented by the object’s label) and the authorisation of the subject (usually represented by the user’s clearance) to access information at that sensitivity level. “Mandatory” means that the system enforces the policy; users do not have the discretion to share their files. Contrast with discretionary access control.
Posing as an authorised user, usually in an attempt to gain access to a system. Synonymous with impersonation.
Ensuring, typically with a message authentication code, that a message received (usually via a network) matches the message sent.
Message authentication code
A code calculated during encryption and appended to a message. If the message authentication code calculated during decryption matches the appended code, the message was not altered during transmission.
See security model.
A device that connects a computed and a terminal via a telephone line. Short for modulator/demodulator.
Used to describe data or devices. Multi-level security allows users at different sensitivity levels to access a system concurrently. The system permits each user to access only the data that he or she is authorized to access. A multi-level device is one on which a number of different levels of data can be processed. Contrast with single-level.
A security principle stating that a user should have access only to the data he or she needs to perform a particular function.
A data communications system that allows a number of systems and devices to communicate with each other.
A system connected to a network.
From the Orange Book definition “A passive entity that contains or receives information. Access to an object potentially implies access to the information it contains. Examples of objects are: records, blocks, pages, segments, files, directories, directory trees, and programs, as well as bits, bytes, words, fields, processors, video displays, keyboards, clocks, printers, network nodes, etc.”
The reassignment to a subject (e.g., a user) of a medium that previously contained and objet (e.g., a file). The danger of object reuse is that the object may still contain information that the subject may not be authorised to access. Examples are magnetic tapes that haven’t been erased, workstations that hold information in local storage, and X Window System objects that haven’t been cleared before they’re reassigned.
A type of encryption in which a cipher is used only once. Two copies of a pad are created; one copy goes to the sender, and the other to the recipient. The pad contains a random number for each character in the original message. The pad is destroyed after use. Sometimes called a “one-time pad.”
Open security environment
An environment in which at least one of the following conditions is true:
- Application developers do not have sufficient clearance or authorisation to provide and acceptable presumption that they have not introduced malicious logic.
- Configuration control does not provide sufficient assurance that applications are protected against the introduction of malicious logic prior to and during the operation of system applications.
Confidence that a trusted system’s architecture and implementation enforce the system’s security policy. In the Orange Book, the set of operational assurances includes system architecture, system integrity, covert channel analysis, and trusted recovery.
A type of threat that involves the interception, but not the alteration, of information. For example a passive tap is a type of wiretapping that involves eavesdropping, monitoring, and/or recording of information, but not the generation of false messages or control signals. The danger of a passive threat is primarily the secrecy of the information being transmitted. Contrast with active threat.
A secret sequence of characters that’s used to authenticate a user’s identity, usually during a login process.
A Successful, unauthorised access to a computer system.
A type of testing in which testers attempt to circumvent the security features of a systems in an effort to identify security weaknesses.
See security perimeter.
A type of interaction a subject can have with an object. For example, file permissions specify the actions particular users or classed of uses can perform on the file. Examples are read, write and execute.
Personal identification number (PIN)
A number or code of some kind that’s unique to an individual and can be used to provide identity. Often used with automatic teller machines and access devices.
Protection of physical computer systems and related buildings and equipment from fire and other natural and environment hazards, as well as from intrusion. Also covers the use of locks, keys and administrative measures used to control access to computer systems and facilities.
In cryptography, the original text that is being encrypted. Synonymous with cleartext.
The recording of a legitimate message and then later, unauthorised resending of the message. Synonymous with replay.
See security policy.
A security principle that protects individuals from the collection, storage, and dissemination of information about themselves and the possible compromises resulting from unauthorised release of that information.
Private Key encryption
A type of encryption that uses a single key to both encrypt and decrypt information. Also called symmetric, or single key encryption. Contrast with public key encryption.
A right granted to a user, a program, or a process. For example, certain users may have the privileges that allow them to access certain files in a system. Only the system administrator may have the privileges necessary to export data from a trusted system.
See administrative security.
A set of rules and formats for the exchange of information, particularly over a communications network.
A component process with an overall communication process. Typically, each layer provides specific functions and communicates with the layers above and beneath it.
The conceptual basis for describing how to communicate within a network.
Public key encryption
A type of encryption that uses two mathematically related keys. The public key is known within a group of users. The private key is known only to its owner. Contrast with private key encryption.
An operation involving the follow of information from an object to a subject. It does not have to involve the alteration of that information.
The actions necessary to restore a system and its data files after a system failure or intrusion.
From the Orange Book definition: “An access control concept that refers to and abstract machine that mediates all accesses to objects by subjects.”
See magnetic remanence.
The recording of a legitimate message and the later, unauthorised resending of the message. Synonymous with playback.
The denial by a message sender that was the message was sent, or by a message recipient that the message was received.
Data left in storage or on a medium before the data has been rewritten or eliminated in some other way.
A biometric system that compares a retina blood vessel pattern with a stored pattern to determine whether there’s a match.
The probability that a particular security threat will exploit particular system vulnerability.
An analysis of a systems information needs and vulnerabilities to determine how likely they are to be exploited in different ways and the costs of losing and/or recovering the system or its information.
The overwriting of sensitive information. On magnetic media, degaussing sometimes called “scrubbing.”
A security principle that keeps information. From being disclosed to anyone not authorised to access it. Synonymous with confidentiality.
A condition in which none of the subjects in a system can access object in an unauthorised manner.
Freedom from risk or danger. Safety and the assurance of safety.
From the Orange Book definition: “The hardware, firmware, and software elements of a Trusted Computing Base that implement the reference monitor concept. It must mediate all accesses, be protected from modification, and be verifiable as correct.”
A representation of the sensitivity of information, derived from a sensitivity label (consisting of classification and categories).
A precise statement of the security rules of a system.
An imaginary boundary between the Trusted Computing Base (inside the perimeter) and other system functions (outside the perimeter). In a networking environment, sometime used to refer to the boundary between trusted and untrusted systems.
From the Orange Book definition: “the set of laws, rules, and practices that regulate how an organisation manages, protects, and distributes sensitive information.
A Type of testing in which testers determine whether the security features of a system are implemented as designed. Security testing may include hands-on functional testing, penetration testing, and formal verification.
A form of discretionary access control in which file access is determined by category. File permissions or some other scheme allow the owner of the file to specify what permissions he or she (self) will have, what permissions a group of users will have, and what permissions the rest of the world (public) will have. Typical permissions include read, write, and execute.
Information that, if lost or compromised, would negatively affect the owner of the information, would jeopardise the ability of the system to continue processing and/or would require substantial resources to recreate. According to the US government (NTISSP 2), “information the disclosure, alteration, loss or destruction of which could adversely affect national security or other federal government interests.”
A label representing the security level of an object and describing the sensitivity of the data in the object. The label consists of two parts: a hierarchical classification and a set of non hierarchical classification and a set of non hierarchical categories or compartments. In systems supporting mandatory access controls, sensitivity labels determine whether a particular subject will be allowed to access a particular object.
Separation of duty
A security principle that assigns security-related tasks to several distinct individuals. Usually, each of them has the least number of privileges needed to perform those tasks.
In TEMPEST technology, a container built around a piece of electronic equipment so the signals emanating from the equipment can’t be intercepted and deciphered.
A biometric system that compares a signature with a stored patter to determine whether there’s a match.
Simple security condition
From the Orange Book definition: “A Bell-LaPadula security model rule allowing a subject read access to an object only if the security level of the subject dominates the security level of the object.” See also dominate.
Used to describe data or devices. Single-level security allows a system to be accesses at any one time only by users at the same sensitivity level. A single-level device is one used to process only data of a single security level at any one time. Contract with multi-level.
An access card containing encoded information and sometimes a microprosessor and a user interface. The information on the code, or the information generated by the processor, is used to gain access to a facility or a computer system.
A trick that causes an authorised user to perform an action that violates system security or that gives away information to an intruder.
Star property (*-property)
From the Orange Book definition: “A Bell-LaPadula security model rule allowing a subject to write access to an object only if the security level of the subject is dominated by the security level of the object. Also known as the conferment property.” See also dominate.
From the Orange Book definition: “An active entity, generally in the firm of a person, process, or device that causes information to flow among objects or changes in the system state.”
A type of cipher that replaces the characters being encrypted with substitute characters.
In TEMPEST technology, an approach taken to build equipment in such a way that signals don’t emanate from the equipment and thus can’t be intercepted and deciphered.
The lowest security level supported by a system at a particular time or in a particular environment.
The highest security level supported by a system at a particular time or in a particular environment.
System high workstation (SHW)
A type of compartmented mode workstation. Like a compartmented mode workstation, a system high workstation handles multiple compartments. Unlike a compartmented mode workstation, users must be cleared for all compartments on a system high workstation.
A government program that prevents the compromising electrical and electromagnetic signals that emanate from computers and related equipment from being intercepted and deciphered.
A possible danger to a computer system. See also active threat and passive threat
A physical item that’s used to provide identity. Typically an
electronic device that can be inserted in a door or computer system to gain
Top level specification
A nonprocedural description of system behaviour at an abstract level; for example, a functional specification that omits all implementation details.
A network configuration; the way the nodes of a network are connected together. Examples include bus, ring, and star topologies.
The message flows across a network. Analysis of message characteristics (e.g. length, frequency, destination) can sometimes provide information to an eavesdropper.
A type of cipher that rearranges the order of the characters being encrypted, but does not change the actually characters.
A hidden mechanism that allows normal system protection to be circumvented. Trap doors are often planted by system developers to allow them to test programs without having to follow in security procedures or other user interfaces. They are typically activated in some unobvious way (e.g. by typing a particular sequence of keys). Synonymous with back door.
A type of programmed threat. An independent program that appears to perform a useful function but that hides another unauthorised program inside it. When an authorised user performs the apparent function, the Trojan horse perform the unauthorised function as well (often usurping the privileges of the user).
Reliance on the ability of a system to meet its specifications.
Trusted Computing Base (TCB)
From the Orange Book definition: “The totality of protection mechanisms within a computer system – including hardware, firmware, and software – the combination of which is responsible for enforcing a security policy. A TCB consists of one or more components that together enforce a unified security policy over a product or system. The ability of a TCB to correctly enforce a security policy depends solely on the mechanisms within the TCB and on the correct input by system administrative personnel or parameters (e.g. a user’s clearance) related to security policy.
The process of distributing a trusted system in a way that assures that the system that arrives at the customer site is the exact, evaluated system shipped by the vendor.
Trusted facility management
The management of a trusted system in a way that assurances separation of duties (e.g. separate operator, system administrator, and security administrator roles), with duties clearly delineated for each role.
A mechanism that allows a terminal user to communicate directly with the Trusted Computing Base. The mechanism can be activated only be the person or the TCB and cannot be initiated by untrusted software. With a trusted path, there is no way an intermediary program can mimic trusted software.
The set of procedures involved in restoring a system and its data in trusted fashion after a system crash or some other type or system failure.
A person or a process who accesses a computer system.
A unique code or string of characters with which the system identifies a specific user.
The performance of tests and evaluations to determine whether a system complies with security specifications and requirements.
The process of comparing two levels of system specification to ensure a correspondence between them; for example, security policy model with top-level specification, top-level specification with source code, or source code with object code. The process may be automated. See also formal verification.
A type of programmed threat. A code fragment (not an independent program) that reproduces by attaching to another program. It may damage data directly, or it may degrade system performance by taking over system resources which are then not available to authorised users.
A biometric system that compares a vocal pattern with a stored pattern to determine whether there’s a match.
A weakness in a computer system, or a point where the system is susceptible to attack. The weakness could be exploited to violate system security.
The attaching of an unauthorised device to a communications circuit to obtain access to data. Taps may be active or passive; see active threat and passive threat.
A type of programmed threat. An independent program that reproduces by copying itself from one system to another, usually over a network. Like a virus, a worm may damage directly, or it may degrade system performance by typing up system resources and even shutting down a network.
An operation involving the flow of information from a subject to an object (e.g. the alteration of that information).