Thursday, February 4, 2010

Glossary of Vulnerability Testing Terminology


Editors: OUSPG crew (OUSPG), Ari Takanen (Codenomicon)

Table Of Contents


Contents
  1. Glossary of Vulnerability Testing Terminology
    1. Table Of Contents
    2. ABSTRACT
    3. OUSPG Glossary
    4. References
    5. Other glossaries




    ABSTRACT


    Several glossaries are available from different fields of expertice on the software engineering and information security. Yet, terminology used in the context of implementation level vulnerabilities has not stabilised. This document collects the relevant definitions from our main areas of interest. Terms are introduced with reference to the source. When multiple sources present the same details on a term, only one is usually noted. An attempt is made to preserve the form of definition used in the original source. The glossary with original wording and reference details has been found useful within the group, thus we are making it publicly available herein. Please do not refer to this glossary, the original source is preferred.

    OUSPG Glossary


    Abstract Syntax Notation One (ASN.1)

  2. The language used by the OSI protocols for describing abstract syntax. This language is also used to encode SNMP packets. ASN.1 is defined in ISO documents 8824.2 and 8825.2. See also: Basic Encoding Rules. [1]

  3. (C) OSI standards use ASN.1 to specify data formats for protocols. OSI defines functionality in layers. Information objects at higher layers are abstractly defined to be implemented with objects at lower layers. A higher layer may define transfers of abstract objects between computers, and a lower layer may define transfers concretely as strings of bits. Syntax is needed to define abstract objects, and encoding rules are needed to transform between abstract objects and bit strings. (See: Basic Encoding Rules.) [2]
Ad hoc
  • Something that is ad hoc or that is done on an ad hoc basis happens or is done only when the situation makes it necessary or desirable, rather than being arranged in advance or being part of a general plan. [3]
Ad hoc testing
  • Testing carried out using no recognised test case design technique. [4]
Ad-lib test
  • (also ad hoc test), a test executed without prior planning; especially if the expected test outcome is not predicted beforehand. An undocumented test. [5]
Anomaly
  • An anomaly is a rule or practice that is different from what is normal or usual, and which is therefore unsatisfactory. [3]
  • Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. [6]
Attack
  • An attempt to bypass security controls on a computer. The attack may alter, release, or deny data. Whether an attack will succeed depends on the vulnerability of the computer system and the effectiveness of existing countermeasures. [7]
  • The act of trying to bypass security controls on a system. An attack may be active, resulting in the alteration of data; or passive, resulting in the release of data. Note: The fact that an attack is made does not necessarily mean that it will succeed. The degree of success depends on the vulnerability of the system or activity and the effectiveness of existing countermeasures. [8]
Attack potential
  • The perceived potential for success of an attack, should an attack be launched, expressed in terms of an attacker's expertise, resources and motivation. [9]
Audit
  • (missing definition)
Availability
  • Assuring information and communications services will be ready for use when expected. [7]
Availability of data
  • The state when data are in the place needed by the user, at the time the user needs them, and in the form needed by the user. [8]
Backus-Naur Form
  • (also Backus normal form, BNF), a metalanguage used to formally describe the syntax of another language. [5]
  • A metalanguage used to formally describe the syntax of a language. [4]
Basic Encoding Rules (BER)
  • Standard rules for encoding data units described in ASN.1. Sometimes incorrectly lumped under the term ASN.1, which properly refers only to the abstract syntax description language, not the encoding technique. See also: Abstract Syntax Notation One. [Source: NNSC] [1]
Black-box testing
  • Functional test case design: Test case selection that is based on an analysis of the specification of the component without reference to its internal workings. [4]
  • Fuctional testing. Testing that ignores the internal mechanism of a system or component and focuses solely on the outputs generated in response to the selected inputs and execution conditions. [6]
Boundary value
  • A data value that corresponds to a minimum or maximum input, internal, or output value specified for a system or component. See also: stress testing. [6]
  • An input value or output value which is on the boundary between equivalence classes, or an incremental distance either side of the boundary. [4]
Boundary value analysis
  • (NBS) A selection technique in which test data are chosen to lie along "boundaries" of the input domain [or output range] classes, data structures, procedure parameters, etc. Choices often include maximum, minimum, and trivial values or parameters. This technique is often called stress testing. [10]
  • A test case design technique for a component in which test cases are designed which include representatives of boundary values. [4]
Boundary value coverage
  • The percentage of boundary values of the component's equivalence classes which have been exercised by a test case suite. [4]
Boundary value testing
  • A testing technique using input values at, just below, and just above, the defined limits of an input domain; and with input values causing outputs to be at, just below, and just above, the defined limits of an output domain. See: boundary value analysis, stress testing. [10]
Branch coverage
  • Metric of the number of branches executed under test; "100% branch coverage" means that every branch in a program has been executed at least once under some test (also link coverage). [5]
Breach
  • The successful defeat of security controls which could result in a penetration of the system. A violation of controls of a particular information system such that information assets or system components are unduly exposed. [7]
Brute force attack
  • (I) A cryptanalysis technique or other kind of attack method involving an exhaustive procedure that tries all possibilities, one-by-one. [2]
  • (C) For example, for ciphertext where the analyst already knows the decryption algorithm, a brute force technique to finding the original plaintext is to decrypt the message with every possible key. [2]
Buffer overflow
  • This happens when more data is put into a buffer or holding area, then the buffer can handle. This is due to a mismatch in processing rates between the producing and consuming processes. This can result in system crashes or the creation of a back door leading to system access. [7]
Bug
  • A fault in a program which causes the program to perform in an unintended or unanticipated manner. See: anomaly, defect, error, exception, fault. [10]
Certification
  • The comprehensive evaluation of the technical and nontechnical security features of an AIS and other safeguards, made in support of the accreditation process, that establishes the extent to which a particular design and implementation meet a specified set of security requirements. [8]
Classification
  • A classification is the separation or ordering of objects (or specimens) into classes [WEBOL 1998]. Classifications that are created non-empirically are called a priori classifications [...; Simpson 1961; WEBOL 1998]. Classifications that are created empirically by looking at the data are called a posteriori classifications [...; Simpson 1961; WEBOL 1998]. [11]
Code coverage
  • An analysis method that determines which parts of the software have been executed (covered) by the test case suite and which parts have not been executed and therefore may require additional attention. [4]
Component
  • An object of testing. An integrated assembly of one or more units and/or associated data objects or one or more components and/or associated data objects. By this (recursive) definition, a component can be anything from a unit to a system. [5]
Compromise
  • An intrusion into a computer system where unauthorized disclosure, modification or destruction of sensitive information may have occurred. [7]
  • A violation of the security policy of a system such that unauthorized disclosure of sensitive information may have occurred. [8]
Confidentiality
  • Assuring information will be kept secret, with access limited to appropriate persons. [7]
  • The concept of holding sensitive data in confidence, limited to an appropriate set of individuals or organizations. [8]
Cost-risk analysis
  • The assessment of the costs of providing data protection for a system versus the cost of losing or compromising the data. [8]
COTS Software
  • Commercial Off the Shelf - Software acquired by government contract through a commercial vendor. This software is a standard product, not developed by a vendor for a particular government project. [7]
Coverage
  • Any metric of completeness with respect to a test selection criterion. Without qualification, usually means branch or statement coverage. [5]
Crash
  • The sudden and complete failure of a computer system or component. [6]
Debugger
  • One who engages in the intuitive art of correctly determining the cause (e.g., bug) of a set of symptoms. [5]
Defect
  • Nonconformance to requirements. [12]
Denial of Service
  • Action(s) which prevent any part of an AIS from functioning in accordance with its intended purpose. [7]
  • Any action or series of actions that prevent any part of a system from functioning in accordance with its intended purpose. This includes any action that causes unauthorized destruction, modification, or delay of service. Synonymous with interdiction. [12]
  • Intentional degradation or blocking of computer or network resources. [13]
  • (I) The prevention of authorized access to a system resource or the delaying of system operations and functions. (See: availability, critical (resource of a system), flooding.) [2]
Disclosure of information
  • Dissemination of information to anyone who is not authorized to access that information. [13]
Dynamic analysis
  • The process of evaluating a system or component based on its behavior during execution. [6]
  • (NBS) Analysis that is performed by executing the program code. Contrast with static analysis. See: testing. [10]
Error
  • (1) The difference between a computed, observed, or measured value or condition and the true. specified, or theoretically correct value or condition. (2) An incorrect step, process, or data definition. Also: fault. (3) An incorrect result. Also: failure. (4) A human action that produces an incorrect result. Also: mistake. [6]
  • (ISO) A discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. See: anomaly, bug, defect, exception, fault. [10]
  • An error is a mistake made by a developer. It might be typographical error, a misleading of a specifications, a misunderstanding of what a subroutine does, and so on (IEEE 1990). An error might lead to one or more faults. Faults are located in the text of the program. More precisely, a fault is the difference between incorrect program and the correct version (IEEE 1990). [11]
Error guessing
  • A test case design technique where the experience of the tester is used to postulate what faults might occur, and to design tests specifically to expose them. [4]
Error seeding
  • The process of intentionally adding known faults to those already in a computer program for the purpose of monitoring the rate of detection and removal, and estimating the number of faults remaining in the program. [6]
  • Contrast with mutation analysis. [10]
Evaluation
  • Evaluation is a decision about significance, valua, or quality of something, based on careful study of its good and bad features. [3]
  • Assessment of a PP [Protection Profile], an ST [Security Target] or a TOE [Target of Evaluation], against defined criteria. [9]
Exception
  • An event that causes suspension of normal program execution. Types include addressing exception, data exception, operation exception, overflow exception, protection exception, underflow exception. [6]
Exercised
  • A program element is exercised by a test case when the input value causes the execution of that element, such as a statement, branch, or other structural element. [4]
Exhaustive testing
  • A test case design technique in which the test case suite comprises all combinations of input values and preconditions for component variables. [4]
  • (NBS) Executing the program with all possible combinations of values for program variables. Feasible only for small, simple programs. [10]
Exploit
  • (verb) To, in some way, take advantage of a vulnerability in a system in the pursuit or achievement of some objective. All vulnerability exploitations are attacks but not all attacks exploit vulnerabilities. [14]
  • (noun) Colloquially for exploit script: a script, program, mechanism, or other technique by which a vulnerability is used in the pursuit or achievement of some information assurance objective. It is common speech in this field to use the terms exploit and exploit script to refer to any mechanism, not just scripts, that uses a vulnerability. [14]
Exploitation (of vulnerability)
  • The exploitation of an access control vulnerability is whatever causes the operating system to perform operations that are in conflict with the security policy as defined by the access control matrix. [11]
External IT entity
  • Any IT product or system, untrusted or trusted, outside of the TOE [Target of Evaluation] that interacts with the TOE. [9]
Failure
  • Deviation of the software from its expected delivery or service. [4] (after Fenton)
  • The inability of a system or component to perform its required functions within specified performance requirements. [6]
False Negative
  • Occurs when an actual intrusive action has occurred but the system allows it to pass as non-intrusive behavior. [7]
False Positive
  • Occurs when the system classifies an action as anomalous (a possible intrusion) when it is a legitimate action. [7]
Fault
  • An incorrect step, process, or data definition in a computer program. [6]
  • A manifestation of an error in software. A fault, if encountered may cause a failure. [4] (after do178b)
  • An incorrect step, process, or data definition in a computer program which causes the program to perform in an unintended or unanticipated manner. See: anomaly, bug, defect, error, exception. [10]
Fault injection
  • The hypothesized errors that software fault injection uses are created by either: (1) adding code to the code under analysis, (2) changing the code that is there, or (3) deleting code from the code under analysis. Code that is added to the program for the purpose of either simulating errors or detecting the effects of those errors is called {\it instrumentation code}. To perform fault injection, some amount of instrumentation is always necessary, and althrough this can be added manually, it is usually performed by a tool. [15]
Fault Tolerance
  • The ability of a system or component to continue normal operation despite the presence of hardware or software faults. [7]
Flaw hypothesis methodology
  • A systems analysis and penetration technique in which specifications and documentation for the system are analyzed and then flaws in the system are hypothesized. The list of hypothesized flaws is then prioritized on the basis of the estimated probability that a flaw exists and, assuming a flaw does exist, on the ease of exploiting it, and on the extent of control or compromise it would provide. The prioritized list is used to direct a penetration attack against the system. [8]
Formal
  • Expressed in a restricted syntax language with defined semantics based on well-established mathematical concepts. [9]
Formal specification
  • (I) A specification of hardware or software functionality in a computer-readable language; usually a precise mathematical description of the behavior of the system with the aim of providing a correctness proof. [2]
Format
  • The organization of information according to preset specifications (usually for computer processing) [syn: formatting, data format, data formatting] [16]
Glossary
  • A glossary is an alphabetical list of words or expressions and the special or technical meanings that they have in a particular book, subject, or activity. [3]
Hacker
  • A person who enjoys exploring the details of computers and how to stretch their capabilities. A malicious or inquisitive meddler who tries to discover information by poking around. A person who enjoys learning the details of programming systems and how to stretch their capabilities, as opposed to most users who prefer to learn on the minimum necessary. [7]
Implementation under test, IUT
  • The particular portion of equipment which is to be studied for testing. The implementation may include one or more protocols. [17]
Implementation vulnerability
  • A vulnerability resulting from an error made in the software or hardware implementation of a satisfactory design. [13]
Information warfare
  • (missing definition)
Injection vector
  • (missing definition)
Input
  • A variable (whether stored within a component or outside it) that is read by the component. [4]
Instrument
  • 1. A tool or device that is used to do a particular task. 2. A device that is used for making measurements of something. [3]
  • In software and system testing, to install or insert devices or instructions into hardware or software to monitor the operation of a system or component. [6]
Instrumentation
  • Instrumentation is a group or collection of instruments, usually ones that are part of the same machine. [3]
  • Devices or instructions installed or inserted into hardware or software to monitor the operation of a system or component. [6]
  • The insertion of additional code into the program in order to collect information about program behaviour during program execution. [4]
  • (NBS) The insertion of additional code into a program in order to collect information about program behavior during program execution. Useful for dynamic analysis techniques such as assertion checking, coverage analysis, tuning. [10]
Integrity
  • Assuring information will not be accidentally or maliciously altered or destroyed. [7]
  • Sound, unimpaired or perfect condition. [8]
Interface
  • (1) A shared boundary across which information is passed. (2) A Hardware or software component that connects two or more other components for the purpose of passing information from one to the other. (3) To connect two or more components for the purpose of passing information from one to the other. (4) To serve as a connecting or connected component as in (2). [6]
  • (1) (ISO) A shared boundary between two functional units, defined by functional characteristics, common physical interconnection characteristics, signal characteristics, and other characteristics, as appropriate. The concept involves the specification of the connection of two devices having different functions. (2) A point of communication between two or more processes, persons, or other physical entities. (3) A peripheral device which permits two or more devices to communicate. [10]
Interface testing
  • Testing conducted to evaluate whether systems or components pass data and control correctly to each other. [6]
  • Integration testing where the interfaces between system components are tested. [4]
Language
  • Any means of conveying or communicating ideas; specifically, human speech; the expression of ideas by the voice; sounds, expressive of thought, articulated by the organs of the throat and mouth. [16]
Least privilege
  • Feature of a system in which operations are granted the fewest permissions possible in order to perform their tasks. [18]
  • The principle that requires that each subject be granted the most restrictive set of privileges needed for the performance of authorized tasks. The application of this principle limits the damage that can result from accident, error, or unauthorized use. [8]
Liability
  • Liability for something such as debt or crime is the legal responsibility for it; a technical term in law. [3]
Malicious code, malicious logic, malware
  • (I) Hardware, software, or firmware that is intentionally included or inserted in a system for a harmful purpose. (See: logic bomb, Trojan horse, virus, worm.) [2]
  • Hardware, software, or firmware that is intentionally included in a system for an unauthorized purpose; e.g., a Trojan horse. [8]
Mutation analysis
  • (NBS) A method to determine test set thoroughness by measuring the extent to which a test set can discriminate the program from slight variants [mutants] of the program. Contrast with error seeding. [10]
  • A method to determine test case suite thoroughness by measuring the extent to which a test case suite can discriminate the program from slight variants (mutants) of the program. See also error seeding. [4]
Mutation testing
  • A testing methodology in which two or more program mutations are executed using the same test cases to evaluate the ability of the test cases to detect differences in the mutations. [6]
Mutually suspicious
  • The state that exists between interacting processes (subsystems or programs) in which neither process can expect the other process to function securely with respect to some property. [8]
Negative tests
  • Tests aimed at showing that software does not work (also called dirty testing); e.g., most effective tests. [5]
Network protocol stack
  • Software package that provides general purpose networking services to application software, independent of the particular type of data link being used. [18]
Operational testing
  • Testing conducted to evaluate a system or component in its operational environment. [6]
Oracle
  • A mechanism to produce the predicted outcomes to compare with the actual outcomes of the software under test. [4] (after Adrion)
  • Any (often automated) means that provides information about the (correct) expected behavior of a component (HOWD86). Without qualification, this term is often used synonymously with input/outcome oracle. [5]
Path coverage
  • Metric applied to all path-testing strategies: in a hierarchy by path length, where length is measured by the number of graph links traversed by the path or path segment; e.g. coverage with respect to path segments two links long, three links long, etc. Unqualified, this term usually means coverage with respect to the set of entry/exit paths. Often used erroneously as synonym for statement coverage. [5]
Penetration
  • (I) Successful, repeatable, unauthorized access to a protected system resource. (See: attack, violation.) [2]
  • The successful unauthorized access to an automated system. [7]
  • The successful act of bypassing the security mechanisms of a system. [8]
Penetration Testing
  • The portion of security testing in which the evaluators attempt to circumvent the security features of a system. The evaluators may be assumed to use all system design and implementation documentation, that may include listings of system source code, manuals, and circuit diagrams. The evaluators work under the same constraints applied to ordinary users. [7] [8]
  • (C) Penetration testing may be performed under various constraints and conditions. However, for a TCSEC evaluation, testers are assumed to have all system design and implementation documentation, including source code, manuals, and circuit diagrams, and to work under no greater constraints than those applied to ordinary users. [2]
Point of Control and Observation, PCO
  • A place (point) within a testing environment where the occurrence of test events is to be controlled and observed as defined by the particular abstract test method used. [17]
Precondition
  • Environmental and state conditions which must be fulfilled before the component can be executed with a particular input value. [4]
Proprietary
  • (I) Refers to information (or other property) that is owned by an individual or organization and for which the use is restricted by that entity. [2]
Protection profile
  • An implementation-independent set of security requirements for a category of TOEs [Target of Testing] that meet specific consumer needs. [9]
Protocol
  • A set of conventions that govern the interaction of processes, devices, and other components within a system. [6]
  • (ISO) A set of semantic and syntactic rules that determines the behavior of functional units in achieving communication. [10]
  • (I) A set of rules (i.e., formats and procedures) to implement and control some type of association (e.g., communication) between systems. (E.g., see: Internet
Protocol.) [2]
  • Agreed-upon methods of communications used by computers. A specification that describes the rules and procedures that products should follow to perform activities on a network, such as transmitting data. If they use the same protocols, products from different vendors should be able to communicate on the same network. [7]
  • A set of rules and formats, semantic and syntactic, that permits entities to exchange information. [8]
  • Code of correct conduct: "safety protocols"; "academic protocol".[16]
  • Forms of ceremony and etiquette observed by diplomats and heads of state.[16]
Protocol Data Unit, PDU
  • A PDU is a message of a given protocol comprising payload and protocol-specific control information, typically contained in a header. PDUs pass over the protocol interfaces which exist between the layers of protocols (per OSI model). [17]
Regression testing
  • Retesting of a previously tested program following modification to ensure that faults have not been introduced or uncovered as a result of the changes made. [4]
Reliability
  • The probability of a given system performing its mission adequately for a specified period of time under the expected operating conditions. [8]
  • Software reliability is the probability that software will provide failure-free operation in a fixed environment for a fixed interval of time. Probability of failure is the probability that the software will fail on the next input selected. Software reliability is typically measured per some unit of time, whereas probability of failure is generally time independent. These two measures can be easily related if you know the frequency with which inputs are executed per unit of time. Mean-time-to-failure is the average interval of time between failures; this is also sometimes referred to as Mean-time-before-failure. [15]
Residual risk
  • The portion of risk that remains after security measures have been applied. [8]
  • (I) The risk that remains after countermeasures have been applied. [2]
Risk
  • The probability that a particular threat will exploit a particular vulnerability of the system. [8]
  • (I) An expectation of loss expressed as the probability that a particular threat will exploit a particular vulnerability with a particular harmful result. [2]
Risk analysis
  • The process of identifying security risks, determining their magnitude, and identifying areas needing safeguards. Risk analysis is a part of risk management. Synonymous with risk assessment. [8]
  • (C) The analysis lists risks in order of cost and criticality, thereby determining where countermeasures should be applied first. It is usually financially and technically infeasible to counteract all aspects of risk, and so some residual risk will remain, even after all available countermeasures have been deployed. [FP031, R2196] [2]
Risk assessment
  • A study of vulnerabilities, threats, likelihood, loss or impact, and theoretical effectiveness of security measures. The process of evaluating threats and vulnerabilities, known and postulated, to determine expected loss and establish the degree of acceptability to system operations. [7]
Risk management
  • The total process of identifying, controlling, and eliminating or minimizing uncertain events that may affect system resources. It includes risk analysis, cost benefit analysis, selection, implementation and test, security evaluation of safeguards, and overall security review. [8]
  • (I) The process of identifying, controlling, and eliminating or minimizing uncertain events that may affect system resources. (See: risk analysis.) [2]
Robustness
  • The degree to which a system or component can function correctly in the presence of invalid inputs or stressful environmental conditions. [6]
  • See: software reliability. [10]
Safety
  • (DOD) Freedom from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property, or damage to the environment. [12]
  • (I) The property of a system being free from risk of causing harm to system entities and outside entities. [2]
  • Software is deemed safe if it is impossible (or at least highly unlikely) that the software could ever produce an output that would cause a catastrophic event for the system that the software controls. Examples of catastrophic events include loss of physical property, physical harm, and loss-of-life. [15]
Safety-critical software
  • Safety-critical software is any software that can directly or indirectly contribute to the occurrence of a hazardous system state. [19]
Security
  • A condition that results from the establishment and maintenance of protective measures that ensure a state of inviolability from hostile acts or influences. [7]
  • The subfield of information science concerned with ensuring that information systems are imbued with the condition of being secure, as well as the means of establishing, testing, auditing, and otherwise maintaining that condition. [14]
  • (I) (1.) Measures taken to protect a system. (2.) The condition of a system that results from the establishment and maintenance of measures to protect the system. (3.) The condition of system resources being free from unauthorized access and from unauthorized or accidental change, destruction, or loss. [2]
  • Security is concerned with the protection of assets from threats, where threats are categorised as the potential for abuse of protected assets. All categories of threats should be considered; but in the domain of security greater attention is given to those threats that are related to malicious or other human activities. [9]
Security evaluation
  • An evaluation done to assess the degree of trust that can be placed in systems for the secure handling of sensitive information. One type, a product evaluation, is an evaluation performed on the hardware and software features and assurances of a computer product from a perspective that excludes the application environment. The other type, a system evaluation, is done for the purpose of assessing a system's security safeguards with respect to a specific operational mission and is a major step in the certification and accreditation process. [8]
Security flaw
  • An error of commission or omission in a system that may allow protection mechanisms to be bypassed. [8]
Security function
  • A part or parts of the TOE [Target of Testing] that have to be relied upon for enforcing a closely related subset of the rules from the TSP [TOE Security Policy]. [9]
Security measures
  • Elements of software, firmware, hardware, or procedures that are included in a system for the satisfaction of security specifications.[8]
Security requirement
  • Security requirements generally include both requirements for the presence of desired behaviour and requirements for the absence of undesired behaviour. It is normally possible to demonstrate, by use or testing, the presence of the desired behaviour. It is not always possible to perform a conclusive demonstration of absence of undesired behaviour. Testing, design review, and implementation review contribute significantly to reducing the risk that such undesired behaviour is present. [9]
Security target
  • A set of security requirements and specifications to be used as the basis for evaluation of an identified TOE [Target of Testing]. [9]
Security testing
  • Testing whether the system meets its specified security objectives. [4]
  • Security testing attempts to verify that protection mechanisms built into a system will, in fact, protect it from improper penetration. ... Given enough time and resources, good security testing will ultimately penetrate a system. [20] (p.652)
  • A process used to determine that the security features of a system are implemented as designed. This includes hands-on functional testing, penetration testing, and verification. [8]
Silver bullet
  • A methodology, practice, or prescription that promises miraculous results if followed - e.g., structured programming will rid you of all bugs, as will human sacrifices to the Atlantean god Fugawe. Named either after the Lone Ranger whose silver bullets always brought justice or, alternatively, as the only known antidote to werewolves. [5]
Smart testing
  • Tests that based on theory or experience are expected to have a high probability of detecting specified classes of bugs; tests aimed at specific bug types. [5]
Snake oil
  • Derogatory term applied to a product whose developers describe it with misleading, inconsistent, or incorrect technical statements. [18]
Sneaker
  • An individual hired to break into places in order to test their security; analogous to tiger team. [7]
Software reliability
  • (IEEE) (1) the probability that software will not cause the failure of a system for a specified time under specified conditions. The probability is a function of the inputs to and use of the system in the software. The inputs to the system determine whether existing faults, if any, are encountered. (2) The ability of a program to perform its required functions accurately and reproducibly under stated conditions for a specified period of time. [10]
Statement coverage
  • Metric of the number of source language statements executed under test. [5]
Static analysis
  • The process of evaluating a system or component based on its form, structure, content, or documentation. Contrast with: dynamic analysis. [6]
  • Analysis of a program carried out without executing the program. [4]
  • (NBS) Analysis of a program that is performed without executing the program. [10]
Stress testing
  • Testing in which a system is subjected to unrealistically harsh inputs or load with inadequate resources with the intention of breaking it. [5]
  • Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements. See also: boundary value. [6]
  • Stress tests are designed to confront programs with abnormal situations. ... Stress testing executes a system in a manner that demands resources in abnormal quantity, frequency, or volume. ... Essentially, the tester attempts to break the program. [20] (p.652-653)
Structural testing
  • Testing that takes into account the internal mechanism of a system or component. Types include branch testing, path testing,, statement testing. Syn: glass-box testing; white-box testing. Contrast with: functional testing (1) [6]
  • (1) (IEEE) Testing that takes into account the internal mechanism [structure] of a system or component. Types include branch testing, path testing, statement testing. (2) Testing to insure each program statement is made to execute during testing and that each program statement performs its intended function. Contrast with functional testing. Syn: white-box testing, glass-box testing, logic driven testing. [10]
Subtest
  • The smallest identifiable part of a test consisting of at least one input and one outcome. [5]
Symbolic execution
  • A software analysis technique in which program execution is simulated using symbols, such as variable names, rather than actual values for input data, and program outputs are expressed as logical or mathematical expressions involving these symbols. [6]
Syntax
  • The structural or grammatical rules that define how symbols in a language are to be combined to form words, phrases, expressions, and other allowable constructs. [10]
Syntax testing
  • A test case design technique for a component or system in which test case design is based upon the syntax of the input. [4]
System testing
  • The testing of a complete system prior to delivery. The purpose of system testing is to identify defects that will only surface when a complete system is assembled. That is, defects that cannot be attributed to individual components or the interaction between two components. System testing includes testing of performance, security, configuration sensitivity, startup and recovery from failure modes. [21]
System Under Test, SUT
  • The real open system in which the Implementation Under Test (IUT) resides. [17]
Target of evaluation, TOE
  • An IT product or system and its associated administrator and user guidence documentation that is the subject of evaluation. [9]
Taxonomy
  • A scheme that partitions a body of knowledge and defines the relationships among the pieces. It is used for classifying and undertranding the body of knowledge. [22]
  • A taxonomy is the theoretical study of classiication, including its bases, principles, procedures and rules [Simpson 1945; ...; WEBOL 1998]. [11]
Technical attack
  • An attack that can be perpetrated by circumventing or nullifying hardware and software protection mechanisms, rather than by subverting system personnel or other users. [8]
Technical vulnerability
  • A hardware, firmware, communication, or software flaw that leaves a computer processing system open for potential exploitation, either externally or internally, thereby resulting in risk for the owner, user, or manager of the system. [8]
Test
  • (1) An activity in which a system or component is executed under specified conditions, the results are observed or recorded and an evaluation is made of some aspect of the system or component. (2) To conduct an activity as in (1). (3) A set of one or more test cases. (4) A set of one or more test procedures. (5) A set of one or more test cases and procedures. [6]
  • Subtests are grouped into tests, which must be run as a set, typically because the outcome of one subtest is the input or the initial condition for the next subtest in the test. Tests can be run independently of one another but are typically defined over the same database. [5] (p.447)
Test bed
  • An environment containing the hardware, instrumentation, simulators, software tools, and other support elements needed to conduct a test. [6]
  • Any system whose primary purpose is to provide a framework within which other systems can be tested. Test beds are usually tailored to a specific programming language and implementation technique, and often to a specific application. Typically a test bed provides some means of simulating the environment of the system under test, of test-data generation and presentation, and of recording test results. [23] according to Dictionary of Computing, Vallerie Illingworth, C1996
Test bed configuration
  • This includes many things: hardware physical configuration, platform software configuration, operating system version, sysgen details, test terminals, test tools, etc. It must be possible to precisely recreate the entire test situation... [5] (p.448)
Test case
  • (1) A set of test inputs, execution conditions, and expected results developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement [do178b?]. (2) Documentation specifying inputs, predicted results, and a set of execution conditions for a test item [24]. See also: test case generator; test case specification. [6]
  • A document describing a single test instance in terms of input data, test procedure, test execution environment and expected outcome. Test cases also reference test objectives such as verifying compliance with a particular requirement or execution of a particular program path. [21]
Test case generator
  • A software tool that accepts as input source code, test criteria, specifications, or data structure definitions; uses these inputs to generate test input data; and, sometimes, determines expected results. Syn: test data generator, test generator. [6]
Test case specification
  • A document that specifies the test inputs, execution conditions, and predicted results for an item to be tested. Syn: test description, test specification. [6]
Test case suite
  • A collection of one or more test cases for the software under test. [4]
Test cycle
  • A formal test cycle consists of all tests performed. In software development, it can consist of, for example, the following tests: unit/component testing, integration testing, system testing, user acceptance testing and the code inspection. [25]
Test design
  • Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests. [6] [24]
Test driver
  • A program or testing tool used to execute and control testing. Includes initialization, data object support, preparation of input values, call to tested object, recording and comparison of outcomes to required outcomes. [5]
  • A software module used to invoke a module under test and, often, provide test inputs, control and monitor execution, and report test results. Syn: test harness. [6]
  • A program or test tool used to execute software against a test case suite. [4]
Test environment
  • A description of the hardware and software environment in which the tests will be run, and any other software with which the software under test interacts when under test including stubs and test drivers. [4]
Test execution
  • The processing of a test case suite by the software under test, producing an outcome. [4]
Test generator
  • A program that generates tests in accordance to a specified strategy or heuristic. [5]
  • See: test case generator. [6]
Test item
  • A software item which is an object of testing. [6] [24]
Test log
  • A chronological record of all relevant details about the execution of a test. [6]
Test plan
  • A document describing the scope, approach, resources, and schedule of intended test activities. It identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring contingency planning. [6] [24]
  • A record of the test planning process detailing the degree of tester indedendence, the test environment, the test case design techniques and test measurement techniques to be used, and the rationale for their choice. [4]
Test procedure
  • (1) Detailed instructions for the set-up, execution, and evaluation of results for a given test case. (2) A document containing a set of associated instructions as in (1). (3) Documentation specifying a sequence of actions for the execution of a test [24] [6]
  • (NIST) A formal document developed from a test plan that presents detailed instructions for the setup, operation, and evaluation of the results for each defined test. See: test case. [10]
Test report
  • A document that summarizes the outcome of testing in terms of items tested, summary of results (e.g. defect density), effectiveness of testing and lessons learned. [21]
  • A document that describes the conduct and results of the testing carried out for a system or component. Syn: test summary report. [6]
Test result analyzer
  • A software tool used to test output data reduction, formatting, and printing. [10]
Test strategy
  • Any method for generating tests based on formally or informally defined criteria of test completeness (also test technique). [5]
Test suite
  • A test suite is a set of related tests, usually pertaining to a group of features or software component and usually defined over the same database. Suites are combined into groups. [5] (p.448)
  • A group of tests with a common purpose and database, usually run as a group. [5]
Tester
  • One who writes and/or executes tests of software with the intention of demonstrating that the software does not work. Contrast with programmer whose tests (if any) are intended to show that the program does work. [5]
Testing
  • The purpose of testing is to discover errors. Testing is the process of trying to discover every conceivable fault or weakness in a work product. [26]
  • (1) The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component. (2) The process of analyzing a software item to detect the differences between existing and required conditions, (that is, bugs) and to evaluate the features of the software items. [6]
Thrashing
  • A state in which a computer system is expending most or all of its resources on overhead operations, such as swapping data between main and auxiliary storage, rather than on intended computing functions. [6]
Threat
  • The means through which the ability or intent of a threat agent to adversely affect an automated system, facility, or operation can be manifest. A potential violation of security. [7]
  • Any circumstance or event with the potential to cause harm to a system in the form of destruction, disclosure, modification of data, and/or denial of service. [8]
Threat analysis
  • The examination of all actions and events that might adversely affect a system or operation. [8]
Tiger team
  • [U.S. military jargon] 1. Originally, a team (of sneakers) whose purpose is to penetrate security, and thus test security measures. ... Serious successes of tiger teams sometimes lead to early retirement for base commanders and security officers. 2. Recently, and more generally, any official inspection team or special firefighting group called in to look at a problem. A subset of tiger teams are professional crackers, testing the security of military computer installations by attempting remote attacks via networks or supposedly `secure' comm channels. The term has been adopted in commercial computer-security circles in this more specific sense. [27]
  • Government and industry - sponsored teams of computer experts who attempt to break down the defenses of computer systems in an effort to uncover, and eventually patch, security holes. [7]
Trojan Horse
  • An apparently useful and innocent program containing additional hidden code which allows the unauthorized collection, exploitation, falsification, or destruction of data. [7]
Underflow
  • (ISO) The state in which a calculator shows a zero indicator for the most significant part of a number while the least significant part of the number is dropped. For example, if the calculator output capacity is four digits, the number .0000432 will be shown as .0000. See: arithmetic underflow. [10]
Unit
  • The smallest piece of software that can be independently tested (i.e., compiled or assembled, loaded, and tested). Usually the work of one programmer consisting of a few hundred lines of source code. [5]
Validation
  • The process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements. [6]
  • (1) (FDA) Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes. Contrast with data validation. [10]
Vendor
  • A person or an organization that provides software and/or hardware and/or firmware and/or documentation to the user for a fee or in exchange for services. Such a firm could be a medical device manufacturer. [10]
  • A 'vendor' is any entity that produces networking or computing technology, and is responsible for the technical content of that technology. Examples of 'technology' include hardware (desktop computers, routers, switches, etc.), and software (operating systems, mail forwarding systems, etc.). Note that the supplier of a technology is not necessarily the ' vendor' of that technology. As an example, an Internet Service Provider (ISP) might supply routers to each of its customers, but the 'vendor' is the manufacturer, since the manufacturer, rather than the ISP, is the entity responsible for the technical content of the router. [28]
Vulnerability
  • Hardware, firmware, or software flow that leaves an AIS open for potential exploitation. A weakness in automated system security procedures, administrative controls, physical layout, internal controls, and so forth, that could be exploited by a threat to gain unauthorized access to information or disrupt critical processing. [7]
  • A weakness in system security procedures, system design, implementation, internal controls, etc., that could be exploited to violate system security policy. [8]
  • (I) A flaw or weakness in a system's design, implementation, or operation and management that could be exploited to violate the system's security policy. [2]
  • (C) Most systems have vulnerabilities of some sort, but this does not mean that the systems are too flawed to use. Not every threat results in an attack, and not every attack succeeds. Success depends on the degree of vulnerability, the strength of attacks, and the effectiveness of any countermeasures in use. If the attacks needed to exploit a vulnerability are very difficult to carry out, then the vulnerability may be tolerable. If the perceived benefit to an attacker is small, then even an easily exploited vulnerability may be tolerable. However, if the attacks are well understood and easily made, and if the vulnerable system is employed by a wide range of users, then it is likely that there will be enough benefit for someone to make an attack. [2]
  • "A state-space vulnerability is a characterization of a vulnerable state which distinguishes it from all non-vulnerable states. If generic, the vulnerability may characterize many vulnerable states; if specific, it may characterize only one..." [Bishop and Bailey 1996] [11]
  • The Data & Computer Security Dictionary of Standards, Concepts, and Terms [Longley and Shain 1990] defines computer vulnerability as: 1) In computer security, a weakness in automated systems security procedures, administrative controls, internal controls, etc., that could be exploited by a threat to gain unauthorized access to information or to disrupt critical processing. 2) In computer security, a weakness in the physical layout, organization, procedures, personnel, management, administration, hardware or software that may be exploited to cause harm to the ADP system or activity. The presence of a vulnerability does not itself cause harm. A vulnerability is merely a condition or set of conditions that may allow the ADP system or activity to be harmed by an attack. 3) In computer security, any weakness or flaw existing in a system. The attack or harmful event, or the opportunity available to threat agent to mount that attack. [11]
  • [Amoroso 1994] defines a vulnerability as an unfortunate characteristic that allows a hreat to potentially occur. A threat is any potential occurence, malicious or otherwise, that can have an undesirable effect on these assets and resources associated with a computer system. [11]
  • ...a fuzzy vulnerability is a violation of the expectations of users, administrators, and designers. Particularly when the violation of these expectations is triggered by an external object. [11]
  • Software can be vulnerable because of an error in its specification, development, or configuration. A software vulnerability is an instance of an error in the specification, development, or configuration of software such that its execution can violate the security policy. [11]
  • A feature or a combination of features of a system that allows an adversary to place the system in a state that is both contrary to the desires of the people responsible for the system and increases the risk (probability or consequence) of undesirable behavior in or of the system. A feature or a combination of features of a system that prevents the successful implementation of a particular security policy for that system. A program with a buffer that can be overflowed with data supplied by the invoker will usually be considered a vulnerability. A telephone procedure that provides private information about the caller without prior authentication will usually be considered to have a vulnerability. [14]
  • A flaw or weakness in a system's design, implementation, or operation and management that could be exploited to violate the system's security policy. [29]
  • A 'vulnerability' is a characteristic of a piece of technology which can be exploited to perpetrate a security incident. For example, if a program unintentionally allowed ordinary users to execute arbitrary operating system commands in privileged mode, this "feature" would be a vulnerability. [28]
Vulnerability analysis
  • Systematic examination of an AIS or product to determine the adequacy of security measures, identify security deficiencies, provide data from which to predict the effectiveness of proposed security measures, and confirm the adequacy of such measures after implementation. [7]
  • The systematic examination of systems in order to determine the adequacy of security measures, identify security deficiencies, and provide data from which to predict the effectiveness of proposed security measures. [8]
Vulnerability assessment
  • A measurement of vulnerability which includes the susceptibility of a particular system to a specific attack and the opportunities available to a threat agent to mount that attack. [8]
Vulnerability case
  • (missing definition)
Worm
  • Independent program that replicates from machine to machine across network connections often clogging networks and information systems as it spreads. [7]
  • (I) A computer program that can run independently, can propagate a complete working version of itself onto other hosts on a network, and may consume computer resources destructively. (See: Morris Worm, virus.) [2]
  • A computer program which replicates itself and is self- propagating. Worms, as opposed to viruses, are meant to spawn in network environments. Network worms were first defined by Shoch & Hupp of Xerox in ACM Communications (March 1982). The Internet worm of November 1988 is perhaps the most famous; it successfully propagated itself on over 6,000 systems across the Internet. See also: Trojan Horse, virus. [1]
Other possible sources of terms
These are some works that we should find and look through for checking possibly useful terminology and cross-referencing the terms present.
  • Dictionary of Computing, Vallerie Illingworth, C1996
  • National Bureau of Standards [NBS] Special Publication 500-75 Validation, Verification, and Testing of Computer Software, 1981.
  • NBS. Special Publication 500-56, "Validation, Verification, and Testing for the Individual Programmer." (February 1980).
  • NBS. S.P. 500-93 Software Validation, Verification, and Testing Technique and Tool Reference Guide, 1982, 138 pp.
  • NBS. Special Publication 500-98, "Planning for Software Validation, Verification, and Testing." (November 1982).
  • Federal Information Processing Standards [FIPS] Publication 101, Guideline For Lifecycle Validation, Verification, and Testing of Computer Software, 1983.
  • American National Standard for Information Systems, Dictionary for Information Systems, American National Standards Institute, 1991.
  • Pressman, R., Software Engineering, A Practitioner's Approach, Third Edition, ?McGraw-Hill, Inc., 1992.
  • Myers, G., The Art of Software Testing, Wiley Interscience, 1979.
  • Beizer, B., Software Testing Techniques, Second Edition, Van Nostrand Reinhold, 1990.
  • Voas
  • The New IEEE Standard Dictionary of Electrical and Electronics Terms, IEEE Std. 100-1992.
  • IEEE Standards Collection, Software Engineering, 1994 Edition, published by the Institute of Electrical and Electronic Engineers Inc.
  • do178b
  • FDA Technical Report, Software Development Activities, July 1987.
  • FDA Guide to Inspection of Computerized Systems in Drug Processing, 1983.
  • FDA Guideline on General Principles of Process Validation, May 1987.
  • Reviewer Guidance for Computer Controlled Medical Devices Undergoing 510(k) Review, Office of Device Evaluation, CDRH, FDA, August 1991.
  • HHS Publication FDA 90-4236, Preproduction Quality Assurance Planning.
  • BSI. BS 4778-89 Glossary of Terms used in Quality Assurance

References

[1]
[2]
[3]
  • COBUILD English Language Dictionary. (1990). Collins.
[4]
[5]
  • Beizer, B.. Software Testing Techniques. Second edition. (1990). Van Nostrand Reinhold. ISBN: ISBN 1850328803.
[6]
  • (1991). "Standard Glossary of Software Engineering Terminology (ANSI)". The Institute of Electrical and Electronics Engineers Inc..
[7]
[8]
[9]
  • (1999). "Common Criteria for Information Technology Security Evaluation - Part 1". Common Criteria.
[10]
[11]
  • Krsul, I.. (1998). "Software Vulnerability Analysis". Department of Computer Sciences, Purdue University.
[12]
[13]
[14]
[15]
[16]
[17]
[18]
[19]
  • Nancy G. Leveson. Safeware: system safety and computers. (1995). Addison-Wesley Publishing Company Inc.. ISBN: ISBN 0-201-11972-2.
[20]
  • Pressman, R.. Software Engineering, A Practitioner's Approach. Third edition. (1992). ?McGraw-Hill.
[21]
[22]
  • (1986). "Standard Taxonomy for Software Engineering Standards (ANSI)". The Institute of Electrical and Electronics Engineers Inc..
[23]
[24]
  • (1983). "Standard for Software Test Documentation". The Institute of Electrical and Electronics Engineers Inc..
[25]
  • Perkins, J.. Advanced Microsoft Visual Basic 5, Chapter 10: Well, at Least It Compiled OK! The Value of Software Testing. Microsoft Press.
[26]
  • Myers, G.. The Art of Software Testing. (1979). Wiley Interscience.
[27]
[28]
[29]
  • J. Arvidsson, A. Cormack, Y. Demchenko, J. Meijer. (2001). "TERENA's Incident Object Description and Exchange Format Requirements". The Internet Society. http://www.ietf.org/rfc/rfc3067.txt.

Other glossaries

These do not contain terms that we were interested in, but might be useful for others.
"Glossary of Communications, Computer, Data, and Information Security Terms"
"Payment and security glossaries and taxonomies"
"?TestWorks & Testing Technology Glossary"
"Valtionhallinnon tietoturvakäsitteistö"
"Tietotekniikan termitalkoot"
"ATK-sanakirja"
"HSTYA-projektin terminologiaa"

No comments:

Post a Comment