False PositiveA false positive is a test that incorrectly indicates the existence of something that doesn't exist. For example, a test that incorrectly indicates there is lead in a water sample when there is no lead present.
False NegativeA false negative is a test that incorrectly indicates the nonexistence of a condition. For example, a test that says you don't have a disease when in fact you do have it.
Random ErrorA random error is an unpredictable error. For example, a wireless data transmission that is disrupted by electromagnetic interference from the environment that results in random change to the data.
Human ErrorHuman error such as a distracted driver who hits something.
Latent Human ErrorLatent human error is a system, process or tool that encourages or fails to prevent human error. It is common to blame humans for things that are actually larger failures of organizations or design. For example, a cumbersome process for disabling an automated system that is malfunctioning on an aircraft that is likely to fail in tragic ways.
Gross ErrorsA gross error is a large mistake. These typically involve humans but should be caught by processes and systems such that they are often latent human errors. For example, a stock trader who is instructed to buy 1,000,000 shares for $30 each that enters these numbers in reverse such that they order 30 shares for $1,000,000 each. If this trade executes it is more of a system error than a human error.
Systemic ErrorAn error that is built into systems, processes or machines. For example, a flight navigation system that is updated with a software flaw that performs a critical calculation incorrectly.
GlitchA glitch is a transitory systemic error or random error that doesn't have much impact. For example, if you were trapped in a simulated reality it might have small temporary imperfections such as corrupted data issues.
Halt and Catch FireHalt and catch fire is a humorous analogy to poor reliability engineering such as an operating system that completely crashes when it encounters errors. Historically, it was common for coders to feel it was dangerous to continue processing data when an error occurs such that they escalated errors as opposed to trying to proceed. The logical conclusion of this would be the machine instruction -- halt and catch fire whereby the safest thing to do in the face of the error is to ask the machine to self-destruct.
Observational ErrorAn incorrect measurement or observation. This can be a human error, systemic error or random error. For example, a weighing scale that is miscalibrated such that all the measurements in an experiment are wrong.
Invalid AssumptionA stated or unstated assumption that is incorrect. For example, a driver who assumes a bridge isn't icy and slippery because the car ahead of them doesn't appear to have any trouble crossing.
Error of JudgmentPoor decisions that are fundamentally unreasonable can be viewed as errors. Decisions involve uncertainty and risk. As such, if a decision doesn't work out well, it wasn't necessarily an error of judgement. In order to qualify as an error of judgement, a decision must be unreasonable given the information available at the time. For example, randomly picking a fight with someone who you know to be a successful professional boxer might be reasonably classified as an error of judgement.
FallaciesA fallacy is an error of logic. For example, prosecutors fallacy whereby the effect of false positives over a large number of samples is misinterpreted.
An artificial intelligence monitors shoppers in a shop for the purposes of loss prevention. The system has a false positive rate of 1% and scans one million shoppers on Tuesday. On Tuesday, one thing is stolen from the shop. Amy is caught by the system. What is the probability she is innocent?
Fallacy: Many people will focus on the 1% false positive rate such that they feel there is a 1% chance Amy is innocent.
Answer: The system scans one million people on Tuesday such the false positive rate of 1% will generate 10,000 false positives. As only one thing was actually stolen, this means that 99.99% of people identified by the system are innocent. As such, there is also a 99.99% probability that Amy is innocent.
Error of OmissionA failure to include something that should be included. For example, a study of the long term returns of the stock market that fails to consider firms that were delisted from markets, often due to complete financial failure.
Erroneous InactionErrors can result from both action and inaction. For example, a self driving car that fails to stop for a pedestrian because it ran into computational delays.
Resource ExhaustionErrors due to the capacity of resources. For example, an out of memory error that causes a software application to crash.
Error vs MistakeAn error is unambiguously wrong. If a scale tells you that you weigh 400,000 kilograms the machine has made an error, assuming you in a 1g gravitational environment such as the gravity of Earth. The term error is often applied to technologies, processes and procedures where an unambiguously correct result is expected.A mistake is wrong but in a more ambiguous way. For example, if you write a brilliant essay and your professor gives you a low grade, you may feel that this is a mistake. The term mistake is often applied to human situations as it is often debatable whether something is wrong. The term error is applied to human actions, strategy, decisions and communications where a high degree of precision and accuracy can be reasonably expected. For example, a bank that publishes an incorrect interest rate on its website would likely admit this is an error as opposed to a mere mistake.Mistakes are also used to describe complexities such as systems where "correct" and "incorrect" actions are often incalculable. For example, a well-intended government policy that has unintended consequences.
|Definition||Information or a course of action that is unambiguously incorrect.|