**Core Concept**
Sensitivity is a key performance metric for screening tests, which are used to detect diseases or conditions in asymptomatic populations. It measures the proportion of true cases that are correctly identified by the test. In other words, it estimates the likelihood that a person with the disease will test positive.
**Why the Correct Answer is Right**
The sensitivity of a screening test is calculated as the number of true positives (individuals with the disease who test positive) divided by the sum of true positives and false negatives (individuals with the disease who test negative). This means that a highly sensitive test will have a high proportion of true positives, indicating its ability to detect most cases of the disease. For example, in the context of HIV screening, a test with high sensitivity will correctly identify the majority of individuals who are infected with the virus.
**Why Each Wrong Option is Incorrect**
**Option A:** False positive refers to the proportion of individuals without the disease who test positive, which is a measure of the test's specificity, not sensitivity.
**Option B:** False negative is the proportion of individuals with the disease who test negative, which is the complement of sensitivity, not a measure of it.
**Option C:** True negative refers to the proportion of individuals without the disease who test negative, which is also a measure of specificity, not sensitivity.
**Clinical Pearl / High-Yield Fact**
Remember that a highly sensitive screening test is not necessarily a highly specific one. This is because sensitivity and specificity are inversely related, meaning that increasing the sensitivity of a test will often decrease its specificity, and vice versa.
**Correct Answer:**
β Correct Answer: D. True positive.
Free Medical MCQs Β· NEET PG Β· USMLE Β· AIIMS
Access thousands of free MCQs, ebooks and daily exams.
By signing in you agree to our Privacy Policy.