If each value of a given group of observation is multiplied by 10, the standard deviation of the resulting observation is –
First, I need to recall the properties of standard deviation under linear transformations. If you add a constant to each data point, the standard deviation remains the same because adding shifts all values equally without changing their spread. But if you multiply each value by a constant, the standard deviation should also be multiplied by that constant. So multiplying by 10 would scale the standard deviation by 10 as well.
Wait, let me verify. Suppose the original standard deviation is σ. If each data point is multiplied by 10, the new standard deviation becomes 10σ. That makes sense because the deviations from the mean are also scaled by 10. So the answer should be that the standard deviation becomes 10 times the original.
Looking at the options provided, the correct answer is likely option C, which would be 10 times the original standard deviation. Now, the other options might be trickier. For example, if someone thought that standard deviation is unaffected by scaling, they might pick the same value. Or if they thought it's squared, they might choose 100 times. But I need to make sure I explain why each wrong option is incorrect.
The key concept here is understanding how linear transformations affect statistical measures. Variance, on the other hand, would be multiplied by the square of the constant (so 100 times), but standard deviation is the square root of variance. Therefore, standard deviation scales linearly with the constant.
So the correct answer is that the standard deviation is multiplied by 10. The clinical pearl here is to remember that scaling data affects standard deviation linearly, while variance is affected quadratically. This is a common point in biostatistics exams.
**Core Concept**
Standard deviation quantifies the spread of data around the mean. When all data points are multiplied by a constant, the standard deviation scales linearly by the same factor, while variance scales by the square of the constant. This is a fundamental property of descriptive statistics.
**Why the Correct Answer is Right**
Multiplying each observation by 10 scales the deviations from the mean by 10. Since standard deviation is calculated as the square root of the average squared deviation, the factor of 10 is preserved after squaring and square-rooting. For example, if original SD = σ, new SD = 10σ. Variance, however, would increase by 100×σ².
**Why Each Wrong Option is Incorrect**
**Option A:** Suggesting no change ignores the linear scaling effect of multiplication on spread.
**Option B:** Suggesting a 100× increase incorrectly applies the variance rule (quadratic scaling) to standard deviation.
**Option D:** Suggesting division by 10 reverses the direction of scaling, violating basic statistical principles.
**Clinical Pearl / High-Yield Fact**
Remember: *Multiplication scales SD linearly, variance quadratically. Addition/subtraction affects mean but leaves SD unchanged.* This distinction is critical for interpreting