Is the standard deviation (s or σn-1) different than the normal standard deviation? How do u calculate it? formula?
The standard deviation = sqrt(variance). And variance = SUM((x – m)^2)/N or SUM((x – m)^2)/(N – 1) for the population or sample respectively. m = SUM(nx)/N is the weighted average where n is the weight.
s is the variable commonly used for the sample std dev (N – 1) and σ (sigma) is used for the population (N). N is the number of data points included in the variance.
As you can see s and σ differ only in the denominator. N – 1 in the s, sample, equation and N in the population one. As N – 1 < N, it represents more uncertainty by yielding a larger variance and consequent std dev for the sample than for the population.
And if you think about it, that makes sense. When taking just a few items as a sample from a larger population, you have to believe there would be more uncertainty (i.e., bigger variance) when inferring things about the whole population. And that’s why N – 1 < N is used in the denominator for the standard deviation of a sampling.