- Trending Categories
- Data Structure
- Networking
- RDBMS
- Operating System
- Java
- iOS
- HTML
- CSS
- Android
- Python
- C Programming
- C++
- C#
- MongoDB
- MySQL
- Javascript
- PHP

- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who

The standard deviation (SD) of a dataset is its average amount of variability. It indicates how far each of the data values in a given distribution deviate from the mean, or center, of the distribution. In the case of normal distributions, a larger standard deviation means that the given values are generally far from the mean, while a smaller standard deviation indicates the values are clustered closer to the mean.

Variance is the average of the squared SDs from the mean. To count variance, one needs to first subtract the mean from each number and then square the outcomes to find the squared differences. Then the average of the given squared differences gives the variance.

**Note** − Standard deviation is calculated by counting the square root of the variance.

Variance and standard deviation help us analyze things that cannot be measured just by taking averages. As an example, imagine that you have three cousins; one is 13 and the other two are twins who are 10. In this case, the average age of the cousins would be 11. Now imagine that you have three cousins aged 17, 12, and 4. In this case, the average age of your cousins would still be 11, but the variance and standard deviation would be different.

Knowing the difference between a population and a sample is important while dealing with statistical measurements. For example, to compute the standard deviation (or variance) of a given population, you would need to collect data for everyone in the group; for sampling purposes, you would only need measurements from a subset of the given population.

If we make an assumption that a group in a set is a population, we need not do anything to the data for counting SD. However, if we treat it as a sample, then calculating the sample standard deviation and sample variance would be different. In such a case, we do not divide the sample size to find the variance, we first subtract each of the data from the sample size and then divide by the smaller number.

Variance and standard deviation serve as the basics of statistical calculations. For example, standard deviation is needed for converting test scores into Z-scores. Variance and standard deviation are used when conducting statistical tests such as t-tests. Therefore, SD and variance play an important role in statistics as well as finance.

- Related Questions & Answers
- How is the standard deviation and variance of a two-asset portfolio calculated?
- Plot mean and standard deviation in Matplotlib
- Average Returns and Standard Deviation of Securities
- C++ Program to Calculate Standard Deviation
- What is Standard Deviation of Return?
- Print the standard deviation of Pandas series
- C program to calculate the standard deviation
- How to create boxplot using mean and standard deviation in R?
- Box plot with min, max, average and standard deviation in Matplotlib
- How to find mean and standard deviation from frequency table in R?
- C++ program to implement standard deviation of grouped data
- Difference between Cost Variance and Schedule Variance
- How to compute the mean and standard deviation of a tensor in PyTorch?
- Python Pandas - Draw a point plot and show standard deviation of observations with Seaborn
- Python Pandas - Draw a bar plot and show standard deviation of observations with Seaborn

Advertisements