How to Calculate the Variance in Salaries

by James Green; Updated September 26, 2017

The variance of a set of data measures how far the observations are spread out. To calculate the variance of any distribution, the data's mean and number of observations are needed. Larger calculations are simplified with the use of a spreadsheet, a tool that will not only make the process easier but will also contain built-in functions that calculate the variance automatically.

Step 1

Gather data regarding salaries. It's important that such data is within the same time frame, for example, in the same month, quarter or year. Databases for salaries in the United States may be obtained from the Bureau of Labor Statistics. Compiling your data into a spreadsheet will make the calculation process easier.

Step 2

Calculate the mean of the sample of salaries. This is accomplished by adding up all of the salaries, then dividing by the sample size. The sample size is the number of observations in your sample. So if you have observations of salaries of $18,000, $12,000 and $14,000 a year, adding all of these up and dividing by three gives a mean of $14,666.67, rounded up to the nearest penny.

Step 3

Subtract the mean from each observed salary. Using the same example, the results are 3,333.33, -2,666.67 and -666.67. Take the square root of each these results. This leaves you with 11,111,088.89 for the first observation, 7,111,128.89 for the second and 444,448.89 for the third. Take the sum of all of these results, which add up to $18,666,666.67.

Step 4

Divide your result by the number of observations, minus one, to obtain the variance. Using the same example, dividing by two would give a variance of $9,333,333.33. Taking the square root of this number gives the standard deviation, which would equal $3,055.05. Due to its smaller nature, many people find the standard deviation easier to deal with than the variance.

bibliography-icon icon for annotation tool Cite this Article