Omitting correlated variables

  • L. Jenkins
  • M Anderson


Data collected on the physical, biological or man-made world are often highly correlated, posing the question of whether fewer variables would contain almost as much information. A crude solution is simply to look at the Pearson correlation matrix and omit one of a pair of highly correlated variables. A more systematic method is to condition on one or more variables, and observe the resulting partial covariance matrix. If the variables have little variance after the conditioning, then the conditioning variables contain most of the information of all the original variables. Paralleling the usual tests applied in judging how many principal components are sufficient to represent all the data, we can use the amount of variance explained by the conditioning variable (s), as a measure of information content. The paper references earlier work in this area, explains the computation and includes examples using published data sets. The approach is found to be highly competitive with using principal components, and has the obvious advantage over principal components of simply omitting some of the original variables from further consideration. The method has been coded in Visual-Basic add-ins to an Excel spreadsheet.
Research Articles