PCR (Principal Component Regression)
https://doi.org/10.1098/rsta.2015.0202 (opens in a new tab)
Principal Component Regression (PCR) is a technique that combines the features of Principal Components Analysis (PCA) and multiple linear regression. It is used primarily when there are multicollinearity issues within the predictor variables in regression analysis or when the number of predictor variables exceeds the number of observations. PCR first reduces the dimensionality of the predictor variables using PCA and then fits a linear regression model on the transformed predictors.
Configurable Parameters:
- Targets: The response variable(s) that the model aims to predict. This could be a single variable or a multivariate target, depending on the problem at hand.
- Number of components: Defines the number of principal components to be retained for the regression model. This parameter impacts both the performance and accuracy of the regression model, balancing between underfitting and overfitting.
Visual Example
We train the model using a spectral database with known values. This enables the machine to recognize subtle changes in the spectra related to the contents, thereby allowing it to predict the contents of spectra with unknown values.
Internally, the machine learning algorithm generates a calibration curve, positioning both known and unknown samples along it and quantizing them accordingly.
Key Concepts
-
Principal Components: Derived from the PCA step, these components are the new variables obtained from an orthogonal transformation, intended to capture the significant variance and patterns in the original variables.
Similar to PCA, principal components in PCR serve as inputs in the subsequent regression analysis.
-
Regression Coefficients: These coefficients correspond to each principal component and indicate their contribution to the target variable.
The coefficients are calculated during the regression phase where the dependent variable (e.g., wine quality) is predicted using the significant principal components.
-
Variance Ratio: Indicates what proportion of the variance in the original variables is retained in the selected principal components.
This metric guides the selection of the number of components to use. It is crucial for ensuring sufficient variability is retained to maintain model accuracy while also promoting simplicity.
Data Example
Let's consider an example dataset consisting of different types of vehicles and their attributes measured for predicting fuel efficiency.
Vehicle | Weight | Engine Size | CO2 Emissions |
---|---|---|---|
Car 1 | 1500 | 1.6 | 120 |
Car 2 | 2000 | 2.0 | 135 |
Car 3 | 1800 | 2.2 | 150 |
Car 4 | 1200 | 1.4 | 110 |
After applying PCA and retaining 2 PCs, the transformed data (Scores) might look like this:
Vehicle | PC1 | PC2 |
---|---|---|
Car 1 | -2.15 | 0.30 |
Car 2 | 1.85 | -0.75 |
Car 3 | 0.95 | 1.20 |
Car 4 | -0.65 | -0.75 |
PCR then uses these scores to model the CO2 Emissions. Supposing the regression model gives us these coefficients:
Coefficient | Value |
---|---|
Intercept | 100 |
PC1 | 8.0 |
PC2 | 1.5 |
The predicted CO2 emissions are estimated using the formula:
Mathematical Explanation of PCR
- Standardize the data (same as in PCA):
-
Apply PCA to the standardized data to obtain principal components.
-
Select the number of components based on the cumulative explained variance.
-
Fit a linear regression model using the selected principal components as predictors.
The regression equation can be represented as:
Where:
- are the coefficients corresponding to each principal component.
- is the model's error term.
-
Use the regression model to predict and interpret results.
The calculated regression coefficients indicate the influence of each principal component on the target variable. The intercept is the average predicted value when all principal components are zero.
PCR is especially useful when dealing with high-dimensional data, helping to prevent overfitting by reducing the number of predictors and elucidating the most impactful variables.