Multivariate statistics provide algorithms and functions to analyze multiple variables. Typical applications include dimensionality reduction by feature transformation and feature selection, and exploring relationships between variables using visualization techniques, such as scatter plot matrices and classical multidimensional scaling.
Fitting an Orthogonal Regression Using Principal Component Analysis (Example)
Implement Deming regression (total least squares).
Feature transformation (sometimes called feature extraction) is a dimensionality reduction technique that transforms existing features into new features (predictor variables) where less descriptive features can be dropped. The toolbox offers the following approaches for feature transformation:
Partial Least Squares Regression and Principal Component Regression (Example)
Model a response variable in the presence of highly correlated predictors.
Feature selection is a dimensionality reduction technique that selects only the subset of measured features (predictor variables) that provide the best predictive power in modeling the data. It is useful when you are dealing with high-dimensional data or when collecting data for all features is cost prohibitive.
Feature selection methods include:
Feature selection can be used to:
Selecting Features for Classifying High-Dimensional Data (Example)
Select important features for cancer detection.
Statistics Toolbox provides graphs and charts to explore multivariate data visually, including: