Performance assessment and parameter tuning

⚠️ NOTE: This page describes a new model assessment and tuning workflow in the development version of mixOmics. We invite you to try it and share feedback during beta testing! To install, use:
devtools::install_github("mixOmicsTeam/mixOmics", ref = "6.31.4")
If you’re using the stable Bioconductor version, this page won’t apply.

This page provides an overview of model performance assessment and hyperparameter tuning in mixOmics, focusing on supervised models such as PLS-DA. Key functions discussed include tune(), perf.assess(), auroc(), and predict(). Performance assessment can be done using test data or cross-validation (Mfold, loo). Misclassification error rates (ER, BER), distance metrics (max.dist, centroids.dist, mahalanobis.dist), and AUC-ROC analysis are explored for classification models, while regression models use MSEP, RMSEP, R2, and Q2. Guidelines for selecting cross-validation parameters (folds, nrepeat) ensure reliable model evaluation.

Data used on this page:
srbct

Key functions used on this page:
summary()

Related case studies:
Case Study: sPLS-DA SRBCT

See more:
Selecting your method
Parameter tuning
Performance Assessment of your final model
More details on Distance Metrics