Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Explainable Hyperparameters Optimization using Hilbert-Schmidt Independence Criterion

Abstract : Tackling new machine learning problems with neural networks always means optimizing numerous hyperparameters that define their structure and strongly impact their performances. In this work, we use a robust system conception approach to build explainable hyperparameters optimization. This approach is defined by the research of a parametrization of a system (a neural network) to optimize its output response (the prediction error) to an input stimulation (the test set). To that end, we study the use of Hilbert-Schmidt Independence Criterion (HSIC), a probability distribution dependence measure widely used for sensitivity analysis in robust system conception, in the context of Hyperparameters Optimization. Hyperparameters spaces can be complex and awkward, with different natures of hyperparameters (categorical, discrete, boolean, continuous), interactions and inter dependencies, which makes it non trivial to apply HSIC. We alleviate these difficulties to make HSIC applicable in that context and obtain an analysis tool that quantifies the relative impact of hyperparameters on a Neural Network's final error. Notably, we show how this knowledge allows obtaining competitive neural networks that are naturally much more cost effective.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03128298
Contributor : Paul Novello <>
Submitted on : Thursday, February 25, 2021 - 4:23:18 PM
Last modification on : Wednesday, April 14, 2021 - 3:33:53 AM

File

ExplainableHSIC___preprint.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03128298, version 2

Citation

Paul Novello, Gaël Poëtte, David Lugato, Pietro Congedo. Explainable Hyperparameters Optimization using Hilbert-Schmidt Independence Criterion. 2021. ⟨hal-03128298v2⟩

Share

Metrics

Record views

135

Files downloads

52