scnn.regularizers
Regularizers for training neural networks by convex reformulation.
- class scnn.regularizers.FeatureGL1(lam: float)
A feature-wise group-L1 regularizer.
This regularizer produces feature sparsity in the final model, meaning that some features will not be used after training. The regularizer has the form,
\[R(U) = \lambda \sum_{i = 1}^d \|U_{\cdot, i}\|_2,\]where \(\lambda\) is the regularization strength.
- lam
the regularization strength.
- class scnn.regularizers.L1(lam: float)
The L1 norm regularizer.
The regularizer has the form,
\[R(U) = \lambda \sum_{i = 1}^p \|U_{i}\|_1,\]where \(\lambda\) is the regularization strength.
- lam
the regularization strength.
- class scnn.regularizers.L2(lam: float)
The standard squared-L2 norm regularizer, sometimes called weight-decay.
The regularizer has the form,
\[R(U) = \lambda \sum_{i = 1}^p \|U_{i}\|^2_2,\]where \(\lambda\) is the regularization strength.
- lam
the regularization strength.
- class scnn.regularizers.NeuronGL1(lam: float)
A neuron-wise group-L1 regularizer.
This regularizer produces neuron sparsity in the final model, meaning that some neurons will be completely inactive after training. The regularizer has the form,
\[R(U) = \lambda \sum_{i = 1}^p \|U_i\|_2,\]where \(\lambda\) is the regularization strength.
- lam
the regularization strength.
- class scnn.regularizers.Regularizer(lam: float)
Base class for all regularizers.