Activation functions between network layers
activation(values = values_activation)
activation_2(values = values_activation)
values_activationAn object of class character of length 23.
This parameter is used in parsnip models for neural networks such as
parsnip:::mlp().
values_activation
#> [1] "celu" "elu" "exponential" "gelu" "hardshrink"
#> [6] "hardsigmoid" "hardtanh" "leaky_relu" "linear" "log_sigmoid"
#> [11] "relu" "relu6" "rrelu" "selu" "sigmoid"
#> [16] "silu" "softmax" "softplus" "softshrink" "softsign"
#> [21] "swish" "tanh" "tanhshrink"
activation()
#> Activation Function (qualitative)
#> 23 possible values include:
#> 'celu', 'elu', 'exponential', 'gelu', 'hardshrink', 'hardsigmoid', 'hardtanh',
#> 'leaky_relu', 'linear', 'log_sigmoid', 'relu', 'relu6', 'rrelu', 'selu',
#> 'sigmoid', 'silu', 'softmax', 'softplus', …, 'tanh', and 'tanhshrink'