These functions generate parameters that are useful for neural network models.
A two-element vector holding the defaults for the smallest and largest possible values, respectively. If a transformation is specified, these values should be in the transformed units.
A trans object from the scales package, such as
scales::transform_log10() or scales::transform_reciprocal(). If not provided,
the default is used which matches the units used in range. If no
transformation, NULL.
dropout(): The parameter dropout rate. (See parsnip:::mlp()).
epochs(): The number of iterations of training. (See parsnip:::mlp()).
hidden_units(): The number of hidden units in a network layer.
(See parsnip:::mlp()).
batch_size(): The mini-batch size for neural networks.
dropout()
#> Dropout Rate (quantitative)
#> Range: [0, 1)