mlp() defines a multilayer perceptron model (a.k.a. a single layer, feed-forward neural network).

There are different ways to fit this model. See the engine-specific pages for more details:

• keras

• nnet (default)

More information on how parsnip is used for modeling is at https://www.tidymodels.org/.

mlp(
mode = "unknown",
engine = "nnet",
hidden_units = NULL,
penalty = NULL,
dropout = NULL,
epochs = NULL,
activation = NULL
)

## Arguments

mode A single character string for the prediction outcome mode. Possible values for this model are "unknown", "regression", or "classification". A single character string specifying what computational engine to use for fitting. An integer for the number of units in the hidden model. A non-negative numeric value for the amount of weight decay. A number between 0 (inclusive) and 1 denoting the proportion of model parameters randomly set to zero during model training. An integer for the number of training iterations. A single character string denoting the type of relationship between the original predictors and the hidden unit layer. The activation function between the hidden and output layers is automatically set to either "linear" or "softmax" depending on the type of outcome. Possible values are: "linear", "softmax", "relu", and "elu"

## Details

This function only defines what type of model is being fit. Once an engine is specified, the method to fit the model is also defined.

The model is not trained or fit until the fit.model_spec() function is used with the data.

## References

fit.model_spec(), set_engine(), update(), keras engine details, nnet engine details

## Examples

show_engines("mlp")
#> # A tibble: 4 × 2
#>   engine mode
#>   <chr>  <chr>
#> 1 keras  classification
#> 2 keras  regression
#> 3 nnet   classification
#> 4 nnet   regression
mlp(mode = "classification", penalty = 0.01)
#> Single Layer Neural Network Specification (classification)
#>
#> Main Arguments:
#>   penalty = 0.01
#>
#> Computational engine: nnet
#>