If parameters of a model specification need to be modified, update() can be used in lieu of recreating the object from scratch.

# S3 method for boost_tree
update(
  object,
  parameters = NULL,
  mtry = NULL,
  trees = NULL,
  min_n = NULL,
  tree_depth = NULL,
  learn_rate = NULL,
  loss_reduction = NULL,
  sample_size = NULL,
  stop_iter = NULL,
  fresh = FALSE,
  ...
)

# S3 method for decision_tree
update(
  object,
  parameters = NULL,
  cost_complexity = NULL,
  tree_depth = NULL,
  min_n = NULL,
  fresh = FALSE,
  ...
)

# S3 method for gen_additive_mod
update(
  object,
  select_features = NULL,
  adjust_deg_free = NULL,
  parameters = NULL,
  fresh = FALSE,
  ...
)

# S3 method for linear_reg
update(
  object,
  parameters = NULL,
  penalty = NULL,
  mixture = NULL,
  fresh = FALSE,
  ...
)

# S3 method for logistic_reg
update(
  object,
  parameters = NULL,
  penalty = NULL,
  mixture = NULL,
  fresh = FALSE,
  ...
)

# S3 method for mars
update(
  object,
  parameters = NULL,
  num_terms = NULL,
  prod_degree = NULL,
  prune_method = NULL,
  fresh = FALSE,
  ...
)

# S3 method for mlp
update(
  object,
  parameters = NULL,
  hidden_units = NULL,
  penalty = NULL,
  dropout = NULL,
  epochs = NULL,
  activation = NULL,
  fresh = FALSE,
  ...
)

# S3 method for multinom_reg
update(
  object,
  parameters = NULL,
  penalty = NULL,
  mixture = NULL,
  fresh = FALSE,
  ...
)

# S3 method for nearest_neighbor
update(
  object,
  parameters = NULL,
  neighbors = NULL,
  weight_func = NULL,
  dist_power = NULL,
  fresh = FALSE,
  ...
)

# S3 method for proportional_hazards
update(
  object,
  parameters = NULL,
  penalty = NULL,
  mixture = NULL,
  fresh = FALSE,
  ...
)

# S3 method for rand_forest
update(
  object,
  parameters = NULL,
  mtry = NULL,
  trees = NULL,
  min_n = NULL,
  fresh = FALSE,
  ...
)

# S3 method for surv_reg
update(object, parameters = NULL, dist = NULL, fresh = FALSE, ...)

# S3 method for survival_reg
update(object, parameters = NULL, dist = NULL, fresh = FALSE, ...)

# S3 method for svm_linear
update(
  object,
  parameters = NULL,
  cost = NULL,
  margin = NULL,
  fresh = FALSE,
  ...
)

# S3 method for svm_poly
update(
  object,
  parameters = NULL,
  cost = NULL,
  degree = NULL,
  scale_factor = NULL,
  margin = NULL,
  fresh = FALSE,
  ...
)

# S3 method for svm_rbf
update(
  object,
  parameters = NULL,
  cost = NULL,
  rbf_sigma = NULL,
  margin = NULL,
  fresh = FALSE,
  ...
)

Arguments

object

A model specification.

parameters

A 1-row tibble or named list with main parameters to update. Use either parameters or the main arguments directly when updating. If the main arguments are used, these will supersede the values in parameters. Also, using engine arguments in this object will result in an error.

mtry

A number for the number (or proportion) of predictors that will be randomly sampled at each split when creating the tree models (specific engines only)

trees

An integer for the number of trees contained in the ensemble.

min_n

An integer for the minimum number of data points in a node that is required for the node to be split further.

tree_depth

An integer for the maximum depth of the tree (i.e. number of splits) (specific engines only).

learn_rate

A number for the rate at which the boosting algorithm adapts from iteration-to-iteration (specific engines only).

loss_reduction

A number for the reduction in the loss function required to split further (specific engines only).

sample_size

A number for the number (or proportion) of data that is exposed to the fitting routine. For xgboost, the sampling is done at each iteration while C5.0 samples once during training.

stop_iter

The number of iterations without improvement before stopping (specific engines only).

fresh

A logical for whether the arguments should be modified in-place or replaced wholesale.

...

Not used for update().

cost_complexity

A positive number for the the cost/complexity parameter (a.k.a. Cp) used by CART models (specific engines only).

select_features

TRUE or FALSE. If TRUE, the model has the ability to eliminate a predictor (via penalization). Increasing adjust_deg_free will increase the likelihood of removing predictors.

adjust_deg_free

If select_features = TRUE, then acts as a multiplier for smoothness. Increase this beyond 1 to produce smoother models.

penalty

A non-negative number representing the total amount of regularization (specific engines only).

mixture

A number between zero and one (inclusive) that is the proportion of L1 regularization (i.e. lasso) in the model. When mixture = 1, it is a pure lasso model while mixture = 0 indicates that ridge regression is being used (specific engines only).

num_terms

The number of features that will be retained in the final model, including the intercept.

prod_degree

The highest possible interaction degree.

prune_method

The pruning method.

hidden_units

An integer for the number of units in the hidden model.

dropout

A number between 0 (inclusive) and 1 denoting the proportion of model parameters randomly set to zero during model training.

epochs

An integer for the number of training iterations.

activation

A single character string denoting the type of relationship between the original predictors and the hidden unit layer. The activation function between the hidden and output layers is automatically set to either "linear" or "softmax" depending on the type of outcome. Possible values are: "linear", "softmax", "relu", and "elu"

neighbors

A single integer for the number of neighbors to consider (often called k). For kknn, a value of 5 is used if neighbors is not specified.

weight_func

A single character for the type of kernel function used to weight distances between samples. Valid choices are: "rectangular", "triangular", "epanechnikov", "biweight", "triweight", "cos", "inv", "gaussian", "rank", or "optimal".

dist_power

A single number for the parameter used in calculating Minkowski distance.

dist

A character string for the outcome distribution. "weibull" is the default.

cost

A positive number for the cost of predicting a sample within or on the wrong side of the margin

margin

A positive number for the epsilon in the SVM insensitive loss function (regression only)

degree

A positive number for polynomial degree.

scale_factor

A positive number for the polynomial scaling factor.

rbf_sigma

A positive number for radial basis function.

Value

An updated model specification.

Examples

model <- boost_tree(mtry = 10, min_n = 3) model
#> Boosted Tree Model Specification (unknown) #> #> Main Arguments: #> mtry = 10 #> min_n = 3 #> #> Computational engine: xgboost #>
update(model, mtry = 1)
#> Boosted Tree Model Specification (unknown) #> #> Main Arguments: #> mtry = 1 #> min_n = 3 #> #> Computational engine: xgboost #>
update(model, mtry = 1, fresh = TRUE)
#> Boosted Tree Model Specification (unknown) #> #> Main Arguments: #> mtry = 1 #> #> Computational engine: xgboost #>
param_values <- tibble::tibble(mtry = 10, tree_depth = 5) model %>% update(param_values)
#> Boosted Tree Model Specification (unknown) #> #> Main Arguments: #> mtry = 10 #> min_n = 3 #> tree_depth = 5 #> #> Computational engine: xgboost #>
model %>% update(param_values, mtry = 3)
#> Boosted Tree Model Specification (unknown) #> #> Main Arguments: #> mtry = 10 #> min_n = 3 #> tree_depth = 5 #> #> Computational engine: xgboost #>
param_values$verbose <- 0 # Fails due to engine argument # model %>% update(param_values) model <- linear_reg(penalty = 10, mixture = 0.1) model
#> Linear Regression Model Specification (regression) #> #> Main Arguments: #> penalty = 10 #> mixture = 0.1 #> #> Computational engine: lm #>
update(model, penalty = 1)
#> Linear Regression Model Specification (regression) #> #> Main Arguments: #> penalty = 1 #> mixture = 0.1 #> #> Computational engine: lm #>
update(model, penalty = 1, fresh = TRUE)
#> Linear Regression Model Specification (regression) #> #> Main Arguments: #> penalty = 1 #> #> Computational engine: lm #>