Skip to content

nearest_neighbor() defines a model that uses the K most similar data points from the training set to predict new samples. This function can fit classification and regression models.

There are different ways to fit this model, and the method of estimation is chosen by setting the model engine. The engine-specific pages for this model are listed below.

¹ The default engine.

More information on how parsnip is used for modeling is at


  mode = "unknown",
  engine = "kknn",
  neighbors = NULL,
  weight_func = NULL,
  dist_power = NULL



A single character string for the prediction outcome mode. Possible values for this model are "unknown", "regression", or "classification".


A single character string specifying what computational engine to use for fitting.


A single integer for the number of neighbors to consider (often called k). For kknn, a value of 5 is used if neighbors is not specified.


A single character for the type of kernel function used to weight distances between samples. Valid choices are: "rectangular", "triangular", "epanechnikov", "biweight", "triweight", "cos", "inv", "gaussian", "rank", or "optimal".


A single number for the parameter used in calculating Minkowski distance.


This function only defines what type of model is being fit. Once an engine is specified, the method to fit the model is also defined. See set_engine() for more on setting the engine, including how to set engine arguments.

The model is not trained or fit until the fit() function is used with the data.

Each of the arguments in this function other than mode and engine are captured as quosures. To pass values programmatically, use the injection operator like so:

value <- 1
nearest_neighbor(argument = !!value)


#> # A tibble: 2 × 2
#>   engine mode          
#>   <chr>  <chr>         
#> 1 kknn   classification
#> 2 kknn   regression    

nearest_neighbor(neighbors = 11)
#> K-Nearest Neighbor Model Specification (unknown mode)
#> Main Arguments:
#>   neighbors = 11
#> Computational engine: kknn