Title: | Model Wrappers for Tree-Based Models |
Version: | 0.3.2 |
Description: | Bindings for additional tree-based model engines for use with the 'parsnip' package. Models include gradient boosted decision trees with 'LightGBM' (Ke et al, 2017.), conditional inference trees and conditional random forests with 'partykit' (Hothorn and Zeileis, 2015. and Hothorn et al, 2006. <doi:10.1198/106186006X133933>), and accelerated oblique random forests with 'aorsf' (Jaeger et al, 2022 <doi:10.5281/zenodo.7116854>). |
License: | MIT + file LICENSE |
URL: | https://bonsai.tidymodels.org/, https://github.com/tidymodels/bonsai |
BugReports: | https://github.com/tidymodels/bonsai/issues |
Depends: | parsnip (≥ 1.0.1), R (≥ 4.0) |
Imports: | cli, dials, dplyr, glue, purrr, rlang (≥ 1.1.0), stats, tibble, utils, withr |
Suggests: | aorsf (≥ 0.1.5), covr, knitr, lightgbm, Matrix, modeldata, partykit, rmarkdown, rsample, testthat (≥ 3.0.0), tune |
VignetteBuilder: | knitr |
Config/Needs/website: | tidyverse/tidytemplate |
Config/testthat/edition: | 3 |
Encoding: | UTF-8 |
RoxygenNote: | 7.3.2 |
NeedsCompilation: | no |
Packaged: | 2025-02-11 22:55:36 UTC; simoncouch |
Author: | Daniel Falbel [aut],
Athos Damiani [aut],
Roel M. Hogervorst
|
Maintainer: | Simon Couch <simon.couch@posit.co> |
Repository: | CRAN |
Date/Publication: | 2025-02-11 23:30:06 UTC |
bonsai: Model Wrappers for Tree-Based Models
Description
Bindings for additional tree-based model engines for use with the 'parsnip' package. Models include gradient boosted decision trees with 'LightGBM' (Ke et al, 2017.), conditional inference trees and conditional random forests with 'partykit' (Hothorn and Zeileis, 2015. and Hothorn et al, 2006. doi:10.1198/106186006X133933), and accelerated oblique random forests with 'aorsf' (Jaeger et al, 2022 doi:10.5281/zenodo.7116854).
Author(s)
Maintainer: Simon Couch simon.couch@posit.co (ORCID)
Authors:
Daniel Falbel dfalbel@curso-r.com
Athos Damiani adamiani@curso-r.com
Roel M. Hogervorst hogervorst.rm@gmail.com (ORCID)
Max Kuhn max@posit.co (ORCID)
Other contributors:
Posit Software, PBC [copyright holder, funder]
See Also
Useful links:
Report bugs at https://github.com/tidymodels/bonsai/issues
Internal functions
Description
Not intended for direct use.
Usage
predict_lightgbm_classification_prob(object, new_data, ...)
predict_lightgbm_classification_class(object, new_data, ...)
predict_lightgbm_classification_raw(object, new_data, ...)
predict_lightgbm_regression_numeric(object, new_data, ...)
## S3 method for class ''_lgb.Booster''
multi_predict(object, new_data, type = NULL, trees = NULL, ...)
Objects exported from other packages
Description
These objects are imported from other packages. Follow the links below to see their documentation.
- parsnip
Boosted trees with lightgbm
Description
train_lightgbm
is a wrapper for lightgbm
tree-based models
where all of the model arguments are in the main function.
Usage
train_lightgbm(
x,
y,
weights = NULL,
max_depth = -1,
num_iterations = 100,
learning_rate = 0.1,
feature_fraction_bynode = 1,
min_data_in_leaf = 20,
min_gain_to_split = 0,
bagging_fraction = 1,
early_stopping_round = NULL,
validation = 0,
counts = TRUE,
quiet = FALSE,
...
)
Arguments
x |
A data frame or matrix of predictors |
y |
A vector (factor or numeric) or matrix (numeric) of outcome data. |
weights |
A numeric vector of sample weights. |
max_depth |
An integer for the maximum depth of the tree. |
num_iterations |
An integer for the number of boosting iterations. |
learning_rate |
A numeric value between zero and one to control the learning rate. |
feature_fraction_bynode |
Fraction of predictors that will be randomly sampled at each split. |
min_data_in_leaf |
A numeric value for the minimum sum of instances needed in a child to continue to split. |
min_gain_to_split |
A number for the minimum loss reduction required to make a further partition on a leaf node of the tree. |
bagging_fraction |
Subsampling proportion of rows. Setting this argument
to a non-default value will also set |
early_stopping_round |
Number of iterations without an improvement in the objective function occur before training should be halted. |
validation |
The proportion of the training data that are used for performance assessment and potential early stopping. |
counts |
A logical; should |
quiet |
A logical; should logging by |
... |
Other options to pass to |
Details
This is an internal function, not meant to be directly called by the user.
Value
A fitted lightgbm.Model
object.