Skip to content

Commit

Permalink
[doc] Add more detailed explanations for advanced objectives (#10283)
Browse files Browse the repository at this point in the history

---------

Co-authored-by: Jiaming Yuan <[email protected]>
  • Loading branch information
david-cortes and trivialfis authored Jul 8, 2024
1 parent 2266db1 commit 8d0f2bf
Show file tree
Hide file tree
Showing 7 changed files with 760 additions and 4 deletions.
12 changes: 12 additions & 0 deletions R-package/R/xgb.train.R
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,18 @@
#' It might be useful, e.g., for modeling total loss in insurance, or for any outcome that might be
#' \href{https://en.wikipedia.org/wiki/Tweedie_distribution#Applications}{Tweedie-distributed}.}
#' }
#'
#' For custom objectives, one should pass a function taking as input the current predictions (as a numeric
#' vector or matrix) and the training data (as an `xgb.DMatrix` object) that will return a list with elements
#' `grad` and `hess`, which should be numeric vectors or matrices with number of rows matching to the numbers
#' of rows in the training data (same shape as the predictions that are passed as input to the function).
#' For multi-valued custom objectives, should have shape `[nrows, ntargets]`. Note that negative values of
#' the Hessian will be clipped, so one might consider using the expected Hessian (Fisher information) if the
#' objective is non-convex.
#'
#' See the tutorials \href{https://xgboost.readthedocs.io/en/stable/tutorials/custom_metric_obj.html}{
#' Custom Objective and Evaluation Metric} and \href{https://xgboost.readthedocs.io/en/stable/tutorials/advanced_custom_obj}{
#' Advanced Usage of Custom Objectives} for more information about custom objectives.
#' }
#' \item \code{base_score} the initial prediction score of all instances, global bias. Default: 0.5
#' \item{ \code{eval_metric} evaluation metrics for validation data.
Expand Down
12 changes: 12 additions & 0 deletions R-package/man/xgb.train.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

9 changes: 6 additions & 3 deletions demo/guide-python/custom_softmax.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,8 @@
XGBoost returns transformed prediction for multi-class objective function. More details
in comments.
See :doc:`/tutorials/custom_metric_obj` for detailed tutorial and notes.
See :doc:`/tutorials/custom_metric_obj` and :doc:`/tutorials/advanced_custom_obj` for
detailed tutorial and notes.
'''

Expand Down Expand Up @@ -39,7 +40,9 @@ def softmax(x):


def softprob_obj(predt: np.ndarray, data: xgb.DMatrix):
'''Loss function. Computing the gradient and approximated hessian (diagonal).
'''Loss function. Computing the gradient and upper bound on the
Hessian with a diagonal structure for XGBoost (note that this is
not the true Hessian).
Reimplements the `multi:softprob` inside XGBoost.
'''
Expand All @@ -61,7 +64,7 @@ def softprob_obj(predt: np.ndarray, data: xgb.DMatrix):

eps = 1e-6

# compute the gradient and hessian, slow iterations in Python, only
# compute the gradient and hessian upper bound, slow iterations in Python, only
# suitable for demo. Also the one in native XGBoost core is more robust to
# numeric overflow as we don't do anything to mitigate the `exp` in
# `softmax` here.
Expand Down
Loading

0 comments on commit 8d0f2bf

Please sign in to comment.