xgboost
Original:🇺🇸 English
Translated
XGBoost gradient boosting library. Use for tabular ML.
4installs
Sourceg1joshi/agent-skills
Added on
NPX Install
npx skill4agent add g1joshi/agent-skills xgboostTags
Translated version includes tags in frontmatterSKILL.md Content
View Translation Comparison →XGBoost
XGBoost is the winningest algorithm in Kaggle history for tabular data. v2.1 (2025) brings native Blackwell GPU support and Polars integration.
When to Use
- Tabular Data: It usually beats Deep Learning on structured tables.
- Speed: Extremely optimized C++ backend.
Core Concepts
Gradient Boosting
Building extensive decision trees sequentially, each correcting the previous one's errors.
DMatrix
Internal optimized data structure.
Device Parameter
device="cuda"Best Practices (2025)
Do:
- Use : GPU training is 10x faster.
device="cuda" - Use Early Stopping: Stop training when validation error rises.
- Pass Polars Dataframes: No need to convert to Pandas/NumPy first.
Don't:
- Don't use one-hot encoding: Use native categorical support ().
enable_categorical=True