XGBoost

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.

Machine learningGradient boosting
  10

Contributor

contributed at 2020-12-17

Authorship

Authorship is unclear, you can claim the item.

Classification(s)

Method-focused categoriesData-perspectiveIntelligent computation analysis

Model Description

English {{currentDetailLanguage}} English

Quoted from: https://dl.acm.org/doi/abs/10.1145/2939672.2939785&https://xgboost.ai/

XGBoost is an open-source software library which provides a gradient boosting framework for C++, Java, Python, R, Julia, Perl, and Scala. It works on Linux, Windows, and macOS. From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks Apache Hadoop, Apache Spark, and Apache Flink. It has gained much popularity and attention recently as the algorithm of choice for many winning teams of machine learning competitions.

XGBoost initially started as a research project by Tianqi Chen as part of the Distributed (Deep) Machine Learning Community (DMLC) group. Initially, it began as a terminal application which could be configured using a libsvm configuration file. It became well known in the ML competition circles after its use in the winning solution of the Higgs Machine Learning Challenge. Soon after, the Python and R packages were built, and XGBoost now has package implementations for Java, Scala, Julia, Perl, and other languages. This brought the library to more developers and contributed to its popularity among the Kaggle community, where it has been used for a large number of competitions.

It was soon integrated with a number of other packages making it easier to use in their respective communities. It has now been integrated with scikit-learn for Python users and with the caretpackage for R users. It can also be integrated into Data Flow frameworks like Apache Spark, Apache Hadoop, and Apache Flink using the abstracted Rabit and XGBoost4J. XGBoost is also available on OpenCL for FPGAs. An efficient, scalable implementation of XGBoost has been published by Tianqi Chen and Carlos Guestrin.

Model Metadata

Name {{metadata.overview.name}}
Version {{metadata.overview.version}}
Model Type {{metadata.overview.modelType}}
Model Domain
{{domain}}
Sacle {{metadata.overview.scale}}

There is no overview about this model. You can click to add overview.

Purpose {{metadata.design.purpose}}
Principles
{{principle}}
Incorporated Models
{{incorporatedModel}}
Model part of larger framework: {{metadata.design.framework}}
Incorporated Models
{{process}}

There is no design info about this model. You can click to add overview.

Information {{metadata.usage.information}}
Initialization {{metadata.usage.initialization}}
Hardware Requirements {{metadata.usage.hardware}}
Software Requirements {{metadata.usage.software}}
Inputs
{{input}}
Outputs
{{output}}

There is no usage info about this model. You can click to add overview.

How to Cite

Copy

QR Code

Contributor(s)

Initial contribute: 2020-12-17

Authorship

Authorship is unclear, you can claim the item.

QR Code

×

{{curRelation.overview}}
{{curRelation.author.join('; ')}}
{{curRelation.journal}}









You can link related {{typeName}} from repository to this model item, or you can create a new {{typeName.toLowerCase()}}.

Related Items
Related Items

You can link resource from repository to this model item, or you can create a new {{typeName.toLowerCase()}}.

Drop the file here, orclick to upload.
Select From My Space
+ add

These authorship information will be submitted to the contributor to review.

Cancel Submit
Model Classifications
Cancel Submit
Localizations + Add
{{ item.label }} {{ item.value }}
Model Name :
Cancel Submit
Name:
Version:
Model Type:
Model Domain:
Scale:
Purpose:
Principles:
Incorporated models:

Model part of

larger framework

Process:
Information:
Initialization:
Hardware Requirements:
Software Requirements:
Inputs:
Outputs:
Cancel Submit
Title Author Date Journal Volume(Issue) Pages Links Doi Operation
Cancel Submit
Add Cancel

{{articleUploading.title}}

Authors:  {{articleUploading.authors[0]}}, {{articleUploading.authors[1]}}, {{articleUploading.authors[2]}}, et al.

Journal:   {{articleUploading.journal}}

Date:   {{articleUploading.date}}

Page range:   {{articleUploading.pageRange}}

Link:   {{articleUploading.link}}

DOI:   {{articleUploading.doi}}

Yes, this is it Cancel

The article {{articleUploading.title}} has been uploaded yet.

OK
Cancel Confirm