Lightweight GAUSS wrapper for the popular XGBoost library.
XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework.
$ git clone --recursive https://github.com/aptech/gxgboost
$ cd gxgboost
$ mkdir build
$ cd buildThe GAUSS gxgboost library can be installed and updated directly in GAUSS using the GAUSS package manager.
$ cmake -G"NMake Makefiles" -DCMAKE_BUILD_TYPE=Release ..
$ nmake install OR jom install$ cmake -G"Unix Makefiles" -DCMAKE_BUILD_TYPE=Release ..
$ make -jN installThe gxgboost.zip release folder can be easily installed using the GAUSS Application Installer, as shown below:
- Download the zipped folder
gxgboost.zipfrom the gxgoost release page. - Select Tools > Install Application from the main GAUSS menu.

- Follow the installer prompts, making sure to navigate to the downloaded
gxgboost.zip. - Before using the functions created by
gxgboostyou will need to load the newly createdgxgboostlibrary. This can be done in a number of ways:
- Navigate to the library tool view window and click the small wrench located next to the
gxgboostlibrary. SelectLoad Library.
- Enter
library gxgboostin the program input/output window. - Put the line
library gxgboost;at the beginning of your program files.
In addition, you can also use the GAUSS package installer CLI (gpkg(.exe)) in the GAUSS installation directory:
gpkg install gxgboost.zip
OpenMP must be installed on the target system to successfully compile or use the gxgboost library.
Install the Visual C++ Redistributable for Visual Studio 2015 if it is not already installed.
Built against GCC OpenMP (libgomp1)
# Ubuntu
$ sudo apt install libgomp1
# RHEL/CentOS
$ sudo yum install libgompLLVM's OpenMP runtime library:
# Homebrew
$ brew install libompRefer to <data/gxgboost.e> for a full example and data/gxgboost.sdf for full description of parameters.
Initialize a control structure for the boosting method of your choice.
The following code samples are all verbose to showcase utilizing their associated control structures. Providing a control structure is not necessary if you wish to use the default values. These can all be referenced in <data/gxgboost.sdf>
library gxgboost;
/*
** Declare instance of the
** xgbTree control structure
*/
struct xgbTree ctl;
/*
** Initialize control structure with
** default values
*/
ctl = xgbCreateCtl("tree");
// Declare instance of xgbModel structure
struct xgbModel model;
// Call xgbTreeFit
model = xgbTreeFit(labels, train_data, ctl);
library gxgboost;
/*
** Declare instance of the
** xgbDart control structure
*/
struct xgbDart ctl;
/*
** Initialize control structure with
** default values
*/
ctl = xgbCreateCtl("dart");
// Declare instance of xgbModel structure
struct xgbModel model;
// Call xgbDartFit
model = xgbDartFit(labels, train_data, ctl);
library gxgboost;
/*
** Declare instance of the
** xgbLinear control structure
*/
struct xgbLinear ctl;
/*
** Initialize control structure with
** default values
*/
ctl = xgbCreateCtl("linear");
// Declare instance of xgbModel structure
struct xgbModel model;
// Call xgbLinearFit
model = xgbLinearFit(labels, train_data, ctl);
pred = xgbPredict(model, test_data);
© Contributors, 2018. Licensed under an Apache-2 license.