Skip to content

xujin1982/Kaggle_Allstate_Claims_Severity

Repository files navigation

Allstate Claims Severity project

This project provides a sample solution to Allstate Claims Severity competition on Kaggle.

Requirements

Dataset

The dataset is available for free on Kaggle's competition page.

Software

This project uses the following software (if version number is omitted, latest version is recommended):

  • Python stack: python 2.7, numpy, scipy, sklearn, pandas, matplotlib.
  • XGBoost: XGBoost is short for “Extreme Gradient Boosting”, where the term “Gradient Boosting” is proposed in the paper Greedy Function Approximation: A Gradient Boosting Machine, by Friedman.
  • LightGBM: A fast, distributed, high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
  • Keras: Keras is a high-level neural networks library. It is shown in the post that how to access to GPUs to speed up the training of the deep learning models by using the Amazon Web Service (AWS) infrastructure.
  • BayesianOptimization: A Python implementation of global optimization with gaussian processes.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published