Gradient boosting machines (GBM) and related tree-based algorithms have been proved to be highly effective and accurate in a wide range of machine learning tasks. It is also one of the most popular algorithms among participants in Kaggle competitions. This talk will introduce the basic concepts of gradient boosting machines, the commonly used libraries in Python such as XGBoost and LightGBM for performing machine learning using GBM, and ways to tune the hyper-parameters of a model to avoid over/under-fitting and achieve better results. The audience are expected to have basic understanding of machine learning. Codes and slides will be provided on Github after the talk.