Training XGBoost in a distributed environment

shis Member Posts: 4

I've read in the docs that XGBoost supports multiprocessing. I have successfully run it in a single node environment, where I verified that it was using all the cores. However, when I try it in a distributed setting, it does not give an error, it just gets stuck at 0%. I'm not sure if it currently does not support multinode, or if I'm doing something wrong.