Reputation: 1405
I encountered the following unexplainable behaviour in Vowpal Wabbit. Sometimes it simply doesn't save a model when -f
flag is specified, without raising any exceptions.
The command is composed automatically by a script and has the following form (file names are changed):
vw -d ./data/train_set -p ./predictions
-f ./model --cache --passes 3
--ftrl_alpha 0.106920149657 --ignore T -l 0.83184072971
-b 29 --loss_function logistic --ftrl_beta 0.97391780827
--ftrl -q SE -q SZ -q DR
Then it trains normally and the standard diagnostic information is displayed. But the model is not saved!
The most weird thing about it is that everything works fine with another parameter configurations!
The context: I'm working on hyperparameter optimization and my script successively composes vw
training and validation commands. It always succeeds to get to 5th iteration, and always fails on the 6th (on exactly the same command). Any help will be appreciated.
Upvotes: 0
Views: 557
Reputation: 1405
That was a bug in Vowpal Wabbit source code. Now it's fixed and models are saved as expected. Here is an issue on Github: https://github.com/JohnLangford/vowpal_wabbit/issues/859
Upvotes: 1