Reputation: 2160
I created data from 1000 sessions of a board game simulator I ran. I'm trying to figure out what the winning strategies are and tracked several features in the data.
I loaded the result in a Azure Machine Learning diagram and connected the data set to a model that uses linear regression.
I click the "Train Model" and go to "View Output". After clicking through the ensuing links, I seem to be able to locate 9 files. I don't see anything that looks like, "column 9 is best predictor of column 1" or something like that.
Instead I see an iLearner file with a lot of binary I can't read. I see a schema file. There's also a lot of meta files about what version of conda ran it and data types and stuff.
How do I see which features best indicated the label I indicated?
EDIT:
As suggested, I added score model and evaluate model.
I did see some error metrics in the evaluate results -> visualize.
Train model had a view output and a view log, but no visualize for me. When I went to "view output" there were a lot of files like convert_to_dataset.yaml and boosted_decision_tree_regression.yaml. Also there was a directory there called trained model which had files with names like data_type.json and score.py. It seemed like it was all meta data and nothing like, "Column 1 best predicted X ...".
I am still not seeing anything that indicates what best predicts the outcome.
Upvotes: 0
Views: 1019
Reputation: 18714
You need to add Score Model
to test the model, by predicting with the test data set (that's important!). It looks like Azure doesn't let you skip it either.
You have to connect the data and model to that piece of the 'tree', right click and run it. Then you need to add Evaluate Model
, right-click and run that. Then you can go to View Output -> Evaluation Results -> Visualize
.
The Evaluate Results
functionality provides a few different error parameters and R2. (The coefficient of determination = explained variance = R2). You can read about the metrics here: https://learn.microsoft.com/en-us/azure/machine-learning/algorithm-module-reference/evaluate-model#metrics-for-regression-models
The intercepts by predictor = the coefficient estimates = the β coefficients are found by going toTrain Model
and right-click then Trained Model -> Visualize
.
I just typed 'score' in the search bar and both Score Model
and Evaluate Model
came up.
Upvotes: 2