The second step in the Explore phase is called ‘Review predictions’. After the Discover phase and some training in shuffle mode, the model will have started making predictions for many of the labels in your taxonomy.
The purpose of this step is to review these for each label, confirming whether they are correct and correcting them where they aren't, and thereby providing many more training examples for the model.
There are therefore two key actions in this step when reviewing label predictions:
- Where the predictions are correct, you should confirm/accept them
- Where they are incorrect, you should either dismiss them or alternatively add the correct label(s) that does apply.
The images below show how predictions look in Re:infer for data with and without sentiment. Hovering your mouse over the label will also show the confidence the model has that the specific label applies.
The transparency of the predicted label provides a visual indicator of Re:infer’s confidence. The darker the colour, the higher Re:infer’s confidence is in that label applying, and vice versa:
Predictions without sentiment enabled
Predictions with sentiment enabled
|Remember: telling a model that a label does not apply is just as important as telling it what does|
- Select the unreviewed filter in Explore (this shows verbatims not yet reviewed by a human)
- Select the label you wish to train
- Re:infer will now present you with unreviewed Verbatims with predictions for the label you have selected. These will be presented in descending order of confidence that the selected label applies, i.e. with the most confident first and least confident at the bottom
- To confirm a label applies simply click on it
- For this example, the the top unreviewed Verbatim of the ‘Spa and pool’ label has a confidence rating of 90%, which means the model is 90% confident that label applies
- To add a different or additional label to one you have already selected, click the ‘+’ button and type it in. This is the way to correct wrong predictions, by adding the correct one and not clicking on the incorrectly predicted labels
Remember: at all times to also add in any other labels that apply to the verbatim you're reviewing
To delete a label you applied in error you can hover over it and an ‘X’ will appear. Click this to remove the label.
Which predictions should you review?
- When looking at predictions it is more beneficial to review predictions that are on or below 90%
- This is because when the model is very confident (i.e. above 90%), then by confirming the prediction you are not telling the model a lot of new information, it's already confident that the label applies
- If has a high confidence and is wrong, however, then it's important to reject the prediction and apply the correct label(s)
- Predictions are shown in descending order of confidence, so you may need to scroll down the page the find lower confidence predictions to review