Calibrating Predictions

Selectify relies on machine learning to predict your selectors, given a text description of what you're looking for and the current state of the page. Calibrating (also known as training) lets you approve or modify the predictions that the model makes so it performs better in the wild.

Using the SDK

If you'd like to increase Selectify's performance on your domain, you can train it on the selectors that you're looking for. If you're using one of our libraries, you can easily set Selectify to training mode. Run your tests as normal, and when you get to the point where you need to approve a selector, you'll see the console pause and wait for your input.

In the SDKs you can do this by setting SELECTIFY_TRAINING=true before running your testing harness.

Console output

$ export SELECTIFY_TRAINING=true
$ npm run pipeline

⠴ Uploading page assets...
✅ Uploaded assets.
selectify [training]: Waiting for confirmation...
⏰ Workflow is waiting until you confirm the selectors.
🤖 Visit https://console.selectify.ai/workflows/1/sessions/1/1?selector=20 to see the options.

⠼ Waiting for confirmation...

Click the link or copy it into your browser. It will open the Selectify Console right to your current page state, where you can navigate around the page and approve the selector recommendations.

Once you've confirmed, the application will continue with the testing flow. Confirmation will automatically time-out after 10 minutes if no annotations have been made on the server.