I'm one of the people on the team that did this. We basically used technologies similar to what self-driving cars would use to recognise their environment, to teach and guide an AI system how to execute goal-oriented interactions. It can measure the quality of those interactions, both visually and technically (performance), and sends alerts if something is completely broken (e.g. adding a product to a shopping cart doesn't do anything).<p>Let me know if you have any questions. I can't spill all the secrets of how it works, but maybe some :)