Intelligence-driven strategies and tactics in team sports

In sports, as in other domains, explainable artificial intelligence (XAI) is becoming increasingly important. Data science has not always delivered on the promises set at the beginning of the projects. One reason is that experts do not understand and, thus, tend to not trust the results from the machine learning models.

XAI is set to change this by developing new methods which will help domain expert understand the rationale behind the machine learning models, understandable to human beings. Only in this way can team managers and game analysts have the confidence to explain and defend their decisions to management, fans, players, and the press. Aside from trust, explainability will ultimately provide the chance to correct the model when it outputs a clearly wrong decision i.e., there will be an opportunity to provide feedback to the system.