West Midlands Police Blames Microsoft's Copilot for Intelligence Error in Football Report, Says Chief Constable.
In a bizarre incident, the chief constable of West Midlands Police has admitted that Microsoft's Copilot AI assistant made an error in a football intelligence report. The mistake included a nonexistent match between West Ham and Maccabi Tel Aviv, which was used to ban Israeli fans from a Europa League match last year. According to Craig Guildford, the chief constable, the police had relied on Copilot's output without fact-checking it.
Guildford made the admission in a letter to the Home Affairs Committee earlier this week, after previously denying that AI was used to prepare the report and blaming "social media scraping" for the error. However, he now accepts responsibility for relying on the faulty information provided by Copilot.
The mistake highlights the risks of relying on AI-powered tools without proper fact-checking. Microsoft's own warnings about the limitations of its Copilot interface include the possibility that it may make mistakes. This incident serves as a stark reminder of the need for accountability and transparency in the use of such technology.
As Microsoft has yet to respond to our queries about the error, it remains unclear what steps will be taken by the company to prevent similar incidents in the future.
In a bizarre incident, the chief constable of West Midlands Police has admitted that Microsoft's Copilot AI assistant made an error in a football intelligence report. The mistake included a nonexistent match between West Ham and Maccabi Tel Aviv, which was used to ban Israeli fans from a Europa League match last year. According to Craig Guildford, the chief constable, the police had relied on Copilot's output without fact-checking it.
Guildford made the admission in a letter to the Home Affairs Committee earlier this week, after previously denying that AI was used to prepare the report and blaming "social media scraping" for the error. However, he now accepts responsibility for relying on the faulty information provided by Copilot.
The mistake highlights the risks of relying on AI-powered tools without proper fact-checking. Microsoft's own warnings about the limitations of its Copilot interface include the possibility that it may make mistakes. This incident serves as a stark reminder of the need for accountability and transparency in the use of such technology.
As Microsoft has yet to respond to our queries about the error, it remains unclear what steps will be taken by the company to prevent similar incidents in the future.