Regulators in Canada and Italy have already started processes to stop the AI from using personal information.
OpenAI’s chatbot, ChatGPT scrapes the open web to help to train it, leading regulators in a growing number of countries to dig in against these actions in order to protect the personal information of their citizens.
Italy’s data protection agency has temporarily banned the chatbot while it launches an OpenAI probe.
These governments – and several experts in the field – are saying that greater oversight is required over the type of data that is used for training systems such as ChatGPT.
Earlier in April, Italy’s data protection agency placed a temporary ban on the chatbot to give the country the opportunity to execute a probe of OpenAI. More recently, Canada’s privacy commissioner also announced that it was investigating OpenAI.
Both the Italian and Canadian agencies stated that data privacy was their primary concern driving their investigations.
“You might say, ‘Oh, maybe it feels a bit heavy handed,’” said Ethically Aligned AI consulting company founder Katrina Ingram as quoted in a recent CBC News report. “On the other hand, a company decided that it was just going to drop this technology onto the world and let everybody deal with the consequences. So that doesn’t feel very responsible as well.”
Concerns have already been expressed over the transparency of ChatGPT and other AI chatbots.
In the rush not to be left behind Google and Microsoft have also launched their own similar AI chatbot products recently. These systems are trained to offer users responses to inquiries or to generate output using openly available information found online. That said, privacy and AI experts have been cautioning that it isn’t entirely clear precisely what kind of information is included in this process.
According to Ingram, improved oversight is required as AI products like ChatGPT continue their rapid evolution.
“One of the challenges right now is that I think we may not know enough about what’s going on under the hood. An investigation can help to clarify that,” said University of Ottawa law professor Teresa Scassa, the Canada Research Chair in Information Law and Policy.