The very recent widespread use of AI-based technologies has opened the debate on the impact they may have on user privacy.
This is where ChatGPT, OpenAI's chatbot recently blocked in Italy by the Garante della Privacy, fits in1.
So let's explore the reasons for the blocking of ChatGPT in Italy, recounting how the Italian Data Protection Authority has required OpenAI to comply with the European Union's GDPR privacy protection regulations, as it appears that the company that owns ChatGPT is not guaranteeing the protection of users' privacy, especially in the area of personal data collection and storage.
Let's continue by delving into the ongoing discussions between OpenAI and the Italian Privacy Authority: OpenAI has been asked to ensure respect for users' privacy and the rights of minors by providing for the implementation of transparent and accessible policy. Any users also will have to declare that they are of legal age to use the ChatGPT service.
Finally, we investigate the initiatives put in place by OpenAI to improve the security of its chatbot. Among other things, the company has launched the Bug Bounty Program, a collaborative program apt to detect bugs and flaws in its systems. Rewards for program participants range from a minimum of $200 up to as much as $20,000 for particularly serious discoveries.
Let us therefore take a closer look at the connections between Artificial Intelligence and Privacy, summarizing the querelle between OpenAI and Chat GPT and highlighting the initiatives implemented by OpenAI to ensure the security of its users.
ChatGPT (GPT stands for Generative Pre-trained Transformer) is an Artificial Intelligence chatbot developed by OpenAI and launched on November 30, 20222.
It is based on a family of Large Language Models (LLMs) – computer programs for Natural Language Processing – and has been refined using Machine Learning.
These models are trained on huge amounts of data available online to generate answers to user questions.
The potential of ChatGPT and similar tools that are gradually appearing on the market3 is nothing short of enormous: after the release of ChatGPT, OpenAI's valuation was estimated at $29 billion in 20234!
ChatGPT's tremendous success, however, has encountered an initial obstacle to its spread in Italy: a few days ago, the Italian Data Protection Authority (Garante per la Protezione dei Dati Personali) ordered with immediate effect a halt to ChatGPT until compliance with privacy regulations as set out in the GDPR – the General Data Protection Regulation in force in the European Union – is ensured.
The Authority simultaneously opened an investigation against OpenAI, the U.S. company that developed and operates the platform, noting that there is a lack of information to users and all relevant subjects on how their data is collected by OpenAI.
It also emerged that there was no legal basis for the massive collection and storage of personal data.
In addition, although OpenAI does not allow minors under the age of 13 to use the service, filters to verify age are not sufficient, according to the Guarantor. This means that minors are exposed to responses that are not considered age-appropriate. The Garante has therefore given OpenAI a short period of time to communicate and implement the measures it intends to take to comply with the requests.
Failure to do so will result in a fine of €20 million, or up to 4% of annual global turnover.
When AI-based tools work productively, they open up a whole new world of getting work done, dramatically reducing the time spent on repetitive tasks and shifting the attention of business resources to more creative and higher-value-added activities.
This is probably one of the reasons why the Italian Privacy Authority stressed that there is no intention to put a brake on AI development5, following a further meeting with OpenAI held by video conference on April 5, 2023.
The meeting, which was also attended at the opening by Sam Altman, CEO of OpenAI, was attended by Che Chang, Deputy General Counsel of OpenAI, Anna Makanju, Head of Public Policy, and Ashley Pantuliano, Associate General Counsel, in addition to the Authority's Panel (Pasquale Stanzione, Ginevra Cerrina Feroni, Agostino Ghiglia, Guido Scorza).
OpenAI confirmed there its willingness to collaborate with the Italian Privacy Authority, with the goal of arriving at a positive solution to the critical issues noted by the Guarantor regarding ChatGPT.
The game therefore still seems to be open: by April 30, OpenAI will have to bring the processing of personal data collected through ChatGPT up to the standards required by the Privacy Authority.
There are essentially two measures that OpenAI will have to take at very short notice.
First, the U.S. company will have to prepare a notice explaining the methods and logic behind the processing of data necessary for the operation of ChatGPT as well as the rights given to users and non-user data subjects. The notice will have to be easily accessible and placed in a location that makes it easy to read before proceeding to use the service for users connecting from Italy.
The second condition relates to the legal basis of the processing of users' personal data for training the algorithms that make ChatGPT work: the Privacy Guarantor has ordered OpenAI to remove any reference to the execution of a contract and to indicate consent or legitimate interest as a prerequisite for using such data, without prejudice to the exercise of its powers of verification and assessment. Following this choice those who want to use ChatGPT will have to declare that they are of age.
The Authority went on to make medium-term demands: it is required of OpenAI that users can be put in a position to request the rectification of personal data about them if it is generated inaccurately by ChatGPT, or that it can be deleted.
Again, users of the service should be given the option to exclude their data from those that are used for algorithm training.
In addition, by May 31, 2023, the watchdog has ordered the implementation of an information campaign on radio, television, newspapers, and the web to inform people about the use of their personal data for algorithm training purposes.
Finally, by September 30, 2023, a system must also be implemented to verify the age of those who intend to use the bot6.
Meanwhile, OpenAI has decided to launch the Bug Bounty Program, to invite the community to research and identify bugs in their systems and products, including ChatGPT.
OpenAi announced that it has partnered with the crowdsourcing cybersecurity platform Bugcrowd.
Rewards for those who participate in the program and detect bugs and other flaws in systems start from a minimum of $200 for "low severity findings" to a maximum of $20,000 for "exceptional discoveries."
Left out, however, are rewards for jailbreaking ChatGPT or generating malicious code or text: "Issues related to the content of model prompts and responses are strictly outside the scope and will not be rewarded," reads the OpenAI statement7.
This choice is due to the fact that jailbreaking would allow users to be able to circumvent security filters and make the current situation even worse by inserting possible dangerous scenarios for individual and social security.
OpenAI's bug bounty program is not the first of its kind.
Other companies have also employed experts to unearth anomalies and bugs in their systems: these include the likes of Amazon, Coinbase, and Google.
The bounty hunting season is open!
The use of Artificial Intelligence is growing rapidly, with the arrival of new software such as OpenAI's ChatGPT that is revolutionizing the way people interact with technology to improve their work.
However, the debate over user privacy is becoming increasingly heated, and the Italian Data Protection Authority's decision to block ChatGPT in Italy has highlighted the critical issues related to data collection and retention.
It is clear that AI has the potential to revolutionize the way we live and work, but the development of such technologies must go hand in hand with the protection of user rights and compliance with privacy regulations.
1. See the article by Politico titled ChatGPT is entering a world of regulatory pain in Europe
2. See the ChatGPT su Wikipedia
3. See the Al Jazeera article titled China’s Baidu unveils ChatGPT rival Ernie
4. See the Business Insider article titled ChatGPT creator OpenAI is in talks to sell shares in a tender offer that would double the startup's valuation to $29 billion
5. See the press release by the Italian Data Protection Authority which came out on April 6, 2023, titled ChatGPT: OpenAI collaborating with Italian SA on commitments for protecting Italian users
6. All this information can be found in the press release of the Italian Data Protection Authority issued on April 12, 2023 and titled ChatGPT: Italian SA to lift temporary limitation if OpenAI implements measures 30 April set as deadline for compliance. This statement summarizes the April 11, 2023 Provision of the Authority.
7. See the details of OpenAI's program