The privacy regulator in Japan issued a warning to OpenAI, the Microsoft-backed startup responsible for the development of the ChatGPT chatbot, cautioning them against collecting sensitive data without individuals' consent. In a statement, the Personal Information Protection Commission emphasised that OpenAI should minimise the collection of sensitive data for machine learning purposes. Furthermore, the commission stated that it may take additional action if further concerns arise.


With the rise of generative artificial intelligence (AI), which has the capability to create text and images, regulators worldwide are scrambling to establish rules to govern its use. Proponents of this technology liken its impact to the advent of the internet. While Japan has been somewhat behind in certain technology trends, the country is increasingly motivated to keep pace with advancements in AI and robotics in order to maintain productivity, particularly in the face of a shrinking population.


The privacy watchdog acknowledged the importance of striking a balance between privacy concerns and the potential benefits of generative AI. They highlighted its potential in fostering innovation and addressing challenges such as climate change.


According to Similarweb, Japan ranks as the third-largest source of traffic to OpenAI's website, indicating a significant level of interest and engagement from the Japanese audience. OpenAI CEO Sam Altman met with Prime Minister Fumio Kishida in April, demonstrating the company's interest in expanding its presence in Japan. The meeting took place ahead of the Group of Seven (G7) leaders summit, where Prime Minister Kishida led discussions on AI regulation.


The European Union (EU), known for its pioneering role in tech regulation, is currently working on formulating the initial set of rules to govern AI. Altman recently stated that OpenAI has no intentions of leaving Europe, despite previous suggestions that they might consider such a move if compliance with EU regulations proved too challenging.