Samsung workers have unwittingly leaked top secret data whilst using ChatGPT to 
help them with tasks.

The company allowed engineers at its semiconductor arm to use the AI 
writer<https://www.techradar.com/best/ai-writer> to help fix problems with 
their source code. But in doing so, the workers inputted confidential data, 
such as the source code itself for a new program, internal meeting notes data 
relating to their hardware.

The upshot is that in just under a month, there were three recorded incidences 
of employees leaking sensitive information via ChatGPT. Since ChatGPT retains 
user input data to further train itself, these trade secrets from Samsung are 
now effectively in the hands of OpenAI, the company behind the AI service.

In response, Samsung Semiconductor is now developing its own inhouse AI for 
internal use by employees, but they can only use prompts that are limited to 
1024 bytes in size.


In one of the aforementioned cases, an employee asked ChatGPT to optimize test 
sequences for identifying faults in chips, which is confidential - however, 
making this process as efficient as possible has the potential to save chip 
firms considerable time in testing and verifying processors, leading to 
reductions in cost too.


In another case, an employee used ChatGPT to convert meeting notes into a 
presentation, the contents of which were obviously not something Samsung would 
have liked external third parties to have known.

Samsung Electronics sent out a warning to its workers on the potential dangers 
of leaking confidential information in the wake of the incidences, saying that 
such data is impossible to retrieve as it is now stored on the servers 
belonging to OpenAI. In the semiconductor industry, where competition is 
fierce, any sort of data leak could spell disaster for the company in question.

It doesn't seem as if Samsung has any recourse to request the retrieval or 
deletion of the sensitive data OpenAI now holds. Some have 
argued<https://www.sydney.edu.au/news-opinion/news/2023/02/08/chatgpt-is-a-data-privacy-nightmare.html>
  that this very fact makes ChatGPT non-compliant with the EU's GDPR, as this 
is one of the core tenants of the law governing how companies collect and use 
data. It is also one of the reasons why

why Italy has now banned the use of ChatGPT 
nationwide<https://www.bbc.co.uk/news/technology-65139406> .


https://www.techradar.com/news/samsung-workers-leaked-company-secrets-by-using-chatgpt

_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to