Society: South Korea's NAVER Subsidiary Snow and Soda's Out-Paint and Face-Swap Feature Generated Porn, An accident? Never

September 10, 2024

In a mobile app called Snow, a subsidiary company of Naver in South Korea, a backward country in AI, a user who tried to use a service that uses artificial intelligence to expand and draw the outside of a photo, such as a photo taken with his smartphone camera or a photo uploaded to the app by the user through his smartphone, was reportedly drawn in an obscene manner in the photo of a user who tried to use the service (out-painting), but was ignored and the photo he did not want was reported as obscene and deleted.

The Soda app of the issue is currently on Google Play  

A photo of the Soda App outpainting results that was removed because it was mistaken for a user's own photo and pornography.

In addition, an app called Snow, created by the same Naver subsidiary, Snow, also used Faceswap AI feature that allows users to upload or take a photo of their face and save it, and then synthesize different colors and hairstyles on the photo also showed a nude photo of the user and the photo that was allegedly pornographic.  

A naked photo received while using the Snow application (app) AI Hair Shop feature. The photo above is a mosaic of the original, Photo courtesy of a reader

The reason why these problems occurred one after another is that the photos taken with the camera or uploaded by the user to the app are transformed by such functions using an artificial intelligence application installed on the server, but the problem is that those scumbags of the company, a subsidiary of Naver, implemented out-painting and face swap functions in conjunction with applications such as Stable Diffusion, which is open source, to easily implement such out-painting and face swap functions. In that case, it's a scumbag company that's all about the money.

Image generation apps such as Stable Diffusion are distributed for free and used by many people around the world, and it is known that it is a very popular application for the creation of deepfakes and related artificial pornography that has recently flooded the Internet, and while it is easy to install and maintain the app, it is difficult to implement a proper control not to mass produce pornography.

In the world except Korea, artificial intelligence image generation applications such as Stable Diffusion are installed for free and used by many individual users, but in Korea, which has a language barrier and a rice wall barrier and is becoming the Galapagos of global technology, it seems that it is provided in the form of a paid service  mobile app for general users by linking these applications.

However, the problem is that this kind of accident occurred because the uncontrollable open source app was linked with a commercially paid app to provide these AI services.

Even in the case of NAVER's subsidiary Snow, it should have provided this feature only with an image generation model with NSWF filter that does not generate such pornography in the first place, or had the relevant stable diffusion management personnel in the company strictly manage the app.

After examining the contents of related broadcasts and news articles, I strongly suspect that the out-painting function, which draws the outline of the photo with artificial intelligence, and the hair style change function (a typical face swap function) were not implemented on the servers of Naver's subsidiary Snow, but by connecting the 3rd party online service company through the API.

Currently, image-generating AI applications like stable diffusion type are centered on AI models that draw pictures or drawings according to text prompt commands,  but also there are limited other functions such as changing the picture into a different shape or expanding the picture when you enter a picture or image. The function that draws a photo or picture when a prompt command is entered is called tex2img, and the function that outputs a picture when a photo or picture is entered is called img2imng. img2img is mainly used to generate adult materials such as deepfakes but there are not many AI models in this img2img function, and most models are trained to generate adult materials that are not prohibited by NSFW. 

In addition, if a NAVER subsidiary called Snow installed such an app on its own servers and provided related functions, it is inferred that the personnel who operates the relevant servers in the company accessed these servers from time to time and drew pornographic images, and that the usage of such applications by the operators was so frequent that the pornographic images were synthesized by reflecting this usage pattern. This is because these applications also have a tendency to produce pictures or drawings based on their usage.    

At this point, I think the police should seize and search the servers of these companies. And if it is true that these applications provide related functions online through camera apps, the photos of the user's face that the user enters or uploads online are stored in a specific storage folder on the server where the stable diffusion is installed as input data.

Additionally, the details of the artificial intelligence model that generates only the faces of specific celebrities through App's image-generating artificial intelligence function related to AI app such as Stable Diffusion are as follows: after collecting photos of celebrity faces that are flooded on the Internet, training an artificial intelligence model with those face pictures, installing it in the app, and entering a few prompts related to the celebrity, the Lora (Low Level Adaptation) artificial intelligence model is trained to generate only photos with the celebrity's face, and images are generated using it. Therefore, there is currently no way to prevent such an artificial intelligence model from being used in such an artificial intelligence app.

Previously, it took 500 to 1000 photos to create such a Lora A.I. model, and it is was possible to train such a model by running a computer with powerful hardware for several hours, but now, a personal gaming  PC can with about 10 photos of the celebrity's face and a few tens of minutes of training depending on the computer's performance can train the model and there are already A.I. apps that make this very easy, and  the information is being shared on the Internet about how individual developers can further develop these training apps and make the models on their own personal computers with less expensive HW. Of course, there is a language barrier, and in Korea, which is a technological Galapagos, few people are doing this, so most of the people who are mass-training these AI models with Korean celebrities' photos and posting them on related websites are Chinese commies.

Now that the Pandora's box of image-generating A.I. apps has been opened, there is no way to stop it no matter what. So, by using an online service that changes a photo you take with A.I., you are giving your face biometric information to these garbage companies that only want to make more money. Therefore, it is really, really stupid to use an online service that changes a photo with A.I., unless it is an image-generating A.I. app installed on your own personal computer, and even pay for it with your own hard-earned money is really absurd.  

Do you really think the ignorant scumbag reporters of the propaganda yellow journalism media are going to realize the dangers of this and report it properly? I don't think so.

It's pathetic to see such an AI backward country, where trashy companies that don't even know the basics of AI are offering these dangerous generative image AI apps online as a paid service without proper internal control, without proper management, and without any hesitation after such an accident. What future is there for AI in such a backward scumbag country?

This scumbag country can prosecute middle schoolers charged for making deepfake porn pictures  out of curiosity but never prosecute big IT service companies doing same thing.

Post a Comment

Previous Post Next Post