Paint the future: a neural network learned how to add the photo of the nonexistent details

Thanks to neural networks, graphical editors of the future will be capable of doing the impossible to date things. For example, can you imagine that an ordinary computer program can draw the picture of the man smile, remove the glasses and change the hair? It seems fantastic, but such a program has been created and posted on GitHub, so any developer can run it on your computer and edit your photos. The results of the program are impressive.

Нейронная сеть SC-FEGAN

On the basis of the application is a generative-adversarial neural network SC-FEGAN. Such networks are used in many similar projects and consists of two parts. In this application, they are the image generator Unet-like and discriminator SN-patchGAN, the first of which generates sample images and the second decides they are suitable or not.

The program interface is quite simple — the user is required to upload a person’s photo, paint on it and new parts. They can be extra strands of hair, smile and even a variety of jewelry. Moreover, the neural network is able to delete some objects like points and also change the color of eyes and hair.

Нейронная сеть SC-FEGAN

Unfortunately, at this point you can run the program only by following the complicated for ordinary users of the user. However, do not despair — in the near future neural network SC-FEGAN can go into the browser based or mobile apps for editing images.

Similar projects in recent years — are not uncommon. In mid-February, one of the developers of the taxi service Uber has launched a website, which is generated each time the face of a nonexistent person. To read about the project and watch a video with the process of creating artificial persons in our material.

What do you think, in what other types of programs to implement neural networks? What new functions they have come from? To dream up on this topic in the comments, or in our Telegram chat!

Leave a Reply

Your email address will not be published. Required fields are marked *