Apple introduces open source “neurophotoshop”
Apple, in collaboration with researchers from the University of California, has introduced an innovative MGIE neural network that allows editing images using natural language. The new approach to photo editing, presented as part of the joint project, opens up opportunities for convenient and intuitive interaction with images without the need for traditional editing tools.
According to the first feedback from journalists, the MGIE neural network demonstrates high accuracy in understanding queries and is capable of performing various editorial operations. Among the main functions of MGIE:
- Application of filters
- Color correction
- Crop and rotate images
- Edit individual objects in a photo
- Corrects contrast, brightness, and saturation.
Interestingly, the neural network independently identifies objects in the photo and edits them according to the user’s request. For example, the user can ask the neural network to reduce the size of a person’s face, increase the contrast of the moon, and make the colors of clothes less bright.
As one of the tests, a photo was sent to the neural network with a request to “make the sky a little redder.” MGIE successfully interpreted the command and produced a result that met the users’ expectations.
At the moment, MGIE is presented as a working demo on the HuggingFace website, where users can try out its functionality. The source code of the neural network is also available on GitHub.
Despite the openness and availability of the technology, it is not yet clear whether Apple will use the MGIE network in its own products. However, this solution, which provides unique image editing capabilities, can be widely used in a variety of photo processing and graphic design applications.