NEURAL STYLE TRANSFER: A COMPARATIVE STUDY USING VGG NETWORK

Main Article Content

Animesh Singh
Ayush Gautam
Merin Meleet

Abstract

Style transfer is an optimization technique used to create a new image out of a content image and a style image (as in an artistic work by a well-known painter) as a result of blending, the content image is altered to be painted in the same style as the content image but to resemble it of the style image. The essential method for creating style transfer is the convolutional neural network that allows for transfer (CNN). This paper will analyse key methods for performing style transfer photos and quickly compare multiple models and their outcomes. We primarily contrast one pioneering technique, developed by Leon Gatys in 2015, with the most recent technology in the High-Resolution Network from 2019 for the transfer of photorealistic style. Gatys's Neural Algorithm of Artistic Style produces excellent results, but the resulting data doesn't maintain the content image's characteristics and the fact that the paper gives no concise description of the internal mechanisms gram matrix. The enhancement is the conversion to a photorealistic style transfer from Gatys' neural style transfer. It aids in maintaining the structure and defining characteristics within the content image. Both methods make use of VGG, a network that was trained using the ImageNet dataset for conducting image categorization. We made our own data set with five types or groups and used transfer learning in a VGG network to do a style transfer using transfer learning.

Article Details

How to Cite
Singh, A., Gautam, A., & Meleet, M. (2023). NEURAL STYLE TRANSFER: A COMPARATIVE STUDY USING VGG NETWORK. Multidisciplinary Journal of Research in Engineering and Technology, 10(1), 16–24. Retrieved from https://journals.mriindia.com/index.php/mjret/article/view/1166
Section
Articles