Transfer of style from an ensemble of van Gogh paintings to Martian landscape imagery via deep convolutional neural networks
- Imaging Group, Mullard Space Science Laboratory, Department of Space and Climate Physics, University College London, London, UK. Email: george.cann.15@ucl.ac.uk.
Abstract
A novel application of the transfer of style via deep convolutional neural networks (CNN) is presented, utilising an ensemble of paintings, by Vincent van Gogh to stylise Martian landscape imagery. Neural style transfer (NST)[1][2] can be used to create artistic work through rendering a content image in the form of a style image. Through applying NST to Martian landscape imagery, a collection of artwork is presented that emulates what van Gogh may have painted had the artist visited Mars.
Introduction
Since the dawn of humanity celestial phenomena have inspired art, science and exploration. Throughout history these celestial phenomena have been viewed beneath the terrestrial sky. In the 21st century, through technological advances in spacecraft reusability and artificial intelligence, new celestial phenomena may be viewed by astronauts beneath extra-terrestrial skies for the first time.[3]
The Dutch post-impressionist painter Vincent van Gogh (1853-1890) is widely regarded as one of the most influential artists in Western art. Many of van Gogh’s most famous artwork includes celestial objects, such as in stars in The Starry Night (1889)[4] and The Starry Night over the Rhône (1888)[5] and the Moon in Landscape with Wheat Sheaves and Rising Moon, (1889)[6]. Here a collection of work highlighting the achievements of the planetary science community in exploring the Martian environment is presented. This collection imagines what Vincent van Gogh would have painted had the artist have visited the Red Planet.
Methods
NST was developed by[1] and can be used to create novel artistic work by rendering one image in the style of another. NST uses a previously trained CNN called the VGG-19 network; a 19 layer neural network that has been trained on a large dataset of ImageNet[7] images. Training on this large dataset of images allows the VGG-19 neural network to recognise low and high-level features in images.
A key aspect of NST involves defining and minimising the content and style cost functions. Once these functions are defined, they are added together to create a total cost function. Using the Adam[8] optimisation algorithm, the aforementioned total cost function, which makes the generated image follow the content of the content image and the style of the style image simultaneously, which after several iterations creates stylised images. For further details the author refers the reader to the literature[1][2].
In this research 30 paintings by Vincent van Gogh were identified that included the Sun, Moon and stars, these artworks contribute to the collection of style images used with NST. Similarly, 30 famous images of Martian landscape imagery were selected, including images from NASA’s Spirit, Opportunity and Curiosity Rovers and Viking 1/2 landers, these artworks contribute to the collection of content images used with NST. The author found that creating a montage of a particular feature, i.e. the sky or cobbled streets, followed by applying NST from [2] with a linear colour transfer provided aesthetically better results than utilising semantic style transfer with guided gram matrices and masks.
Results
Figure 1: (Left) The content image of a Martian eclipse, Phobos transits the Sun at sunset, NASA, 2010 [10]. (Right) The result: A Martian eclipse, Phobos transits the Sun at sunset, captures the beauty of nature in the solar system, showing a blue sunset eclipse of the Sun by the Martian moon Phobos. This work has been rendered in the style of Vincent van Gogh’s, 1888, The Cafe Terrace on the Place du Forum, Arles, at Night.[9]
Conclusions
This research has described a novel application of NST[1][ to Martian landscape imagery, through presenting a collection of Martian van Gogh artworks, showing that deep convolutional neural networks, offer a new way of visualising the Martian environment.
Acknowledgements
The author would like to thank the UK Space Agency for their support through the Aurora Science Studentship, STFC:535385. This work is freely available to download from Oxia Palus at: https://www.oxia-palus.com/.
Compute Environment
Instance: GCP Compute Engine. Image: c3-deeplearning-tf-1-14-cu100-20190724. Hardware: NVIDIA 16GB V100, 8CPU GCP Deep Learning VM with CUDA enabled. Software Tools: Tensorflow 1.14, CUDA-10.0, scipy, numpy, matplotlib, PIL. Training Protocols: Style transfer run for 10000 iterations, with an RGB style image. Content cost coefficient alpha=10 and style cost coefficient beta=40. For the style layers conv1_1=0.2, conv2_1=0.2, conv3_1=0.2, conv4_1=0.2 and conv5_1=0.2. Layer conv4_2=1, used as the content layer. Adam optimizer utilized with learning rate 1.0. Colour channels=3. Noise image ratio=0.6.
References
How to cite: Cann, G.: Transfer of style from an ensemble of van Gogh paintings to Martian landscape imagery via deep convolutional neural networks, Europlanet Science Congress 2020, online, 21 September–9 Oct 2020, EPSC2020-904, https://doi.org/10.5194/epsc2020-904, 2020