TY - JOUR
T1 - Multispectral Texture Synthesis Using RGB Convolutional Neural Networks
AU - Ollivier, Sélim
AU - Gousseau, Yann
AU - Lefebvre, Sidonie
N1 - Publisher Copyright:
© 1980-2012 IEEE.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - State-of-the-art red-green–blue (RGB) texture synthesis algorithms rely on style distances that are computed through statistics of deep features. These deep features are extracted by classification neural networks that have been trained on large datasets of RGB images. Extending such synthesis methods to multispectral images is not straightforward, since the pretrained networks are designed for and have been trained on RGB images. In this work, we propose two solutions to extend these methods to multispectral imaging (MSI). Neither of them requires additional training of the neural network from which the second-order neural statistics are extracted. The first one involves optimizing over batches of random triplets of spectral bands during training. The second one projects multispectral pixels onto a 3-D space. We further explore the benefit of a color transfer operation upstream of the projection to avoid the potentially abnormal color distributions induced by the projection. Our experiments compare the performances of the various methods through different metrics. We demonstrate that they can be used to perform exemplar-based texture synthesis, achieve good visual quality, and come close to state-of-the-art methods on RGB bands.
AB - State-of-the-art red-green–blue (RGB) texture synthesis algorithms rely on style distances that are computed through statistics of deep features. These deep features are extracted by classification neural networks that have been trained on large datasets of RGB images. Extending such synthesis methods to multispectral images is not straightforward, since the pretrained networks are designed for and have been trained on RGB images. In this work, we propose two solutions to extend these methods to multispectral imaging (MSI). Neither of them requires additional training of the neural network from which the second-order neural statistics are extracted. The first one involves optimizing over batches of random triplets of spectral bands during training. The second one projects multispectral pixels onto a 3-D space. We further explore the benefit of a color transfer operation upstream of the projection to avoid the potentially abnormal color distributions induced by the projection. Our experiments compare the performances of the various methods through different metrics. We demonstrate that they can be used to perform exemplar-based texture synthesis, achieve good visual quality, and come close to state-of-the-art methods on RGB bands.
KW - Multispectral imaging (MSI)
KW - texture synthesis
UR - https://www.scopus.com/pages/publications/105001292992
U2 - 10.1109/TGRS.2025.3554931
DO - 10.1109/TGRS.2025.3554931
M3 - Article
AN - SCOPUS:105001292992
SN - 0196-2892
VL - 63
JO - IEEE Transactions on Geoscience and Remote Sensing
JF - IEEE Transactions on Geoscience and Remote Sensing
M1 - 5402914
ER -