Skip to main navigation menu Skip to main content Skip to site footer

Articles

Vol. 51 No. 1 (2024): CAA2024 Across the Horizon. Proceedings of the 51st Conference on Computer Applications and Quantitative Methods in Archaeology

Statistical image processing (Decorrelation Stretch) and deep learning (CycleGANs) to restore images of faded artworks

DOI
https://doi.org/10.64888/caaproceedings.v51i1.899
Submitted
December 14, 2025
Published
2026-01-12

Abstract

Cultural properties gradually fade and deteriorate over time due to various factors, including exposure to ultraviolet and infrared radiation, fluctuations in temperature and humidity that cause the degradation of organic materials, and the accumulation of dust and other contaminants on the surface. These factors diminish the visibility of the artwork, often requiring professional restoration techniques that are time-consuming, costly, and potentially irreversible. This study proposes a method to digitally restore the visual clarity of cultural properties by combining statistical image processing with deep learning. First, Decorrelation Stretch (DStretch) is applied to transform the color profile of photographed images, enhancing the visibility of faded motifs such as characters and illustrations. Next, Cycle-Consistent Generative Adversarial Networks (CycleGANs) are applied to learn the color domain transformation between the DStretch-processed (faded) images and reference images of similar, minimally faded artworks. This enables the generation of estimated prefaded images without requiring any physical intervention on the actual cultural property. The proposed method facilitates digital restoration and utilization of cultural properties, contributing to their preservation and broader public access. Furthermore, by reusing and reanalyzing processed image data, this approach fosters reproducibility in cultural heritage research, which is often susceptible to subjectivity.

References

  1. Agapiou, A.; Lysandrou, V.; Sarris, A.; Hadjimitsis, D. 2019. The use of remote sensing for the detection of buried archaeological remains. Journal of Archaeological Science 104, 56–68.
  2. Belard, R. 2010. The May 1st Sutra: conservation of a Nara-period handscroll. Journal of the Institute of Conservation 33(1), 93–109. https://doi.org/10.1080/19455220903516767
  3. Chiang, C. H.; Yeh, H. T.; Lin, Y. T. 2020. Multispectral imaging and decorrelation stretch for cultural heritage documentation. Journal of Cultural Heritage 45, 148–156.
  4. Fiske, B.; Morenus, L. S. 2004. Ultraviolet and Infrared Examination of Japanese Woodblock Prints: Identifying Reds and Blues. The Book and Paper Group Annual 23. Available online: https://cool.culturalheritage.org/coolaic/sg/bpg/annual/v23/bpga23-05.pdf
  5. Gillespie, A. R.; Kahle, A. B.; Walker, R. E. 1986. Color enhancement of highly correlated images. I. Decorrelation and HSI contrast stretches. Remote Sensing of Environment 20(3), 209–235. https://doi.org/10.1016/0034-4257(86)90044-1
  6. Harman, J. 2005. Using Decorrelation Stretch to Enhance Rock Art Images. In American Rock Art Research Association Annual Meeting, 28 May 2005; updated 2006, 2008. Available online: https://www.dstretch.com/AlgorithmDescription.pdf
  7. Kobayashi, Y. 1986. Problems of discolouration and fading in museums: Preliminary observations using Japanese painting pigments. The Annual Report of the Historical Museum of Hokkaido 14, 133–140. https://doi.org/10.24484/sitereports.128292-114338
  8. Li, Y. 2017. Chinese Objects Recovered from Sutra Mounds in Japan, 1000–1300. In Visual and Material Cultures in Middle Period China, Ebrey, P. B.; Huang, S.-s. S., eds.; Brill: Leiden, The Netherlands, 284–318.
  9. Pietroni, E.; Ferdani, D. 2021. Virtual Restoration and Virtual Reconstruction in Cultural Heritage: Terminology, Methodologies, Visual Representation Techniques and Cognitive Models. Information 12(4), 167. https://doi.org/10.3390/info12040167
  10. Putranti, N. E.; Chang, S.-J.; Raffiudin, M. 2025. Revitalizing Art with Technology: A Deep Learning Approach to Virtual Restoration. JISKA (Jurnal Informatika Sunan Kalijaga) 10(1), 87–99. https://doi.org/10.14421/jiska.2025.10.1.87-99
  11. Sun, P.; Hou, M.; Lyu, S.; Wang, W.; Li, S.; Mao, J.; Li, S. 2022. Enhancement and Restoration of Scratched Murals Based on Hyperspectral Imaging—A Case Study of Murals in the Baoguang Hall of Qutan Temple. Sensors 22(24), 9780.
  12. https://doi.org/10.3390/s22249780
  13. Thumas, J. 2022. Buried Scripture and the Interpretation of Ritual. Cambridge Archaeological Journal 32(4), 585–599. https://doi.org/10.1017/S0959774322000038
  14. Tolosana, R.; Vera-Rodriguez, R.; Fierrez, J.; Morales, A.; Ortega-Garcia, J. 2020. Deep fakes and beyond: A survey of face manipulation and fake detection. Information Fusion 64, 131–148. https://doi.org/10.48550/arXiv.2001.00179
  15. Uyeda, T. 2022. Reuse and Recycling: Implications in Japanese Painting Conservation. Ars Orientalis 52, 222–243. https://doi.org/10.3998/ars.3994
  16. Wei, Z.; Feng, Z.; Tan, H. 2023. Key to the conservation of calligraphy and painting relics in collection: proposing a lighting damage evaluation method. Heritage Science 11. https://doi.org/10.1186/s40494-023-00945-0
  17. Zhu, J. Y.; Park, T.; Isola, P.; Efros, A. A. 2017. Unpaired image-to-image translation using Cycle-Consistent Adversarial Networks. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2223–2232. Available online: https://arxiv.org/abs/1703.10593