Publication:
Unsupervised Stereoscopic Video Style Transfer

dc.contributor.authorImani, Hassan
dc.contributor.authorIslam, Md Baharul
dc.contributor.authorAhad, Md Atiqur Rahman
dc.contributor.institutionImani, Hassan, Department of Computer Engineering, Bahçeşehir Üniversitesi, Istanbul, Turkey
dc.contributor.institutionIslam, Md Baharul, Bahçeşehir Üniversitesi, Istanbul, Turkey, Florida Gulf Coast University, Fort Myers, United States
dc.contributor.institutionAhad, Md Atiqur Rahman, Department of Computer Science and Digital Technologies, University of East London, London, United Kingdom
dc.date.accessioned2025-10-05T15:07:05Z
dc.date.issued2023
dc.description.abstractThe creative style transfer across photos and videos involves transferring one style to another while maintaining the recipient image/video content. However, it is a challenging task, particularly for stereoscopic video, to preserve stereoscopic properties. This paper proposes a stereoscopic video style transfer method that maintains the temporal, depth, and stylization features. We suggest loosening the objective function to resolve the conflict between style transfer and temporal consistency for each left and right view to make the stylization loss term more motion-resistant. We use a zero-shot video style transfer framework for each left and right video frame. To make the stylized features consistent against the stereo features and consider the cross-view information in the stylized stereo video, we extend the parallax attention mechanism (PAM) as ePAM and use it to combine the left and right information. We compare quantitative and qualitatively with other image and video style transfer methods. Experimental results demonstrate the competitive performance of our method over the state-of-the-art. © 2023 Elsevier B.V., All rights reserved.
dc.identifier.conferenceName2023 Innovations in Intelligent Systems and Applications Conference, ASYU 2023
dc.identifier.conferencePlaceSivas, Sivas Cumhuriyet University
dc.identifier.doi10.1109/ASYU58738.2023.10296716
dc.identifier.isbn9798350306590
dc.identifier.scopus2-s2.0-85178293926
dc.identifier.urihttps://doi.org/10.1109/ASYU58738.2023.10296716
dc.identifier.urihttps://hdl.handle.net/20.500.14719/8202
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers Inc.
dc.subject.authorkeywordsDisparity.
dc.subject.authorkeywordsPam
dc.subject.authorkeywordsStereo Video
dc.subject.authorkeywordsStyle Transfer
dc.subject.authorkeywordsComputer Vision
dc.subject.authorkeywordsStereo Image Processing
dc.subject.authorkeywordsZero-shot Learning
dc.subject.authorkeywordsAttention Mechanisms
dc.subject.authorkeywordsCreatives
dc.subject.authorkeywordsDisparity.
dc.subject.authorkeywordsParallax Attention Mechanism
dc.subject.authorkeywordsProperty
dc.subject.authorkeywordsStereo Video
dc.subject.authorkeywordsStereoscopic Video
dc.subject.authorkeywordsStyle Transfer
dc.subject.authorkeywordsTransfer Method
dc.subject.authorkeywordsVideo Contents
dc.subject.authorkeywordsGeometrical Optics
dc.subject.indexkeywordsComputer vision
dc.subject.indexkeywordsStereo image processing
dc.subject.indexkeywordsZero-shot learning
dc.subject.indexkeywordsAttention mechanisms
dc.subject.indexkeywordsCreatives
dc.subject.indexkeywordsDisparity.
dc.subject.indexkeywordsParallax attention mechanism
dc.subject.indexkeywordsProperty
dc.subject.indexkeywordsStereo video
dc.subject.indexkeywordsStereoscopic video
dc.subject.indexkeywordsStyle transfer
dc.subject.indexkeywordsTransfer method
dc.subject.indexkeywordsVideo contents
dc.subject.indexkeywordsGeometrical optics
dc.titleUnsupervised Stereoscopic Video Style Transfer
dc.typeConference Paper
dcterms.referencesGatys, Leon A., Texture synthesis using convolutional neural networks, Advances in Neural Information Processing Systems, 2015-January, pp. 262-270, (2015), Gatys, Leon A., Image Style Transfer Using Convolutional Neural Networks, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-December, pp. 2414-2423, (2016), Huang, Xun, Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization, Proceedings of the IEEE International Conference on Computer Vision, 2017-October, pp. 1510-1519, (2017), Johnson, Justin, Perceptual losses for real-time style transfer and super-resolution, Lecture Notes in Computer Science, 9906 LNCS, pp. 694-711, (2016), Chen, Dongdong, StyleBank: An explicit representation for neural image style transfer, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2017-January, pp. 2770-2779, (2017), Ruder, Manuel, Artistic style transfer for videos, Lecture Notes in Computer Science, 9796 LNCS, pp. 26-36, (2016), Huang, Haozhi, Real-time neural style transfer for videos, 2017-January, pp. 7044-7052, (2017), Eccv, (2018), Eilertsen, Gabriel, Single-frame regularization for temporally stable cnns, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, pp. 11168-11177, (2019), Wang, Wenjing, Consistent Video Style Transfer via Relaxation and Regularization, IEEE Transactions on Image Processing, 29, pp. 9125-9139, (2020)
dspace.entity.typePublication
local.indexed.atScopus
person.identifier.scopus-author-id54796733900
person.identifier.scopus-author-id57204631897
person.identifier.scopus-author-id23491419800

Files