Object-Coherence Warping for Stereoscopic
Image Retargeting

Shih-Syun Lin     Chao-Hung Lin     Shu-Huai Chang     Tong-Yee Lee    

accepted to appear in IEEE Transactions on Circuits and Systems for Video Technology

Teaser

Combining cropping with the proposed method. Left: original stereoscopic image; middle: retargeting using the proposed method with cropping; right: retargeting using the proposed method only.

Abstract

This paper addresses the topic of content-aware stereoscopic image retargeting. The key to this topic is consistently adapting a stereoscopic image to fit displays with various aspect ratios and sizes while preserving visually salient content. Most methods focus on preserving the disparities and shapes of visually salient objects through nonlinear image warping, in which distortions caused by warping are propagated to homogenous and low-significance regions. However, disregarding the consistency of object deformation sometimes results in apparent distortions in both the disparities and shapes of objects. An object-coherence warping scheme is proposed to reduce this unwanted distortion. The basic idea is to utilize the information of matched objects rather than that of matched pixels in warping. Such information implies object correspondences in a stereoscopic image pair, which allows the generation of an object significance map and the consistent preservation of objects. This strategy enables our method to consistently preserve both the disparities and shapes of visually salient objects, leading to good content-aware retargeting. In the experiments, qualitative and quantitative analyses of various stereoscopic images show that our results are better than those generated by related methods in terms of consistency of object preservation.

Expremental Results

Teaser

Comparison of shape preservation with the seam-carving-based method (SSC), mesh-warping-based method (SMW), linear scaling (LS), and the proposed method. The left image and stereoscopic image are shown at the top and bottom, respectively.

URL

Our experimental results can be downloaded from [Link].

In addition, some tested images are selected from the datasets CMU/VASC [1] and NVIDIA [2].

[1] “CMU/VASC Image Database: Stereo Image,” http://vasc.ri.cmu.edu/idb/html/stereo/.

[2] “NVIDIA Image Database: Stereo Image,” http://photos.3dvisionlive.com/NVIDIA/.