Motion-based Video Retargeting with Optimized Crop-and-Warp

Yu-Shuen Wang1,2, Hui-Chih Lin1, Olga Sorkine2, Tong-Yee Lee1

ACM Transaction on Graphics (Proceedings of SIGGRAPH 2010)

1National Cheng Kung University, Taiwan, 2New York University

(U.S. Provisional Patent Application No. 61/334,953, May 14, 2010 )

 

Our video retargeting framework has motion information at its core, utilizing it to define temporal persistence of video contents and to describe temporal coherence constraints. Left: We combine cropping and warping by forcing all informative video content inside the target video cube, without a priori constraining the size of each frame. Right: parts of the bunny and the squirrel are allowed to be cropped (top) since they will fully appear later in the video (bottom).

 

Abstract

We introduce a video retargeting method that achieves high-quality resizing to arbitrary aspect ratios for complex videos containing diverse camera and dynamic motions. Previous content-aware retargeting methods mostly concentrated on spatial considerations, attempting to preserve the shape of salient objects in each frame by removing or distorting homogeneous background content. However, sacrificeable space is fundamentally limited in video, since object motion makes foreground and background regions correlated, causing waving and squeezing artifacts. We solve the retargeting problem by explicitly employing motion information and by distributing distortion in both spatial and temporal dimensions. We combine novel cropping and warping operators, where the cropping removes temporally-recurring contents and the warping utilizes available homogeneous regions to mask deformations while preserving motion. Variational optimization allows to find the best balance between the two operations, enabling retargeting of challenging videos with complex motions, numerous prominent objects and arbitrary depth variability. Our method compares favorably with state-of-the-art retargeting systems, as demonstrated in the examples and widely supported by the conducted user study.

Paper

User Study

UserStudyData.pdf

http://cims.nyu.edu/~yw572/

Accompanying Video

VideoRetargeting.mp4

Supplemental Results

MARcomp.mp4    SVRcomp.mp4    Widening.mp4    MultiRes.mp4    limitation.mp4

Bibtex

   @article{CropWarp:2010,
        author   = {Yu-Shuen Wang and Hui-Chih Lin and Olga Sorkine and Tong-Yee Lee},
        title       = {Motion-based Video Retargeting with Optimized Crop-and-Warp},
        journal  = {
ACM Transactions on Graphics (proceedings of ACM SIGGRAPH)},
        year       = {2010},
        volume  = {29},
        number  = {4},
        pages    = {article no.\ 90},
    }

Acknowledgement

We thank the anonymous reviewers for their constructive comments. We are grateful to Alexander Hornung and Manuel Lang for insightful discussions and for helping us with the comparisons. We are also grateful to Tino Weinkauf for his comments, to Alec Jacobson for narrating the accompanying video, to Joyce Meng for her help with the video materials and licensing, to the members of Computer Graphics Group/Visual System Lab, National Cheng- Kung University, in particular Kun-Chuan Feng, for helping to conduct the user evaluation, and to all the users who participated in the user study. The usage of the video clips is permitted by ARS Film Production, Blender Foundation and MAMMOTH HD. This work was supported in part by the Landmark Program of the NCKU Top University Project (contract B0008) and by an NYU URCF grant.