⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 qccvidmeshmotionestimationsearch.3

📁 QccPack-0.54-1 released (2007-04-30) is being developed and tested on Fedora Core Linux. QccPack pro
💻 3
字号:
.TH QCCVIDMESHMOTIONESTIMATIONSEARCH 3 "QCCPACK" "".SH NAMEQccVIDMeshMotionEstimationWarpMesh,QccVIDMeshMotionEstimationSearch,QccVIDMeshMotionEstimationCreateCompensatedFrame\- routines for motion estimation and compensationusing regular triangle meshes.SH SYNOPSIS.B #include "libQccPack.h".sp.BI "int QccVIDMeshMotionEstimationWarpMesh(const QccRegularMesh *" reference_mesh ", QccRegularMesh *" current_mesh ", const QccIMGImageComponent *" motion_vectors_horizontal ", const QccIMGImageComponent *" motion_vectors_vertical );.br.sp.BI "int QccVIDMeshMotionEstimationSearch(const QccIMGImageComponent *" current_frame ", QccRegularMesh *" current_mesh ", const QccIMGImageComponent *" reference_frame ", const QccRegularMesh *" reference_mesh ", QccIMGImageComponent *" motion_vectors_horizontal ", QccIMGImageComponent *" motion_vectors_vertical ", int " block_size ", int " window_size ", int " subpixel_accuracy ", int " constrained_boundary ", int " exponential_kernel );.br.sp.BI "int QccVIDMeshMotionEstimationCreateCompensatedFrame(QccIMGImageComponent *" motion_compensated_frame ", const QccRegularMesh *" current_mesh ", const QccIMGImageComponent *" reference_frame ", const QccRegularMesh *" reference_mesh ", int " subpixel_accuracy );.SH DESCRIPTION.BR QccVIDMeshMotionEstimationSearch()and.BR QccVIDMeshMotionEstimationCreateCompensatedFrame()perform motion estimation and compensation, respectively,between two video frames using a regular triangle meshrather than square blocks as in the ubiquitous block-basedmotion estimation/compensation (i.e., see.BR QccVIDMotionEstimationFullSearch (3)).This regular triangle mesh is created by dividing the referenceframe into square blocks and then splitting each block along its diagonal..LPFor motion estimation via.BR QccVIDMeshMotionEstimationSearch() ,the triangle vertices, or "control points," of the regular mesh aretracked from the reference frame to the current frame via asimple, block-based motion-estimation strategy due to Eckert.IR "et al" .In this approach, motion into the current frame is estimated bycentering a small block at each vertex in thereference-frame mesh and finding the best-matching blockin the current frame..LPOnce motion is estimated in this mannerand a field of motion vectors determined,.BR QccVIDMeshMotionEstimationWarpMesh()can be used to create a motion-compensated version of themesh in the current frame from the meshin the reference frame. In essence,.BR QccVIDMeshMotionEstimationWarpMesh()"warps" the reference-frame mesh into the current frameby adding to each vertex of the meshits corresponding motion vector..LPOnce the motion-compensated mesh is available in the current frame,.BR QccVIDMeshMotionEstimationCreateCompensatedFrame()can be used to perform motion compensation between the frames.That is,.BR QccVIDMeshMotionEstimationCreateCompensatedFrame()uses affine transforms between the two meshesto construct a motion-compensated prediction of the current frame from thereference frame. This motion-compensated frame is constructed bycreating, for each triangle in the reference-frame mesh,an affine transform that maps the triangle into the current-frame mesh.This affine transform is then used to mapthe pixels corresponding to the trianglein the reference frame into the current frame,with bilinear interpolation betweenthe surrounding four integer-pixel locations used to resolvesubpixel positions produced by the affine mapping..SS "Motion Estimation".BR QccVIDMeshMotionEstimationSearch()performs a motion-estimation search to produce a motion-vector fieldbetween.I reference_frameand.IR current_frame ..IR reference_meshis the regular mesh in the reference frame, and.BR QccVIDMeshMotionEstimationSearch()estimates the motion of thevertices of this mesh, producing a motion-vector fieldwhich is returned in.I motion_vectors_horizontaland.IR motion_vectors_vertical ..BR QccVIDMeshMotionEstimationSearch()calls.BR QccVIDMeshMotionEstimationWarpMesh()(see below)to produce the corresponding motion-compensated mesh in the current frame,which is returned as.IR current_mesh ..LP.IR block_sizegives the the size of the square block that is centered at eachmesh vertex in order to determine the vertex motion, followingthe block-based vertex-motion estimation procedure outlined by Eckert.IR "et al" ..LP.IR window_sizegives the size of the motion-estimation search window about the currentvertex location..LP.I subpixel_accuracyis one of.BR QCCVID_ME_FULLPIXEL ,.BR QCCVID_ME_HALFPIXEL ,.BR QCCVID_ME_QUARTERPIXEL ,or .BR QCCVID_ME_EIGHTHPIXEL ,indicating full-, half-, quarter-, or eighth-pixel accuracy.If anything other than integer-pixel accuracy is used,.BR QccVIDMotionEstimationCreateReferenceFrame (3)must be called on both.IR current_frameand.IR reference_frameto interpolate them to the appropriate subpixel accuracyprior to calling.BR QccVIDMeshMotionEstimationSearch() ..LPIf.IR constrained_boundaryis 1, .BR QccVIDMeshMotionEstimationSearch() constrains all vertices that lie on the boundary of the reference frameto have zero-valued motion vectors. In doing so, the resulting.IR current_meshis guaranteed to cover the entire.IR current_framewith no "gaps."If.IR constrained_boundaryis 0, no such guarantee is in place, and motion vectors for the image-boundaryvertices can take on any value, perhaps moving into the interior ofthe image or beyond thebounds of the image.This latter, unconstrained approachmay permit better motion estimation at theexpense of some "gaps" possibly arising in thecorresponding motion-compensated frame..LPIf.IR exponential_kernelis 1,a exponential function is used to create a kernel for the block-basedsearch process that estimates the motion of the mesh vertices. Thisexponential kernel provides greater weight to the pixels in thecenter of the block (i.e., corresponding to the vertex of interest itself) andexponentially decreasing weight to pixels distant from the center.If.IR exponential_kernelis 0, all pixels in the block are weighted the same in the motion-estimationsearch. See Eckert.IR "et al" .and Schroder and Mech..SS "Motion Compensation".BR QccVIDMeshMotionEstimationCreateCompensatedFrame()constructs the motion-compensated prediction ofthe current frame from.I reference_frameusing affine transforms between the reference-frame mesh,.IR reference_mesh ,and the current-frame mesh,.IR current_mesh .The motion-compensated frame is returned in.IR motion_compensated_frame ,which must be allocated prior to calling.BR QccVIDMeshMotionEstimationCreateCompensatedFrame() ..LP.IR subpixel_accuracyis one of.BR QCCVID_ME_FULLPIXEL ,.BR QCCVID_ME_HALFPIXEL ,.BR QCCVID_ME_QUARTERPIXEL ,or .BR QCCVID_ME_EIGHTHPIXEL ,indicating full-, half-, quarter-, or eighth-pixel accuracy.If anything other than integer-pixel accuracy is used,.BR QccVIDMotionEstimationCreateReferenceFrame (3)must be called on.IR reference_frameto interpolate it to the appropriate subpixel accuracyprior to calling.BR QccVIDMeshMotionEstimationCreateCompensatedFrame() .On the other hand,.IR motion_compensated_frame must be the same size as the original current and reference framesin all cases (i.e., it is not interpolated to subpixel accuracy)..LP.BR QccVIDMeshMotionEstimationCreateCompensatedFrame() uses.BR QccTriangleCreateAffineTransform (3)to construct an affine transform between each pair ofreference-frame and current-frame triangles, and.BR QccPointAffineTransform (3)to map a pixel from the reference frame to the current frame..SS "Mesh Warping".BR QccVIDMeshMotionEstimationWarpMesh()constructs a mesh in the current frame,.IR current_mesh ,from a mesh in the reference frame,.IR reference_mesh ,by adding motion vectors to each vertex of.IR reference_mesh .The motion vectors are specified by.IR motion_vectors_horizontaland.IR motion_vectors_verticaland are usually obtained via.BR QccVIDMeshMotionEstimationSearch() ..IR current_meshmust be allocated to the same size as.IR reference_meshprior to calling.BR QccVIDMeshMotionEstimationWarpMesh() ..SH "RETURN VALUE"These routines return 0 on success, and 1 on failure..SH "SEE ALSO".BR QccVIDMotionVectorsEncode (3),.BR QccVIDMotionVectorsDecode (3),.BR QccRegularMesh (3),.BR mesh_memc (1),.BR QccPackVID (3),.BR QccPackENT (3),.BR QccPack (3)Y. Altunbasak, A. M. Tekalp, and G. Bozdagi,"Two-Dimensional Object-based Coding Using a Content-based Meshand Affine Motion Parameterization," in.IR "Proceedings of the International Conference on Image Processing" ,Washington, DC, October 1995, vol. 2, pp. 394-397.M. Eckert, D. Ruiz, J. I. Ronda, and N. Garcia,"Evaluation of DWT and DCT for Irregular Mesh-basedMotion Compensation in Predictive Video Coding,"  in.IR "Visual Communications and Image Processing" ,K. N. Ngan, T. Sikora, and M.-T. Sun, Eds., Proc. SPIE 4067,June 2000, pp. 447-456.K. Schroder and R. Mech,"Combined Description of Shape and Motion in an ObjectBased Coding Scheme Using Curved Triangles," in.IR "Proceedings of the International Conference on Image Processing" ,Washington, DC, October 1995, vol. 2, pp. 390-393.Y. Wang, S. Cui, and J. E. Fowler,"3D Video Coding Using Redundant-Wavelet Multihypothesis andMotion-Compensated Temporal Filtering," in.IR "Proceedings of the International Conference on Image Processing" ,Barcelona, Spain, September 2003, vol. 2, pp. 755-758.Y. Wang, S. Cui, and J. E. Fowler,"3D Video Coding with Redundant-Wavelet Multihypothesis,".IR "IEEE Transactions on Circuits and Systems for Video Technology" ,submitted July 2003. Revised April 2004, March 2005..SH AUTHORCopyright (C) 1997-2007  James E. Fowler.\"  The programs herein are free software; you can redistribute them an.or.\"  modify them under the terms of the GNU General Public License.\"  as published by the Free Software Foundation; either version 2.\"  of the License, or (at your option) any later version..\"  .\"  These programs are distributed in the hope that they will be useful,.\"  but WITHOUT ANY WARRANTY; without even the implied warranty of.\"  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the.\"  GNU General Public License for more details..\"  .\"  You should have received a copy of the GNU General Public License.\"  along with these programs; if not, write to the Free Software.\"  Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -