1 / 28

Detail Preserving Shape Deformation in Image Editing

Detail Preserving Shape Deformation in Image Editing. SIGGRAPH 2007 Hui Fang and John C. Hart. Abstract. We propose an image editing system Preserve its detail and orientation by resynthesizing texture from the source

kalei
Download Presentation

Detail Preserving Shape Deformation in Image Editing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Detail Preserving Shape Deformation in Image Editing SIGGRAPH 2007 Hui Fang and John C. Hart

  2. Abstract • We propose an image editing system • Preserve its detail and orientation by resynthesizing texture from the source • Patch-based texture synthesis that aligns texture features with image features

  3. Introduction • A novel image editing system that allows a user to select and move one or more image feature curves • Replacing any texture stretched by the deformation with texture resynthesized • Anisotropic feature-aligned texture synthesis step to preserve texture detail • Distortion to the texture coordinates for each patch to align the target image features • GraphCut textures [Kwatra et al. 2003]

  4. Introduction • A new method that distorts the coordinates of patch • Image Analogies [Hertzmann et al. 2001] can synthesize a texture to adhere to a given feature line • Yields more high-frequency noise unlike modern patch-based synthesis • Image Quilting [Efros and Freeman 2001] could fill different silhouettes with a texture • Boundary patches appeared to repeat • Feature matching and deformation for texture synthesis [Wu and Yu 2004] distorted neighboring patches to connect their feature lines • Not as global as what us did

  5. Overview • Deformation • Draw feature curves in the source image, and then move them to their desired destination positions • Curvilinear Coordinates • Define curvilinear coordinates using curve tangent vectors & Euler integration • Textured Patch Generation • A pair of curvilinear coordinate is generated • Texture synthesis over the destination grid from source • Image Synthesis • Finalize the synthesis via GraphCut

  6. Deformation p'i(t) pi(t) D(p'f) = pf – p'f D(∂I’) = 0

  7. Deformation Original Deformed

  8. Curvilinear Coordinates p'i(t) T'

  9. Curvilinear Coordinates • Since the parametrization of each feature curve is arbitrary, one can encounter global orientation inconsistencies • Calculate separate tangent field for each curve then use only the field which is the closest • We integrate these diffused tangents to construct a local curvilinear coordinate system extending from any chosen “origin” pixel

  10. Curvilinear Coordinates p'i(t) j k

  11. Curvilinear Coordinates • Time-step ɛ = 1 • 30 ~ 40 pixels along spines (j direction) • 15 ~ 30 pixels wide ribs (k direction) • Two pixels short of nearby feature curve to prevent overlapping • Smooth the coordinates with several Laplacian iterations • λ = 0.7 • Removes singularities and self-intersections that can occur • Does not completely solve the problem (Not very noticeable)

  12. Curvilinear Coordinates

  13. Textured Patch Generation • Source origin q0,0 = D(q'0,0) • Bilinear filter to find the color at the source image • Unit-radius cone filter centered at each destination to accumulate the synthesized texture • Small reduction in the resolution of the resynthesized texture detail

  14. Image Synthesis • Use GraphCut [Kwatra et al. 2003] • Generate patches individually, using a priority queue to generate first patches whose origin pixel is closest to the feature curve and adjacent to a previously synthesized patch • Generate a pool of candidate textured patches synthesized from source patches grown from origins randomly chosen from an 11×11 pixel region surrounding the point D(q'0,0) • Choose one with the least overlapping difference with previously synthesized neighboring patches

  15. Image Synthesis • Selected patch merges into destination via GraphCut • Use Poission Image Editing when the seam produces by GraphCut is unsatisfactory

  16. Scale Adaptive Retexturing • The deformation field D can potentially compress a large source area into a small target area • Cause blocky artifacts and seams • Occur when the origin pixels of neighboring patches in the target map to positions in the source with different texture characteristics • Can be overcome by altering the texture synthesis sampling

  17. Scale Adaptive Retexturing

  18. Scale Adaptive Retexturing • We detect these potential problems with a (real) compression field C' • Clamp the compression field to values in [1,3] to limit its effect • The “spine” length and “rib” breadth of patches are reduced by C'(x,y)

  19. Scale Adaptive Retexturing

  20. Results • Accelerated the construction of source feature curves by using portions of the segmentation boundary produced by Lazy Snapping [Li et al. 2004] • Feature curves do not need to match feature contours exactly, as deformed features were often aligned by the texture search • Used the ordinary Laplacian deformation for interactive preview • Denoted some feature curves as “passive” to aid texture orientation

  21. Results • Filtering used for curvilinear grid resamplingremoves some of the high frequency detail • Could be recovered by sharpening with histogram interpolation and matching [Matusik et al. 2005]

  22. Results

  23. Results

  24. Results

  25. Failure case

  26. Results • Sharp image changes (like shading changes) should identified by feature curves • Lack of feature curves will cause unrealistic discontinuities in the result • Poisson image editing hides some of these artifacts • by softly blending the misaligned features

  27. Results Measured on a 3.40GHz Pentium 4 CPU (31 x 31 search domain for beach)

  28. Conclusion • Stretched texture details can be adequately recovered by a local retexturing around user-defined feature curves • Assumes that the orientation of texture detail of an image is related to the orientation of nearby feature curves • Matting can be used to eliminate unwanted artifacts (Fig. 5) • In practice the success of this approach depends primarily on the selection of the feature curves • The most promising direction of future work in this topic would be to add the automatic detection and organization of image feature curves

More Related