1 / 21

Planning Tracking Motions for an Intelligent Virtual Camera

Planning Tracking Motions for an Intelligent Virtual Camera. Tsai-Yen Li & Tzong-Hann Yu Presented by Chris Varma May 22, 2002. Problem considered Definition Related work Similar problems Problem space General formulation Specific formulation Actual formulation Search criteria

mahsa
Download Presentation

Planning Tracking Motions for an Intelligent Virtual Camera

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning Tracking Motions for an Intelligent Virtual Camera Tsai-Yen Li & Tzong-Hann Yu Presented by Chris Varma May 22, 2002

  2. Problem considered Definition Related work Similar problems Problem space General formulation Specific formulation Actual formulation Search criteria Planning efficiency Tracking direction View distance Overall movement View angle Approach Best-first planning (BFP) Cost Function Post-processing steps Implementation Experiments Improvements Q&A Presentation Outline

  3. Problem Definition • How to automatically generate viewpoint motions for a virtual camera according to the pre-planned trajectory of an interactive tour guide application • Different from active vision problems: requires maintaining constant visibility with the target while optimizing certain camera-specific criteria • Different from traditional path planning • Consider visibility constraints • Obstructions to camera’s view and its geometry not the same

  4. Related Work • Similar visibility related problems in robotics: • Installing minimal # of sensors (or cameras) statically is art gallery problem • Allowing sensors (or cameras) to move actively is pursuit evasion problem

  5. Similar Problems • Similar problems about object tracking solved by Gleicher and Witkin (1992) • Used dynamic programming approach to generate motion for observer (or camera) to track moving target • But, targets trajectory was partially known • If motion of target is predictable, then can find optimal solution off-line • Otherwise, must use on-line solution • Planner tested for off-line use: took about 20 sec. • But, don’t use because • Tour guide app is interactive: 20 sec. is too slow • Our target’s motion is predictable so better solutions possible off-line

  6. General Formulation • Target t and viewpoint v (or camera) • Parameterization: qt = (xt, yt, θt) and qv = (xv, yv, θv) • Free C-space of t and v: Ctfree and Cvfree • Composite free C-space of t and v: Xfree = Ctfree x Cvfree, is a 6D space where solution path should reside • Suppose target’s trajectory given as function of time (t) and all qt collision-free: CT(xv, yv, θv, t) = (xt(t), yt(t), θt(t), xv, yv, θv), is configuration-time space • But, not all configurations in CT legal b/c must also satisfy the visibility and velocity constraints of camera!

  7. Specific Formulation • Specify v’s configuration with respect to t’s coordinate system: qv’= (ø, l, φ), so CT’ = Cv’ x T • Tracking direction Ø : orientation of vector connecting t and v • Preferred view distance l • View Angle φ : between direction that camera is pointing to and vector connecting v and t • S = fixed width of view cone

  8. Actual Formulation • Technical issue: 4-dimensional space, as CT’, is too large to search for interactive app • Solution: simplify further by decoupling φ b/c: • v can be modeled as an enclosed circle so any orientation of circle won’t violate configuration constraint • Assume can rotate v as fast as moving target, then can adjust view angle passively to maintain visibility of t • Account for φ after other parameters set • So, first search 3D configuration-time space, CT’’ = (t, ø, l), t = time

  9. Search Criteria: planning efficiency • Since this is an interactive app, efficiency is the most important criteria • The planner returns the first feasible trajectory satisfying • Configuration constraints • Visibility constraints • Corresponds to the time dimension in CT’ and CT’’

  10. Search Criteria: tracking direction • Since simulating motion of camera following tour guide, want camera behind guide • So force tracking direction Ø to say within range of orientations centered at orientation directly behind target

  11. Search Criteria: view distance • To maintain clear image of tour guide, near and far clipping distances need to be applied to further constrain viewpoint motion • So maintain view distance l as closely as possible

  12. Search Criteria: overall movement • Want to minimize overall movement of viewpoint b/c • Frequent movement causes scene discrepancy and motion sickness • Frequent movement provides less opportunity for 3D rendering speedup • Thus reduce movement, denoted d, in each step of tracking trajectory • d is function of current and previous Ø and l

  13. Search Criteria: view angle • Technical issue: moving target may move out of sight if view angle φ outside range • Solution: keep target clearly at center of view whenever possible without introducing frequent scene changes • This is a tradeoff • User defined

  14. Approach: Best-first Planning • Search starts from qi(ts, fi, li) and tries to find path to legal goal qg(te, *, *) in last time slice. • First feasible path returned if one exists • A configuration considered legal iff • Parameters in legal bounds • Viewpoint doesn’t collide with obstacle • View cone not obstructed

  15. f1: cost function for the distance between the current and the ending time slices. teis the ending time f2: normalized cost function for the tracking direction f0 is a preferred neutral tracking direction f3: normalized cost function for the view distance. L0 is a preferred neutral view distance f4: normalized cost function for the Euclidean distance moved from the parent configuration p: returns the previous position of the viewpoint for the given approaching direction dist: returns the distance between two positions dir: an integer indicating the direction where the current configuration was created Approach: Cost Function (1)

  16. Approach: Cost Function (2) • Cost function is linear combination of individual cost functions • Weights are user specified • For tour guide app, large w1 to make f1 dominant b/c planning time is most important

  17. Approach: Post-processing • BFP returns path consisting of sequence of configurations indexed by time • Post-processing • Path is smoothed to replace portions with straight-line segments of same lengths in CT-space s.t. accumulated costs for new segments are smaller • Unlock view angle φ and allow to change s.t. it minimizes viewpoint’s rotational movement • i.e. don’t rotate viewpoint unless target going to exit view cone

  18. Implementation • Written in Java • Planner has 2 parts • Path planning computes sequences of holonomic motions for target • Motion tracking computes tracking motion for viewpoint • Running time is linear in # of time steps • Actual run time depends on volume of regions visited during search process • Worst case: no feasible path—all nodes in CT visited in a few seconds • Average case: run time is fractions of second to few seconds

  19. Experiments

  20. Improvements • In current planner, configuration collisions and visibility occlusion are computed on the fly • But, if we have complex environment, would be better to preprocess this since we are off-line anyway • So could systematically compute forbidden regions in CT-space as preprocessing and use hash table to check for collision

  21. Q&A

More Related