The input to our system is a single image. First, the user points, coarsely, at image features (edges) that are subsequently automatically and reproducibly extracted in real-time. The user then performs a high level labeling of the curves (e.g. limb edge, cross-section) and specifies relations between edges (e.g. symmetry, surface or part). NURBS are used as working representation of image edges. The objects described by the user specified, qualitative relationships are then reconstructed either as a set of connected parts modeled as Generalized Cylinders, or as a set of 3-D surfaces for 3-D bilateral symmetric objects. In both cases, the texture is also extracted from the image. Our system runs in real-time on a PC.
Keywords: Interactive segmentation; NURBS fitting; Bilateral symmetry; Volumetric inference; 3-D modeling.