Share via


INuiFusionReconstruction::AlignPointClouds Method

Aligns two sets of overlapping oriented point clouds and calculates the camera's relative pose.

Syntax

public:
HRESULT AlignPointClouds(
         const NUI_FUSION_IMAGE_FRAME *pReferencePointCloudFrame,
         const NUI_FUSION_IMAGE_FRAME *pObservedPointCloudFrame,
         USHORT maxAlignIterationCount,
         const NUI_FUSION_IMAGE_FRAME *pDeltaFromReferenceFrame,
         FLOAT *pAlignmentEnergy,
         Matrix4 *pReferenceToObservedTransform
)

Parameters

  • pReferencePointCloudFrame
    Type: NUI_FUSION_IMAGE_FRAME
    A reference point cloud frame. This image must be the same size and have the same camera parameters as the pObservedPointCloudFrame parameter.

  • pObservedPointCloudFrame
    Type: NUI_FUSION_IMAGE_FRAME
    An observed point cloud frame. This image must be the same size and have the same camera parameters as the pReferencePointCloudFrame parameter.

  • maxAlignIterationCount
    Type: USHORT
    The number of iterations to run.

  • pDeltaFromReferenceFrame
    Type: NUI_FUSION_IMAGE_FRAME
    Optional pre-allocated color image frame that receives color-coded data from the camera tracking. This image can be used as input to additional vision algorithms, such as object segmentation. If specified, this image must be the same size and have the same camera parameters as the referencePointCloudFrame and observedPointCloudFrame parameters. If you do not need this output image, specify null.

    The values in the received image vary depending on whether the pixel was a valid pixel used in tracking (inlier) or failed in different tests (outlier). Inliers are color shaded depending on the residual energy at that point. Higher discrepancy between vertices is indicated by more saturated colors. Less discrepancy between vertices (less information at that pixel) is indicated by less saturated colors (that is, more white). If the pixel is an outlier, it will receive one of the values listed in the following table.

    Value Description
    0xFF000000 The input vertex was invalid (for example, a vertex with an input depth of zero), or the vertex had no correspondences between the two point cloud images.
    0xFF008000 The outlier vertices were rejected due to too large a distance between vertices.
    0xFF800000 The outlier vertices were rejected due to too large a difference in normal angle between the point clouds.
  • pAlignmentEnergy
    Type: FLOAT
    The threshold in the range [0.0f, 1.0f] that describes how well the observed frame aligns to the model with the calculated pose (mean distance between matching points in the point clouds).

  • pReferenceToObservedTransform
    Type: Matrix4
    matrix that receives the initial guess at the transform. This is updated when tracking succeeds. Tracking failure is indicated by a value of identity.

Return value

Type: HRESULT
S_OK if successful; otherwise, returns a failure code.

Requirements

Header: nuikinectfusionvolume.h

Library: TBD