With the widespread application of electronic products, AR technology has also begun to be widely used in games, e-commerce, home decoration and other fields. For example, in interior design, we can use AR technology to match virtual soft furnishings in actual scenes, and use Huawei AR Engine motion tracking capabilities to output 3D coordinate information of the indoor environment in real time in practical applications to determine the real indoor environment and virtual soft. The transformation relationship between the decorations, so as to realize the reasonable placement of the soft decoration in the indoor space stably and accurately.

As a basic capability of Huawei AR Engine, the motion tracking capability plays an important role in the practical application of AR technology by continuously and stably tracking the changes in the position and attitude of the terminal device relative to the surrounding environment, and at the same time outputting the three-dimensional coordinate information of the surrounding environment features. The role of framework construction is to build a bridge between the real world and the virtual world.

Feature introduction

The motion tracking capability can determine the transformation relationship between the virtual coordinate system of the terminal device and the world coordinate system of the surrounding environment by tracking the changes in the position and attitude of the terminal device relative to the surrounding environment, and unify the virtual coordinate system of the terminal device into the world of the surrounding environment. Under the coordinate system, virtual objects are rendered from the observer's perspective, and then superimposed on the camera image, thereby realizing the geometric fusion of virtual and reality.

For example, in the scene of the AR auto show below, it is necessary to use the ability of motion tracking to track the motion posture and change trajectory of the camera relative to the surrounding environment in real time. precise placement.

The basic condition for realizing virtual-real fusion is to track the motion of the terminal device in real time, and update the virtual object state in real time according to the motion tracking results, so as to establish a stable connection between the real and virtual world. Therefore, the accuracy and quality of motion tracking directly affect AR. The overall effect of the application, any delay, error, etc., will cause the virtual object to shake or drift, greatly destroying the realism and immersion of the AR experience.

Feature advantage

Recently, Huawei AR Engine 3.0 uses SLAM 3.0 technology, which has further improved its technical indicators.

  1. Realize 6DOF motion tracking method (world tracking method), which can observe virtual objects from different distances, directions, and angles, creating a more realistic AR experience environment;
  2. Achieve monocular ATE (absolute trajectory error) as low as 1.6cm to ensure the stability of virtual objects and better experience.
  3. The plane detection time is less than 1 second, and the plane recognition and expansion speed is faster.

Integration steps

1. Log in to the official website of Huawei Developer Alliance and create an application.

2. Integrate AR Engine SDK.

  1. Open the Android Studio project-level "build.gradle" file. Add Maven repository. (The version below 7.0 is used as an example here)
    Configure the Maven repository address of the HMS Core SDK in "buildscript > repositories".
    Configure the Maven repository address of the HMS Core SDK in "allprojects > repositories".
 buildscript {
    repositories {
        google()
        jcenter()
        // 配置HMS Core SDK的Maven仓地址。
        maven {url "https://developer.huawei.com/repo/" }
    }
}
allprojects {
    repositories {
        google()
        jcenter()
        // 配置HMS Core SDK的Maven仓地址。
        maven {url "https://developer.huawei.com/repo/" }
    }
}
  1. Open the app-level "build.gradle" file in your project.
 dependencies {
implementation 'com.huawei.hms:arenginesdk:3.1.0.1'
}

3. Code development

  1. Check whether the AR Engine is installed on the current device. If it is installed, it runs normally. If the Active Jump App Market is not installed, request to install the AR Engine.
 private boolean arEngineAbilityCheck() {
    boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
    if (!isInstallArEngineApk && isRemindInstall) {
        Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
        finish();
    }
    LogUtil.debug(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
    if (!isInstallArEngineApk) {
        startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
        isRemindInstall = true;
    }
    return AREnginesApk.isAREngineApkReady(this);
}
  1. Pre-run permission check
 AndroidManifest里面配置相机权限
<uses-permission android:name="android.permission.CAMERA" />

private static final int REQUEST_CODE_ASK_PERMISSIONS = 1;
private static final int MAX_ARRAYS = 10;
private static final String[] PERMISSIONS_ARRAYS = new String[]{Manifest.permission.CAMERA};
List<String> permissionsList = new ArrayList<>(MAX_ARRAYS);
boolean isHasPermission = true;

for (String permission : PERMISSIONS_ARRAYS) {
    if (ContextCompat.checkSelfPermission(activity, permission) != PackageManager.PERMISSION_GRANTED) {
        isHasPermission = false;
        break;
    }
}
if (!isHasPermission) {
    for (String permission : PERMISSIONS_ARRAYS) {
        if (ContextCompat.checkSelfPermission(activity, permission) != PackageManager.PERMISSION_GRANTED) {
            permissionsList.add(permission);
        }
    }
    ActivityCompat.requestPermissions(activity,
        permissionsList.toArray(new String[permissionsList.size()]), REQUEST_CODE_ASK_PERMISSIONS);
}
  1. Call the ARWorldTrackingConfig interface to create a motion tracking ARSession,
 private ARSession mArSession;
private ARWorldTrackingConfig mConfig;
config.setCameraLensFacing(ARConfigBase.CameraLensFacing.FRONT);   // 通过config.setXXX方法配置场景参数
config.setPowerMode(ARConfigBase.PowerMode.ULTRA_POWER_SAVING);
mArSession.configure(config);
mArSession.resume();
mArSession.configure(config);


mSession.setCameraTextureName(mTextureDisplay.getExternalTextureId());
ARFrame arFrame = mSession.update();  // 从ARSession中获取一帧的数据。

// Set the environment texture probe and mode after the camera is initialized.
setEnvTextureData();
ARCamera arCamera = arFrame.getCamera();  // 可以从ARFrame中获取ARCamera,ARCamera对象可以获取相机的投影矩阵,用来渲染窗口。

// The size of the projection matrix is 4 * 4.
float[] projectionMatrix = new float[16];
arCamera.getProjectionMatrix(projectionMatrix, PROJ_MATRIX_OFFSET, PROJ_MATRIX_NEAR, PROJ_MATRIX_FAR);
mTextureDisplay.onDrawFrame(arFrame);
StringBuilder sb = new StringBuilder();
updateMessageData(arFrame, sb);
mTextDisplay.onDrawFrame(sb);

// The size of ViewMatrix is 4 * 4.
float[] viewMatrix = new float[16];
arCamera.getViewMatrix(viewMatrix, 0);
for (ARPlane plane : mSession.getAllTrackables(ARPlane.class)) {    // 从ARSession中获取所有的可跟踪平面。

    if (plane.getType() != ARPlane.PlaneType.UNKNOWN_FACING
        && plane.getTrackingState() == ARTrackable.TrackingState.TRACKING) {
        hideLoadingMessage();
        break;
    }
}
drawTarget(mSession.getAllTrackables(ARTarget.class), arCamera, viewMatrix, projectionMatrix);
mLabelDisplay.onDrawFrame(mSession.getAllTrackables(ARPlane.class), arCamera.getDisplayOrientedPose(),
    projectionMatrix);
handleGestureEvent(arFrame, arCamera, projectionMatrix, viewMatrix);
ARLightEstimate lightEstimate = arFrame.getLightEstimate();
ARPointCloud arPointCloud = arFrame.acquirePointCloud();
getEnvironmentTexture(lightEstimate);
drawAllObjects(projectionMatrix, viewMatrix,  getPixelIntensity(lightEstimate));
mPointCloud.onDrawFrame(arPointCloud, viewMatrix, projectionMatrix);

ARHitResult hitResult = hitTest4Result(arFrame, arCamera, event.getEventSecond());
if (hitResult != null) {
    mSelectedObj.setAnchor(hitResult.createAnchor());  // 在命中检测位置创建锚点,使得AREngine持续跟踪。

}
  1. Draw the desired virtual object according to the anchor point position.
 mEnvTextureBtn.setOnCheckedChangeListener((compoundButton, b) -> {
    mEnvTextureBtn.setEnabled(false);
    handler.sendEmptyMessageDelayed(MSG_ENV_TEXTURE_BUTTON_CLICK_ENABLE,
            BUTTON_REPEAT_CLICK_INTERVAL_TIME);
    mEnvTextureModeOpen = !mEnvTextureModeOpen;
    if (mEnvTextureModeOpen) {
       mEnvTextureLayout.setVisibility(View.VISIBLE);
    } else {
      mEnvTextureLayout.setVisibility(View.GONE);
    }
    int lightingMode = refreshLightMode(mEnvTextureModeOpen, ARConfigBase.LIGHT_MODE_ENVIRONMENT_TEXTURE);
    refreshConfig(lightingMode);
});

Learn more details>>

Visit the official website of Huawei AR Engine to learn more about it

Get the Huawei AR Engine development guidance document

Huawei AR Engine open source warehouse address: GitHub , Gitee

Visit the official website of HUAWEI Developer Alliance for more related content

Get development guidance documents

Follow us to know the latest technical information of HMS Core for the first time~


HarmonyOS_SDK
596 声望11.7k 粉丝

HarmonyOS SDK通过将HarmonyOS系统级能力对外开放,支撑开发者高效打造更纯净、更智能、更精致、更易用的鸿蒙原生应用,和开发者共同成长。