Nowadays, people are more and more fond of using emoticons when chatting and posting online. On the one hand, emoticons are a personalized way of expression, and on the other hand, they can convey the current psychological activities. A bag is an indispensable presence. Coupled with the rise of the metaverse in recent years, 3D avatars are widely used. Users can control the expressions of the avatars through their own expressions, and make a series of exclusive expression packs to make them more vivid.
So, how to make avatars have the same changing expressions as humans? The facial expression tracking capability of HMS Core AR Engine can help to realize the real-time calculation of the parameter values corresponding to each facial expression. Users can control the expressions of virtual characters through their own facial movements, and finally make various vivid expressions of virtual characters, which can convey text emotions in a more interesting form, and also greatly facilitate application scenarios such as expression production of virtual characters. .
For example, in social apps, people who don’t want to show their faces can convey their emotions through the expressions of virtual images, which increases the fun while protecting privacy. In live broadcast and e-commerce apps, in order to avoid homogeneity, merchants can use the vivid expressions of virtual anchors to bring users more vivid consumption scenarios and novel interactive experiences, and stimulate young people's interest in immersive virtual entertainment and digital consumption. need. In apps such as short videos and photos, users use facial expressions to control the expressions of avatars, display and express themselves, and shorten the distance between people; It can understand the content of adult facial expressions in real time, and use virtual images to explain and teach more vividly, and stimulate users' interest in learning.
Implementation
AR Engine provides the " face expression tracking " capability, which can track and obtain face image information in real time, calculate the pose of the face, understand the content of adult facial expressions, and convert them into various expression parameters. The facial expression directly controls the expression of the avatar. AR Engine currently provides a total of 64 expressions, including expressions and actions of major facial organs such as eyes, eyebrows, eyeballs, mouths, and tongues. There are 21 kinds of expressions in the eyes, including eyeball movement, eye opening and closing, and micro-movements of the eyelids; there are 28 expressions in the mouth, including opening and pouting, pulling down the corners of the mouth, pursing lips, tongue movements, etc.; there are 5 expressions in the eyebrows , including raising eyebrows, unilateral eyebrows down or up, etc. Other specific expression parameters can be found in the FaceAR design specification .
Show results
Development steps
Development environment requirements:
JDK 1.8.211 and above.
Install Android Studio 3.0 and above:
minSdkVersion 26 and above
targetSdkVersion 29 (recommended)
compileSdkVersion 29 (recommended)
Gradle 6.1.1 and above (recommended)
Download the AR Engine server APK from the App Market on the Huawei terminal device (search for "Huawei AR Engine" in the Huawei App Market) and install it on the terminal device.
Device for testing the application: see the AREngine feature software and hardware dependency table . If you use the services of multiple HMS Cores at the same time, you need to use the maximum value corresponding to each Kit.
development preparation
- Before developing an app, you need to register as a developer on the HUAWEI Developer Alliance website and complete real-name authentication. For details, see Account Registration Authentication .
- Huawei provides the AR Engine SDK package in the Maven warehouse integration method. Before starting development, you need to integrate the AR Engine SDK into your development environment.
- The codebase configuration of Android Studio differs between Gradle plugin versions below 7.0, 7.0, and 7.1 and above. Please select the corresponding configuration process according to your current Gradle plugin version.
- Take 7.0 as an example:
Open the Android Studio project-level "build.gradle" file and add the Maven repository.
Configure the Maven repository address of the HMS Core SDK in "buildscript > repositories".
buildscript {
repositories {
google()
jcenter()
maven {url "https://developer.huawei.com/repo/" }
}
}
Open the project-level "settings.gradle" file and configure the Maven warehouse address of the HMS Core SDK
dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
repositories {
google()
jcenter()
maven {url "https://developer.huawei.com/repo/" }
}
}
}
- Add dependencies Add the following compilation dependencies to "dependencies":
dependencies {
implementation 'com.huawei.hms:arenginesdk:{version}
}
application development
- Verification before running: Check whether the AR Engine is installed on the current device. If it is installed, it will run normally. If it is not installed, the app should remind the user to install the AR Engine in an appropriate way, such as actively jumping to the app market and requesting the installation of the AR Engine. The specific implementation code is as follows
boolean isInstallArEngineApk =AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk) {
// ConnectAppMarketActivity.class为跳转应用市场的Activity。
startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
isRemindInstall = true;
}
- Create AR scenarios: AR Engine provides 5 scenarios, including motion tracking (ARWorldTrackingConfig), face tracking (ARFaceTrackingConfig), hand recognition (ARHandTrackingConfig), body tracking (ARBodyTrackingConfig), and image recognition (ARImageTrackingConfig).
Call the ARFaceTrackingConfig interface to create face tracking.
// 创建ARSession。
mArSession = new ARSession(this);
// 根据要使用的具体场景,选用具体的Config来初始化ARSession。
ARFaceTrackingConfig config = new ARFaceTrackingConfig(mArSession);
After creating a face tracking ARSession , you can configure scene parameters through the config.setXXX method
//设置相机的打开方式,外部打开或内部打开,其中外部打开只能在ARFace中使用,推荐使用内部打开相机的方式。
mArConfig.setImageInputMode(ARConfigBase.ImageInputMode.EXTERNAL_INPUT_ALL);
- Configure the face tracking AR scene parameters and start the face tracking scene:
mArSession.configure(mArConfig);
mArSession.resume();
- Create the FaceGeometryDisplay class, which gets the face geometry data and renders the data on the screen
public class FaceGeometryDisplay {
//初始化与面几何体相关的OpenGL ES渲染,包括创建着色器程序。
void init(Context context) {...
}
}
- Created in the FaceGeometryDisplay class, the onDrawFrame method, and the face.getFaceGeometry() method to get the face Mesh
public void onDrawFrame(ARCamera camera, ARFace face) {
ARFaceGeometry faceGeometry = face.getFaceGeometry();
updateFaceGeometryData(faceGeometry);
updateModelViewProjectionData(camera, face);
drawFaceGeometry();
faceGeometry.release();
}
- Create the method updateFaceGeometryData() in the FaceGeometryDisplay class to pass in the face Mesh data for configuration and use OpenGl to set the expression parameters
private void updateFaceGeometryData(ARFaceGeometry faceGeometry){
FloatBuffer faceVertices = faceGeometry.getVertices();
FloatBuffer textureCoordinates =faceGeometry.getTextureCoordinates();
//获取人脸Mesh纹理坐标点数组,在渲染时,与getVertices()返回的顶点数据配合使用。
}
- Create the FaceRenderManager class, which manages rendering related to face data:
public class FaceRenderManager implements GLSurfaceView.Renderer {
//构造函数初始化上下文和activity
public FaceRenderManager(Context context, Activity activity) {
mContext = context;
mActivity = activity;
}
//设置ARSession,获取最新数据
public void setArSession(ARSession arSession) {
if (arSession == null) {
LogUtil.error(TAG, "Set session error, arSession is null!");
return;
}
mArSession = arSession;
}
//设置ARConfigBase,获取配置模式。
public void setArConfigBase(ARConfigBase arConfig) {
if (arConfig == null) {
LogUtil.error(TAG, "setArFaceTrackingConfig error, arConfig is null.");
return;
}
mArConfigBase = arConfig;
}
//设置外置摄像头打开方式
public void setOpenCameraOutsideFlag(boolean isOpenCameraOutsideFlag) {
isOpenCameraOutside = isOpenCameraOutsideFlag;
}
...
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mFaceGeometryDisplay.init(mContext);
}
}
- Finally call the method in FaceActivity: achieve the final effect by setting these methods
public class FaceActivity extends BaseActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
mFaceRenderManager = new FaceRenderManager(this, this);
mFaceRenderManager.setDisplayRotationManage(mDisplayRotationManager);
mFaceRenderManager.setTextView(mTextView);
glSurfaceView.setRenderer(mFaceRenderManager);
glSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
}
}
For specific implementation, please refer to the sample code .
Learn more details>>
Visit the official website of Huawei Developer Alliance
Get development guidance documents
Huawei Mobile Services Open Source Warehouse Address: GitHub , Gitee
Follow us to know the latest technical information of HMS Core for the first time~
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。