The popularity of the Internet and smart terminals has promoted the emergence and vigorous development of e-commerce, while the emergence of new technologies has promoted the continuous upgrading of the e-commerce field. The epidemic has made people more accustomed to using e-commerce for shopping, but they have become tired of the traditional online shopping model. The e-commerce market is in urgent need of changes in the model to refresh the old users, provide convenience and promote their shopping desires. At the same time, it can also attract new users to join by relying on hot spots. Against this background, we use the intelligent image processing capabilities provided by HMS Core to recognize the user’s facial and physical characteristics, combined with the display mode, so that users can experience the wearing effect of the product directly on the phone, providing a more convenient shopping experience.
Scenes
In shopping apps such as Taobao, JD.com, and Tmall, as well as good things sharing apps such as Xiaohongshu, Dewu, and What Is Worth Buying, you can experience AR products so that consumers can feel the effects of the product and reduce the subsequent return and exchange ratio.
Look at the effect first
Open the App
Click on the picture to view the 3D model of the product, which can be rotated and zoomed
Development preparation
Configure Huawei Maven warehouse address
Open the build.gradle file in the AndroidStudio project
Add the Maven warehouse address of the SDK in "buildscript> repositories" and "allprojects> repositories":
buildscript {
repositories{
...
maven {url 'http://developer.huawei.com/repo/'}
}
}
allprojects {
repositories {
…
maven { url 'http://developer.huawei.com/repo/'}
}
}
Add compile SDK dependency
Open the application-level "build.gradle" file
Add the SDK package of the graphics engine to dependencies, use Full-SDK, and the SDK package of AR Engine
dependencies {
….
implementation 'com.huawei.scenekit:full-sdk:5.0.2.302'
implementation 'com.huawei.hms:arenginesdk:2.13.0.4'
}
The above steps can refer to the application development introduction in the developer website
Add permissions in AndroidManifest.xml
打开main中的AndroidManifest.xml文件,在<application 前添加相机的使用权限
<!--相机权限-->
<uses-permission android:name="android.permission.CAMERA" />
Development steps
MainActivity configuration
Add two buttons to the layout configuration file of MainActivity, one with the background set as the product preview, and the other with the text "Try it on!" to guide users to try it on.
<Button
android:layout_width="260dp"
android:layout_height="160dp"
android:background="@drawable/sunglasses"
android:onClick="onBtnShowProduct" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Try it on!"
android:textAllCaps="false"
android:textSize="24sp"
android:onClick="onBtnTryProductOn" />
Click the onBtnShowProduct button to load the 3D model of the product, and click onBtnTryProductOn to enter the AR try-on interface.
Product 3D model display
1. Create a SceneSampleView that inherits from SceneView
public class SceneSampleView extends SceneView {
public SceneSampleView(Context context) {
super(context);
}
public SceneSampleView(Context context, AttributeSet attributeSet) {
super(context, attributeSet);
}
}
Override surfaceCreated to complete the creation and initialization of SceneView, loadScene loads the model file to be rendered and displayed, currently supports the rendering of glTF and glb format materials, loadSkyBox, loadSpecularEnvTexture, loadDiffuseEnvTexture respectively perform sky box texture, specular reflection texture and diffuse texture texture Loading, currently supports dds files in cubemap format.
The loaded materials are stored in the src->main->assets->SceneView folder.
@Override
public void surfaceCreated(SurfaceHolder holder) {
super.surfaceCreated(holder);
// 加载渲染素材
loadScene("SceneView/sunglasses.glb");
//调用loadSkyBox加载天空盒纹理贴图素材
loadSkyBox("SceneView/skyboxTexture.dds");
//调用loadSpecularEnvTexture加载环境光反射贴图素材
loadSpecularEnvTexture("SceneView/specularEnvTexture.dds");
//调用loadDiffuseEnvTexture加载环境光漫反射贴图素材
loadDiffuseEnvTexture("SceneView/diffuseEnvTexture.dds");
}
2. Create a new SceneViewActivity, inherited from Activity, use the onCreate method to call setContentView, and pass in the SampleView created with the xml tag in the layout file,
public class SceneViewActivity extends Activity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_sample);
}
}
The SampleView in the layout file is created as
<com.huawei.scene.demo.sceneview.SceneSampleView
android:layout_width="match_parent"
android:layout_height="match_parent"/>
3. Create a new onBtnShowProduct in MainActivity. When the corresponding button is clicked, SceneViewActivity will be called to load the 3D model of the product, render it, and then display it
public void onBtnShowProduct(View view) {
startActivity(new Intent(this, SceneViewActivity.class));
}
Product AR try-on display
Through the facial recognition, image rendering and AR display capabilities provided by HMS Core, it is very convenient to realize the AR try-on display of the product
1. Create a FaceViewActivity, inherit from Activity, and create the corresponding layout file
Create face_view in the layout to display the try-on effect
<com.huawei.hms.scene.sdk.FaceView
android:id="@+id/face_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:sdk_type="AR_ENGINE"></com.huawei.hms.scene.sdk.FaceView>
At the same time create a switch to compare the effect of wearing and not wearing
<Switch
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/switch_view"
android:layout_alignParentTop="true"
android:layout_marginTop="15dp"
android:layout_alignParentEnd="true"
android:layout_marginEnd ="15dp"
android:text="Try it on"
android:theme="@style/AppTheme"
tools:ignore="RelativeOverlap" />
2. Rewrite the onCreate method in FaceViewActivity to get FaceView
public class FaceViewActivity extends Activity {
private FaceView mFaceView;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_face_view);
mFaceView = findViewById(R.id.face_view);
}
}
3. Create a monitoring method for the switch switch. When the switch is turned on, use the loadAsset method to load the 3D model of the product, and the LandmarkType can select the recognition position of the face
mSwitch.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
mFaceView.clearResource();
if (isChecked) {
// Load materials.
int index = mFaceView.loadAsset("FaceView/sunglasses.glb", LandmarkType.TIP_OF_NOSE);
}
}
});
The size and position of the model can be adjusted through setInitialPose, create position, rotation, and scale arrays, and pass in the values to be adjusted
final float[] position = { 0.0f, 0.0f, -0.15f };
final float[] rotation = { 0.0f, 0.0f, 0.0f, 0.0f };
final float[] scale = { 2.0f, 2.0f, 0.3f };
Add below the loadAsset statement
mFaceView.setInitialPose(index, position, scale, rotation);
4. Create a new onBtnTryProductOn in MainActivity, and call FaceViewActivity when the button is clicked to let the user view the AR try-on effect
public void onBtnTryProductOn(View view) {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(
this, new String[]{ Manifest.permission.CAMERA }, FACE_VIEW_REQUEST_CODE);
} else {
startActivity(new Intent(this, FaceViewActivity.class));
}
}
Access Huawei Developers Alliance official website
Obtain the development guide document
Huawei HMS Core official forum
AR Engine open source warehouse link: GitHub , Gitee
To solve integration problems, please go to Stack Overflow
Follow us and learn about the latest technical information of HMS Core
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。