When developing any application is it essential to iterate as quickly as possible, and having to build to the device to test functionality is frustrating and can dramatically increase development time and cost.

开发任何应用程序时,必须尽快进行迭代,并且必须构建到设备上以测试功能令人沮丧,并且可能会大大增加开发时间和成本。

At Unity, we strive to make your job as a developer easier and more efficient, and since the release of Apple’s ARKit in mid-2017, we have been working hard to streamline AR development for ARKit with our ARKit plugin and the ARKit Remote. The ARKit remote allows developers to iterate on ARKit experiences right inside the Unity Editor, without building to the device each time. Today we are happy to announce that you can now access ARKit Remote functionality for Face Tracking on iPhone X by downloading or updating the ARKit plugin for Unity.

在Unity,我们努力使您的开发人员工作更轻松,更高效,并且自2017年中发布Apple的ARKit以来,我们一直在努力通过ARKit插件和ARKit Remote简化ARKit的AR开发。 ARKit远程允许开发人员直接在Unity编辑器中迭代ARKit体验,而无需每次都构建设备。 今天,我们很高兴地宣布,您现在可以通过下载或更新Unity的ARKit插件来访问iPhone X上用于面部跟踪的ARKit远程功能。

构建ARKit Remote (Build ARKit Remote)

To use ARKit Remote for Face Tracking, you will first need to build the ARKit Remote scene as an app to your iPhoneX. You will need an iPhoneX since it is the only device right now to feature the front facing TrueDepth camera, which is needed for Face Tracking.  Follow these steps for building the app to the device:

要使用ARKit Remote进行面部跟踪,您首先需要将ARKit Remote场景构建为iPhoneX的应用程序。 您将需要iPhoneX,因为它是目前唯一具有正面TrueDepth相机功能的设备,这是人脸跟踪所必需的。 请按照以下步骤将应用程序构建到设备上:

1. Get the latest Unity ARKit Plugin project from Bitbucket or Asset Store and load it up in the Unity Editor.

1.从BitbucketAsset Store获取最新的Unity ARKit插件项目,并将其加载到Unity Editor中。

2. Open the “Assets/UnityARKitPlugin/ARKitRemote/UnityARKitRemote” scene.

2.打开“资产/ UnityARKitPlugin / ARKitRemote / UnityARKitRemote”场景。

ARKit Remote:现在有了脸部追踪功能!

3. Select the “Assets/UnityARKitPlugin/Resources/UnityARKitPlugin/ARKitSettings” file and activate the “ARKit Uses Facetracking” check box.

3.选择“ Assets / UnityARKitPlugin / Resources / UnityARKitPlugin / ARKitSettings”文件,然后**“ ARKit Uses Facetracking”复选框。

4. Select PlayerSettings (in the menu: Edit/Project Settings/Player) and make sure you have some text in the entry “Camera Usage Description.”

4.选择PlayerSettings(在菜单中:编辑/项目设置/播放器),并确保在“摄像机使用说明”条目中有一些文本。

ARKit Remote:现在有了脸部追踪功能!

5. Select BuildSettings (in menu File/Build Settings…) and check the Development Build checkbox.

5.选择BuildSettings(在菜单File / Build Settings…中),然后选中Development Build复选框。

ARKit Remote:现在有了脸部追踪功能!

6. Now build this scene to your iPhone X as you would normally build an app via XCode.

6.现在,就像通常通过XCode生成应用程序一样,将此场景构建到iPhoneX。

Here’s a video of the steps needed for building the ARKit Remote.

这是构建ARKit Remote所需步骤的视频。

演示地址

将编辑器连接到ARKit Remote (Connect Editor to ARKit Remote)

The steps in the previous section need only be done once to build ARKit Remote to your device. The following steps can be used over and over again to iterate on the ARKit Face Tracking in the editor:

上一节中的步骤仅需完成一次即可为您的设备构建ARKit Remote。 可以反复使用以下步骤在编辑器中的ARKit Face Tracking上进行迭代:

1. Connect the iPhone X to your Mac development machine via USB.

1.通过USB将iPhone X连接到Mac开发机。

2. Start up the ARKit Remote app on the device.  You should see a “Waiting for connection..” screen.

2.在设备上启动ARKit Remote应用程序。 您应该看到“等待连接..”屏幕。

3. In the Unity Editor, connect to your iPhone X by going to your Console Window and selecting the iPhone X connected via USB.

3.在Unity Editor中,通过转到控制台窗口并选择通过USB连接的iPhone X,连接到iPhoneX。

4. Load up one of the FaceTracking examples in the project e.g. “Assets/UnityARKitPlugin/Examples/FaceTracking/FaceAnchorScene” and press Play in the Editor.

4.在项目中加载一个FaceTracking实例,例如“ Assets / UnityARKitPlugin / Examples / FaceTracking / FaceAnchorScene”,然后在编辑器中按Play。

5. You should see a green screen with a button on top that says “Start ARKit Face Tracking Session.” Press that button and you should see your front camera video feed in your Editor “Game” window.  If your face is in the view it will be sending ARKit Face Tracking data from the device to the Editor as well.

5.您应该会看到一个绿色屏幕,上面带有一个按钮,上面显示“开始ARKit Face Tracking会话”。 按下该按钮,您应该在“游戏”编辑器窗口中看到前置摄像头的视频提要。 如果您的脸在视图中,它将也将ARKit脸部跟踪数据从设备发送到编辑器。

Here is a video that demonstrates the connection steps:

这是演示连接步骤的视频:

演示地址

玩ARKit人脸跟踪数据 (Play with ARKit Face Tracking Data)

Once you have connected your ARKit Face Tracking scene to ARKit Remote, all the Face Tracking data (face anchor, face mesh, blendshapes, directional lighting) is sent from device to Editor. You can then manipulate that data in the Editor to affect the scene immediately. Here are a couple of videos to demonstrate this:

将ARKit Face Tracking场景连接到ARKit Remote之后,所有Face Tracking数据(面部锚,面部网格,混合形状,定向照明)都将从设备发送到编辑器。 然后,您可以在编辑器中操纵该数据以立即影响场景。 这里有一些视频来演示这一点:

演示地址

演示地址

全新,简化的ARKit远程工作流程! (New, Streamlined ARKit Remote Workflow!)

As part of adding Face Tracking functionality to the ARKit Remote, we also made it much easier to work with ARKit Remote without altering your original ARKit scene in the Unity Editor. Previously, you had to add a GameObject that connects from your scene to the ARKit Remote. Now, we check if you are trying to initialize an ARKit configuration from the Editor and it automatically adds the RemoteConnection GameObject to your scene at runtime.

作为向ARKit Remote添加人脸跟踪功能的一部分,我们还简化了ARKit Remote的使用,而无需在Unity Editor中更改原始的ARKit场景。 以前,您必须添加一个GameObject,将您的场景连接到ARKit Remote。 现在,我们检查您是否正在尝试从编辑器初始化ARKit配置,它会在运行时将RemoteConnection GameObject自动添加到场景中。

Have fun playing around with ARKit Face Tracking in the Unity Editor!

在Unity编辑器中玩ARKit Face Tracking玩得开心!

翻译自: https://blogs.unity3d.com/2018/01/16/arkit-remote-now-with-face-tracking/

相关文章: