Reputation: 1591
I'd appreciate if someone can provide any sources that can help me understand how Vuforia samples for Android work? I've installed everything and can run them on my phone but it's difficult to understand the project structure.
It would be great if there's any tutorial on how to create simplest AR
app with Android Studio
(not Unity
).
I've also learnt how to create AR scenes with Unity, export them to Android and run on the device, but still don't understand how to work with the exported project in Android Studio further.
My goal is to have one image target and several 3D objects. I want to have simple UI like ListView
to choose which object to place on the target.
Also, is it possible to build ListView
with Android and on its item's click event switch 3D object with another in a single scene created in Unity? I know I can dynamically load 3D models with Unity C# script, so can I trigger some function in that script via Android?
I'd really appreciate any advice.
Upvotes: 4
Views: 1054
Reputation: 10701
Summary:
First you upload a picture to the Vuforia cloud so it returns a xml and .dat file. Those that are stored in the Streaming Assets. The dat file contains all the info in binary format about your marker. The xml contains info about name and size and is linked to the C# component.
Vuforia allows to creat runtime marker or cloud marker but we shall leave those out for now. The idea remains the same.
When you run the app, the camera hardware CH (not the Unity camera, keep that distinction in mind), provides a feed. That feed is rendered on a texture in the Unity scene, a Unity camera UC is facing that texture. Those are fixed in space, only the content of the texture is updated with what the CH provides each frame. This is the reality of your app.
At the same time, Vuforia scans the CH feed and performs a pattern recognition https://en.wikipedia.org/wiki/Pattern_recognition trying to find a match with the dat file you provided. When a pattern is found, it performs a second run to define the distance and rotation of that pattern with respect to the CH. This is possible since the xml file contains the dimensions of your real marker. If the xml says 50x50 and your marker is 25x25, it will be twice as small as expected since the system understands the marker is further away than it actually is.
When the marker is recognized, Vuforia calls on the state listener on DefaultTrackableEventHandler (Check the script on the parent of the model), it implements this method:
public void OnTrackableStateChanged(
TrackableBehaviour.Status previousStatus,
TrackableBehaviour.Status newStatus)
{
if (newStatus == TrackableBehaviour.Status.DETECTED ||
newStatus == TrackableBehaviour.Status.TRACKED ||
newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED)
{
OnTrackingFound();
}
else
{
OnTrackingLost();
}
}
Basically, if Vuforia detects a change, it calls that method. Then you can propagate the event further making OnTrackingFound/Lost a public event onto anything can register. Or create a new script that implements ITrackableEventHandler. This is only about listening if a model got found or lost. In the example, when found, it shows the model and vice-versa. This is the most likely and basic scenario but anything can be triggered.
The result of the calculations represents a Transform (position, rotation). That Transform is passed on to a second Unity camera in the scene. The coordinates are defined with (0,0,0) as the position of the marker. It most likely aims at the 3D model you placed there. Note that you can place the model anywhere in the scene, it will just be offset. The Vuforia camera CANNOT be controlled, if you were to try and pass value to the Transform, they get overwritten by Vuforia. You are just not meant to play with those values. You can on the other hand set it on and off, affect some of the rendering and so on.
The first UC has a lower depth so it renders the real scene first, the second is rendered on top, it augments the reality with the 3D model. With a set of layer mask, the second camera ignores the rest of the scene so only the model is considered.
You do not really want to play around with the background feed, but you surely want to interacts with the model, to do so, just like any normal scene. Grab the Camera component of the Vuforia camera and raycast from it in the forward direction. Check your hit and do your action.
Upvotes: 3