Nitisha Sharma
Nitisha Sharma

Reputation: 303

Place image from Gallery on a Wall using ARKit

I have a list of images coming from server and stored in gallery. I want to pick any image and place on live wall using ARKit and want to convert images into 3d images tp perform operation like zooming , moving image etc.

Can anybody please guide how can I create custom object in AR?

Upvotes: 8

Views: 4781

Answers (1)

PongBongoSaurus
PongBongoSaurus

Reputation: 7385

To detect vertical surfaces (e.g walls) in ARKit you need to firstly set up ARWorldTrackingConfiguration and then enable planeDetection within your app.

So under your Class Declaration you would create the following variables:

@IBOutlet var augmentedRealityView: ARSCNView!
let augmentedRealitySession = ARSession()
let configuration = ARWorldTrackingConfiguration()

And then initialise your ARSession and in ViewDidLoad for example e.g:

override func viewDidLoad() {
    super.viewDidLoad()

    //1. Set Up Our ARSession
    augmentedRealityView.session = augmentedRealitySession

    //2. Assign The ARSCNViewDelegate
    augmentedRealityView.delegate = self

    //3. Set Up Plane Detection
    configuration.planeDetection = .vertical

    //4. Run Our Configuration
    augmentedRealitySession.run(configuration, options: [.resetTracking, .removeExistingAnchors])

}

Now that you are all set to detected vertical planes you need to hook into the following ARSCNViewDelegate Method:

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { }

Which simply:

Tells the delegate that a SceneKit node corresponding to a new AR anchor has been added to the scene.

In this method we are going to explicitly look for any ARPlaneAnchors which have been detected which provide us with:

Information about the position and orientation of a real-world flat surface detected in a world-tracking AR session.

As such placing an SCNPlane onto a detected vertical plane is a simple as this:

//-------------------------
//MARK: - ARSCNViewDelegate
//-------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. Check We Have Detected An ARPlaneAnchor
        guard let planeAnchor = anchor as? ARPlaneAnchor else { return }

        //2. Get The Size Of The ARPlaneAnchor
        let width = CGFloat(planeAnchor.extent.x)
        let height = CGFloat(planeAnchor.extent.z)

        //3. Create An SCNPlane Which Matches The Size Of The ARPlaneAnchor
        let imageHolder = SCNNode(geometry: SCNPlane(width: width, height: height))

        //4. Rotate It
        imageHolder.eulerAngles.x = -.pi/2

        //5. Set It's Colour To Red
        imageHolder.geometry?.firstMaterial?.diffuse.contents = UIColor.red

        //4. Add It To Our Node & Thus The Hiearchy
        node.addChildNode(imageHolder)

    }
}

Applying This To Your Case:

In your case we need to do some additional work, as you want to be able to allow the user to apply an image to the vertical plane.

As such your best bet is to make the node you have just added a variable e.g.

class ViewController: UIViewController {

    @IBOutlet var augmentedRealityView: ARSCNView!
    let augmentedRealitySession = ARSession()
    let configuration = ARWorldTrackingConfiguration()
    var nodeWeCanChange: SCNNode?

}

As such your Delegate Callback might look like so:

//-------------------------
//MARK: - ARSCNViewDelegate
//-------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. If We Havent Create Our Interactive Node Then Proceed
        if nodeWeCanChange == nil{

            //a. Check We Have Detected An ARPlaneAnchor
            guard let planeAnchor = anchor as? ARPlaneAnchor else { return }

            //b. Get The Size Of The ARPlaneAnchor
            let width = CGFloat(planeAnchor.extent.x)
            let height = CGFloat(planeAnchor.extent.z)

            //c. Create An SCNPlane Which Matches The Size Of The ARPlaneAnchor
            nodeWeCanChange = SCNNode(geometry: SCNPlane(width: width, height: height))

            //d. Rotate It
            nodeWeCanChange?.eulerAngles.x = -.pi/2

            //e. Set It's Colour To Red
            nodeWeCanChange?.geometry?.firstMaterial?.diffuse.contents = UIColor.red

            //f. Add It To Our Node & Thus The Hiearchy
            node.addChildNode(nodeWeCanChange!)
        }


    }
}

Now you have a reference to the nodeWeCanChange setting it's image at anytime is simple!

Each SCNGeometry has a set of Materials which are a:

A set of shading attributes that define the appearance of a geometry's surface when rendered.

In our case we are looking for the materials diffuse property which is:

An object that manages the material’s diffuse response to lighting.

And then the contents property which are:

The visual contents of the material property—a color, image, or source of animated content.

Obviously you need to handle the full logistics of this, however a very basic example might look like so assuming you stored your Images into an Array of UIImage e.g:

let imageGallery = [UIImage(named: "StackOverflow"), UIImage(named: "GitHub")]

I have created an IBAction which will change the image of our SCNNode's Geometry based on the tag of the UIButton pressed e.g:

/// Changes The Material Of Our SCNNode's Gemeotry To The Image Selected By The User
///
/// - Parameter sender: UIButton
@IBAction func changeNodesImage(_ sender: UIButton){

    guard let imageToApply = imageGallery[sender.tag], let node = nodeWeCanChange  else { return}
    node.geometry?.firstMaterial?.diffuse.contents = imageToApply

}

This is more than enough to point you in the right direction... Hope it helps...

Upvotes: 14

Related Questions