Objective C Functions

 

Objective-C and ARKit: Building Augmented Reality Apps for iOS

Augmented Reality (AR) has revolutionized the way we interact with digital content, seamlessly blending the virtual world with the real one. If you’re an iOS developer looking to venture into the exciting world of AR, this guide is your gateway. We’ll explore how to harness the power of Objective-C and ARKit to build immersive AR apps that engage and captivate your users.

Objective-C and ARKit: Building Augmented Reality Apps for iOS

1. Why Choose Objective-C for ARKit Development?

Before we dive into ARKit, let’s address the choice of programming language. While Swift has become the default language for iOS development, Objective-C still holds its ground, especially for developers with a legacy codebase or a strong background in the language.

Here are some reasons to consider Objective-C for your ARKit project:

1.1. Legacy Code Compatibility

If you have an existing Objective-C project, integrating ARKit becomes easier without rewriting your entire codebase in Swift. Objective-C and Swift can coexist in the same project, allowing for a smooth transition.

1.2. Experienced Objective-C Developers

If you or your team are more comfortable with Objective-C, it makes sense to stick with it. ARKit is accessible from both languages, so you can leverage your existing skills.

1.3. Proven Stability

Objective-C has been around for decades and is known for its stability and reliability. This is crucial when building complex AR applications where performance and robustness are paramount.

Now that we’ve established why Objective-C is a viable choice, let’s delve into the world of ARKit and discover how to create incredible AR experiences.

2. Getting Started with ARKit

2.1. Setting Up the Project

First, create a new iOS project in Xcode or open an existing one. Ensure that your project is configured for Objective-C. ARKit is supported on devices with an A9 or later processor, so ensure that your target device meets this requirement.

2.2. Importing ARKit

To start using ARKit, import the ARKit framework into your Objective-C class:

objectivec
#import <ARKit/ARKit.h>

2.3. Configuring ARSession

ARKit relies on an ARSession to manage the AR experience. Create an instance of ARSession and set it up in your view controller:

objectivec
ARSession *arSession = [[ARSession alloc] init];
ARWorldTrackingConfiguration *configuration = [[ARWorldTrackingConfiguration alloc] init];
[arSession runWithConfiguration:configuration];

This code sets up a basic AR session with world tracking, which allows your app to track the device’s position and orientation in the real world.

2.4. Adding ARView

To display the AR scene, add an ARSCNView to your view controller’s view hierarchy:

objectivec
ARSCNView *arView = [[ARSCNView alloc] initWithFrame:self.view.frame];
arView.session = arSession;
[self.view addSubview:arView];

This ARSCNView will render the AR content and provide the user with a window into the augmented world.

3. Creating and Manipulating 3D Objects

One of the most exciting aspects of AR development is placing and manipulating 3D objects in the real world. Let’s explore how to do this using ARKit and Objective-C.

3.1. Adding 3D Models

To add a 3D model to your AR scene, you’ll need a 3D asset in a supported format (e.g., .dae, .usdz). Import the asset into your Xcode project and load it into your scene:

objectivec
SCNScene *scene = [SCNScene sceneNamed:@"your_model.usdz"];
SCNNode *modelNode = scene.rootNode.childNodes[0];
[modelNode setScale:SCNVector3Make(0.1, 0.1, 0.1)]; // Scale the model

Here, we load the 3D model and scale it down to an appropriate size. You can adjust the scale and position to place the model where you want in the AR world.

3.2. Placing Objects in the Real World

Now that you have your 3D model, you can place it in the real world using ARKit’s ARAnchor:

objectivec
ARAnchor *anchor = [[ARAnchor alloc] initWithTransform:modelNode.simdTransform];
[arSession addAnchor:anchor];

This code creates an anchor at the position and orientation of your 3D model and adds it to the AR session. The model will now appear in the AR view, anchored to the real-world position you specified.

3.3. Interacting with 3D Objects

ARKit makes it easy to implement interactions with 3D objects. You can add gesture recognizers to your ARSCNView to enable tap or swipe gestures, allowing users to interact with the objects in the AR scene.

objectivec
UITapGestureRecognizer *tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(handleTap:)];
[arView addGestureRecognizer:tapGestureRecognizer];

In the handleTap: method, you can implement the logic to respond to the user’s interaction with the 3D object.

objectivec
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer {
    CGPoint tapPoint = [gestureRecognizer locationInView:arView];
    NSArray<ARHitTestResult *> *hitTestResults = [arView hitTest:tapPoint types:ARHitTestResultTypeExistingPlaneUsingExtent];
    
    if (hitTestResults.count > 0) {
        ARHitTestResult *hitResult = [hitTestResults firstObject];
        SCNVector3 position = SCNVector3FromFloat3(hitResult.worldTransform.columns[3].xyz);
        
        // Implement your interaction logic here
    }
}

In this example, we use a tap gesture to detect the user’s touch on the AR view. We then perform a hit test to find any real-world surfaces (planes) where the user tapped. If a hit is detected, you can implement your interaction logic.

4. Adding Real-World Tracking

ARKit’s real-world tracking capabilities enable your AR app to detect and interact with the physical environment. Let’s look at how to incorporate this functionality into your Objective-C ARKit app.

4.1. Plane Detection

To enable plane detection, you can configure your ARWorldTrackingConfiguration:

objectivec
ARWorldTrackingConfiguration *configuration = [[ARWorldTrackingConfiguration alloc] init];
configuration.planeDetection = ARPlaneDetectionHorizontal;
[arSession runWithConfiguration:configuration];

Here, we set planeDetection to ARPlaneDetectionHorizontal, which enables the detection of horizontal surfaces like tables and floors.

4.2. Handling Plane Detection Results

When ARKit detects a horizontal plane, it triggers a delegate method that you can use to place objects on the detected plane:

objectivec
- (void)renderer:(id<SCNSceneRenderer>)renderer didAddNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
    if ([anchor isKindOfClass:[ARPlaneAnchor class]]) {
        ARPlaneAnchor *planeAnchor = (ARPlaneAnchor *)anchor;
        
        // Create and add a plane node to represent the detected plane
        SCNPlane *plane = [SCNPlane planeWithWidth:planeAnchor.extent.x height:planeAnchor.extent.z];
        SCNNode *planeNode = [SCNNode nodeWithGeometry:plane];
        planeNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z);
        
        // Rotate the plane to be horizontal
        SCNMatrix4 transform = SCNMatrix4MakeRotation(-M_PI / 2.0, 1, 0, 0);
        planeNode.transform = transform;
        
        [node addChildNode:planeNode];
    }
}

In this delegate method, we create a visual representation of the detected plane using an SCNPlane and add it to the AR scene. This gives users a visual cue for where they can place AR objects.

4.3. Adding Objects to Detected Planes

To place objects on detected planes, you can modify the handleTap: method we discussed earlier. Instead of using hit testing against existing planes, you can use the plane anchor provided by ARKit:

objectivec
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer {
    CGPoint tapPoint = [gestureRecognizer locationInView:arView];
    NSArray<ARHitTestResult *> *hitTestResults = [arView hitTest:tapPoint types:ARHitTestResultTypeExistingPlaneUsingExtent];
    
    if (hitTestResults.count > 0) {
        ARHitTestResult *hitResult = [hitTestResults firstObject];
        
        // Create a 3D object and place it on the detected plane
        SCNNode *objectNode = [self createObjectNode];
        objectNode.transform = SCNMatrix4MakeTranslation(
            hitResult.worldTransform.columns[3].x,
            hitResult.worldTransform.columns[3].y,
            hitResult.worldTransform.columns[3].z
        );
        
        [arSession addNode:objectNode];
    }
}

In this modified handleTap: method, we use hit testing to find the detected plane where the user tapped. We then create a 3D object and place it on that plane.

5. Adding Interactivity and Animation

To make your AR experience more engaging, you can add interactivity and animation to your 3D objects. Let’s explore how to achieve this with Objective-C and ARKit.

5.1. Adding Gesture Recognizers

To enable user interactions with AR objects, you can add gesture recognizers to your scene. For example, you can add a rotation gesture recognizer to allow users to rotate objects:

objectivec
UIRotationGestureRecognizer *rotationGestureRecognizer = [[UIRotationGestureRecognizer alloc] initWithTarget:self action:@selector(handleRotation:)];
[arView addGestureRecognizer:rotationGestureRecognizer];

In the handleRotation: method, you can apply the rotation to the selected object:

objectivec
- (void)handleRotation:(UIRotationGestureRecognizer *)gestureRecognizer {
    if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
        // Determine which object was selected
        CGPoint tapPoint = [gestureRecognizer locationInView:arView];
        NSArray<SCNHitTestResult *> *hitTestResults = [arView hitTest:tapPoint options:nil];
        
        if (hitTestResults.count > 0) {
            SCNHitTestResult *hitResult = [hitTestResults firstObject];
            SCNNode *selectedNode = hitResult.node;
            
            // Store the initial rotation and node
            self.selectedNode = selectedNode;
            self.initialRotation = selectedNode.eulerAngles;
        }
    } else if (gestureRecognizer.state == UIGestureRecognizerStateChanged) {
        // Apply the rotation to the selected object
        CGFloat rotation = gestureRecognizer.rotation;
        SCNVector3 newRotation = SCNVector3Make(self.initialRotation.x, self.initialRotation.y + rotation, self.initialRotation.z);
        self.selectedNode.eulerAngles = newRotation;
    }
}

In this example, we use a rotation gesture recognizer to rotate the selected object when the user performs a rotation gesture.

5.2. Adding Animation

To add animation to your AR objects, you can leverage SceneKit’s animation capabilities. For instance, you can animate the scaling of an object when it’s tapped:

objectivec
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer {
    CGPoint tapPoint = [gestureRecognizer locationInView:arView];
    NSArray<ARHitTestResult *> *hitTestResults = [arView hitTest:tapPoint types:ARHitTestResultTypeExistingPlaneUsingExtent];
    
    if (hitTestResults.count > 0) {
        ARHitTestResult *hitResult = [hitTestResults firstObject];
        
        // Create a 3D object and place it on the detected plane
        SCNNode *objectNode = [self createObjectNode];
        objectNode.transform = SCNMatrix4MakeTranslation(
            hitResult.worldTransform.columns[3].x,
            hitResult.worldTransform.columns[3].y,
            hitResult.worldTransform.columns[3].z
        );
        
        // Add scaling animation
        CABasicAnimation *scaleAnimation = [CABasicAnimation animationWithKeyPath:@"scale"];
        scaleAnimation.fromValue = [NSValue valueWithSCNVector3:SCNVector3Make(0.1, 0.1, 0.1)];
        scaleAnimation.toValue = [NSValue valueWithSCNVector3:SCNVector3Make(1.0, 1.0, 1.0)];
        scaleAnimation.duration = 0.5;
        [objectNode addAnimation:scaleAnimation forKey:nil];
        
        [arSession addNode:objectNode];
    }
}

Here, we add a scaling animation to the object using CABasicAnimation. This animation smoothly scales the object from a small size to a full-size representation over half a second.

Conclusion

Objective-C remains a robust choice for developing ARKit applications on iOS, especially for those with a background in the language or existing Objective-C projects. In this comprehensive guide, we explored the fundamental steps to get started with ARKit, from setting up your project to adding and interacting with 3D objects and enabling real-world tracking. We also covered how to incorporate interactivity and animation, making your AR experiences more engaging and immersive.

As AR continues to gain momentum in various industries, mastering Objective-C and ARKit will open up a world of opportunities for creating cutting-edge augmented reality applications. Whether you’re building educational tools, gaming experiences, or innovative business solutions, ARKit and Objective-C provide the tools and flexibility you need to bring your ideas to life in the AR realm. So, grab your Mac, fire up Xcode, and start building the next generation of AR apps for iOS!

Previously at
Flag Argentina
Brazil
time icon
GMT-3
Senior Mobile Engineer with extensive experience in Objective-C. Led complex projects for top clients. Over 6 years. Passionate about crafting efficient and innovative solutions.