What do I need to know about iOS 11 app development? – TechTarget

Posted: August 17, 2017 at 3:47 pm


without comments

Every new generation of Apple's iOS operating system for iPhone and iPad brings leaps in technology. Version 11 is no different. Set to launch in September 2017, among the new key capabilities that app developers must learn are augmented reality and core machine learning.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

ARKit, according to Apple, is a framework for creating augmented reality experiences for the iPhone and iPad that merges camera sensor data with data from a device's accelerometer, gyroscope and pedometer -- known collectively as Core Motion data.

Core ML, according to Apple, is "a foundational machine learning framework" that runs learning models directly on a mobile device, eliminating the server round trips necessary in the past. Face tracking, face detection, landmarks, text detection, rectangle detection, bar code detection and object tracking will enable developers to build vision-based machine learning into applications.

To understand what app developers need to know about ARKit and Core ML, SearchCloudApplications asked an expert. Mark Price is a veteran mobile app developer, with nearly 60 projects completed. He has taught 16 different courses to more than 130,000 students at Udemy, an online learning academy that offers courses covering an array of topics, from technology to personal development.

Before we get into specifics, is moving from iOS 10 going to cause problems in iOS 11 app development?

Mark Price: The switch from iOS 10 to iOS 11 was very small in regard to breaking features. Even with Swift 3 to Swift 4, nothing really broke. The switch on this version was much better than the last version. The last version introduced APIs that didn't support older devices. It's important to understand what technologies older devices cannot use.

ARKit is brand-new. How does this affect iOS 11 app development?

Price: Apple built this complex framework under the hood that does surface detection. If you have multiple surfaces in your camera view, it does a good job of estimating how far they are off the ground, how far they are from you. Before ARKit, that was incredibly complex. You'd have to get a library -- if it even existed. This is all built in now. You can throw 3D models into your project, and [the app] will maintain its position in space. You had to have code in the past; now, it happens automatically.

What can developers do with this capability?

Price: We don't know what people are going to make with it. With ARKit plus facial recognition, you can literally rebuild Snapchat-style filters in a fraction of the time. Some of my students have used it to place rulers that can measure objects. It opens the door to developers who didn't have these skills to work with these types of complex systems. A Pokemon Go app, for instance, would have been much more complex before. But now, a developer can go in and look at the APIs and build something like that in a fraction of the time.

How can developers use Core ML to provide accessibility for people with, say, a vision disability?

Price: Right out of the gate, using Core ML, you can do image recognition using the built-in libraries. With the prebuilt models that Apple provides, one has thousands of images. In one of our classes, we built an app that will look at an item and use the Siri speech synthesizer to speak the item's name. The app will see a cup and speak the word 'cup.' We built the app in just a few hours.

Facial recognition will be a big component of iOS 11 app development. Do you see that being used in different ways?

If I was looking for a job, I'd totally be mastering the Apple ARKit right now. Mark Priceveteran mobile app developer and teacher

Price: With facial recognition, you could take a picture with your camera and look them up with Facebook APIs to identify them. You can do this now with machine learning and the right technology stack. Apple made it really easy to integrate this.

In the past, you had to pay a lot of money to get models for machine learning and cognitive services, but this is all part of iOS for free now. We could use Core ML to match your own face as a different mode of authentication for your app.

As an expert already building apps in iOS 11, what is your key message to developers?

Price: It's in your best interest to go to the Worldwide Developers Conference 2017 website and learn about all the new features. What you'll be able to do in your job is delight your bosses with new things -- and Apple has added so many. Research all the new APIs to what new things you can incorporate into your apps. I'm not talking about just the technical level, but also at the user experience level.

What advice do have for developers regarding understanding these new technologies in the bigger sense, not just as they pertain to iOS 11 app development?

Price: Invest time learning about machine learning -- not just the framework, but why it is important. You're going to see more jobs that require virtual and augmented reality skills. There's money in it: Become an expert in iOS augmented reality, and it will put you above other people who are applying for jobs. If I was looking for a job, I'd totally be mastering the Apple ARKit right now. We're going to see lots of opportunities there.

Joel Shore is news writer for TechTarget's Business Applications and Architecture Media Group. Write to him atjshore@techtarget.comor follow @JshoreTTon Twitter.

Continue reading here:
What do I need to know about iOS 11 app development? - TechTarget

Related Posts

Written by admin |

August 17th, 2017 at 3:47 pm




matomo tracker