Apple devs get IBM AI: Watson Services for Core ML
Apple and IBM have been working together for a few years.
Thought to be an unlikely couple by some, the two firms established a partnership in 2014 to work on integration projects.
Fast forward to 2018 and the two tech giants are pairing up again, but at a granular developer focused level designed to help inject IBM’s Watson Artificial Intelligence (AI) platform and services directly into Apple iOS mobile applications.
The product/service launches are IBM Watson Services for Core ML and the IBM Cloud Developer Console for Apple.
In terms of what’s actually happening here, Core ML is a Apple’s software framework for running machine learning models on Apple iOS devices – the focus is very much on iPhone smartphones in the first instance, iPad second… and then onto Apple technologies for smart watches and television.
Core ML, first released in iOS 11, is an Apple framework for running machine learning models locally on iOS-enabled devices, meaning it runs even when the device is offline. Existing models written in Caffe, Keras, Scikit-learn and others can be converted to Core ML.
Due to this agreement, developers using Core ML can now gain access to IBM Watson AI functions inside the apps they are building.
According to Apple, “A trained model is the result of applying a machine learning algorithm to a set of training data. The model makes predictions based on new input data. For example, a model that’s been trained on a region’s historical house prices may be able to predict a house’s price when given the number of bedrooms and bathrooms.”
Continuous learning
Developers can now build AI-powered apps that securely connect to enterprise data, are optimised to run offline and on cloud… and that continuously learn, adapt and improve through each user interaction.
“IBM Watson is available as a set of cloud-based services for developers to build AI applications, and Core ML delivers advanced machine learning to apps on Apple devices. Together, these technologies can deliver faster, smarter insights with continuous learning capabilities, transforming AI for enterprise mobility,” said Mahmoud Nagshineh, general manager, IBM partnerships & alliances
The Coca-Cola Company is said to be working with this technology to transform in-field capabilities for field service engineers. Initial functionalities being analysed are custom visual recognition problem identification, cognitive diagnosis and augmented reality repair.
Watson Visual Recognition is a service on IBM Cloud that enables you to quickly and accurately tag, classify, and train visual content using machine learning. It has built-in classifiers for objects such as faces and food.
The new IBM Cloud Developer Console for Apple provides tools, including pre-configured starter kits which aim to simplify Swift development, along with AI, data and mobile services optimized for Swift.
“For example, we’re introducing IBM Cloud Hyper Protect Starter Kit to enable iOS developers to safeguard credentials, services and data using the IBM Cloud Hyper Protect services,” said Nagshineh.
IBM’s focus here in the smartphone arena has been confirmed to extend to Apple only devices at this time i.e. Android does not appear to be in the picture. IBM is a fan of Apple’s ability to have integrated its hardware and software platforms together and says that they provides the level of enterprise grade security and control the firm was looking for in order to make this agreement happen.