On a technical point, I was impressed by their CoreML model format. The specification is open, optimized for maximum transferability, it can convert models from Keras (with TensorFlow), scikit-learn and others, and their Python model converter is also open source: <a href="https://pypi.python.org/pypi/coremltools" rel="nofollow">https://pypi.python.org/pypi/coremltools</a><p>Once you have your model as a CoreML file, it is stupidly simple to incorporate it into an app (the live demo with the flower model was very impressive), and Xcode will convert it to a machine-optimized version.<p>I was skeptical when I saw the announcement, but honestly it seems like a game changer -- to be able to drag-and-drop models that other people have trained into an app and use them with virtually zero boilerplate is just great.<p>Video here: <a href="https://developer.apple.com/videos/play/wwdc2017/703/" rel="nofollow">https://developer.apple.com/videos/play/wwdc2017/703/</a>
I am an Apple fanboy but I am also a privacy fanboy. As AI, IoT, and consumer tracking continue to invade our daily lives I think Apple will continue to do quite well if they maintain there pro-privacy heading. Privacy is a feature, it's something people want, and unfortunately its becoming a luxury. That is something that attracts Apple's customers.
> Six years later, the technology giant is struggling to find its voice in AI.<p>I know that this is the most common approach to the situation but the article -as many other articles that have argued this before- provides no proof other than analysts speaking in broad strokes.<p>I use Siri, Alexa and Assistant several times a week if not everyday and I'd hesitate a lot before saying that Siri is behind. Alexa is more responsive but it's domain is also more narrow. Siri does a pretty good job when I dictate a message or want to set a timer. It's limited, sure, but so are the other assistants I use. I might be the odd case, but speech recognition of Siri vs Assitant is pretty comparable in my experience.<p>Moving onto services. Google Photos is outstanding, yes. But I fail to see any other example that proves the big advantage other players have over Apple in the ML or AI field.<p>Am I missing something? I feel like a lot of these articles argue their view from the experience they got from Siri 5 years ago.
This article ticks me off. Apple announced some really good features on Monday (CoreML and a bunch of ML driven features). Having this kind of article after WWDC is unfair. As a consumer, I applaud Apple's approach. The so-called AI as pursued by Google & other tech companies isn't AI at all. True AI won't need as much data. You do need data for Machine Learning (i.e. pattern matching). That's what other tech firms are pursuing. Apple being a hardware, systems and frameworks company is doing the right thing respecting my privacy. They leave data mining and personalization to 3rd party apps. If and when true AI happens, there's a good chance Apple will be leading that. (Why? True AI won't happen overnight, and true AI won't require access to a billion customer's emails or browsing history).
Me: Hey Siri, what cheese goes well with fruit?<p>Siri: Here's what I found on the web for "What cheese goes well with fruit..."<p>Me: Hey Google, what cheese goes well with fruit?<p>Google: "Edam"
Well yes, - they don't publish, they don't have a reputation (no teams like FAIR, DeepMind, MetaMind, MSFT AI, etc.), they don't attract the top AI talent, and anecdotally, they don't even even know how to hire for AI (had a brilliant colleague working in AI turned away because he didn't know some facets about the Python language when they hit him with Leetcode whiteboard questions; IMO they aren't in a position with AI where they get to flex programmer egos).
This is interesting because while Apple's (comparative) dedication to privacy is endearing, it's a long-term existential threat.<p>Google knows all about me and its assistant is, usually, great. Amazon has troves of data on what I buy, and I get to yell at Alexa to order more TP as soon as I see we're on the last roll.<p>Apple knows much less about me and, while I'm still an Apple fan and am tied to iPhones/Macs thanks to iMessage, Siri stinks as a result.<p>If voice assistants based on machine learning (specifically, personalized voice assistants) are the next big thing, Apple's privacy ethos will separate it from its major tech competitors – either in a great way, or a very negative way.
I've still not seen any AI that really gets me excited. Google Now does some neat things - but if the home automation products from Amazon and Google are representative of the state-of-the-art - we have an awful long way to go before AI is any kind of game changer.<p>More likely AI/ML is another trip down buzzword lane. It can hang out with IoT, VR, Big Data, and hell, even containers and microservices.<p>It feels like everyone is casting about for that next revolutionary technological innovation, but maybe we just need to be at a plateau for awhile.
Google and Amazon may get there faster, but personally, I'm quite ok with it taking a bit longer to get a truly smart digital assistant if that means I get to keep my privacy.
I love Apple's commitment to privacy, but the notion that privacy precludes powerful AI strikes me as a false dichotomy, and a cop-out. There must be some middle-ground.
Honest question: in the context of the realistic range of consumer AI applications in 2017, what are some meaningful shortcomings Apple products have? And do those shortcomings have any interest to the majority of consumers (and not say, to developers or analysts)?
One thing to remember is, it's easier to build AI and ML when you are willing to use lots of user data but it's not impossible if you put limits on your access to user data.<p>For example, when you meet someone, they don't need to know everything about your life to be useful. The same is the case for building internet AIs. Using more pre-trained models. Asking the right questions and listening to the responses.<p>Sure, it's easier if you have access and are willing to use user data to build those predictive models. But there is some real value in the world for companies who respect user privacy AND build predictive models.
> ... the company hired Russ Salakhutdinov, a Carnegie Mellon professor whose expertise is in an area of artificial intelligence known as “deep” or “unsupervised” learning, a complex branch of machine learning in which computers are trained to replicate the way the brain’s neurons fire when they recognize objects or speech.<p>The author of the article clearly made a mistake saying "deep learning" is the same as "unsupervised learning".
Regarding privacy, when human beings meet a new person, they do not need tons of data about that person to understand their spoken language. We learn English or any other language once and then rarely need to adapt to new people, unless they have a strong accent or speak a truly different dialect of the language. This demonstrates that the much touted speech recognition technologies from Google, Amazon, and Apple do not match human level performance. Are they really exceeding the marginal performance of the Hidden Markov Model based Dragon Naturally Speaking which took around two months of training to achieve its maximum accuracy ten years ago? Or are they just running similar models with huge numbers of adjustable parameters tuned to each user on the "cloud?"<p>If Apple invested in genuinely new, creative AI technologies that matched human level speech recognition and other tasks then they could preserve their purported emphasis on privacy. They would not need to collect huge amounts of personal data on most customers, unlike Google or Amazon.
I probably won't receive any medals for this, but it is my most humble opinion that most of these AI gimmicks (Siri, Alexa, etc...) are useless.<p>I will concede I get moderate gag value out of Google automatically creating animated gifs from a string of photographs.
AI is a funny field where you may still be in a great position by just having millions of your capable devices distributed to people. Software wise it's relatively easy to adopt state of the art - without necessarily doing research yourself. On the cloud side they have the capacity as well, that's not a blocker for them.
I haven't made up my mind on this yet: I really like Apple's privacy policies but I also like Google Assistant on my iPhone.<p>As usual, for a business trip I took this week, I went all-in with Google (location, my email forwarded to my gmail account, used Google Travell app, etc.). Very convenient for keeping my schedule, travel arrangements, etc. in order.<p>That said, when I returned home last night, as usual after a trip, I stopped forwarding my email to gmail, uninstalled most Google apps from my iPhone, and turned off location.<p>When I am at home (most of the time) I like using duck-duck-go, and just Apple's software.<p>EDIT: I have been working in the field of AI and machine learning for about 30 years. I think that Apple will find a sweet spot between good privacy and making Siri into a useful digital assistant.
Collecting location data, visited URLs and phone books is absolutely not required to make speech recognition. You don't have to learn computer vision algorithms on people's private photos: you can use public photos instead. Even "context based computing" does not seem to work better if data from all users is put into single database. Why "AI" and surveillance are usually treated as related?<p>Most creepy forms of surveillance are used for ads (but I'm not sure if it even works, I always see completely irrelevant ads) and Apple is not advertising company.
Currently, the only documentation I've seen for extending Siri depend on an app running on a particular device. More than anything else, this device-centric focus is holding Siri back when compared to Alexa and Google Home.<p>Am I missing something? Has there been any announcement that third party developers could add skills or actions across all Siri instances via the cloud?
Apple is struggling to become an AI powerhouse, but it just deployed top-notch machine learning frameworks to millions of portable devices around the world in a way in which its hundreds of thousands of developers can use machine learning now. I want to be struggling at investments in the same way. :)
Why on earth does Apple want to become an "AI powerhouse"? Please, Apple, do what you're good at: design great consumer electronic products. You should be hiring the top talent in fields like HCI, AR/VR, and wearable computing, not AI.
Does anyone have details on the AIKit & Apple CNN for AMD graphics cards? This was previously the exclusive territory of Nvidia and this week, Apple promised expanding that for the other half of discrete graphics cards. Will there be Python APIs?
So the deep learning machinery is becoming part of OS. I am wondering if part of the model (the relatively stable layers) will also be released with OS.
Two reasons, One) Apple is currently a lifestyle company and not a technology company so AI is well outside their wheelhouse in what they have spent the last 5 years doing.<p>Two) Apple is never really bleeding edge with it's products, it waits to see what the industry is doing and then subsequently tries to take something and make it better, something tough to do with AI/ML. You can't really just take someone's idea and build on it with $$, there needs to be a lot of groundwork done first to even get to that point and it looks like Apple has barely done any (looking straight at Siri for an example)