Apple is developing its own genetic AI, codenamed MM1

If the company with bitten apples is going to rely on Google's artificial intelligence for its iPhones, this option is probably only temporary. should offer its own model within a few months.

While Meta, Microsoft and even Google are making AI bids, and Samsung is touting its Galaxy S24s as “artificially intelligent” above all, one new tech giant has remained very discreet until now: Apple. We recently learned that the brand with the bitten apple, which Siri is starting to seriously blame, would be negotiating with Google to integrate Gemini AI into the next version of iOS. However, given the company's tendency to favor its independence, we have a hard time seeing it not accepting a solution from a competitor in the long term. It turns out that a research paper, published late last week by Apple teams, suggests that the company is indeed developing its own AI production model. Its name: MM1, probably for MultiModal 1.

An AI to calculate the price of beer

The document concerns an MLMM (Multimodal model of large language) able to work with text and images. Authors define “That after the rise of LLMs, MLLMs emerge as the next frontier of core models”.

Below is one of the examples shown: a photo of a table with some beer bottles on it. an image of the menu. MM1 must determine how much it costs to drink all the beers.

An AI fed on beer © Apple

Based on the number of parameters, the MM1 is a rather average model. In the article it says: “We are building MM1, a family of multimodal models, including both dense variants up to 30B and MoE variants [mixture-of-experts] to 64B, which are SOTA in pre-training metrics and achieve competitive performance after supervised detailing across a range of established multimodal benchmarks ».

Let's clarify what SOTA means Ultra modern. It is a term used to refer to the latest and most innovative technologies.

In the words of Boston University professor and machine learning expert Kate Saenko, who says The wireHowever, it would be dangerous to draw conclusions at this stage. He suggests Apple engineers experiment with different learning methods on a small scale before moving on to more complex models. Additionally, Brandon McKinzie, lead author of the MM1 article, explained to X, ” This is just the beginning. The team is already working hard on the next generation of models”.

iPhone-Gemini, a fixed-term relationship?

As mentioned above, Apple will soon consider integrating Google's Gemini chatbot into its iPhones – at a modest cost (as part of a deal with Google to make the Mountain View giant's search engine the default on iPhone, the latter would have paid Apple more than $18 billion in 2021, according to information from The wire). As such, it seems a bit counterintuitive that his teams are working on an internal chatbot.

It was only in 2012 that Apple released Apple Maps as an alternative to Google Maps, then the default navigation app on iPhones. Plus, working with Google might just be a way to save time. In recent years, the Cupertino company has shown that it would rather take its time. they release high-performance products immediately rather than rushing into the market. This is evidenced by the late marketing of the Vision Pro VR headset, which is superior to many competing products in many ways. The company's absence from the foldable smartphone market speaks to this strategy as well. Especially since multiple indications suggest that foldable iPhones have been in the works for several months.

Regardless, when it comes to genetic AI, Apple CEO Tim Cook has promised investors announcements later this year.

🔴 To not miss any news from 01net, follow us on Google News and WhatsApp.

Source:

Apple via The Wire

Leave a Reply

Your email address will not be published. Required fields are marked *