Artificial intelligence requires massive amounts of RAM to run locally

Artificial intelligence - Computer RAM

Google has stated that locally running AI demands massively powerful devices

Google’s artificial intelligence model, Gemini AI has been rolling out on a few mobile devices as of earlier in 2024, but the newest flagship phone from the company was not included on that release list.

While Gemini Nano may be headed to Pixel 8, it will be released exclusively as a developer option

An Android app shared the artificial intelligence model with some device users, but not the Pixel 8.  The reason, said Google at the Mobile World Congress, was that Gemini Nano simply won’t run on the smartphone due to certain hardware limitations that were not identified during the announcement.

Artificial intelligence - Google logo on company building
Credit: Photo by depositphotos.com

That said, since that time, more information has been released, and developers have been having a look at the details and have come up with an explanation for Google’s decision.

For instance Seang Chau, Vice President of Devices and Services Software at Google recently spoke on the Made by Google podcast.  He explained that the 12 GB of RAM in the Pixel 8 Pro makes it ideal for loading Gemini Nano.  That said, the standard Pixel 8 has 8 GB of RAM, which makes a substantial difference when running the AI model. 

Since first announcing the artificial intelligence wouldn’t be on the phone, changes were made

Google initially wasn’t going to launch the AI for its basic Pixel 8 at all.  That said, it has made somewhat of a change in its strategy, having since decided to roll out Gemini Nano as a developer option for Pixel 8 users.

While this won’t mean much for the average device user, it could be good news for developers who know how to enable it.

During the podcast, Chau explained that the decision was initially made not to roll out the AI entirely to Pixel 8 users because they didn’t want the less powerful phone to “degrade” the artificial intelligence experience.  However, the company is now aiming to add some of its AI-enabled features such as smart reply as a “RAM resident.” What this means is that it will “permanently” use a certain amount of the device memory so that it is ready for use whenever the user wants.

Leave a Comment


This site uses Akismet to reduce spam. Learn how your comment data is processed.