Aamir Siddiqui / Android AuthorityTL;DR Google has introduced Personal Intelligence in Gemini, allowing the AI to remember your personal details by connecting to your Google apps.Gemini can now think across your data, not just pull a single email or photo on command.Instead of stuffing everything into the model, Gemini selectively surfaces only the most relevant info when needed.
The feature is now live for US Google AI Pro and AI Ultra subscribers, with plans to expand further.Google is finally tackling a common frustration with AI assistants: their tendency to forget personal details.Starting today, Gemini is getting a memory upgrade called Personal Intelligence.
This new feature lets the AI access your Google apps, like your emails and photos, so it can give answers that feel much more relevant to you.This new feature is now rolling out in beta.Instead of making you jump between Gmail, Photos, Search, and YouTube, Gemini can now bring all that information together.If you choose to opt in, Gemini can access your Google apps and use that information in real-time, making your personal data more useful.Up till now, Gemini could fetch things — an email here, a photo there — but it didn’t really think across them.
Personal Intelligence changes this by addressing what Google calls the “context packing problem.” In short, your life produces far more data than even large AI models can process at once.Google’s answer is a new system that selects the most relevant emails, images, or searches and delivers them to Gemini only when needed.Don’t want to miss the best from Android Authority? Set us as a favorite source in Google Discover to never miss our latest exclusive reports, expert analysis, and much more.
You can also set us as a preferred source in Google Search by clicking the button below.This is powered by Gemini 3, Google’s most advanced model family yet.It brings improved reasoning, stronger tool use, and a massive one-million-token context window.
That still isn’t enough to swallow an entire inbox or photo library, so Google uses context packing to surface just the right details at the right moment.Google gives a simple example to show how this works.If you ask Gemini about tire options for your car, it won’t just list specifications.It can find your exact car model from Gmail, get tire sizes from your Photos, and consider your road-trip habits before making suggestions.
If you’re planning a trip, Gemini can check your past travel emails, saved places, and Photos to recommend places that match your real interests instead of just tourist spots.It also works across text, images, and videos.Gemini might pull a license plate number from a photo, confirm a trim level from an email receipt, then combine that with Search results, all in one response.Personal Intelligence is turned off by default.You decide which apps to connect and can disconnect them whenever you want.
You can also get answers without personalization.Gemini tries to show where it got its information, so you know the source.More importantly, Google says Gemini doesn’t train on your Gmail or Photos data.
Those sources are referenced to answer questions, not absorbed into the model.Training focuses on prompts and responses, with personal data filtered or obfuscated first.Google is open about the current issues.Sometimes Gemini might get too personal, mix up timelines, or misunderstand relationships.
For example, if Gemini thinks you love golf but you’re just a supportive parent, you should correct it, and Google encourages this feedback.Personal Intelligence will start rolling out on January 14 to Google AI Pro and AI Ultra subscribers in the US, with plans to expand later.It works on the web, Android, and iOS, and will soon be available in Search’s AI Mode.For now, it’s only for personal accounts, not Workspace users.
NewsAIGoogleGoogle GeminiFollowThank you for being part of our community.Read our Comment Policy before posting.
Read More