00:00 – Mishaal Rahman: AI is promising to transform the way we read books, but will it be for the better?
00:04 – C. Scott Brown: And Google is transforming Gemini into a one-stop shop for all your purchases, projects, and pictures.
00:11 – Mishaal Rahman: I’m Mishaal Rahman.
00:12 – C. Scott Brown: And I’m C. Scott Brown and this is the Authority Insights Podcast where we break down the latest news and leaks surrounding the Android operating system.
00:21 – Mishaal Rahman: So Amazon recently rolled out a new AI assistant in the Kindle app for iOS that basically allows you to ask it questions about what you’re reading. And we found evidence that Google may be following suit with its own Gemini assistant chatbot in the Google Play Books app. But should people really be reading books with an AI assistant at their side?
00:39 – C. Scott Brown: And should we be preparing for a future where apps like Gemini become super apps? Because it sure seems like Google wants us to spend more and more time in the app.
Google also wants us to spend more time digging through our Google Account settings and more time chatting with Gemini, even if it’s not inside the Gemini app. The company is testing a new Gemini-powered chatbot for answering your burning questions about your own Google account.
01:06 – Mishaal Rahman: Man, Gemini, Gemini, Gemini. Seems like this week is all about Gemini. And you’re not going to stop hearing about AI because the holidays are around the corner and right after that CES kicks off and it’s going to be chock full of Gemini news.
And speaking of CES, because of the holidays and because of CES, there won’t be another episode of the Authority Insights Podcast for a couple of weeks. Next week is the Christmas holiday, so both Scott and myself will be out. And then the week after that, New Year’s week, we’re just not going to have enough time to actually record that week as well. CES, we kind of doubt we’re going to be able to get something out during then, but we’ll have to wait and see, but we’ll keep you updated if you want to find out when the next episode will be airing. But for now, let’s get back to the main stories. The trio of Gemini news.
Starting with the news that Google is bringing Gemini into the Google Play Books app. So for those who don’t know, we have discovered evidence within the latest version of the Google Play Books app suggesting that Google is working on a new Ask Gemini feature. And we’ve discovered evidence that Google will be introducing this Ask Gemini button within the selection menu whenever you highlight text inside of a book that you’re reading. And it’ll be existing sitting alongside the add note button, the translate button, the copy button, the define button. And currently, it’s not functional yet, but I think it’s pretty obvious what this is going to do. You highlight a block of text, a couple of paragraphs, a couple of words, you tap Ask Gemini, and then it’ll ask the AI assistant to explain what that block of text means or maybe summarize it or something else. Now, of course, this is a work in progress so we don’t know exactly whether or not this will open like an in-app interface or if this will open in the dedicated Gemini app. But either way, it’s still going to be interfacing with an AI chatbot, whatever you’re asking about whatever book you’re reading, whatever you want to ask it. And we can’t help but notice that this is coming very shortly after Amazon announced its own version of the feature called Ask This Book which is currently available in the iOS version of the Kindle app. And that feature allows you to basically just ask questions about whatever you’re reading, characters, or the plot or something and get spoiler-free answers to your questions based on how far you are currently in the book which is a pretty significant kind of way that this is structured. Because you don’t want to be asking a question about you’re reading the Harry Potter novels and you ask, is Snape a bad guy? And then it tells you the most infamous spoiler in history, right? You don’t want it to accidentally reveal that to you in case you haven’t already heard that spoiler by now. So I think I have mixed feelings about this idea. But like, what do you think Scott? What do you think of this idea of adding an AI chatbot to your book?
03:55 – C. Scott Brown: I could see it being really beneficial for doing reading that you kind of have to do, whether that’s because you’re reading like a textbook or a reference manual or something. And being able to just sort of ask the assistant, who is this person? Or can you give me some background on this that maybe I missed earlier in the book or whatever. But yeah, for reading a fiction book, it seems like it’s a little superfluous. Like to me, half the fun of reading a story is engrossing yourself in it and getting to know the characters and getting to feel the vibe, getting to feel how the writer writes. Like you could read Dune and then read The Lord of the Rings and there are two very different writers even though they’re both in like this fantasy element. They’re very different stories by very different writers. And getting into that and sort of interrupting that by asking an AI assistant to explain something to you. I don’t know, to me what it feels like to me in that sense is like you’re in the middle of a movie and you’re that person who leans over and is like, “Who’s that guy? What’s he doing there?” You know, that kind of thing. And that’s just annoying. Like nobody likes that. So yeah, I don’t think I would use this, but I could also see it for maybe people who want to read more but find that they lack kind of the ability to sit down and focus on a story. Being able to like check in with an AI and get like reaffirmation of things be like, “Oh, I forgot who this character was. I haven’t read this book in two weeks. Like can you remind me who this is?” Like that, that’s cool. I could see it being useful, but I don’t know how it would fit into my life, I guess is what I’m trying to say.
05:41 – Mishaal Rahman: Yeah, it’s definitely one of those… Like with all AI features all generalized AI chatbot features I think it really depends on the purpose that you’re using it for. If you’re a student and you’re using it to skip out on studying, then that’s really bad because you’re not learning anything. You’re just kind of offloading getting the answer to an AI chatbot. You’re not learning anything. But like you mentioned, if you’re just a busy adult, you haven’t read, you haven’t touched this book in weeks, but you really want to finish it, you want to get through it, you actually enjoy reading it. But you just forgot like what that one character did ten chapters ago because like a lot of books use some subtle foreshadowing, some more obvious, but you just forgot because you haven’t picked up the book in weeks or maybe months. And you just ask it, what did that one character do? I forgot. And it just gives you a little bit of recap to kind of get you back up to speed with what was going on. So you’re not like the random person who goes online and you’re accused of lacking reading comprehension just because you know you’re not binge reading the novel from beginning to end. So you don’t really remember every single little plot point and every single little detail that happens, you know, because like a lot of books they require you to really immerse yourself and really focus your attention on everything that’s being said in the book to really understand everything that’s going on. But not everyone is able to read at the same pace so I can see it being helpful for a lot of people.
07:00 – C. Scott Brown: Yeah, we’ll just have to wait and see how it actually works in practice. You know, like we’ve seen it. We’ve seen what we assume to be the inspiration for this feature from Amazon. But yeah, will Google do the exact same thing? Or maybe Google is going to lean into it being like, yeah, if you want to spoil the book, go for it, you know? We’re not going to try to stop you, kind of thing. But have to wait and see how it pans out.
07:26 – Mishaal Rahman: I would hope at least that if Google implements it that they implement it in a way that it recognizes what page you’re on. So then like it won’t it will actively try to avoid giving you information that happens later in the novel. If it is even possible to do that because I would imagine the only way it would be possible to do that would be to kind of have the entire contents of that book uploaded to Gemini for its analysis. Otherwise how would it know that this certain plot point was revealed in page 352 and not page 200. It would just tell you that information and congratulations, you just spoiled a major plot point.
08:00 – C. Scott Brown: Yeah, and it also like it could because now Gemini can ask like follow up questions. So it’s possible that that could be something that comes into it. Like you say who is this character and what’s going on here or whatever and Gemini says this is a recap of what this character has done and then you ask a question afterwards and then Gemini asks a follow up question saying, “You know, I can give you the answer to this, but do you want to know? Like this is going to be a spoiler.” And then you can say yes, please spoil it for me and then Gemini can answer it. Like having that kind of back and forth might be cool because then it’s like then it’s like you’re talking to somebody who’s already read the book, you know? That having that be like a little bit more of a free-flowing conversation might actually be something that’d be cool. Almost like a book club with a friend of yours, you know, it’s just that instead of being a real friend, it’s Gemini.
08:50 – Mishaal Rahman: A parasocial relationship with Gemini.
08:53 – C. Scott Brown: Yeah, you know, you know, people have parasocial relationships with Taylor Swift. They can have it with Gemini.
08:58 – Mishaal Rahman: Oh, people are already going crazy over ChatGPT just the kind of things you’re seeing with the way people react with ChatGPT when they changed the personality from 4o I think.
09:07 – C. Scott Brown: Yeah, a lot of people their relationships ended.
09:12 – Mishaal Rahman: Yeah. A little problematic there.
09:14 – C. Scott Brown: Little creepy. Little creepy.
09:17 – Mishaal Rahman: And speaking of problematic, I’m also wondering exactly like I mentioned, the only way for Gemini to really know exactly the information up to a certain point in the novel is for it to have ingested that novel in its database. Like for it to kind of be able to feed on that and answer it and add it to its database. I’m sure a lot of authors already kind of pissed off at the way that a lot of AI models have been trained on a lot of their books without potentially their consent. And I’m kind of wondering if Google is reaching out to a lot of these authors and saying, “Hey, we want your book to be able to opt in to this experience. We want to be able to have this Ask Gemini experience with your novel. Will you provide your consent?” Or is Google just going to enable this on all existing books that are already in the Play Books database, in the Play Books library like is this something that we can expect for all books or only a handful of books whose authors or publishers have explicitly opted in to be included with this experience? That’s what I’m also curious to find out.
10:14 – C. Scott Brown: Yeah, I mean there could be a way around that. Like you could still have Gemini do what’s necessary for the feature to work, but not learn from that. You know, like almost like a NotebookLM, like a private thing where you’re feeding the information to it but then it just locks that information in within that one that one gem or whatever. Like that might be a way around it where it’s like we’re not using your copyrighted information to enhance our product, we’re just providing a service to people that uses a product that they’ve already paid for. So I don’t know, that might be a way around it but you’re right like some authors might just be like, “No, like I don’t want this at all.” And if Google doesn’t offer an opt out that might cause problems.
10:58 – Mishaal Rahman: I did want to highlight very quickly this editorial written by our colleague Dhruv on Android Authority who basically did use kind of a bit of a hacky approach to add Google Gemini to his Kindle because currently that’s not really officially supported. But basically he used the Koreader app and used like an assistant plugin and then added his own API key to have it basically integrate with Gemini so that whenever he’s reading any kind of book, any kind of historical book or science fiction book or anything that requires like deep technical knowledge, he is able to just highlight the text and ask Gemini whatever he wants about it. And he thinks it’s a game changer. Like I’m sure a lot of people are like, why would you ever use this, you know, this is just bad for learning and bad for knowledge. But I would highly recommend giving his editorial a read because I think he gives some pretty good explanation, pretty good justification for why you might want to do something like this or why Google and Amazon are justified in rolling out this kind of feature in their respective book reading services. So definitely give that a read on the website.
12:03 – Mishaal Rahman: All right, and moving on to our next story. So as we alluded earlier, Google seems to be trying to transform the Gemini app into a super app where instead of just opening the app and you ask it a question and you kind of interact with a chatbot, you know, they’ve added so much to it. You can generate images with it, you can generate videos with it, you can ask it to compile a research paper for you, you can ask it to compile code for you. It has this canvas feature, it creates files. There’s so much stuff that you can do in the Gemini app. To the point where now it literally has a dedicated section called “My Stuff” where it just has all the stuff that you’ve created with Gemini in one single location. The problem is right now that “My Stuff” collection is just a reverse chronological list of every single thing you’ve made. Whether it’s images, whether it’s code, whether it’s things that you’ve looked up. Like there’s no kind of organization to that list. But our Authority Insights team, we’ve discovered evidence that Google is preparing to make that “My Stuff” list a bit more organized with categories. So you’ll have categories like Media, Documents, and very interestingly, Purchases. And obviously these categories will include their respective things that you’ve looked up. So Documents will have any documents you’ve created. Purchases will have things you’ve purchased which is a relatively new feature of the Gemini app where you can kind of have it do shopping for you, the agentic shopping feature. And Media are any images or videos that you’ve created with the Gemini app. We’ve talked about this several times in the past Scott where we’re kind of debating is it better to have these AI experience in a all-in-one application like Gemini app or is it better for users to kind of experience Gemini and AI through the application they already use, kind of integrated within the application they use. Do you think we’re going to reach a point where just kind of everything is done through a single all encompassing AI app where you know you don’t open your email app anymore. You just ask Gemini to look through your emails, summarize your emails for the day. Everything is contained within your one big AI assistant AI service app. Or do you think this is just kind of Google trying to create organization where everything was kind of chaotic before in the Gemini app?
14:21 – C. Scott Brown: I mean, I think it’s a real possibility. The core… the core caveat will be that only for products that Google owns. And you and I have discussed this a few times now where, you know, Uber is just first app that came to my mind. Uber wants you to use the Uber app. Uber wants that data. Uber wants that deep connection with the app and its service. It doesn’t want you to be able to do everything through Gemini because then that removes the user from the Uber ecosystem. So it would be a very hard sell for Google to go to Uber and be like, “Hey, wouldn’t it be cool if people didn’t go to the Uber app and just did everything here? You know, thumbs up.” And Uber would be like, “No, like we actively do not want that.” So Google’s never going to be able to make Gemini the super app where you do everything. But it could be the super app where you do everything Google. You know, where I can just imagine like all of your email funneling into Gemini. And all of your web browsing history from Chrome feeding into there. Your messages, your phone calls, like everything that happens that Google has a degree of control over being fed into this one place. And then making it this super app where you can go and you can be like… I’m just coming off the top of my head, but like, you know, querying about an email and a phone number and something like a recipe online all in one query. Like, “My grandmother sent me an email with a recipe in it, but she also told me that there’s a recipe online that does a little bit better. Can you find that recipe for me and mix them together and then message it to my grandmother so she has it?” Like that kind of thing that would involve multiple Google products working together. Like that would be really cool. Like having that kind of ability to sort of bring your entire Google life into this one spot. But yeah, but the wall that will come up is that Google just won’t be able to do it with third parties. So yeah, Google, if Google really wants something, Google’s going to have to make it itself or go on a buying spree and start buying companies left and right to really bring that in. But obviously Google is so huge so I think that you know they have plenty of room to start. But yeah, I for one would welcome it. I think that it would be really cool to have this one spot where I go and my entire digital life is there. You know, like to me that would be very useful to me both in my work and personal life. But whether or not that’s what Google’s actually going for is you know we’ll have to wait and see.
17:04 – Mishaal Rahman: Yeah, I’d imagine you know even if they do transform Gemini into the Google Super App that you’d still need kind of UIs for a lot of the things. Like for example, if you want to create notes, you want to create lists like you have Google Tasks and Google Keep for that, right? How would they inte- how would they implement that and integrate that into the Gemini app? Right now in order to access your lists and tasks within Gemini you basically gotta explicitly ask it and then it can pull up those lists but like say you want to do some kind of um you want to edit each individual list, you want to add images to them, you want to change like a due date or something like that. A lot of that stuff is kind of cumbersome to do within the Gemini app. Potentially they could integrate that stuff directly into the Gemini app. They could have kind of like a way to link out to those dedicated surfaces. Kind of like how the Gmail app for a long time literally had chats integrated into it. Like it was basically the Gmail app is the Google Chat app at the same time and it also has Meet integrated into it. Like Gmail is basically the super app for not just Gmail but Chat and previously Hangouts and previously and currently Meet, right? What if they do something similar with Gemini where it becomes the Google Super App and it has everything all Google apps kind of integrated like tabs and interfaces within it. And then like you open the Gemini app and your main experience is just this big search bar where you can kind of just ask it anything and it connects to all the different Google services including like Google Maps, Google Home, etc. You don’t need to use different apps, it’s just everything in one app. I think yeah that’s a great idea. I think that would be something really cool to have. And whether or not they’re able to integrate that with third-party services remains to be seen. I know they’re trying to open it up and make it work better with third-party services, but like you said, Uber is not going to want to, they’re not going to want to sit by and let that happen because they want people to use their app and their services and give them the data, not Google.
18:59 – C. Scott Brown: Yeah, and there are certain apps that I don’t think Google would be able to replace with Gemini. You know like Maps. Like it’s just way easier to open up Maps and have a map of your current location, you know zoom in, zoom out, find the nearest coffee shop, find the nearest bar whatever it is you’re looking for, hit the button and then say navigate. Like that would be actually faster and more efficient than pulling up Gemini and being like, “I’m looking for the nearby coffee shop. Please make sure it has a good chai latte and it’s within a 10-minute walking distance.” Like that’s going to be slower. It’s going to be much more efficient for you to be tactile with it. So it’s not going to kill you know any of these Google services but yeah being able to integrate it all into one space for when you do want them there that is going to be a big deal. But the core that the biggest thing though will be the apps integrating with each other. You know. “Oh, uh, you know I went to Cape Cod last year and I went to this great restaurant. I don’t remember what it was. Where did I go?” You know and Gemini is like, “Oh, here’s your Maps history blah blah blah.” And then it’ll say, “Do you want me to text this?” And you say, “Yes, please text this restaurant name to, you know, my wife.” Whatever. Like that kind of thing could be incredibly useful. And kind of represents the future like that kind of represents where AI like how it was sold to us. You know right now people are really anti-AI. I see a lot of anti-AI sentiment online which is totally valid and 100% you know can see. But at the same time we were promised AI that would literally be like having a personal assistant. Like having a human being standing next to you who’s like, “I will do whatever you want me to do. Just let me know what it is and I’ll do it.” Like that’s what we were sold and we still don’t have that. And so that this kind of represents that future that we’ve been sort of sold on. Can Google pull it off? That’s always the big question. Can Google pull this off? I feel like we could end every podcast with, “can Google pull this off?”
21:02 – Mishaal Rahman: I did want to kind of go back to that one Google Maps example you made… Right now instead of asking Google Maps… “find me the nearest coffee shop that has like a good chai latte”, right? And right now a lot of people would just prefer to open Maps, kind of have the tactile feeling of typing in coffee shop and checking that there’s good chai latte on the menu. That is the way a lot of us prefer to do it because that’s the way we’ve been doing it for a long time. But what if in the future a lot of people kind of grow up just asking everything to a chatbot? So right now we prefer opening a Maps app, seeing the map, searching the location and reading the menu. But in the future if people grow up just asking a chatbot everything, they might prefer the first option, just asking a chatbot for that query. And then in the future like the kind of the old school approach of just looking at the map dies down. We’re kind of seeing this kind of generational shift right now where like you know us millennials we grew up doing Google searches and like searches on a search bar for everything. Now a lot of the younger generation they literally do their searches on TikTok. They search like a video app for the things that we used to use Google search for. So their concept of search is very different than our concept of search. So maybe the next generation their concept of search is entirely AI driven queries and not anything to do with applications. So like who knows what the future will hold and what the future will hold in terms of how we search and interact with apps and services. Maybe it won’t be app driven at all.
22:31 – C. Scott Brown: Totally possible. Yeah. I think we’re very far away from that happening. But you’re right like that could be the trajectory that we find ourselves on. Yeah and then the question then would become, is that what we want? Like do we really want, you know, I don’t know, there’s a phrase that I can’t remember – I can’t remember off top of my head but it’s basically like where humanity gets so everything gets so automated and so convenient that people forget how anything works. And then they wake up one day and they realize that they actually don’t understand how you know a car goes because it’s all automated. And that’s happened like I don’t know how a car goes. Like I get into my car I turn the ignition and I’m off but if somebody said Scott explain how this happens I wouldn’t. Obviously there are still plenty of people around the world who do understand that but that’s the basic concept of like we don’t understand how this works anymore. Yeah so is that kind of where we want to go? Like do we really want people asking these questions of AI and not knowing how to do it themselves? Like I don’t this is an extreme example just because we brought it up with the Maps thing like what if you had someone with a Maps app not like a physical map but like a Maps app and said find the nearest coffee shop and they literally didn’t know how to do it. Like that would be wild. You know? Like that would be wild.
23:55 – Mishaal Rahman: Kind of like how a lot of GPS navigation kind of people have no sense of direction anymore because they are wholly dependent on GPS. So yeah I can kind of see a lot of these important skills kind of going away because everyone is so dependent on asking an AI chatbot for everything. They don’t even understand the concept of distance anymore because they just say nearest and just take whatever the AI chatbot says for granted.
24:20 – C. Scott Brown: Yeah, yeah. It’s certainly one of those things where you know with all of AI right now is like okay, you know, Jeff Goldblum like, “we were so preoccupied with whether or not we could we didn’t stop to think if we should.” And it’s like yeah there are a lot of things like that I could pinpoint to AI. But yeah, but the core idea here of this Gemini linking with your entire Google life I think is cool. Yes it kind of might create a slippery slope but to me that’s no reason to not go for it because it could be so helpful to so many different people for so many different reasons.
25:00 – Mishaal Rahman: There are definitely a lot of arguments to be made both in favor of having an AI chatbot do things like looking up things on a map for you versus just doing it yourself on a Maps app. But I would say this next feature I think definitely there’s no argument that this is a good thing to have in terms of an AI driven feature. And it’s the news that Google is working on an AI chatbot for managing and searching through your Google Account settings. Because I don’t know if you’ve ever opened the Google Account settings, but there are so many different settings, layers upon layers of menus and submenus and things to do for like look through that like I frequently just open a Google search and say, “How do I change this?” And it leads me directly to the page. I never look through this account settings by itself and like go through each individual setting. Just too much to do. So the rumor that Google is working on a Gemini-powered AI chatbot for the Google Account settings page I think is a big deal. Because you can basically just pull this search bar up, ask it, “How do I change my password?” And then it’ll pull up the exact page you need to go to to change your password. Or maybe even guide you through the process to do it. We we spotted this evidence we spotted evidence for this feature in a recent version of the Google Play Services application and as you can see in the images there’s a floating chat bar that appears at the bottom of the Google Account management screen and then you can just type natural language questions um for whatever’s on your mind whatever you want help with to change in your settings. And of course it’s going to be powered by Gemini but we don’t know when this is going to roll out or if it does roll out.
I think this is pretty unambiguously a good thing to have. There are potentially some concerns with the way this feature is constructed especially if you know it is uh maybe trained on the information in your account. Probably not. Just going to be trained on the account of features, the kind of features that are available in the Google Account settings. But I can see this being really really helpful especially for non-tech savvy people, the elderly for example who need to figure out how to change something in their Google account and they might be led into you know getting assistance from some helpful third party who is actually just trying to phish them, you know. So this will definitely be a big step up from having that happen to them. What do you think Scott?
27:15 – C. Scott Brown: Yeah, no, I think this is pretty much a no-brainer like… yes we need… I… you and I who both, you know, work at Android Authority and tech is a huge part of our lives, you and I can freely admit that the Google Account is crazy to try and navigate. Just doing simple things like changing your password or finding all the devices in which you’re logged into. Oh, that’s crazy. I don’t like that I always have to look that up. Like you know working at Android Authority I log in using my Google Account to like multiple devices all over the place and every now and then I’m like I should probably log out of all of those. I always have to Google how to find where those devices are and then I can remove. So like things like that I think this is totally a no-brainer thing. I have one concern though which is Gemini’s ability to keep up. I’ve noticed that you know if I’m using Gemini and I’m asking a question about something that’s relatively recent, Gemini will sometimes argue with me and say that doesn’t exist. I remember in August when the Pixel 10 series came out I was crafting articles and making video scripts and doing all these things and I was using Gemini as a soundboard for ideas and fleshing out things or whatever. And constantly, you know, Gemini would be like you know this phone is speculative like it actually hasn’t launched yet. I was like it’s lau- it launched. It launched a week ago. Like you know. So yeah, like let’s say Google brings in a new feature which it does all the time for your Google Account. You know like, you know you weren’t you didn’t always have the ability to go into your Google Account and see like dark web results and things like that. Like these are features that Google’s introduced over the years. And so yeah like let’s say Google introduces a new feature on March 1st. Is Gemini going to be able to know about it on March 2nd? Or do you gotta have to wait till you know June 2nd before Gemini accepts that this is a real feature that’s there? Obviously Google controls Gemini and Google controls your Google Account so it will be able to work that out but that’s been my biggest complaint with Gemini as a whole is how slow it is to be updated with what’s going on in the world. So that would be my concern. Would be like I would go and clicky clack and type in what my command is and Gemini would argue with me and be like that command doesn’t exist or that setting isn’t there. And I’d be like, “I know it is.”
29:40 – Mishaal Rahman: Speaking of hallucinations like as you mentioned the training database cutoff is one problem. But potentially another problem would be if in the future they evolve this feature to where… instead of just showing you where the setting or the feature is in your Google Account settings it offers to change it for you. Say for example, “oh can you turn off location tracking” or like the ability to save my account data or like have my account data deleted after every three months, right? Certain things you can toggle in your Google Account settings. If in the future they build this feature to where Gemini offers to do it for you, can you trust that it’s doing the right thing? Can you trust that it’s always like saying okay I took care of that I changed it for you. And then later you go and find out it didn’t do that. You know you’d be pretty pissed off because you trusted this chatbot built by Google to change the setting in your Google Account settings and it didn’t do that. So like can it be trusted to do that and will it do that faithfully and consistently all the time for every user? So yeah I think Google would have to properly train Gemini on everything that the Google Account settings is capable of and keep it consistently updated and trained on any new features that are rolled out. And from what I can tell basically the Gemini that we see in most Google services is still like the generalized Gemini chatbot. It’s not like it’s a customized version for every single different Google service. It’s just like Gemini 3 Flash for everything like in if you’re talking to it in Google Docs, if you’re talking to it in Gmail for example. It’s still Gemini 3 Flash I think probably just with like a different system prompt but it’s training database I think is still the same generalized Gemini 3 you know whatever Gemini 3 was trained on. So yeah I can see there potentially being some problems. Hopefully Google figures out a way to kind of have each chatbot be tailored and personalized in terms of what it’s trained on for each different Google service to avoid the issues that Scott and I just brought up.
31:37 – C. Scott Brown: Yeah, and you brought up a really good point you know especially with when it comes to deleting your data. Like what if you tell Gemini every three months, “delete the previous three months of data?” So in other words you always have three months of data in your history. But if you go past three months then all that data is gone. What if you try to explain that to Gemini and Gemini just erases all your data? And then all of a sudden you know all your Google Account data is just gone. You know like that would be bad. You know like that would not be good.
32:10 – Mishaal Rahman: Or especially the Maps the location data. When that bug happened and people’s location history was wiped, people were pissed. Because they had that location data saved for like a decade or decade plus.
32:28 – C. Scott Brown: Yeah. Yeah yeah, wasn’t there a story recently where someone had like their entire hard drive wiped by the agentic AI system or something?
32:35 – Mishaal Rahman: Yeah yeah. Happens all the time.
32:37 – C. Scott Brown: Yeah. That’s not good. That’s really scary you know so especially with something as important as your Google account data. So yeah I can see there being a lot of issues here but the core idea of it to me is like universal yes like please do this immediately. So I’m very excited to see this come to form. But yeah maybe Google will just keep it simple. Maybe Google just be like you can now query Gemini and it just gives you a link. It just says click this link and you get taken to the page and then from there you’re on your own. That might be the easiest, simplest, safest way to go rather than you know what we’re describing where Gemini is kind of doing the work for you. But yeah I’m really excited for this because yeah like you’re saying like people who are older less tech savvy, you know that this will be really helpful for them.
33:26 – Mishaal Rahman: I’m betting that when this does roll out, it’ll initially just be looking things up for you and pointing you to the right direction in terms of what page to go to change whatever you’re asking it about. But eventually it will probably start doing things for you. Because if you don’t know the big hot buzzword in the field of AI this year is agentic AI. And of course you know this is one of those things where you ask an AI to change something for you, that’s one of the key ways that agentic AI works. You ask it to do something and it’ll do it for you. So I do think we’ll probably eventually have this feature start changing Google Account settings for you but we’ll have to wait and see.
And we probably don’t have to wait long because Google has been doing a lot of cooking this month you know. As before they let everyone go for the holidays, they are pumping out as much stuff as they can in terms of new Gemini features. Like we got just in the past couple of weeks Nano Banana Pro, Gemini 3 Pro, Gemini 3 Flash. We got so much stuff like new updates to Gemini every single week basically and I wouldn’t be surprised if everything we talked about just now rolls out before the end of the year because that’s how fast they’re moving.
34:34 – C. Scott Brown: Got to get it all done before everybody goes away for the holidays.
34:37 – Mishaal Rahman: Yeah. But uh if they do end up rolling all this stuff out we unfortunately won’t be able to tell you about it on this podcast because we’re not returning by the end of the year. We’re not going to have an episode as I mentioned next week or the week after that. But if they do roll out you can of course catch all those stories on androidauthority.com because you’ll definitely find out whenever any of these features do roll out. But for now that’s everything we’ve got for you this week. You can find links to all the stories mentioned in this episode down in the show notes and you can find more amazing stories to read over on androidauthority.com.
35:08 – C. Scott Brown: Thanks for listening to the Authority Insights Podcast. We publish every week on YouTube, Spotify, and other podcast platforms. You can follow us everywhere on social media @AndroidAuthority and you can follow me personally on Instagram, Bluesky, and my own YouTube channel at @CScottBrown.
35:25 – Mishaal Rahman: As for me, I’m on most social media platforms posting day in and day out about Android. If you want to keep up with the latest in Android news, be sure to follow me on X, Threads, Mastodon, or Telegram at Mishaal Rahman. Thanks for listening.


