2025 AI : 10 Things Coming In 2025 (A.I In 2025 Major Predictions)

16.45k views6512 WordsCopy TextShare
TheAIGRID
Join my AI Academy - https://www.skool.com/postagiprepardness šŸ¤ Follow Me on Twitter https://twitt...
Video Transcript:
in this video I've scoured through everything that I could personally find that I know is going to be there in 2025 so let's not waste any time and jump into exactly what is going to happen in AI so one of the first things that is going to happen is of course agents now with agents one of the biggest companies in the world in terms of AI is open Ai and they are going to launch an AI agent in 2025 called an operator agent and this is going to be an agent that can use a computer to take actions on a person's behalf such as writing code booking travel according to people familiar with the matter this is something that was leaked via the information which is a very reputable Source in AI so it's quite likely that we will get this in 2025 so now that we know opening eye is going to produce this agent we do know that this is going to be something that is probably of a high quality many other companies have already rushed out the AI agents but opening eye is likely waiting on something that will blow us away now if you're wondering on a timeline for this a agent I did make a video when this information was released but the cliffnotes version of that is that in a staff meeting on Wednesday when it happened open a basically announced that this is going to happen in January now with the AI industry timelines do differ due to a variety of different factors but essentially we could be experiencing this in just a month so I won't be surprised if next month we do get some kind of research preview that opening ey demos to us and shows us what an agent is capable of doing but I do know as well that it could be February since AI agents have been a bit tricky and we haven't got them to work completely reliably now if you're wondering what other kinds of Agents open are going to be releasing in 2025 we also have the fact that they are going to be releasing several related agent research projects and another one is not just one that works on your computer but it's going to be one that execute tasks in your web browser so this is going to be something that you have in your browser and it's going to be able to do things for you and I've already seen examples of web browser agents that I'm going to cover in a moment and samman has actually spoken about agents in 2025 and why he believes they will change everything agents are the thing every is talking about I think for good reason you know this idea that you can give an AI system a pretty complicated task like a kind of task you'd give to a very smart human that takes a while to go off and do and use a bunch of tools and create something of value um that's the kind of thing I'd expect next year and that's like a huge deal we talk about that like oh you know this thing is going to happen but that's like that if that works as well as we hope it does that can that can really transform things now he actually also gives us a very interesting demo not really demo but more of the sort of example that we're going to getting with AI agents in this extended talk on his channel maybe I can give the following example when people talk about an AI agent acting on their behalf uh the the main example they seem to give fairly consistent is oh you can like you know you can like ask the agent to go book you a restaurant reservation um and either it can like use open table or it can like call the restaurant or or whatever and you know it's like okay sure that's that's like a mildly annoying thing to have to do and it maybe like saves you some work one of the things that I think is interesting as a world where uh you can just do things that you wouldn't or couldn't do as a human so what if what if instead of calling uh one restaurant to make a reservation my agent would call me like 300 and figure out which one had the best food for me or some special thing available or whatever and then you would say well that's like really annoying if your agent is calling 300 restaurants but if if it's an agent answering each of those 300 300 places then no problem and it can be this like massively parallel thing that a human can't do so that's like a trivial example but there are these like limitations a human bandwidth that maybe these agents won't have the category I think though is more interesting is not the one that people normally talk about where you have this thing calling restaurants for you um but something that's more like a really smart senior coworker um where you can like collaborate on a project with and the agent can go do like a two-day task or two we task really well and you know ping you at when it has questions but come back to you with like a great work product now if you want to know about that voice agent that he was actually talking about they actually did a very short demo this is something that you probably didn't see because number one like I didn't even cover this and no other channel has really covered this because this was a private demo that open ey actually did but it actually showcases what open ey's agents are going to be doing in 2025 where they're going to be able to call different stores and do things on your behalf so you can see right here you can say hey can you call these stores and this assistant is going to be able to do many different things for you okay calling now hello is this the tech store in L my name is a and I'm calling on behalf of a customer yes that's the texor in L FR how can I help you great I'm calling to check if you have a 1 12 and flowy emerald in stock our customer Kia is interested in picking it up today oh actually we do have one plus 12 but we don't have that c it I'm sorry a thank you and what time do you close today Kia will likely come by later to pick up the phone if she's happy with the color yeah I mean we don't have Emerald but we do have silver and and black so uh yeah I guess she can come up uh we close at 700 p. m. perfect thank you I'll let Kya know that you have silver and black available and she can come by before 7 p.
m. thanks for your help and have a good day thank you you now they aren't the only people that are doing this you can see that Google have actually showcased their customer agent demo and I do suspect that this will likely be rolled out in the later part of 2025 just launched a customer agent and it leverages Gemini and Vector search to deliver a seamless shopping experience what can we help you find well I'd like that shirt but I guess I have a few other specifications as well so find me a checkered shirt like the keyboard player is wearing I'd like to see prices where to buy it and how soon can I be wearing it going to include the video now the customer agent is using Gemini's multimodal reasoning to analyze the text and video to identify exactly what I'm looking for then Gemini turns it into a searchable format okay how cool is this it found the checkered shirt I'm looking for right and some other great options in no time and that's because these results harness Google's trusted search Technologies which ensures customers like me get the right results in record time the suggested products are grounded in simple Fashions inventory and historical performance data to make sure customers leave happy and with that purchase in hand okay the first one is perfect it looks like I can have it delivered in 4 days or pick it up nearby today like I said I want to be wearing it tomorrow night so I'm going to go with the local store to be safe of course it never fails they only have three left in my size I don't want to miss out on wearing this shirt so I'm going to go ahead give the store a call and ask them to set it aside for me but first let me tell you what's happening behind the scenes symbol Fashions customer agent is using Google Cloud's full Suite of AI capabilities to offer customized support interactions you know facilitate transactions like purchases and returns and ensure that I'm receiving the most upto-date information in real time I'm so close to having this shirt for the concert let's give the store a call hi there this is the symbol fashion customer agent at South Las Vegas Boulevard am I speaking with Amanda yes this is Amanda great thanks for reaching out Amanda I see you had a session on another device I've sent you an SMS message with a link to our live chat companion if you would like to switch to chat please click the link how can I help you today I'd like to purchase the shirt in my cart with the cart I have on file absolutely I see you're also a symbol fashion Rewards member looks like you have a 20% off voucher available to use would you like to apply it to this purchase yes please that would be great the shirt you're purchasing goes well with these items also available for pickup in your preferred size would any of these be interesting to you absolutely please add the white shirt and the boots to my cart great your total is $23. 76 okay to proceed with the cart on file yes your purchase is confirmed do you need anything else today no I'm all set thank you incredible so when we actually take a look at browser agents Google actually did release this thing called the search in research and this is just something that's absolutely insane people are still sleeping on this incredible AI tool this is called Gemini deep research essentially it allows an AI agent to browse millions of different websites for you and create a full entire research report this is something that I've been using every single day and it is something that is incredibly underrated but I do suspect that in the future research is going to be done by AI agents that are able to browse millions of different websites and then you're going to come back to your computer and you get a ton of information that you can easily sift through or just simply sort via whatever it is that you do want so this is something that I am currently using now if you do want to know exactly the kinds of you know AI agents that are going to be there and the other kinds of AI agents Google released something called project Mariner now project Mariner is essentially a prototype experiment and in this video you're going to be able to see exactly what that does because it's just absolutely insane basically you have a one-click tool it says project marinina and you can basically do a variety of different tasks in your browser so I'll let Google take it away for the next two minutes and then we can get on to the next point today I want to tell you about project Mariner it's a research prototype exploring the future of human agent interaction and it's built on Gemini 2.
know like with all new technology it's important for us to build this responsibly which is why we're starting small we'll be getting feedback from a group of trusted testers and using their experiences to really shape how project Mariner evolves let me show you how it works so project Mariner works in the browser as an experimental Chrome extension I'm going to start by entering a prompt here I have a list of outdoor companies listed in Google Sheets and I want to find their contact information so I'll ask the agent to take this list of companies then find their websites and look up a contact email I can use to reach them this is a simplified example of a tedious multi-step task that someone could encounter at work now the agent has read the Google sheet and knows the company names it then starts by searching Google for Benchmark climbing and now it's going to click into the website you can see how this research prototype only works in your active tab it doesn't work in the background once it finds the email address it remembers it and moves on to the next company at any point in this process you can stop the agent or hit pause what's cool is that you can actually see the agent's reasoning in the user interface so that you can better understand what it is doing and it will do the same thing for the next two companies navigating your browser clicking links scrolling and recording information as it goes you're seeing an early stage research prototype so we sped this up for demo purposes we're working with trusted testers to make it faster and smoother and it's so important to keep a human in the loop after the fourth website the agent has completed its task list out the email addresses for me to use and there you have it we're really just scratching the surface of what's possible when you bring a gantic AI to computers and we're really excited to see where this goes next now there was also this very interesting demo that there wasn't too much revealed about but this was the runner H agent so this is the kind of agent that can scan the web and do a variety of different tasks for you this is a really really interesting AI agent because it is currently state-of-the-art in terms of the things that it can do so I will be very intrigued to see how the other labs managed to outperform this current AI agent because they did you know just come out of stealth and just drop a bomb and you can currently sign up for the wait list which I currently have so I'm going to be pretty excited to see if this manages to compete with any of the competing Labs I do want to see if it manages to use some of the proprietary models it currently according to my research is used I think a 2B model and a 3B model so it's going to be super super fascinating to see how they managed to execute on this and they can scan the website and just do a billion different things so internet users are going to get really really powerful for example another thing that I think is going to be doing really well is this kind of workflow agent so these kind of workflow agents are essentially where you connect a different number of apps together and this one is one that I built to be able to actually get business details so you can see get business details it runs through a few apis and actually uses Claude to be able to store all those business details in a nice organized table and this is basically something that I'm building for my community cuz someone wanted to have an automated way to actually get leads such as websites and of course they wanted all the contact info to be able to be automated just an automated way to grow your business and find leads and I feel like stuff with that in 2025 is really going to becoming even more commonplace of course if you want something like that don't forget to check out the agent workflow section of my community this is where you can see I've got ready to use a agent workflows automate your income streams and of course scale your business then of course what we do have is physical AI now physical AI will be one of the biggest themes for 2025 and this is because many different companies are going to be really really pushing the boundar of what it comes to when they are going to be making these robots super humanlike and realistic the reason being there are a variety of different advances that now make it easier to develop humanoid robots in the sense that you can make them a lot more realistic there's a lot more environments that you can test things in and we're likely going to see a real increase in the number of humanoids around and these things are going to be doing a lot of human work now if you're wondering what I'm talking about if you aren't familiar with the Boston Dynamics demo this was something that I don't want to say it broke the internet but I don't think people really truly realize what this meant for society I've already seen several companies and countries having you know their humanoid robots but this was one of the first ones where we actually got to see a fully humanoid robot that was basically doing something autonomously and not only was it doing things autonomously it was even sometimes making its own mistakes and then correcting its own mistakes and being able to continue now this is of course what many companies are trying to do companies like Tesla and figure these companies are really trying to ensure that they can you know expand the workforce by building automated workers and these kind of robots are going to be the kind of employees that we're going to be seeing every year increase in terms of the sheer numbers of these individuals so it's going to be really interesting in 2025 because it's quite likely that you know some other companies are going to come out of the Woodworks and show us the kind of humanoids they've been working on and it's quite likely that we're going to see more and more extended demos of these robots being able to perform certain tasks and of course be able to do them really really effectively like what we see here the robot manages to make a mistake it's like wait a minute what just happened and then it manages to fix this now this also is a completely electric robot so this is something that is 100% electric which means of course it's better for the environment of course it's a lot you know lightweight as well it's a little bit more lightweight it's like bulky and hydraulic like the previous version of um you know Atlas which is what we had before but something like this is going to be super super interesting because I think once we start to get you know a sheer scale of these robots that are able to do a lot of human tasks it's going to really start to boost the economy and I think certain things will change now I think we won't probably get like robots that are going to be completely working in factories I mean there's definitely going to be some that are working in factories but I don't think the full scale comes to the later years but I will expect some really really really impressive demos and one of the companies that managed to do something really incredible was this company right here of course you've got 1X robotics they've been able to make this robot that is remarkably effective at doing humanlike motions it's really smooth it's really effective this is something that is really really cool that a lot of people have seen we've seen them testing their robots and we do know that some major updates are coming in 2025 and this robot AI startup physical intelligence actually raised $400 million from Bezos and open a ey and that was because they built um this first Foundation model for robotics that is essentially called Pi zero and this is where they have a generalist policy that is able to do things autonomously so these robots just literally take two arms and they're able to do a variety of different things completely autonomously which is pretty crazy when we take a look at what it's able to do it's able to fold clothes it's able to you know um pack a cardboard box it's literally able to do a variety of different things that are really really incredible and honestly looking at it you wouldn't think that this is autonomous you would think this is teleoperated but no this is just a generalist policy that is simply literally automating this completely completely autonomous so previously when people were you know saying ah this demo is not autonomous it's not autonomous why do we even care well this is actually autonomous so this is something that you know you should probably be paying attention to because it's like this is where the real Innovation is coming from and where things are really starting to change so this is one of those things where I think most people like I said before aren't paying attention to need to be paying attention to and it's quite likely that future iterations of their generalist policy like Pi 2 pi 3 are going to be rapid rapid improvements and when I watched this entire video I made an entire video on it I was quite surprised that this was the first time that we've seen this and the video didn't even I don't even think it got like more than 20,000 views but I think you know later on in the year we started to see more robotics demos because their policies actually do generalize to other robots which is what we've seen with other robots so I'm might include some of those demos here but it's quite likely that you know robots are going to get a huge boost now that's also because of Genesis which is that robot simulator thing which essentially it manages to map reality not actual reality but I guess you could say it simulates reality and because it can simulate reality really accurately that means that it's able to literally be able to be trained in that simulated reality and then when it's actually imported into the real world that means that it's going to be able to perform a lot more effectively so that is why I think this Genesis thing is going to speed up everything by 10-fold and that is why I think then in 2025 we're going to have tons of robotic demos that are just super super surprising that are probably going to blow some people's minds we've already seen robots work really effectively we've already seen robots be able to be completely autonomous and do tasks that humans would already do so it wouldn't be surprising if of course now we do get those demos that are going to be very very effective and here we have the head of Microsoft AI actually talking about how we are going to be getting infinite memory which is essentially where these models that we do have are essentially now never going to forget anything and this brings up the very interesting debate if a model if an llm if an AI system never forgets anything you tell it why would you switch to another AI system so I think companies are rushing to get out this you know unlimited memory kind of thing and it's going to be really important but he literally says here that this will come online in 2025 uh memory is the critical piece because today every time you go to your AI you know you have a new session and it has a little bit of memory for what you talked about last time or maybe the time before but because it doesn't remember the session five times ago or 10 times ago it's quite a frustrating experience to people because you don't go and invest deeply and really share a lot and really uh you know look to build on what you've talked about previously because you know it's going to forget so you sort of tap out after a while and it turns you know into a shallower experience but we have prototypes that we've been working on that have near infinite memory and so it just doesn't forget which is truly transformative I mean you talk about inflection points memory is clearly an inflection point because it means that it's worth you investing the time because everything that you say to it you're going to get back in a useful way in the future um you know you will be supported you will be advised it will take care of you know in time you know planning your day and organizing how you sort of live your life um so it's that capability alone which I expect to come online in 2025 is is going to be truly transformative and for those of you who think that that just might be one clip he actually said this twice in another interview he said like memory is going to be done like we will nail memory in 2025 what's interesting about these I'm really confident 20 25 memory is done permanent memory I mean if you think about it we already have memory on the web we retrieve from the web you know all the time quite what's interestingly as well we also have another employee an open ey employee or member of technical staff I'm not entirely sure about this employee that they are quite shrouded in secrecy but we can see right here that they've posted soon cuz someone actually posted say $200 a month and no increase in user memory is whack what's the outlook on that and you can see here he says soon we're going to have infinite memory with this infinite icon right here so it's quite likely that you know um we're going to get infinite memory very very soon it's just a matter of when they decide to roll that out now if we take a look as well there are going to be more thinking models thinking models are models that are a lot more smarter now for most people you won't actually realize this at all you won't really know what's going on and I say this because currently the 01 models from what I've seen they're only really applicable to those who are in you know some form of really extreme Academia like mathematics ICS physics science coding anyone that's at that real level is going to be getting value from it so for those of you who aren't at the area where you know you're going to be using these thinking models then of course this is going to be something that isn't of value to you but it is something that is essentially the new paradigm in 2025 simply because of course the fact that like the GPT series is a little bit slowing down but the only thing with that is that because they're slowing down they're of course finding new ways to scale these models and make them smarter and of course they are doing that with the 01 series and thinking model you can see right here that Google is of course adding their weight when it comes to the thinking models they are deciding to introduce Gemini 2. 0 flash thinking an experimental model that explicitly shows its thoughts and this is something that is really cool because it's something that is a little bit smarter than the other models because essentially rather than just responding instantly it thinks through its problems then searches through them and then comes up with a response now I actually did test this myself and I decided to ask the model how to solve the reman hypothesis in 2025 give hypothetical of course I don't think it's going to be able to do that in 2025 but you can see if you open it up you're going to be able to see exactly what the model is thinking and how it got to its response now I personally think that in 2025 this is going to be valuable because sometimes when you input a prompt and you see that the model makes a mistake you can kind of look back at the steps and Trace where it made a mistake with your response in the output which is of course really useful but I think something like that is going to be super super valuable now when we actually also do take a look at how the iteration Cycles are going to come we also do need to look at the fact that there are going to be an incredible level of progress so you can see right here that this person at opening ey talks about how o03 is very performant and the progress from 01 to 03 was only 3 months meaning that from 01 to the O2 which is essentially their second iteration of the model they only had three months of time which shows you how fast the progress is going to be in the new paradigm of Chain of Thought to scale inference compute which is way faster than pre-training Paradigm of a new model every 1 to two years so previously they would have to you know do that thing where they spent so much time pre-training so much time Gathering data so much time anzing the data of course then of course fine-tuning it to specific toas domain and now all of that just takes three months because they are no longer doing that anymore and this is a simply faster way to scale and this is all explained here we actually see models improve faster in the next six months to a year than we saw them improve in the last year because there's this new axis of synthetic data generation and the amount of compute we can throw at it is we're we're still we're still right here in the scaling law right we're not here we haven't pushed it to billions of dollars spent on synthetic data generation functional verification reasoning training we've only spent Millions tens of millions of dollars right so what happens when we scale that up so there's there is a new axes of spending money and then there's of course test time compute as well I spending time at inference to get better and better so it's possible um and in fact many people at these Labs believe that next year of gains or the next six months of gains will be faster because they've unlocked this new axis through a new methodology right and it's still scale right because this requires stupendous amounts of compute you're generating so much more data than exist on the web and then you're throwing away most of it but you're generating so much data that you have to you have to run the model constantly and you can see the previous state-ofthe-art when we look at the Benchmark that is Frontier math you can see Epoch AI on Frontier math is at 2% and then we can see that the second iteration of the model and I know it says 01 to03 and that might confuse some people thinking that you know 01 is the first model 03 is the third model it's just a naming thing 03 is actually the second model which of course is confusing but just trust me um you can see right here that this is where we actually get that result from and I think that this is something that is really really interesting because this model was able to get 25% on a test that it hadn't really seen before which is one of the first times that we have seen that and this is kind of you know the first inclinations that we're really really on a new level when it comes to AI so this is something that I think is going to be really really interesting and I'm going to be you know quite excited to see how this actually changes everything now of course as well I do think that this is going to happen AI movies previously we've seen some incredible incredible progress in terms of the rate of production in terms of the quality of video clips from AI models this clip right here this is from Jason Sahara this was absolutely incredible every shot was done via text to video with Google V2 if you haven't seen V2 it's honestly incredible this guy basically made an AI short film he generated a bunch of different clips stitched them together of course he choreographed this short film but I want to you know talk about how good this is in terms of comparing it to previous AI models and we do know that like this is just the best that you can get right now and as the old saying goes in AI this is the worst it will ever be so if this is the worst AI video will ever be we can imagine what it's going to look like at the end of 2025 when there are going to be vast amounts of improvements it's quite likely that we're going to get you know 1080p 4K resolution a lot of more consistency you know things not morphing at all I mean right now it just looks absolutely incredible and the quality probably doesn't even look that good because I've downloaded this clip so I mean when you watch the original which is linked in the description you're going to see just how good this thing really really is so that is something that you need to check out as well now of course as well there is going to be AI Hardware so in late 2025 I do think we're going to get an explosion in AI Hardware that's of course because if agents manage to work then AI Hardware will manage to work and essentially with that what we do have is a situation where you know we've already seen that Joanie I has confirmed that he's working on a new device with open AI you can see that the confirmation comes in major New York Times profile about what I've been up to since leaving Apple so one of the key designers of apple has actually left and has gone to make something in himself but of course now you can see that with that that actually now means that you know he's working at open Ai and of course he's building some kind of AI product so I will be intrigued to see exactly what kind of thing he is doing and of course since he's working with open AI it's going to be some kind of physical AI product whatever it is you know whatever product it is that they are working on so this is going to be something that essentially takes off when we have the agents worked out I'm not entirely sure how long it's going to take to get agents really running smoothly there are a few Kinks that you know some of them did say could take until 2026 but I do know that it is quite likely that we will get AI Hardware products and that is not just because of open AI but of course as well we we've seen that Amazon's new Alexa Voice Assistant will use Claude AI now currently we haven't had any major updates just yet but when this thing does come out it is quite likely that we will get a situation on our hands where we will have a model that is rather effective okay and Amazon Alexa has taken kind of the backseat in AI because that was one of the you know first consumer AI products that people had around in their homes and it was kind of like the advanced voice mode of the decade but now that is going to be using Claude Ai and it's going to be able to be you know Powerhouse and they're literally calling it remarkable Alexa in in here uh and it did expect to launch in October and require subscription fee but we haven't actually you know had anything from that just yet so I will be intrigued for more voice assistant I do think that that will become in common place I'm not sure why just yet but I do expect that in 2025 AI Hardware might be a trend that some of these companies take advantage of and you know I actually recently looked at this website called friend. com and I think that AI is going to become a little bit more human so one of the things I think as well is that right now whilst AI is I guess you could say it's a bit weird in the sense that like even when you talk to Advanced voice mode sometimes it's not human enough it just and I can't believe I'm saying that because it sounds remarkably human but like sometimes you'll know that it's robot but imagine an AI that calls you and messages you and you know has weird things and their own problems and stuff like that I think that is going to be a theme moving into 2025 late 2025 and this is you know one of the devices this is a friend.
Copyright Ā© 2025. Made with ā™„ in London by YTScribe.com