[Applause] Good morning and welcome to build. And here we are uh in 2025 building out this open aentic web at scale. And we're going from these few apps with you know vertically integrated stacks to more of a platform that enables this open scalable agentic web.
More importantly, you know, it's all about expanding that opportunity for developers across every layer. Uh we have a bunch of new updates we're rolling out at build starting with Visual Studio. It is the most powerful IDE for .
NET and C++. Uh make and we're making it even better, right? NET 10 support, a live preview at design time, improvements to Git tooling, a new debugger for crossplatform apps, and much much more.
And when it comes to VS Code, just a couple of weeks ago, we shipped we shipped our 100th release in the open. It included improved multi-wind support and made it easier to view stage directly from within the editor. And GitHub continues to be the home for developers.
GitHub enterprise has tremendous momentum in in the enterprise. And we're doubling down for developers building any applications. Trust, security, compliance, auditability, data residency are even more critical today.
As GitHub copilot has evolved inside VS Code, AI has become so central to how we code. And that's why we're open sourcing C-pilot in VS Code. We're really excited about this.
Starting today, we will integrate these AI powered capabilities directly into the core of VS Code, bringing them into the same open-source repo that powers the most world's most loved uh dev tool. In fact, we're building app modernization right into agent mode, right? So, Copilot now is capable of upgrading frameworks like a Java 8 to 20, Java 21 or .
NET 6 to 9 and migrate any on premise app to the cloud. And the next thing we're introducing is an autonomous agent for site reliability engineering or S sur. The S sur agent starts automatically triaging root causing mitigating the issue and then it logs the incident management report as a GitHub issue with all the repair items.
Uh and from there you can even assign the repair items uh to GitHub copilot. a full coding agent built right into GitHub, taking C-pilot from being a pair programmer to a peer programmer. You can assign issues to Copilot, bug fixes, new features, code maintenance, and it'll complete these tasks auto autonomously.
And today, I'm super excited that it's now available to all of you. I don't think since teams launched we've had an update of uh this uh level and it really brings together chat search notebooks create and agents all into this one scaffolding that's intuitive right I always say this is the UI for AI and chat for example is grounded both on web data as well as your work data and that's the game changer especially with pages uh search works across all of your applications, right? Whether it's Confluence or Google Drve or Jira or Service Now, not just M365 data.
Uh with notebooks, I can now create these heterogeneous collections of data, right? In fact, I can have chats and pages and any documents, emails all in that collection. Um and then in fact, I can get all these audio reviews or podcasts out of it.
Um, you know, I can use create, you know, to turn a PowerPoint into a new explainer video or generate an image. Um, and when it comes to agents, we have a couple of special agents like researcher, right? It's been perhaps the biggest game changer for me because it's synthesizing across both the web and enterprise sources, right?
Applying deep chain of thought reasoning to any topic or any project. Uh, analyst goes from raw data across multiple source files. I can just upload a bunch of Excel files.
It will g get the insights. It'll do forecast. It'll do all the visualizations.
All of the agents you build can now show up in Teams and in Copilot. And you can ask questions, assign action items, or kick off a workflow by just at mentioning an agent in a chat or meeting. And with the team's AI library, building multiplayer agents is easier than ever.
it now supports MCP and with just one line of code you can even have it create enable A2A uh and you can add you know things like episodic or uh semantic memory by using Azure search and a new retrieval system which I'll talk about later and as a developer you can now publish and this is the biggest thing right now you can build an agent you can publish your agent to the agent store and have them discovered and distributed across both copilot and teams providing you access to the hundreds of millions of users and unlocking that opportunity. Today, we're introducing a new class of enterprisegrade agents you can build using models fine-tuned on your company's data workflows and style. We call it co-pilot tuning.
You know, Copilot can now learn your company's unique tone uh and language, and soon it'll even go further understanding all of the company's specific expertise and knowledge. All you need to do is seed the training environment with a small set of references and kick off a training run. The customized model inherits the permissions of all the source control.
uh and once integrated into the agent, it can deploy to authorized users. Our new model router will automatically choose the best OpenAI model for the job. No more sort of those, you know, manual model selections.
Uh an approach today though goes from having apps that you built or agents you build only bind to one model to truly becoming multimodel. Uh that's why today we're thrilled to announce Grock from XAI is coming to Azure. When you have multiple models, what you need is a new capability in how you use these models.
And now you can provision throughput once on Foundry and you can across you can use all that provision throughput across multiple models including Grock, right? That's just a gamecher in terms of how you think about uh models and model provisioning and the foundry agent service lets you build declarative agents in fact just with few lines of code just on the port in the portal uh for complex workflows it supports multi-agent orchestration uh and I'm excited to share that now the agent service is generally available and we're making it straightforward for example for you to connect Foundry to your container app or functions uh and deploy deploy any open-source model into AKS uh whether it's in the cloud or in hybrid mode with ARC and you can now take a model uh fine-tune it in or post- trainin it uh in Foundry and then drop it right into Copilot Studio so that you can now use that post-train model to automate a workflow or build an agent. This healthcare agent orchestrator that Stanford used is now available to everyone in Foundry.
It's pretty awesome. You know, we now have new observability features coming to Foundry to help you monitor and manage AI in production. Uh you can track the impact, quality, safety as well as cost all in one place.
Uh with Entra ID, agents now get their own identity, permissions, policies, access controls. uh the agents you build in Foundry and Copilot Studio show up automatically in an agent directory in Entra. Uh we're also partnering with Service Now and Workday to bring automated provisioning and and management to their agents via Entra.
And when it comes to data governance, Purview now integrates with Foundry, right? So when you write an agent, automatically because of Purview, you can ensure end-to-end data protection. Another massive safety consideration.
Uh and on the security side, uh defender now integrates with Foundry. So that means uh your agents are also protected just like an endpoint would be from threats like wallet abuse or credential theft by with defender. Now we want to bring the power of this app server and app building capability to the edge and clients as well with Foundry local uh which we're announcing today.
You know it includes a fast higherformance runtime models agents as a service uh and a CLI for local app development. And yes it's fully supported on Windows and the Mac. We're excited to announce the Windows AI Foundry.
You know, Windows AI Foundry is what uh we used in fact ourselves internally to build features SDK and now we're extending this platform to support the full dev life cycle right not just on co-pilot PCs but across CPUs, GPUs, NPUs, all and in the cloud, right? So you can build your application and have it run across all of that silicon. And Foundry local is built into Windows AI Foundry.
So you can tap into this rich catalog of these pre-optimized open-source models that you can run locally on your device. We're announcing native support for MCP in Windows. Windows will now include several built-in MCP servers like file systems, settings, app actions, as well as windowing.
Uh, and we're adding native MCP registry that lets MCP compatible uh, client discover the secure MCP servers that have been vetted by us for security performance all while keeping you in control. We first announced Bash on Ubuntu on Windows nearly 10 years ago. Uh it subsequently became what we obviously call, you know, today WSL.
Today we're making WSL fully open source. And so we're announcing today, and you all should go check out the code in the GitHub repo, uh NL web. It is a way for anyone who has a website or an API already to very easily make their website or their API uh an Agentic application.
We're in integrating Cosmos DB directly into Foundry. So that means any agent can store uh and retrieve things like conversational history. Um and soon they'll be able to do uh also use Cosmos for all their rag application uh needs.
uh and we're taking it further uh with Azure data bricks uh connecting your data in genie spaces or in AI functions to foundry. The other very cool capability is now inside of a Postgress SQL query, you can have LLM directly, you know, LLM responses directly integrated. Uh we're bringing Cosmos DB to fabric too, right?
Because AI apps need more than just structured data. Uh they need semi-structured data whether it's text, images, audio. And with Cosmos and Fabric and your data instantly available alongside SQL uh you can now unify your entire data estate and make it ready uh for AI.
And uh there's a lot more. In fact, we are even building our digital twin uh builder right into fabric. Uh now you can you know very easily take digital twins with no code, low code.
Uh as you can see here, you can map the data from your physical assets and systems super fast. Uh we're also announcing you know shortcut transformations in one link. You can think of this as AI uh driven ETL.
You can apply all these pre-built AI powered transformations you know audio to text or sentiment analysis when it's data is coming in summarization uh all powered by foundry straight into fabric. So in fact the largest GB200based supercomputer is going to be Azure. And so we're very very excited about scaling this um and making it available to all of you as developers.
We're bringing together the entire stack I talked about today um and saying look let's apply it to science and the scientific workflow the scientific process uh that's our ambition with Microsoft discovery which we are announcing today. It understands the nuance knowledge in the scientific domain right from public domain as well as your own data if you're a biioarma company discoveries built on foundry bringing advanced agents uh highly specialized in R&D not just for reasoning but for conducting research itself. It's great to see how companies across every industry are already using uh discovery to accelerate their R&D and I can't wait to see this in the hands of more R&D labs all over uh and what they can do.
So that was you know a quick comprehensive whatever you want to call it walk uh through the full stack and how we're creating new opportunity for you across the agentic web. uh we're talking we're taking really a a systems approach a platform approach which you can expect from Microsoft across every layer uh of the stack. back.