I'm Buying These 3 Stocks to Get Rich (Without Getting Lucky)

97.84k views2697 WordsCopy TextShare
Ticker Symbol: YOU
Access some of the best late-stage AI companies BEFORE THEY IPO with the Fundrise Innovation Fund: h...
Video Transcript:
artificial intelligence has a huge problem a single chat GPT query can take up to 10 times the electricity of a Google Search and while power demand from data centers has already doubled Over The Last 5 Years Goldman Sachs predicts that it'll grow by another 160% by 2030 so in this episode I'll highlight a few companies that are tackling this exact problem positioning them to win big no matter which AI companies come out on top making them a great way to get rich without getting lucky your time is valuable so let's get right into it first things first I'm not here to waste your time so here's exactly what I'm going to cover I'll go over the big Power problem that AI is causing right now I'll talk about Verve which is a company that provides power and thermal Management Solutions for data centers broadcom which designs custom power efficient AI chips for Tech giants like Google and meta platforms and Marvel a company that also makes efficient AI chips and switches for data centers it's important to understand just how much power AI is projected to consume over the next few years so let me break that down first a single chat GPT prompt costs 2. 9 wat hours that's like keeping a 5w LED bulb on for a little over half an hour while using AI to generate a single image can cost as much electricity as charging your phone compare that to a Google search which also takes in a query and returns text and images but only uses around 0 . 3 wat hours in the process a couple of wat hours may not seem like much but there are roughly 9 billion Google searches every day and if we move them all to generative AI it would take on the order of 10 tratt hours more to serve all of those requests that's enough electricity to power almost a million homes for an entire year but this is actually a bad comparison because people don't use generative AI tools the same way they use Google for example chat GPT tends to be more of a dialogue between the user and an AI model that can really rack up the number of queries compared to a Google Search and that's just the inference side of the story training and retraining large AI models is very energy intensive too especially when we're talking about trillion parameter models for example GPT 4 took over 50 gwatt hours to train or about .
2% of the electricity generated by the entire State of California over a year as an investor this worries me for three reasons first the amount of compute needed to train AI models has been doubling roughly every 6 months talk about exponential growth second that gets multiplied by the number of foundation models being trained which is also growing exponentially and third even though you can use chat GPT almost anywhere in the world it consumes energy only at the server's location energy accounts for up to 70% of a data Center's total cost cost of operations so the hardware and the racks how the data center facility is designed and weighed out and even the age of its local power grid all really matter and by the way the average age of the US power grid is around 40 years old with over a quarter of the grid being 50 years old or older and AI only makes this problem worse for example nvidia's previous generation a100 gpus use about 400 watts but the current generation of Hopper gpus run at 700 Watts that's almost twice the power and four or five times the power of CPU based servers it's worth noting that the h100 gpus are up to nine times faster at AI training and 30 times faster for inference over the A1 100s so the power efficiency of nvidia's gpus is going way up with every generation but Power demand is going up faster so it takes more than just high performance gpus to solve this problem let's start with cooling since that accounts for up to 40% of a data Center's energy us which means 28% of the total cost of operations ver of Holdings ticker symbol VRT provides power and thermal Management Solutions for data centers like their libert liquid cooling systems these systems are built specifically for highdensity deployments like the ones that power intense AI applications providing cold plates and directed chip Cooling in a way that integrates with existing data center infrastructures which is a big deal because around 90% of all server racks are air cooled today in fact most data centers even run their h100 chips at low enough power so they can be air cooled so a lot of them will need to make massive infrastructure changes to support direct to chip liquid cooling for nvidia's upcoming Blackwell systems if they want to run those chips at Peak Performance that includes hyperscalers like Amazon Google and Microsoft all of which need to support power hungry AI workloads for thousands of other businesses speaking of which according to Market us the Global artificial intelligence Market is expected to almost 12x over the next 8 years which is a compound annual growth rate of 36. 8% but many of the companies building the next generation of AI applications are not publicly traded think about the 9s and early 2000s companies like Amazon and Google went public very early in their growth cycle but today companies are waiting an average of 10 years or longer to go public that means investors like us can miss out on most of the returns from the next Amazon the next Google the next Nvidia so I spent a lot of time digging into this and The fundrise Innovation fund is a great way to invest in some of the best tech companies before they go public venture capital is usually only for the ultra wealthy but fund Rises Innovation fund gives regular investors access to some of the top private pre-ipo companies on Earth without breaking the bank The fundrise Innovation fund also has an impressive track record already investing over $100 million into some of the largest most inem demand Ai and data infrastructure companies so if you want access to some of the best weight stage AI companies before they IPO check out the fundrise Innovation fund with my link below today all right so 90% of all server racks are air cooled today but industry estimates suggest that up to 80% of data center cooling will become direct to chip liquid cooling over time direct to chip liquid cooling is where a heat conductive copper plate sits on top of a chip chip just like a normal heat sink but instead of being air cooled by a fan the plate is connected to two pipes one pipe brings in Cool Water to absorb the heat from the plate and the other pipe moves hot water away direct to chip liquid cooling is up to 3,000 times more effective than air Cooling and better cooling means that servers can be stacked closer together without overheating every data center has a fixed amount of space so they need to optimize their cooling if they want to squeeze the most compute out of their entire facility as a result the liquid cooling market for data centers is expected to more than quadruple by 2030 which would be a compound annual growth rate of 27. 6% for the next 6 years and Verve definitely knows that according to their quarter2 earnings call they're on track to expand their liquid cooling production capacity by a whopping 45x over the course of 2024 Verve also doubled their production capacity for power management products over the last 3 years and they plan to double It Again by the end of 2025 all of these expansions should lead directly to more Revenue since Verve is currently limited by Supply not demand Verve had a$7 billion backlog of orders at the end of quarter 2 which was up 11% quarter over quarter and 47% year over-year Verve stock is already up around 120% year to dat and I definitely think they have plenty of room to run as the AI boom continues compute and connectivity are also energy intensive so let's talk about them next today Nvidia has a massive share of the data center GPU Market with estimates ranging from 92% all the way to 98% market share but gpus are not the only way to process intense AI workloads over the last 3 years I've spent a lot of time covering the custom chips used by Amazon web services Microsoft Azure and Google Cloud these custom chips are called as6 application specific integrated circuits and they do exactly what their name implies their design is tailored to a specific application which simplifies the Chip's architecture the result is a chip that can run a narrow set of workloads extremely efficiently at the cost of supporting fewer kinds of workloads than more General processors like gpus and CPUs so as Amazon Microsoft Google and their Cloud clients need more support for a specific kind of workload like running large language models synthesizing speech from text or generating images they could make a chip for that workload and free up their more expensive Nvidia infrastructure for other tasks all three hyperscalers are making big investments into their own semiconductor Supply chains to reduce their overall Reliance on Nvidia over time Amazon has their inferentia and tranium chips for AI inference and training respectively Microsoft has their Azure Maya accelerators and of course Google has their tensor processing units or tpus the demand for A6 is so high that even an Nvidia is building a new business unit focused on making custom chips for other companies which could help extend their Cuda ecosystem to new kinds of chips but Nvidia will have some serious competition in this space from rival companies like Marvel technology and broadcom so let's talk about them next broadcom is the leader of the Asic Market with a dominant 55 to 60% share and a major focus on AI and data center infrastructure broadcom co-designed the last six generations of Google tpus and that partnership got extended to the next generation of tpus as well which shows just how sticky these chip design relationships can be once they're up and running Google claims that their sixth generation Trillium tpus are 67% more energy efficient than their current generation with 4.
7 times more Peak compute performance and JP Morgan analysts estimate that broadcom's TPU program will generate $8 billion in Revenue in 2024 and another 10 billion in 2025 and that's just from Google's tpus broadcom is also behind every generation of the MTI chips which are meta's training and inference accelerators and broadcom's Ambitions for custom AI chips don't stop with Google or meta in July broadcom was rumored to be in talks with open AI to design as6 for them as well but who knows what'll happen now that open ai's Chief technical officer Mira Madi is leaving and open AI is becoming a for-profit company let me know if you want a separate Deep dive on all of that and the resulting drama but broadcom makes more than just custom chips for Tech Giants according to broadcom CEO Hawk tan more than 99. 5% of all internet traffic Touches at least one broadcom chip Jim I got to tell you in 99.
Related Videos
My Secret Plan to Get Rich Without Getting Lucky
16:59
My Secret Plan to Get Rich Without Getting...
Ticker Symbol: YOU
47,948 views
GET IN EARLY! My Top 4 Investments To Get Rich (Without Getting Lucky)
17:06
GET IN EARLY! My Top 4 Investments To Get ...
Ticker Symbol: YOU
240,303 views
IT'S OVER! The Shocking Truth Behind Nvidia Partner Super Micro (SMCI Stock)
17:11
IT'S OVER! The Shocking Truth Behind Nvidi...
Ticker Symbol: YOU
127,133 views
The Race to Harness Quantum Computing's Mind-Bending Power | The Future With Hannah Fry
24:02
The Race to Harness Quantum Computing's Mi...
Bloomberg Originals
1,659,719 views
Kamala Harris: The 2024 60 Minutes Interview
20:50
Kamala Harris: The 2024 60 Minutes Interview
60 Minutes
1,997,768 views
My Top 7 Stocks to Buy For Small Accounts (HIGH GROWTH)
17:49
My Top 7 Stocks to Buy For Small Accounts ...
Ticker Symbol: YOU
148,760 views
The M4 Generation is a bigger deal than you realize...
13:19
The M4 Generation is a bigger deal than yo...
Luke Miani
166,430 views
Where Are Laid Off Tech Employees Going? | CNBC Marathon
41:28
Where Are Laid Off Tech Employees Going? |...
CNBC
1,565,761 views
TOM LEE: "BUY THESE 6 STOCKS IN 2024 AND NEVER WORK AGAIN"
14:22
TOM LEE: "BUY THESE 6 STOCKS IN 2024 AND N...
Tom Nash
230,107 views
I'm Investing In This HUGE AI Networking Breakthrough (Here's Why)
15:32
I'm Investing In This HUGE AI Networking B...
Ticker Symbol: YOU
120,591 views
E7: NVIDIA AI BUBBLE - We Can't Stay Quiet Any Longer
55:45
E7: NVIDIA AI BUBBLE - We Can't Stay Quiet...
Ticker Symbol: YOU
374,838 views
These 7 Stocks are the NEXT Magnificent 7
13:08
These 7 Stocks are the NEXT Magnificent 7
Let's Talk Money! with Joseph Hogue, CFA
46,827 views
Microchip Breakthrough: The Next Era of Electronics
14:46
Microchip Breakthrough: The Next Era of El...
Anastasi In Tech
155,915 views
Before The Mets, Steve Cohen Was The Hedge-Fund King (full documentary) | FRONTLINE
53:19
Before The Mets, Steve Cohen Was The Hedge...
FRONTLINE PBS | Official
5,439,513 views
GET IN EARLY! My Top 3 Stocks To Get Rich (Without Getting Lucky)
16:53
GET IN EARLY! My Top 3 Stocks To Get Rich ...
Ticker Symbol: YOU
218,910 views
Labour's Alleged Tax Raid on Pension, ISAs, CGT & IHT
17:12
Labour's Alleged Tax Raid on Pension, ISAs...
James Shack
258,859 views
How ASML, TSMC And Intel Dominate The Chip Market | CNBC Marathon
56:39
How ASML, TSMC And Intel Dominate The Chip...
CNBC
4,529,950 views
How Nvidia Grew From Gaming To A.I. Giant, Now Powering ChatGPT
17:54
How Nvidia Grew From Gaming To A.I. Giant,...
CNBC
3,794,688 views
Top 3 AI Stocks I'm Buying Now As Nvidia Stock Crashes
17:19
Top 3 AI Stocks I'm Buying Now As Nvidia S...
Ticker Symbol: YOU
162,440 views
Appaloosa's David Tepper explains why he's not buying Nvidia on the dip here
7:24
Appaloosa's David Tepper explains why he's...
CNBC Television
127,505 views
Copyright © 2025. Made with ♥ in London by YTScribe.com