The AI Investor Podcast

A Top AI Stock to Buy for 2025 + Did OpenAI Just Show Us the Future of AI?

24/7 Wall St. Season 1 Episode 10

In the first episode of the new year, Eric Bleeker and David Hanson discuss OpenAI's latest breakthrough and AI stocks' hot start to 2025. They also touch on NVIDIA's sharp rebound and the company's still-enviable position in the market.

Lastly, Eric shares the latest trade he's making in his $500,000 AI Portfolio.

On today's episode of the AI Investor Podcast from 24/7 Wall Street, we jump into the new year and discuss the latest AI breakthrough that's seriously raising some eyebrows and share a brand new trade to kick off twenty twenty five. All of that is next. Welcome back to the show, everyone. I am David Hanson, joined by Eric Bleeker. Eric, happy new year. I think we're still within the grace period of it being okay to say happy new year. So happy new year to you. I think you get three days and it's January third. So you nailed it, David. There we go. So we're into a new year. We've got the market. It's back open after... a pretty quiet trading time the last couple of weeks of the year, although there were some stories that were coming out right at the end of the year that I definitely want to get to that were some real headline grabbers in the world of AI. But just kind of recapping the year we just came off the market at large, we had the S&P five hundred and the Nasdaq both up well over twenty percent. That's two years in a row now of twenty plus percent gains. Doesn't happen often, although there have been several times in history where we've had these big runs and doesn't always mean that we're going to fall off a cliff. Sometimes we have seen a big correction after a big consecutive years of twenty percent gains. We've also seen years where that rally has continued, most notably kind of in the mid nineteen nineties, had some back to back twenty percent years and then you had thirty percent. So kind of hard to say what we're stepping into, but we've talked a lot about in previous shows, a lot of market enthusiasm. Again, I just said two back to back, twenty percent plus years. But we've also seen some things come back to, you know, maybe some normalcy a little bit less than, say, any of the last couple of weeks. So I just wanted to kind of get your take on as we step into this new year. What are some things that are that is catching your eye from kind of that high level market perspective right now? Yeah, we were seeing some of the enthusiasm kind of fade from some of the most notable stocks. Tesla had gone up ninety percent from Trump's election to December fifteenth, and it had been down twenty percent plus. MicroStrategy was off nearly forty percent from its peak. Now, of course, we filmed this today on Friday, January third, and MicroStrategy is up thirteen percent. We're back, baby. And Fartcoin is at all time highs. It is now up to one point six billion dollars. And also, AI stocks are having an incredible day, one of the best days across the board for AI stocks. So we'll talk maybe about some of the enthusiasm there. But you're right, David. The big thing to remember is that There has only been one instance throughout market history where the market saw a twenty percent plus returns three times a row. It was during the dotcom boom. We are now in the AI boom. We will see if history rhymes here. But the most important thing is just going to be whether or not we see kind of the promised earnings growth this year across the market. We're trading at twenty nine times around their trailing earnings, twenty two times forward. That forward numbers predicate on very strong earnings growth this year. If the market's able to deliver on that, maybe we won't see another twenty five percent return, but you could see a relatively strong year. But if we see a year of earnings growth similar to what we saw in the year behind, we will probably see a year that doesn't have you know, could be negative returns, right? Because again, it's it's been constantly setting up to the situation of, well, earnings are bad this year, but we're going to get a very strong cycle moving forward. So hopefully, hopefully we'll see that in the year ahead and hopefully some of the breadth of the market. You know, while we are an AI show, being able to see kind of strong growth across the economy would be a really good development, not just for the market, but for the U.S. economy itself. Yeah, it seems like kind of on the earnings side, there's some different paths here. We have the hyperscalers, which, you know, when earnings come out in the next month or so into I think it's early February is when we'll start to get you know, some commentary from the likes of Microsoft hearing, what are they doing in regards to their spend? But then also, you know, just all the other companies in the economy that are tangential to AI hearing, are the companies, are the end users starting to see benefits from there? So we're going to hear from kind of the companies that are investing in the space. And we also want to hear from the companies that are maybe the end users of all this capacity that they're buying from the hyperscalers. And then, of course, you have the NVIDIAs of the world, which I don't think we're going to get their earnings until the end of February. So we have a bit of time, although I know we're going to be talking about NVIDIA in just a little bit. We'll still get some news from them in the next couple of weeks. So lots to look forward to. I'm certainly excited. the year ahead I told you before we started recording over the holidays I visited with some family and some of them worked for a fairly large bank and ai came up and they were like oh yeah we don't even we're not even allowed to talk about or use that at work so this is a very large publicly traded company that is basically throwing their hands up and like yeah we hope to maybe do something with ai so it kind of just hit home with me we're still very early in the adoption I'm sure anyone listening to this podcast is maybe a bit of an ai nerd I know You're maybe the chief AI nerd, but it underscored that we're still very early in this and there's lots of misconceptions and people still learning about this. So it feels like even though AI, I think for a lot of investors was a big story in twenty twenty four. I don't know if we've totally hit that mainstream where everyone understands what we're dealing with. So it's going to be fun to cover it on this show and. We're certainly glad to have all of our listeners along with that journey. So I wanted to dive into a story that I alluded to that came out, I think we recorded two weeks ago, maybe later that afternoon, OpenAI was doing their, what was it, twelve days of Christmas or doing all these product releases and one they save towards the very end, we didn't get to capture in our show was the release of a new model that really raised some eyebrows. So I was hoping that you could kind of walk us through what was it they released and why is it so meaningful? Why are so many people commenting on this new model? Yeah, it's really unfortunate. It did come out right after the show. I would have loved to have talked about in the prior episode because this is a very big deal. So we're going to spend a little time breaking this down today. And, you know, the biggest breakthrough In AI, it was a new type of AI models, transformers. We've talked about this. First, the paper on them came out in twenty seventeen. And it essentially we walked through this in our scaling episode, if anyone wants to go back and listen to more in-depth discussion on. But it started basically a relationship with AI models, these large language models where if you are able to train on more data, and you are able to give it more compute, specifically GPUs from NVIDIA. we have seen kind of a linear relationship between those two in the gains of the quality of the models. And that's roughly called scaling laws. And it's at the center of kind of this AI boom. So the question has been, how long will these scaling laws last for? How long are we going to be able to see this kind of almost predictable growth in quality of AI models? I should note, you know, we're used to kind of these laws in technology around different technologies getting better, right? The most famous one that we've talked on the show before is Moore's Law, which is the growth of transistors. But often, while we're making technologies better, we have to really change things. process of how that happens and kind of the technologies behind the law. You know, you look at Moore's law and processors, David, you probably remember if you were around for the dot-com bubble, when you used to buy a computer, it would be marketed as a two hundred and thirty three megahertz, right? Seven hundred and thirty three. And what that was, that was basically clock speed is what it was. And the path to being able to make processors better was being able to market higher clock speed now what happened was around the middle of the two thousands decade two thousand five or so you know it really became apparent that if we just kept taking the clock speed up processors were gonna fry themselves. We needed a new path forward. And what happened was essentially processors were rebuilt instead of being a single core. Basically we adjusted how we did software to do multiple cores in processors. So we hit a wall, but we found a new scaling law. Moore's law continued the number of transistors, but the process, the way that we were achieving Moore's law completely changed. We're looking at a similar situation right now in artificial intelligence, where we have a scaling law where we are getting better, we are getting better models, but how are we achieving that is changing rapidly. And the newest example of that is... OpenAI's new model, which was released on December, and it's the newest version of what we call reasoning models. So let's talk just a minute about what's so impressive with this model, and then we could talk, maybe pull it back a little and talk about what reasoning models are. So I've got some details here. that we're going to look at just what's so impressive about O three, which is the newest model from open AI. There's a test measure, whether AI models are as far as humans across kind of a range of, you'd almost look at them and see them as visual problems. His name arc AGI is designed to be easier for humans and hard for AI. And it's proven really hard for AI systems to crack it. So GPT four, what kind of began this age of AI because it came out and everyone said, wow, this is really good. It's shockingly good. It scored two percent on Arc AGI and the magic number where you're at a human level is eighty five percent. The creators of Arc AGI, they were hoping most optimistically it might be cracked in twenty twenty eight. They were thinking kind of a realistic scale would be twenty thirty. OpenAI's newest model scored an eighty seven point five, scored higher than that eighty five. So here's a quote from the creator of Arc AGI. He said OpenAI's new O three model represents a significant leap forward in AI abilities to adapt to novel tasks. This is not merely incremental improvement, but a genuine breakthrough. marking a qualitative shift in AI capabilities compared to prior limitations of LLMs. David, another big test is called Frontier Math. It's basically a collection of the hardest math problems possible to devise. You look at them, and your brain melts a little bit just trying to comprehend how you could even begin to solve them. Prior to this new OpenAI model, The best that state of the art models could do was solving two percent frontier map. This new open AI model can solve twenty five percent. Another key area when we're just trying to demonstrate what a leap forward we're looking at right now. If you play chess, you've seen ELO scores before. Basically, it's a four-digit measurement to show how capable you are at playing chess. And there is an ELO score that's basically competition amongst programmers. They put O three into this and its score jumped from eighteen ninety one to twenty seven, twenty seven, meaning this new model from OpenAI scored higher than ninety nine point nine five percent of all programmers in the world. And believe me, it's probably even better than that because you're only taking that test if you're pretty capable. So, you know, David. I don't think. any superlatives I could really offer, do what we're seeing justice. I follow dozens, probably hundreds of people involved in AI research. And the reaction on December, was something of complete shock. The progress made by this model was more significant than anyone imagined. And the outcome from this is now it's pretty clear by the end of twenty twenty five, maybe twenty twenty six, it's going to be hard to design benchmarks. That, you know, we're able to have intelligence higher relative to AI, we're not going to be able to effectively test these systems anymore. And once, you know, you can design a benchmark, you can always You know, if you design a benchmark, researchers are going to find a way to optimize towards it. The true test of what you call AGI or artificial general intelligence is that you're unable to design a benchmark anymore. And we're getting really close to there with this new style of reasoning models. So the good news for anyone listening today to this podcast is kind of to the point where you've said about your brother-in-law and still not using AI in a bank or not being allowed, doesn't understand what's coming, the scale, the scope of the changes, how rapidly it's progressing. If you're listening to this podcast, I truly think just by the merits of being this interested in AI, you are probably in the point oh one percent of what could be rapidly developing into the biggest breakthrough in human history. But, you know, David, it is just and we could pause here because I know this is putting off a lot. This is very big. But I think December twentieth is going to go down as kind of a date where if you are in a small group and you're watching people in the field, it was extremely obvious that we are on a new trajectory and we've gone from in a matter of months, people asking, are these scaling laws dead? Could this actually be where an AI bubble breaks? And that being a common thing to now the question being, Are these scaling laws that are present, these new reasoning models being built, are they more powerful than we can imagine? And what are the implications of that? So I'll pause right here to give you a chance to ask some questions, but this is extremely significant, probably the most significant news we've ever seen since we began creating this podcast to document the path of AI and the growth of this trend. And we should say the model that was released was a bit of a demo. So before any one of our listeners runs off and signs it to ChatGPT and is like, I got to get my hands on it. It's not available yet. Have they given a date on when this might roll out in some limited form? Or when will this model at least be able to be tested a little bit more rigorously than like, oh, it's a great little demo. Obviously, we have the test scores. But when will this be in the hands of people? Have they said that yet? Or is it still kind of TBD? Well, this is where we need some caveats to some enthusiasm. The cost for the high tune model, the one that was able to score at eighty seven point five on Arc AGI, it costs about three thousand dollars per task. To get it to the performance that achieved in Arc AGI, you need ten thousand of Nvidia's prior generation best GPUs, each one hundreds for about ten minutes. So, you know, David, that's that's kind of the downside in a way that we have this kind of these models that are getting to be able to potentially solve problems in a way beyond human comprehension, but they're going to be incredibly expensive. You know, one comparison I've seen described for this is we built a jet engine, but we still haven't begun designing the plane, right? It's going to be a lot of work to be able to begin applying them. Now on the other side, Inference and how these models are largely getting better is using new forms of inferencing. The cost of inference in the last three years dropped by a thousand fold. So I'm talking about this model right now costing three thousand dollars per task if it falls by the same rate that we've seen in the prior three years, you know, that would imply going from three thousand dollars a task to three dollars a task. You know, one quote I saw was from Aaron Levy, who's the CEO of Box, and he said, OpenAI's O three model appears to be better at reasoning than any other model out there costs way more to operate. But that's irrelevant. What is expensive today is cheap tomorrow. Quality is all that matters because you know that cost will always drop. So I think from our perspective, how long till you're gonna be able to work with something at the level of kind of the high tuned O three, it's not necessarily coming anytime soon. Open AI, they had originally announced their video model Sora and people were like, whoa, blown away by this. When can we touch it? Well, it's been a long process, because they wanted costs to come down before it was going to be commercially viable. In the meantime, you're going to be able to work with prior raising models like O-one that you actually can get subscriptions for and work with them right now. But again, David, for our sake, what we begin asking for at a moment like this is how is this going to impact everything that we're buying, if there's a kind of new way of building super intelligent models, what are the changes in how we're looking at investments? And we'll have a couple implications of that today. I mean, I'll certainly leave a lot of the conclusions for you to draw, but it does seem like, you know, as a company like OpenAI continues to raise the stakes in terms of the capabilities of a model, it puts a lot of pressure on the competitors who are also trying to build their models and kind of this general sense of how much do we need to keep investing in AI, even if we haven't seen kind of the full ROI. It feels like that pressure continues with the release of something like this. days of the market, but this enthusiasm to start the year, I do wonder how much of it, we're seeing a lot of these infrastructure type plays in the AI space doing quite well the last couple of days. When you have a model like this that grabs so much attention and people really see, wow, this raises the stakes. Everyone needs to keep investing in this space. It seems like that might be some of the narrative driving the market these last couple of days here. You're working with two things. When we look at AI right now, we're looking at what's the revenue side. And if you're a Microsoft or a big company with specific targets from Wall Street, what kind of ROIC are you saying you're getting from this? How are you justifying your spend? And then we're looking at the FOMO side of an arms race. Creating a new kind of paradigm for intelligence through AI, which is what it appears we're seeing with these reasoning models, ignites that arms race. And there's going to be a lot of work, which we'll talk about some new releases from Nvidia and some other stocks benefiting from this. But there's going to be a lot of catching up to this. Now, I should also note one thing I thought was hilarious. I talked about December, maybe kind of being might be a date that we look back on as the date that we kind of hit a new inflection point with AI, that this concept of AGI, we kind of, we weren't officially there, but we were kind of a soft launch towards it. If you picked up a Wall Street Journal the next day, the headline, I took it down here. It's the next great leap in AI is behind schedule and crazy expensive. Well, as we talked about, David, it is crazy expensive, but it's, me this is a little bit like the day after the moon landing the new york times had a headline uh trip to mars still behind schedule right I I think celebrating the achievement of what we're seeing is a lot more important this moment also if you're wondering you know again you look back at past trends and go that was so obvious How were you able to make so much money on it? And it does come back to a lot of the negativity that's present at the times. It appears very clear that we might be, again, on the precipice of a massive breakthrough that's going to see incredible gains in progress throughout twenty twenty five. And again, the headline from the paper record for finances is behind schedule and expensive. So I think that kind of It does a great job illustrating why these situations can be so lucrative while it might look really obvious in hindsight, right? Yeah, I want to move on to you mentioned NVIDIA. I want to talk a little bit about them. But yeah, it's going to be fun. I always say the comment, every two weeks there's something new. We had exactly two weeks. We had the other three. It's only been two weeks since then. And even the enthusiasm, I followed many of the people that you do. That first couple of days, people were like, I can't believe what I just saw. But if you go to CNBC.com today, it's not on the home page still. So it's going to be a story that's going to be interesting watching as we kind of is that the rest of the plane built. What do you say? We have the engine, but we don't have the rest of the plane built. I think this year it's going to be interesting to see who starts building that plane. It feels like now is the time that we're going to start seeing some really impressive things. But let's talk about the most impressive company in AI. That is Nvidia. I mentioned we're not going to get earnings from them until late February. So that's a lot of time. The narrative shifts so much with Nvidia. A couple of weeks ago, it was like, oh, Nvidia's already in a correction. It's down. Ten percent is maybe Broadcom eating their lunch. And now it's, you know, we're right back almost to one hundred and fifty dollars a share, a price that it can't quite get past that too far. I wanted to hear kind of what's the latest news from Nvidia and kind of what are you watching in this time between now and earnings from the company? Yeah, so NVIDIA, they are absolutely relentless. And it's one thing to be a company that, any company that's as big as NVIDIA needs to get lucky, right? When NVIDIA was founded, its leader, Jensen Wang, he was never thinking about this level of GPUs, right? There's some luck, but there's also, again, some tenaciousness. And we've talked a lot about NVIDIA's moving towards Blackwell, which people think of it as a chip. It's really a system approach. to be able to kind of protect NVIDIA's competitive advantage, but also continue driving the, you'd call it total cost TOC down for these AI applications. So this scale out right now has been, GB two hundred, which is the first systems that they're shipping. They got a little bit of revenue contribution from it in their last quarter, but supposed to begin scaling this quarter. Now, generally you'd see a much slower cadence between you know, a next major launch. You know, you might have some kind of secondary launch next year, you know, what you'd call a GB two hundred ultra, you know, something like that. And then you'd have a new architecture the year after. Nvidia is trying to accelerate everything as much as possible. They're doing something that's almost never been tried before in semiconductors. And what they're launching is an update to GB two hundred and it's going to be called GB three hundred and it's supposed to be ramping already in the third quarter. Now, what's particularly interesting about this is there's been some kind of analysis on it from semi analysis, which is probably the best researcher in the space around semiconductor matters. And it was glowing in how effective this is. So if we're looking at it, it's an upgrade that specifically is much better for these reasoning models that we've been talking about. They believe that it's economics per token using NBL, which is part of this system, is ten times better on these reasoning chains. So what's going on here is there's something called chain of thought. How this reasoning works is effectively you have a problem and the AI model uses chain of thought so it's able to think. And as it's thinking, one of the big complaints about AI is it spits out nonsensical answers. And what it's able to do is if you're in a field that's verifiable, like coding or math, it's able to, if it gets to a wrong answer or a dead end, it's able to go back and retrace its steps. And this is partially what makes this reasoning so much more effective. Also, it's creating new training data because you're able to get training data. So you're able to solve this problem that we've talked about before, having less data available. Now, the problem with this chain of thought structure, it has incredible memory demands. So what NVIDIA is doing is they're going to launch this new system that is really a lot stronger on the memory side, which is one of NVIDIA's greatest weaknesses against AMD and also against Broadcom, frankly. So again, it's hard to really... capture how aggressive it is going for another launch this quickly. But what semi analysis believes is that all of the hyperscalers are now going big into this GB three hundred upgrade. It's released at just the right moment where it's optimized for these reasoning models, right as we're getting this new three results. And this is probably going to lead to a lot of momentum for Nvidia specifically in the back half of the year, not just for their chips, but also for their networking revenue. So David, we've seen Nvidia shares up, five percent today. I think they were up three percent yesterday. And I'm sure there's going to be a lot of speculation about why the cause of that is. I think the key reason is that there's probably a lot of research notes coming out from Wall Street, extremely impressed how aggressive Nvidia is. They've got this lead and everyone's saying, are they really gonna be able to protect this lead? They are doing everything in their power to a degree that you've never even seen before from a large semiconductor company to be able to keep this pace up so um if you're someone who owns nvidia what they're doing right now to cement their lead across again a new kind of paradigm a new scaling law coming up um you you couldn't ask for anything more it is extremely impressive and again I think it's it's it's what's led to the recent gains for nvidia and speaks very well for them to deliver once again in twenty twenty five after you know, let's call it what it is, beyond belief, twenty, twenty three and twenty, twenty four. Right. So we obviously own Nvidia in the AI portfolio. I think at a, not sure what our blended kind of cost basis is, lower than what it is today. So if someone's sitting there, again, I mentioned earnings coming in a couple months. This is a company that has had an enormous run and it hasn't had a lot of these like air pockets down on earnings where it's like, usually you see these large companies that have explosive growth. You might have earnings where it's like, oh, it's, off eighteen percent it's not unusual nvidia has not had that so if someone has been sitting watching nvidia is this still a price that's reasonable I know we've talked a lot about looking forward to the earnings what would the pe multiple look like there what's kind of your just general feeling I'm assuming you're still confident kind of holding the stock right now but do you have any different opinion on if someone's kind of I've been thinking about it and I just haven't gotten in yet Yeah, I mean, I think if you didn't own NVIDIA, I think you should be able to relatively confidently add to them right now because a lot of the concerns have been how the competition from specifically Broadcom, but Broadcom is designing custom processors at the behest of these large hyperscalers, giving them optionality beyond NVIDIA. NVIDIA just kind of stepped in and said, we have something so impressive, you cannot afford to not buy this. And all of them have apparently said, yes, you are right. And we're going to see a relatively large cycle in the back half of the year. So, I mean, this probably pushes again, this probably begins pushing the cycle into twenty twenty six, which was our unknown zone. And again, reinforces It reinforces the long term strategy that NVIDIA has been pursuing around the systems approach because it certainly looks like a lot of the networking technologies that they've built for these Blackwell systems are part of getting latency down in memory and part of being able to provide a superior system. And again, it just starts beginning emphasizing things that other competitors can't really copy easily. You know, AMD can't copy this. The closest thing that can try that has, you know, the complete offerings between networking and chips is Broadcom. But again, I think this really probably blunts some near-term momentum from Broadcom in a way that I think the hyperscalers are really gonna be focusing in the back half of this year and trying to get as many Nvidia systems as they can, because this is an outrageously competitive offering. So I think how you felt about if you were a six, feeling about Nvidia a couple of weeks ago, the developments around, uh, these reasoning models, a lot of headlines might say it's, it's inferencing where they're weaker. Um, Nvidia has kind of taken some perceived weakness and they've kind of judo flipped it. It would appear, um, that they're definitely getting to have this very well. So if you were a six on a confidence scale, I would feel like the past two weeks have to take you up to an eight. Um, they're just It's a trend that might be unlike anything we've ever seen before, and it's a company operating at a level we've potentially never seen before. Yep. Just your friendly reminder of that NVIDIA was down. Seventy percent into twenty twenty three. Just incredible to think about the change that's happened in the market the last couple of years. Again, really, since that launch of ChatGPT is when It really took, you know, mainstream and all the hyperscalers really got on board and really started spending with NVIDIA. So just a truly incredible story that we probably can't talk enough about. Just incredible. Now a three and a half trillion dollar company. The question, I guess, will start to be, will it get to four trillion in a race? Maybe, you know, Microsoft. We'll see if they've had not a great, not as good of a twenty twenty four. Still good, but. We'll see if they maybe close that gap into twenty twenty five. So there's Nvidia. It's in the portfolio. Let's talk about a company that we're adding to the portfolio. We do have a new company that we're kicking off the new year with a new trade. Why don't you tell us a little bit about the company that we're that we're adding now? I thought this was A very light show, lacking in any nerdiness with a chain of thought, reasoning models, all that. So I decided let's also dive into semiconductor equipment. So we're going to be adding LAM research. It's been down about eight percent. in the past month, it's getting a little back today as we record this, but thirty six percent from July tenth. You know, I've talked about, David, the semiconductor equipment industry and historical returns and why it's relatively attractive. You look at this industry, it's it's about one hundred and thirty three billion dollars in size. There's a general rule of thumb that however big the semiconductor industry is, this is going to be about fifteen to twenty percent. And all the major companies have seen outstanding returns the past decade. And the reason for that is, you have a lot of different kind of verticals within semiconductor equipment. Generally, you only have two or three major competitors in each vertical. So if we look at the five biggest companies, we've got ASML, which we've talked about before. That's the monopoly on cutting edge lithography, transferring the chip design onto a wafer. We've got Applied Materials, which has the most broad portfolio. we've got lamb research and tokyo electron they're involved in etch and deposition which we'll talk more about momentarily and then kla which dominates metrology which is looking for defects in chips now between these five companies They're ninety three billion revenue. And I said it was about one hundred and thirty three billion in size. So you're able to get roughly two thirds of the industry just looking at this group. You know, I always want to start kind of giving the high level overview of how you start attacking kind of an opportunity. It's trying to understand from a high level what's going on here. So let's dig into why specifically we're going to be singling out land research today. It's a company with a market cap of ninety billion. It's got fifteen point six billion in revenue, four point one billion in earnings. So that's pretty attractive margin profile, four point eight billion in free cash flow. So if we look at multiples based off that twenty three times trailing earnings, nineteen times cash flow, twenty times board earnings, those are all below market multiples. And the reason they're all below market multiples is because on July tenth, there's new export restrictions announced by the US targeting China. That's been an anchor across the entire semiconductor industry, but it's hurt these companies because most of these companies are forty to fifty percent sales to China. Lamb Research is forty two percent. These restrictions are potentially painful. We've we've probably seen kind of a pull forward of demand, which which could complicate things in the year ahead. But again, David, we're being compensated with a stock that's thirty six percent cheaper that, you know, across the past decade was a company that grown six hundred percent and maintains a very dominant position. So we're going to take some of that pain today. for the potential for upside as they work through these restrictions. So let's talk about what LAM research does. I talked about deposition, etch. They're two biggest specialties, two massive markets. ASML gets a lot of attention. Lithography is about twenty two percent of spend for semiconductor equipment. Deposition is twenty eight, etch is twenty eight, and there's sub verticals there. But these are big areas. What etch does is lithography transfers the chip design. You still need to engrave the design into a wafer. That's where etching comes in. The other thing is chips are going in multiple directions. Memory that's specific to AI. We'll talk about this. It's stacking three D but transistors are also going three D as well. They're getting more and more complicated to build. And that's where deposition and adding all these layers comes in. So lamb research has four different opportunities that they see becoming billion dollar opportunities around growth areas. So the first area is technology around making transistors more intricate. It's called gate all around. And that's going to benefit them tremendously. The second is they've developed a dry film that actually reduces the demand for ASML's machines. So part of the reason ASML has struggled a little bit is what you call the intensity, the need for their machines has decreased in part because innovations from lamb research. The third is a technology called backside power. I'm not going to get into that, but it is something should benefit from AI. And the last is advanced packaging. I do want to spend a little bit more time talking about this because earlier I talked about this need for more memory. these ai models and you know why do you need more memory because you're having to store the parameters you're having to store all the work that you're doing in inferencing it's just incredibly memory intensive and there's a special kind of memory we've talked about on this show before it's called high bandwidth memory and what you do is you essentially stack the memory on the logic die so you're stacking the memory on top the chip from nvidia or broadcom or amd then you're basically drilling through it to directly connect well it happens that lamb research is a leader in this space and their advanced packaging for this you know new type of memory it grew um from about three hundred million to a billion this year and the forecast for this HBM market is going from something like fifteen billion to a hundred billion by twenty thirty. So right now this is only six percent lamb sales, but it could be an incredible tailwind at their back. And it's just one of the four opportunities I talked about earlier. So, David, you know, a key theme that we've talked about is being responsive to changes as we're seeing something like new reasoning models spin up what does that change well if it's changing memory and people are saying that there's a good chance that memory is going to actually outgrow kind of chip sales for ai well we want to be responsive to that so we're making kind of investment targeting this opportunity we're going to put ten thousand dollars in lamb research and I'm going to continue looking at some opportunities in the space there's some pure play companies entirely focused on advanced packaging and there's other companies in the semiconductor equipment space but I want to make a first buy that was topical to what we talked about on today's show um And I want to make sure if people's brains weren't full from talking about reasoning models that we got even nerdier at the end, David. So that's why we're adding lamb research. And I don't think this is the last stock we're going to add. If this semiconductor equipment tracks, it's fifteen or twenty percent semiconductors. You know, you want to have some exposure to that in the portfolio. So we will continue looking for opportunities. I mean, you mentioned the market kind of adjusting for the changes with sales to China. And the outgoing administration has had some combative relations with China. The incoming administration may also, well, unclear whether it's going to be play hardball with China or maybe it'll be relaxed. Are you viewing that as that's not going to be resolved? Or is that any part of your thesis on that part of the business changing? Or do you feel like it's, well, we have enough exposure to these other areas. If we hit an inflection point in one of them, it doesn't really matter if their sales to China are kind of permanently damaged. Yeah. There were some new export control measures that were announced and they came out and said, we are not updating our financial guide. So I think partially pessimism is probably a little overpriced in the stock right now. The second thing is, how would you read into how Trump's gonna react to China? Man, that's a good question. I do think one thing is, his administration is much more likely to want to make a deal and potentially have some concessions to be able to get a brand or trade deal in other areas. So I think it's hard to play any specific theme around geopolitics, but I do see some scenarios where their environment is actually better in the next four years than it will have been trending in the last four. But we'll have to monitor that. And frankly, we have to monitor for a number of stocks in the portfolio. We talked last week about how Broadcom They have a deal with ByteDance, which is a Chinese company that owns TikTok to design custom chips, and that could be very meaningful. The number two buyer of NVIDIA chips last year, even though NVIDIA has to sell different chips to China that are not as powerful, the number two buyer of NVIDIA chips last year was ByteDance. So we're kind of having to watch this situation across the entire portfolio. Lamb Research has taken the brunt and other semiconductor companies. This is weighing on kind of every company across the portfolio right now. All right. Well, that's Lamb Research. Next episode, I'll make the promise that we'll do kind of a quick review of the portfolio. If we have any new listeners and you're kind of wondering, what is this AI portfolio they're talking about? A few months ago, Eric took five hundred thousand dollars of his own money. and kind of agreed to manage it in full view of the listeners of this podcast. So next episode, we'll do a quick review of the portfolio, kind of where it stands. It's been a good couple months. We've had some winners like Credo, really a lot of winners actually in over the dozen companies in the portfolio. So we'll be sure to give a full review of that. And speaking of reviews, if you haven't left us a review, on Apple or Spotify, we'd really appreciate it. Go give us the stars. Give us a quick little paragraph on what you find interesting about the show. That would be great. For Eric Bleeker, I'm David Hanson. Happy New Year. We'll see you in two weeks.

People on this episode