Is $MRVL the next AI play?
Is the AI hype over, or is there still value left? I think there is, but not just in GPUs.
In the newsletter published on New Year’s Day, I wrote that 2023 was the year of semiconductors, and in 2024 the focus would shift to other bottlenecks in AI, and I thought networking was a big part of that.
The company I was looking into at the time was Marvell Technology. Since the publication of that post, the company has had a 10,5% return (hate it when this happens).
In this post we are going to try and see if there is still some upside left in $MRVL, or if this is just a momentary pump.
WTF is Marvell Technology?
MRVL is a semiconductor company (Just when I thought I was out…), known for developing integrated circuits for data storage, communications, and consumer electronics. I’m not going to devote loads of time to their consumer, automotive (their chips are Tesla cars though), or telecom sectors. Most of their revenue comes from their data center and enterprise networking segments. These are also the biggest segments.
Historically, Marvell has focused on data storage where they have been one of the main actors in storage controllers: a storage controller is a component that manages communications between a system (Windows for example) and the storage device. This is of course insanely specific stuff, and I’ll try not to get too bogged down in the details, but sufficed to say that these guys know their stuff.
What I’m most interested in however is their data center networking offerings. Data centers are ridiculously complicated stuff, and most of them, depending on size, have hundreds of kilometers of cables. Keep in mind most of us have experience with Ethernet cables being 5 meters long, and consider that “a long cable”. In a data center, everything above a couple of handful of meters is done by optical fiber and Marvell makes the equipment called transceivers, used to translate electrical signals from a server to optical signals.
Ok cool, so what does this have to do with AI? One of the biggest problems with AI training and inference (using ChatGPT) is that no model can fit on a single GPU, and the solution to this is to string many GPUs together in parallel to train and perform inference on for example LLMs. While this might sound like a “problem solved” situation where parallelism allows you to train models to your heart’s desire, another bottleneck appears.
The Bandwidth Bottleneck
Using multiple GPUs in parallel solves the size problem for you, but it adds a different one. What is the one thing you need when training algorithms? Data. When these GPUs work in parallel they frequently need to exchange data. This communication and movement of data becomes a significant overhead cost, especially as the number of GPUs increases. This scaling issue can lead to a situation where the time spent on data exchange outweighs the computational benefits of parallel processing. In addition, the physical layout of connections between GPUs can impact efficiency.
All of this points to bandwidth as a key limiting factor, and faster transfer of data becomes a priority. As luck would have it, Marvell makes high-bandwidth optics needed for faster AI. Since GPU clusters are growing larger and larger Marvell will benefit from this by virtue of volume. Each transceiver they sell is attached to a GPU. and the reference cluster architecture has 1024 GPUs (yes, really) and 2400 transceivers. Those numbers might not make that much sense to you, but the estimates show that for every $100 USD spent on AI compute Marvell gets around $3 of that through transceivers alone. Sure, I’d rather be NVIDIA, but this is still a non-trivial amount of money, especially considering that companies like Inflection AI have bought 22,000 cluster GPUs from NVIDIA costing around $800M USD.
Billions from AI alone
Semiconductors is a notoriously cyclical industry, and when to invest has a lot to do with the R&D cycle, as well as product releases. This is “more true” in consumer segments such as the automotive, smartphone, and PC segments and I think it’s changing when it comes to AI.
I think $MRVL is on the right side of the cycle now for their data center and networking segment and this was more or less confirmed in the J.P Morgan fireside chat with Marvell CEO Matt Murphy.
Murphy spoke about some key drivers for growth, noting that the optics business will be much larger than expected, and the same goes for custom silicon (part of the data center segment). He specifically pointed out that optics for AI had “Substantially surpassed our prior projections, for both this year and next year (2025)”. While it is wise to take this information with a grain of salt as it comes from the horse’s mouth (aphorism twofer), it still seems credible, and they are ramping up production of chips in both optics and custom silicon.
Based on what Murphy said it seems that AI revenue for $MRVL could be in the billions for 2024, as they have a lot of revenue exposure to AI through their transceivers. I’m not going to do a DCF here because I believe that is an exercise best left to the reader, not because a DCF is not worth anything, but because your assumptions are such a big driver that I’d encourage you to make your own. If you do want a DCF however, I’ll happily oblige, and all you have to do is to leave a comment.
In previous newsletters, I have talked about the fact that there is so much money being spent on data centers at the moment that it borders on the ridiculous, Microsoft is spending $50B USD per year on data centers (and the hardware inside) and they are not alone, Google and Amazon are making similar investments. There is competition for Marvell from actors like Broadcom, but Marvell’s market share in transceivers is between 60-80%.
I think $MRVL has a shot at billions of revenue from AI, combined with their other data center offerings. While the stock has rallied since New Year’s, I still think it has room to grow. I will enter into a long position this week.
I understand if after looking more at the company you have fears about overvaluation, in both absolute and relative terms, and this post is not financial advice, so I would encourage you to do your own research. I am also more than happy to keep the conversation going in the comments.