Advertisement
New Zealand markets closed
  • NZX 50

    12,105.29
    +94.63 (+0.79%)
     
  • NZD/USD

    0.5969
    -0.0036 (-0.60%)
     
  • ALL ORDS

    8,153.70
    +80.10 (+0.99%)
     
  • OIL

    82.44
    +1.09 (+1.34%)
     
  • GOLD

    2,232.60
    +19.90 (+0.90%)
     

Semiconductors: Nvidia is best performing S&P 500 stock in 2023

KeyBanc Capital Markets Equity Research Analyst John Vinh and Interactive Brokers Chief Strategist Steve Sosnick join Yahoo Finance Live to discuss the performance of Nvidia stock, as the company has been rated the best performing on S&P 500 this year.

Video transcript

- Of all the stocks rising on the fervor around AI, Nvidia stands alone. The chipmaker shares have rocketed well over 100% this year, cementing its status as the best performer in the S&P 500. The company is back in focus this week ahead of Wednesday's earnings report after the bell. CEO Jensen Huang set expectations high back in March when he stated that Nvidia had hit the iPhone moment of AI. He said the capabilities of the technology created a sense of urgency for companies, and that sense of urgency not exclusive to Nvidia. Investors have been clamoring to get on the AI train and about a FoMO reminiscent of the early days of crypto, or as Steve told us earlier, the dotcom era. Joining us now is John Vinh, KeyBanc Capital Markets equity research analyst, and as I mentioned, Steve Sosnick of Interactive Brokers is still with us.

So John, as we look at this huge surge of enthusiasm that we have seen this year, is Nvidia going to be able to meet those expectations?

ADVERTISEMENT

JOHN VINH: Yeah. I think they are extremely well positioned to monetize this generative AI cycle that we're in the early stages of starting to ramp right now. They've got an extremely dominant position. So if anyone is really working on a training a large language model, which now has potentially up to trillions of parameters, there's really only one game in town with Nvidia's GPUs.

STEVE SOSNICK: So the question though is, having spent most of my career as a trader, I'm starting to think we're maybe a little bit ahead of ourselves going into a key event. Do you know where do you feel the risk is after earnings, to the upside or to the downside? Have you thought expectations, at least in the short term, have gotten a little ahead of themselves?

JOHN VINH: I think expectations are high. And I would say you're absolutely right. In a normalized situation, I'd be a little bit more wary about the setup going into earnings. But what I take a step back and I look at NVIDIA, it's probably one of the cleanest and best positioned name within the semiconductor universe right now.

It's widely owned by both tech specialists, as well as generalists. It's got two new product cycles ramping on both the data center side with H100, as well as gaming. And I think you guys had just talked about FoMO. I think there's a lot of investors that felt like they really missed out on the run. And if there's any sort of pullback just on a mismatch and expectations, I think it's just going to be pretty shortlived.

- Where should investors be focusing their attention? Because I mean, even as we were thinking back to the amount of mentions over the course of the last earnings season or recent earnings calls, talking about AI, it was interesting. Meta had perhaps the best layout of what the pillars of AI or the buckets of AI might look like, saying it was the application, saying it was the models that were underneath of those applications. And then even further underneath of that was the chips. Is there a core segment within there that investors are right to perhaps pay a little bit more attention to?

JOHN VINH: Yeah. I think what is going to be really key is obviously the data center performance. We do think that demand is outstripping supply right now. They are starting to ramp it, so they are going to be supply-constrained. But I think what's going to be really important is kind of the commentary about what and how their latest generation Hopper H100 data center GPUs are ramping. This is going to be kind of the core chipset that most cloud providers are going to be used to train their new large language models going forward.

- I'm curious about differentiation in the chip business. As you said, Nvidia is sort of the leader in this area far and away. Contrast that with something like a Micron today, which obviously is in a very different business in the memory chip business and is now facing headwinds because of China regulation. Are we going to see a lot more dispersion within chips as AI ramps up more?

JOHN VINH: I think that's a good point. I think investors are really looking for more AI plays out there, right? And obviously, not everybody within the semiconductor universe benefits from that. Micron obviously is going through a more exacerbated kind of downturn here. Part of it is that you have some bad actors in the memory space who have been overproducing up until recently. But as things bottom and things kind of ramp up, we do expect that Micron does become a beneficiary of AI because their content in an AI server is greater.

But if you're looking within who are the big beneficiaries of AI, it's going to be kind of the compute seminames. The other name that I think stands to benefit from AI servers and the ramp of generative AI is going to be AMD. But certainly, Nvidia stands to be kind of the top dog there.

STEVE SOSNICK: How do you see when do you see the cycle really picking up in terms of earnings? Because I think the stock price has out has outkick the earnings right now. And unless the company has a blowout quarter this week, the earnings will actually be down vis a vis a year ago. Obviously, investors are looking forward. When do you really see the ramp up kicking in?

JOHN VINH: I think we're going to see much more meaningful upside in the second half here. I think we're just getting started. And what gives me confidence, as I mentioned earlier, if you look at ChatGPT 3.5, which is in production right now, I believe that large language model is trained on roughly $250 billion parameters.

When you look at GPT 5.0 and some of the newer language models out there that are about to be trained here over the next quarter or so, those LLMs have north of a trillion parameters, which means that you're going to need one more data synergy piece from NVIDIA, and then you're also going to need their latest generation GPUs, which also carry a pretty significant ASP uplift. And you're going from chips that cost maybe tens of thousands of dollars to over $20,000 for NVIDIA here going into the second half.

- John, kind of along those lines with the language learning models. How do generative AI companies or even applications avoid some of the same security risk claims that have started to come forward from different nations that social media algorithms have already started to encounter or already have been in the throes of?

JOHN VINH: Yeah. I mean, that's a complex topic, right? And if you set kind of Micron aside, I think that is less of a security issue. I think that's more of a retaliation and response to some of the sanctions that the US government has placed on China memory companies, namely YMTC. But I think what you're kind of bringing up is a very complex topic and probably beyond the scope of what an Nvidia can manage but probably becomes more of a policy question going forward.

- John, really helpful perspective this morning going into those Nvidia earnings. John Vinh, KeyBanc Capital Markets equity research analyst. And our thanks also to Steve Sosnick of Interactive Brokers for spending so much time with us this morning. Great to have you here. Appreciate it.

STEVE SOSNICK: Thanks for having me. This was a blast.

- Yeah. We'll do it again, for sure.

STEVE SOSNICK: Look forward to it.

- Thank you.