Nothing of market emotions can escape any stock.
Source | Zhitong Finance APP
When I wrote "Nvidia, is the stock price leaked" a year ago, although it is not logical and clear, we know that the point to express is the last sentence: it is a brief return, and Nvidia can rise a lot.
A year later, Nvidia’s short -term stock price trend came to this almost exactly the same.Lucknow Investment
(Picture source: Wind)
Standing at this time, can you still believe that "this is a brief return, can Nvidia still increase a lot?"
If you can’t firmly believe, what factors make us sway?
Those risk factors
Nothing of market emotions can escape any stock.
On the day before Nvidia reached the top, Wall Street’s largest Niu San said that Nvidia’s stock price still had a 50% room for rising. RosenBlatt Securities Analyst Hans Mosesmann raised the target price of Nvidia’s stock from $ 140 to $ 200, and there were still 50%.The room for rising.
If you don’t look at those lengthy research reports, you can also know that Nvidia’s short -term performance still depends on the capital expenditure plans of the giants and the configuration of sovereignty computing power.
Huang Renxun’s recent "sovereign AI", which is running around, refers to the ability of a country to build AI with its own infrastructure, data, labor and commercial networks.Nvidia "sovereign AI" customers come from Singapore, Japan, France, Italy, India and other places.Nvidia revealed in May that the "sovereign AI" is expected to bring nearly $ 10 billion in revenue this year (one -third of the total revenue last year), and the number last year is 0.
Look at the giant’s capital expenditure:
Microsoft’s capital expenditure in the first quarter of 2024 was $ 14 billion, an increase of 79.4%compared with the same period last year.Microsoft said that the capital expenditure in the 24th year increased by 80.8%-89.6%over the 23rd fiscal year.
In Google, Capex will be or slightly higher than the level of US $ 12 billion in the first quarter.
Meta’s adjustment of this year’s capital expenditure was raised from 300-370 to 30 billion US dollars, but Q1 decreased by 5%year-on-year.
A large number of technology giants need to build a large number of cloud infrastructure, and Amazon is left.Although the Amazon’s quarterly performance did not disclose the data after the quarterly performance, the market is expected to be around 60 billion yuan, an increase of+4%.
Is the giants doubling the capital expenditure this year?
These four account for about 50%+of the total sales of Nvidia.The total revenue of Nvidia reached US $ 60 billion in 2022, which reached the level of $ 60 billion in 2023, doubled.Everyone thinks that from the level of 60 billion in 2023, how likely will it double in 2024?
The capital expenditure of the giants is also periodic.If this year can still double the expansion, what about next year?
And Nvidia’s penetration of more application levels such as software, medical care, robotics, and intelligent driving. Those cakes can be painted again, and they still have to be confirmed for longer performance.
Then the question is, since the launch of Openai’s launch of ChatGPT at the end of November 2022, which caused people’s interest in artificial intelligence, Nvidia’s stock price has risen by about 700%.
Do people who have made 7 times need to weigh whether the next 50%of the room for rising is worth adhering?
Don’t follow Nvidia TisteraKolkata Wealth Management!
Everyone has heard of this saying: computing power is the "oil" of the new era.
Human beings have evolved into the "Bronze Age" from the "Stone Age", and then evolved into "people in the iron era", and then evolved into the "carbon era". Then, our descendants may bepeople".
Of course, through digital forms, part of us, life may continue eternal life until it becomes the "people in the silicon era".At least Huang Renxun and Musk have such ambitions.
People who have mastered the skills of polished stone tools, those who produce bronze and iron in batches, and those who produce computer hardware and software have historically become the "king" of wealth and economy.Huang Renxun is no exception.Therefore, as a monopoly production AI infrastructure, it was him, not someone else, and eventually sat on the "Frozen Throne".
(Huang Renxun: Computing power is the source of power: Tencent Video)
The growth rate of Nvidia’s market value is surprising.
The rise of Nvidia is always reminiscent of the giants of the Internet era Cisco -from the first public fundraising in 1990 to 2000, Cisco’s stock price increased by more than 1,000 times, and once became the world’s most valuable company.
Analysts also compared Nvidia and Cisco.
On Sunday, June 23, BTIG strategist Jonathan Krinsky warned in the report that Nvidia’s performance even surpassed any American company in the technology bubble period in the late 1990s, and the stock price was about 100% higher than its 200 -day mobile average.EssenceSince 1990, no U.S. company can become the largest market value company, its stock price is more than 80%higher than the 200 -day mobile average.The closest time was Cisco in March 2000. At that time, Cisco’s stock price was 80%higher than its 200 -day mobile average, marking the historical highest point of its stock price.
In the past five years, Nvidia’s stock price has increased by 4280%, compared with Cisco’s 4460%increase in its market value in the first five years, it has also performed equally.
Is it a coincidence of history?
Today, the computing power of the AI era is Nvjiao, will it repeat the Internet solution in the Internet era?
As above, many people think that Nvidia is a hardware.The reason why he dominates the world is because his hardware is doing well.
The NPU structure is much simpler than the CPU.It is a dedicated chip, which is not programmed, and does not even need a few instructions even instructions.Compared with the general chip such as CPU, the controller is simplifiedSurat Stock. What is left is a bunch of computing units and caches, so that the large throughput is calculated in parallel computing.The computing is matrix operation, only two operations: multiplication and addition.
At the design level, NPUs that can design CPUs are dimensional reduction strikes.It is undeniable that high -end NPU Nvidia is indeed leading, but its key to its lead is not a leading design, but a leading process.
Therefore, in the field of AI artificial intelligence, Nvida’s GPU is not the only and best option.For example, Google’s TPU may be better under certain conditions.
However, what is desperate is not the superiority of Nvidia NPU, but CUDA’s "unification".
In other words, Nvidia has a complete ecology earlier, making the opponent daunting and full of weakness -now the parallel computing on the market is basically Nvida CUDA. Theoretically competitors and AMD ROCMs, but they rarely listen to itI have said that some people are actually using it, and there are still many engineers who say that ROCM is not easy to use.
As for why CUDA can form a crushing advantage today? In the early years, NVIDIA’s promotion of CUDA was very scary, and was willing to smash money and smash research and development.Let’s take a look at this discussion screenshot and you will feel intuitive:
——This is very similar to the development of WeChat in the social development of Chinese mobile.To this day, Ru has been sitting big.Even if there are competitors to engage in a new communication app, but your friends and colleagues are already on WeChat. How likely is your migration?
Can’t CUDA copy? Can the competitors translate the source code of CUDA? It really doesn’t work.
The high -performance libraries of CUDA, such as Cublas, CUDNN, CUFFT, etc., are all non -open -source; open source Cutlass, etc., performance cannot be seen.Huang Renxun is not a charity. Of course, sheep raising sheep is to have a good wool for one day.
From this perspective, everyone can understand why Nvidia will be blocked, and other chips are not allowed to simulate CUDA.
Nvidia rely on deeply binding the hardware with the "moat" of CUDA, and took away the overtime excess profits of the entire upstream and downstream.
To dig his corner, unless there are two situations:
First, it has been found that the wrong downstream has been united, and the ships and ladders that can smuggle.The tens of billions of expenditures of technology giants have contributed to Nvidia, and they have long hated itching.But a family is too difficult to do ecology, unless there is a real powerful open source alliance (already started), just like Android compete with iOS.After a year, we can look back at CUDA’s moat.
Second, a new technology tree emerges in this world to cover the AI trend.How likely is it in the short term?
Therefore, from the perspective of Nvidia’s strategic layout, Nvidia will not fall into the situation of Cisco for the time being. In the future, Nivioida will still be the hegemon of this field.
But this does not mean that the owner of the yellow religion can sit high.
Wind can carry pigs to destroy pigs
Because of the rise of Nvidia, he borrowed it.
Nvidia’s CUDA ecosystem is a moat, but its key to the leading hardware is not the leading design, but in process technology.In other words, the deepest layer of moat is the chip manufacturing of TSMC, and the bottom layer of the moat is the ASML EUV light carved machine.
But why since last year, although TSMC and ASML have also risen, they have not risen much? Because it happens that Nvidia is the pig in AI.
Why is the AI air outlet so fierce?
Here is part of the discussion of "Wu Daokou Macro", briefly answer this question:
In the past two years, whether it is domestic or overseas, the market’s desire and chase of scientific and technological breakthroughs have reached an enthusiastic level. From the Yuan universe, room temperature superconducting, to artificial intelligence ChatGPT, some concepts can actually be a little bit of science and technology or physics common sense.Simplified evidence and pseudo (or the current basic theory and hardware equipment are difficult to achieve), but why are the financial giants and the technology of technology still rushing? It is actually because we are currently close to the end of this round of computer/Internet innovation cycle.It is at a critical moment when the new innovation cycle and the crisis of the long debt cycle explosion.
This round of computer/Internet innovation cycle has two highs driven by the United States and India. With the final Internet climax brought by the epidemic from 2020-2022, the innovation cycle of this round is nearing its end.And whether the new round of artificial intelligence wave can start a new round of innovation cycle is still the biggest variable. Compared with the science network foam in 2000, there are certain similarities and differences:
Similar to the current degree of fanaticism in the capital market. This year, AIGC companies’ financing of the first-level market has exceeded 150 billion yuan, while only 200-30 billion yuan in 2022; the secondary market is even more obvious.The Dow has a record high, and most of the relevant valuation indicators are close to the high level of 2000.
The difference is that the previous round of science network bubbles are telling stories, and the current ChatGPT does have the direction of landing.In the future, U.S. stock performance will continue to be deeply bound with ChatGPT. The performance of ChatGPT5 in 2024 will determine whether US stocks will have a large retreat. This may also further trigger the Fed’s decision to cut interest rates.
Nvidia’s stock price trend involves not only its own performance and market emotions, but I am afraid that the overall trend of US dollars and the US dollar and the US dollar cannot be removed.
In addition, we should not forget that in human history, polishing stone artifacts, mass production of bronze and iron, monopoly oil mining, monopoly computer and software manufacturing — all of these top -level wealth channels are often cooked with politics and struggle.
The key to the old yellow neck of the card
Back to a familiar sentence in front: computing power is the "oil" of the new era.Do you think it makes sense?
No, the computing power is still the same as the stone and bronze.It is not energy.
It can become infrastructure, but must rely on energy.
If the electricity has a limit, the computing power will have a bottleneck.Unless the energy consumption ratio between power and computing power can be smaller, or the energy consumption ratio between power and computing power is reduced at a very fast rate.
Each H100 is as high as 700 watts, which is almost eight times the power consumption of a 60 -inch tablet TV; the power consumption of the B100 is almost twice that of the H100.
Huang Renxun also admits that the global artificial intelligence hegemony is likely to depend on which countries have enough data centers and electricity to support this technology.
In fact, the data center is built fast, and it is fast to buy enough GPUs to stuff the data center.However, the construction of power generation facilities and transportation network takes time.
First, the rapid increase in power supply requires objective conditions.Simla Investment
In 2023, the net power generation of the United States was 4178.171 billion kWh, a year -on -year decrease of 1.2%.Among them, the net power generation of the power department was 402.238 billion kWh, a year -on -year decrease of 1.3%; the self -power generation of the commercial department was 16.675 billion kWh, a year -on -year decrease of 0.4%; the self -power power of the industrial sector was 139.157 billion kilowatt -hours, a year -on -year decrease of 0.6%.
EIA data shows that the total power consumption in the United States in 2022 totaling 4.06 trillion kWh (23 years of data has not yet been announced), which can be roughly inferred that the current power consumption and power generation in the United States are roughly in a relatively balanced state.Does the prerequisite for the explosive growth of power supply have the United States?
OPENAI CEO Sam Altman said that to adapt to this new situation, it requires a breakthrough in energy-it is likely to come from nuclear energy.
Amanda Peterson Korio, the global data center of Google Global Data Center, said, "We need Taiwa’s traditional green energy", which is equivalent to about 1,000 nuclear power plants.
Microsoft and Amazon also bet on nuclear energy. Amazon recently bought a nuclear -powered data center in Pennsylvania and expressed his willingness to buy more.But as of now, the way forward is still unclear.Nuclear energy does not want to have it.
Second, even if the conditions are all available, the power grid construction takes time, and some projects need to wait for the 2-4 years cycle.
Data center supplier DataBank has built a new data center in Virginia.The madness of the AI equipment competition was before the DataBank plan provided electricity for "a big customer", the client stuffed the server in the building.But there is no electricity."This is the issue of artificial intelligence." James Mathes, who manages some facilities of DataBank, said: "Now, artificial intelligence is like a blank check."
Dominion Energy Inc., a Dominion Energy Inc., provides "World Data Center". It connects 94 data centers and consumes a total of approximately four g a tiles of electricity.Now, the company is dealing with the request of the data center park. The power consumption of these data centers will reach countless gigers, which is enough to supply hundreds of thousands of households. Among them, the power consumption of two or three data centers may be equivalent to 2019. 2019 may be equivalent to 2019The total power consumption of all facilities that has been connected for years.Data center developers need to wait longer to connect their projects to the grid.Edward Baine, president of Dominian Electric Power Company, said in an interview: "It may be almost two years, and it is four years slowly, depending on items that need to be built."
From these news from Bloomberg, we cannot reach the conclusion that power supply will not keep up with Nvida GPU production speed.But you can see a threatened signal: Nvida’s GPU, with backlogs and idleness in some data centers.
According to media sources, before and after Christmas last year, Huang Renxun summoned a series of meetings to discuss an increasingly serious concern with the company’s executives: "Nvidia’s largest customers may not be able to install AI chips because of insufficient data center space, and they may not be able to install AI chips, and they may then may be possible."Sales that affect GPU." Another interview confirmed this -DataBank’s CEO RAUL MARTYNEK said, "Unless customers prove that they have enough data center capacity to place these GPUs, Nvidia will not ship."
In fact, the increasing concern of Lao Huang’s increasingly aggravated may be that the data center space is insufficient.To the present, the focus is not that the data center is the "buffer pool" that can undertake its sales, but in the more core -insufficient electricity.
Even if the power is sufficient, you must worry about the insufficient transportation capacity of the power.
Everyone is crashing shovel, can the shovel dig out gold? There is still controversy now.But the motivation to wave shovel cannot keep up with the increase of the shovel -this is the embarrassing reality that is about to face.
When the giants who expand capital expenditure, what do you think when you find that the grabbed shovel eats ash in the data center for a year or two?
Isn’t there a bit of force in Nvidia’s executive?
Of course they have.
Since Nvidia announced the first fiscal quarterly results on May 22, more than one -third of insider chose to reduce its holdings, which has reached a new high over the years.
Alas, this is almost the card.
AI King Nvidia is a high probability that it will start a long period of time, and it may have been at this year.
What do you think?Welcome to leave a message to discuss.
Follow the public account below
Explore all the secrets of the Hong Kong stock market with me
Mumbai Wealth Management