본문 바로가기

미국주식 종목분석

Why Nvidia's Solo Will Continue Through The End Of This Year

반응형

Why Nvidia's Solo Will Continue Through The End Of This Year

올해는 엔비디아가 계속 핫한 종목으로 자리잡을듯 합니다

악재는 없고 오로지 호재만 가득.

물론 지수조정나올 경우에는 같이 조정은 받겠지만...

기다리면 더 큰 상승세를 만들어줄 수 있을겁니다.




I went to AMD Korea press conference yesterday.

The M1300X is competing with Nvidia AI semiconductors. It is already available to cloud providers, and server vendors will release their products sequentially from May.

When I talked to a straw in the server field I knew, they said it was the second half of the year. Again, optimization takes time.

Intel said that Gaudi 3 is in the second half of the year, even though it has been released here. It is said that the 5th generation server of Xeon can cope with the inference market, but I wonder if the stock price would be like that.

AMD and Intel said they would cut H100 with M1300X and Gaudi 3, but Nvidia said it would deliver the B200 from the second half of the year.

Of course, by the end of the year, cloud service providers will have more or less diversified options, including their own chips, Nvidia A100, H100, H200, B200, AMD, Intel, and a number of other global AI startups called Gana NVIDIA.

Oh, Naver and Samsung Electronics' Mach1 chip will be released next year, so I hope that infrastructure prices will drop a little and there will be more free services.

By the way, I think SK Hynix will get better now. The number of hbm devices is increasing rapidly. Amd and Intel are representative, and apart from this, cloud operators need hbm as they make delayed semiconductor chips. At #awsreenvent 2023, which was held in Las Vegas last year, Amazon told me that it is receiving hbm delivery of sk hbm.

I think demand will increase even more than Microsoft is also making AI chips. What kind of man is Cfo, who doesn't talk about this properly and seems to be only buried in Engidia? lol
ChatGPT4 Tips I think it will be a business. (1)
- YouTube Summary.

In the beginning, I wondered what benefits ChatGPT would have other than puns, and it's quite useful.

It's usually hard to watch the end of YouTube and it takes a lot of time. Even if you watch it until the end, there are many things that just catch the title and don't have any content. When it's like a waste of time... It must be hard to make money from YouTube. It's a useful tool at this time. There are hundreds of other useful things besides this, so I'll just upload one. If you post a link on YouTube, it organizes you. How useful is it?

Maybe not uploading the entire content is the copyright issue and what helps me with it (AI summarizing YouTube). The downside is that the summary is too brief and there is a high possibility that the content will be leaked in a weekly manner. It will get better if the GPT performance improves.

I think it will be really useful for researchers such as various language programs, Excel editing, picture editing, PDF thesis summary, chart recognition, semiconductor design learning, etc.

I'm looking forward to GPT5.

Love and thought
#GPT4 #Copilot #Utility #GPTApp #YouTube #Summary #Youtube #summarize

==============================
Educational recap: CXL Died - Why Did U.S. Sites Shockingly Claim? | Information Market CEO Kang Yong-woon by Video Summarizer:

In this video, we explore different opinions about the current state and future of Compute Express Link (CXL) technology. Kang Yong-woon presents several evidences to support the claim that CXL technology is not suitable for the AI era, and conducts a detailed analysis on it.

The End of CXL: Experts claim that despite having high expectations in the past, CXL's importance has diminished with the advent of the AI era. Nvidia has already developed NVLink, a more efficient memory transfer system, limiting CXL's potential for development.

Market Forecast and Reality: While some experts predict that the CXL-related market will reach $15 billion by 2028, this may be an over-optimistic view. Indeed, AMD is only theoretically applying CXL on 300A server CPUs, with very limited real-world use cases.

Technical Limitations and Bandwidth Issues: CXL technology significantly underperforms other technologies such as NVLink in terms of bandwidth. This is a problem caused by the low bandwidth CXL provides and excessive focus on latency. For this reason, Nvidia prefers its own NVLink over CXL.

Looking Ahead: Although CXL can be useful in certain areas, its importance will gradually decrease as major companies like Nvidia choose other solutions, which could have a significant impact on investment and research and development in CXL technology.

Insights based on numbers:

CXL-related market size is expected to be $15 billion by 2028, but this may not be realistic.
The bandwidth difference between NVLink and CXL shows that NVLink delivers on average three times faster performance than CXL.
The use of CXL technology is very limited on AMD 300A server CPUs, suggesting a large gap between theory and real-world use cases.
Example exploratory questions:

What is the main reason why CXL technology doesn't fit the AI era? (Enter 1 to ask)
What are the specific technical reasons why NVLink technology is preferred over CXL? (Enter 2 to task)
Why is the CXL-related market forecast considered unrealistic? (Enter 3 to ask)
https://www.youtube.com/watch?v=h5j0IQZ4kN4

#ai #NVIDIA

320x100