NVIDIA can’t sell its best GPU for AI applications to China. The Western alliance led by the United States has always been reluctant to give its most advanced technologies to this huge Asian country, but on October 7, 2022, the Joe Biden administration raised its fist to the maximum by including the entire Chinese semiconductor industry . in a list showing the scope of its restrictions.
One of the American companies that is being hit the most by sanctions aimed at curbing technological development in China is precisely NVIDIA. The Commerce Department is the body that, among other functions, issues export licenses to US companies, and a company directed by Jensen Huang has been banned from selling its A100 and H100 GPUs, which are powered by advanced artificial intelligence, to Xi Jinping’s country. It has chips for ( AI).
The Chinese market is a very important source of income for some US semiconductor designers and manufacturers, which has led some of them to look for ways to circumvent sanctions without running afoul of the administration. Nvidia did it. Its strategy is allowing it to maintain its commercial ties with China, and at the same time, it is helping companies from this Asian powerhouse involved in the development of artificial intelligence to prosper in this discipline.
Chinese artificial intelligence continues with ingenuity (and effort)
China cannot afford to be left out of the AI market. Currently its three best-positioned companies in this area are Baidu, Alibaba and SenseTime. These three companies have artificial intelligence models designed to implement artificial vision services, natural language processing and content creation, among other capabilities, and if it were not for NVIDIA’s support in the short term, they probably would have stopped their development. Will not be able to continue business in this area.
Chinese AI engineers need to use A800 or H800 chips that are three times faster than A100 or H100, but don’t have access to the latter
And it is that this company has the approval of the US Department of Commerce to sell in the Chinese market its A800 and H800 GPUs, which are slightly simplified versions of the flagship graphics processors that I mentioned in the second paragraph of this article. In practice, these chips limit the capabilities of the multi-GPU platforms needed to train the most sophisticated artificial intelligence models, but Chinese experts have found a way to keep up despite the restrictions imposed by these graphics processors. have taken.
According to a group of experts consulted by The Wall Street Journal, the language model that ChatGPT is based on requires support between 5,000 and 10,000 NVIDIA A100 GPUs. According to Yang Yu, who is a professor at the National University of Singapore and founder of HPC-AI Tech, Chinese engineers working in the field of artificial intelligence will need to use about three times the A800 chips to achieve comparable processing power or the H800. is required. , so the cost of your hardware infrastructure is very high.
Still Chinese companies play tricks. In fact, Tencent last April developed a set of NVIDIA H800 GPUs specifically designed to train its generative artificial intelligence models. Furthermore, technicians from Chinese companies are coming up with ingenious methods that are allowing them to grow their AI ecosystem at a good pace.
The two most effective are combining different types of chips into their hardware infrastructure and developing software techniques that aim to ease the strain of training large AI models. In practice, AI development continues in China despite US setbacks.