Why China's AI startups stand no chance in the ChatGPT race

By Yin Ruizhi
Technology Specialist
Yin Ruizhi

Share:

Get the ThinkChina newsletter

Insights on China, right in your mailbox. Sign up now.

AI is all the rage at the moment, but technology expert Yin Ruizhi warns that China's AI-related startups may not stand a chance in the current environment as big tech platforms gobble small ones, and the business know-how is firmly in the hands of industry leaders who would choose to work with the tech bigwigs.
A man takes a picture of robots during the World Artificial Intelligence Conference in Shanghai on 7 July 2023. (Wang Zhao/AFP)
A man takes a picture of robots during the World Artificial Intelligence Conference in Shanghai on 7 July 2023. (Wang Zhao/AFP)

Given the high costs of training, the recent wave of popularity generated by OpenAI does not make it any more suitable for startups to create a large language model (LLM) from scratch.

According to a report by research company TrendForce, when OpenAI was training ChatGPT's predecessor GPT-3, it used approximately 20,000 Nvidia graphics cards. With each Nvidia A100 card costing around US$10,000, this would mean a total cost of US$200 million.

Industry insiders estimate that ChatGPT would require more than 30,000 graphics cards. This is a tough scale even for OpenAI itself, with CEO Sam Altman commenting that OpenAI might need to try and raise US$100 billion of funding in the next few years. This scale of financing is something that startups which rely on initial funding cannot even begin to imagine.

So, though the recent wave of AI-related startups seems to be all the rage in China, in truth not many of these new firms are truly centred around creating LLMs. Startups can generally be separated into two categories based on their targets: those that leverage LLMs to provide services directly to the consumers, and those that leverage LLMs to provide relevant services to businesses.

While the media constantly hypes up this wave of entrepreneurial opportunities due to AI, insiders are adopting a very cautious attitude towards these two business tracks, especially in China.

In China, large online platforms have a history of taking over medium/small third-party application platforms.

Consumer-facing applications: taken over by large platforms

The first entrepreneurial route that business insiders advise "approach with caution" is consumer-facing commercial applications.

In China, large online platforms have a history of taking over medium/small third-party application platforms. For example, during the early days of the short video e-commerce era, Kuaishou leveraged the third-party Mockuai e-commerce platform to explore and develop the use of short videos for e-commerce, even investing in Mockuai. Once the short video e-commerce model matured, Kuaishou immediately entered the market on its own and developed its own e-commerce platform. Similar incidents have been talked about with Tencent's WeChat app and Douyin's local life services platform.

A response by ChatGPT, an AI chatbot developed by OpenAI, is seen on its website in this illustration picture taken on 9 February 2023. (Florence Lo/Reuters)
A response by ChatGPT, an AI chatbot developed by OpenAI, is seen on its website in this illustration picture taken on 9 February 2023. (Florence Lo/Reuters)

So, amid this wave of AI-centric entrepreneurship, we have no reason to believe that large platforms would stay out of the fray in the long run, allowing new companies to grow and freely explore the consumer-facing services market.

Large platforms house the three essential pillars of models, computing power and data. They have increased their capacity to undermine and engulf the core businesses of startup companies.

New startup companies are allowed to pursue new LLM-based AI applications on the existing, major online platforms in the short run because the outlook is not yet clear for the latter. In the long term, however, it's hard to define and guard one's territory that is created on these major platforms, and startups will hardly stand a chance.

Decreasing costs of developing applications

Currently, creators in China and in the US are mainly targeting the business-facing AI LLM market, which can be inferred from one detail: OpenAI's website visits decline sharply during weekends and holidays, and rise during normal work days. For business-facing applications, cost is always a key factor, as it is unlike consumer-facing applications with its low unit cost which allows for the company to continue growing as long as there is sufficient increase in the number of users.

For creators who create industry-specific vertical applications with external LLMs, the good news is that the cost of developing such LLM-based applications will go down. Alibaba Cloud said it hopes the future cost for businesses to train a model would be one-tenth, or even one-hundredth of current costs, and small and medium enterprises can draw the capabilities and services of an LLM through its cloud service platform. Three months after its launch, Baidu's LLM has lowered costs to 10% of what it was at launch. Baidu Cloud added that "price should not be a factor hindering people from using or embracing the use of LLMs".

...the most valuable information with regard to this know-how might not be available online, but in companies' private databases, or even in the minds of a select few experts.

Business-facing applications: data thresholds and limits of LLM 'intelligence'

But the problem is that besides cost, generic LLMs are unable to optimally meet the demands of different industries. The biggest problem with generic LLMs is that its "intelligence", or its effectiveness, is determined by the data used in its training, which sets the upper limit of the "intelligence" of LLMs, and the commercial intelligence gathered or know-how.

Visitors take pictures of robot arms at the 2023 World Robot Conference in Beijing on 16 August 2023. (Wang Zhao/AFP)
Visitors take pictures of robot arms at the 2023 World Robot Conference in Beijing on 16 August 2023. (Wang Zhao/AFP)

The core issue is that each industry has its own unique know-how, and the most valuable information with regard to this know-how might not be available online, but in companies' private databases, or even in the minds of a select few experts. Even if large platforms like OpenAI, Google, Huawei and Baidu spend a fortune to improve the abilities of their LLMs, it would be difficult for them to overcome this data threshold.

To elaborate, generic LLMs are not directly applicable to government, urban, professional or corporate settings. Also, even as generic LLMs lack professional depth, there are other pain points such as data security risks, no guarantee of data integrity, and lack of cost control.

...industry-specific applications with a higher probability of success would be created by industry leaders with access to sufficient resources and knowledge, working directly with major platforms.

No space for startups?

Business-facing startup creators who started with a technical advantage in AI would find themselves waking up early but being late to the market anyway. This is due to the fact that industry-specific applications with a higher probability of success would be created by industry leaders with access to sufficient resources and knowledge, working directly with major platforms.

This is why when Baidu and Alibaba fought to release generic LLMs, Tencent instead chose to lead the charge in implementing plans to release an industry-specific LLM, competing for business-facing clients with Baidu and Alibaba.

The current wave of AI-related entrepreneurship might seem in full swing, but in truth, right from the start, it was a chilling prospect for startup entrepreneurs.

Get the ThinkChina newsletter

Insights on China, right in your mailbox. Sign up now.