When it comes to AI, we always think of AlphaGo sweeping the go masters, the ability of Boston robot dog to move against the sky, and the cool explosion of Tesla humanoid robot
These can attract people’s attention, but what about the real AI industry? What is the role of AI computing power in social and economic development? What are the development trends of AI? What are the AI application scenarios?
Recently, the 2021 AI Computing Conference (AICC 2021) was successfully held. The partners from the industry had a deep discussion on the hot topics in the current AI industry, and discussed how AI can enable technological innovation, social governance and industrial upgrading in the new pattern of the digital economy, and how to accelerate the construction of digital China and smart society.
AICC 2021, these AI hot topics you must know
At this conference, not only have various AI companies brought more than 600 latest AI products and application results, but also released the Evaluation Report on the Development of AI Computing Power in China from 2021-2022, interpreting the current development and industry penetration of AI computing power in China. Big Data Online also had the opportunity to have in-depth exchanges with six representatives of the AI industry and discuss various hot topics in the current AI industry.
The ideal and reality of intelligent assistant
Nowadays, smart assistants have become increasingly popular. Smart assistants have become standard in all kinds of mobile phones, loudspeakers and even cars. However, there is still a gap between the use experience of smart assistants and people’s expectations. How do you view this situation?
In this regard, Wan Yulong, the chief architect of OPPO’s small cloth assistant, believes that users’ evaluation of the intelligent assistant is based on their expectations and current situation. The greater the gap, the greater the user’s disappointment; At present, the performance of the intelligent assistant in some specific vertical scenes has been very close to the user’s expectations, but in a broader scenario, there is still a lot of room for improvement, and the goal of the OPPO small cloth assistant is to become an all-around intelligent assistant in the whole scene.
AICC 2021, these AI hot topics you must know
It is reported that since its establishment in December 2018, OPPO Small Cloth Assistant has become a smart assistant on smart phones and IoT devices such as OPPO, OnePlus and Realme. It has been developing for three years and has activated nearly 250 million devices. In February 2021, OPPO’s small cloth assistant became the first mobile phone voice assistant with a monthly life of more than 100 million in China.
Wan Yulong said that the positioning of the small cloth assistant is to provide intelligent interactive experience across terminals, full scenes, and multiple modes for OPPO’s entire series of intelligent hardware devices, which is much more difficult and challenging than the assistant in professional scenes, “The effect optimization of full scene interaction puts forward higher requirements for algorithm scheme and data diversity. Among them, at the algorithm level, in order to solve the problems of high labeling cost and limited data diversity, we will try to gradually develop from supervised training scheme to unsupervised training scheme; at the computing level, OPPO is also actively cooperating with chip and server manufacturers at home and abroad to continuously improve AI computing power and support model training and multimodal new algorithm performance Enter. ”
At present, OPPO’s small cloth assistant mainly faces the scenarios of mobile phones, watches and other wearable devices, smart home and smart travel. Facing the future, OPPO’s small cloth assistant will continue to develop in the direction of virtual digital human, providing personalized and customized intelligent assistant service for thousands of people and thousands of faces for each user.
Everything counts
IDC predicts that the global AI market expenditure will reach US $85 billion in 2021 and increase to US $200 billion in 2025, with a five-year compound growth rate (CAGR) of about 24.5%. Among them, by 2025, about 8% of AI-related expenditures in the world will come from the Chinese market, with the market size ranking third among the nine regions in the world.
It can be said that the huge market means huge opportunities. To this end, many new chip companies were born in the market, hoping to make breakthroughs in the computing power market. Among them, Tianzhixin is a typical representative in China at present.
AICC 2021, these AI hot topics you must know
Zou Ju, vice president of products of Shanghai Tiantian Smart Core Semiconductor Co., Ltd., believes that the biggest advantages of Tiantian Smart Core as an AI chip company are mainly in three aspects:
First, without the burden of history, developers can carry on the battle lightly. When designing the chip system and software, Tianzhixin can discard the part facing the past, greatly improve the quality of service purchased by customers, and even control the power provided by each watt.
Second, the Tiantian Smart Core team is composed of a group of technical experts who have worked in GPU, computing and other fields for many years. With profound industry experience, it can achieve the best in the number of transistors in each specific chip unit.
Third, we should better understand the needs of the Chinese market. From the perspective of applications, software and chip design, Tianzhixin has its own unique ideas. The future is an era where everything can be counted. At this time, GPU is more promising.
Zou also believes that the fact that the GPU market cannot be denied is that the giants have worked in this circuit for many years, and domestic chip companies have just found their way in high-end chip design, talent, IO, architecture innovation, application iteration and other aspects, and still need to make continuous efforts.
As for how to bridge the technology gap with giants, Zou said that it is necessary to continuously understand the demand in the market, form the continuous iteration of technology in the market, and then repair the technology through market feedback. However, the process can not be achieved in a day or two. We need to keep our efforts down-to-earth.
Finally, Zou Lei shared his insight into the future development trend of AI computing power. He believes that as AI continues to land in the scene, the demand for computing power is also increasing; In this regard, the most common practice in the industry is parallelization, but the increasing size of the parallel structure will also cause problems. Tiantian Smart Core is committed to solving the general balanceability after the increasing size of the parallel structure, so that the chip’s energy efficiency can also be the best when the future computing power continues to improve. At the same time, in cooperation with industry partners, it will gradually establish the evaluation standard of general computing, and continue to make its due contribution to the industry.
Software definition of computational power value geometry
The most influential technology of data center in the past decade is software definition.
From the era of virtualization, companies represented by VMware have made more efficient use of computing resources in data centers by virtualizing server CPU resources; After that, software-defined network, software-defined storage and other technologies have also emerged. Now, with the popularity of cloud computing technology and concepts, software definition has become the most important core technology of data centers.
Now we have entered the era of AI computing power. In the field of AI chips, in addition to giants such as NVIDIA, there are many types of AI chip companies, technologies and products. User data centers are facing an era of heterogeneous computing. With the increasing number of AI chips such as GPU, how to make AI computing power more efficient, fast and flexible for people is the original intention of software definition of computing power.
In the software definition circuit, the current representative of innovation in China is the technology, which is committed to solving the challenge of efficient use of AI computing power.
Relevant data shows that the utilization rate of GPU in the user data center is only 10% – 30% at present, and there is huge utilization space in the future. Zhang Zengjin, the technical director of Tencent Technology, believes that there are three major problems with AI computing power at present: first, the demand for GPU cards in many businesses is very flexible, and how to provide matching computing power for business departments is a major problem in the industry; The other is how to integrate GPU capabilities into existing PaaS platforms and cloud management platforms; Third, the management, monitoring and life-cycle management of various cards, different brands and models, brought by heterogeneous computing are challenges.
“The core value of software definition computing power is to reduce costs and increase efficiency,” said Zhang Zengjin.
So, what scenarios can software definition computing power play a good role in the data center? In this regard, Zhang Zengjin believes that there are four scenarios that are very suitable for software definition of computing power.
First, take things from space. For example, a user does not have a GPU card on a server, but wants to run AI applications on that server. At this time, it needs to call the resources on the GPU service remotely through the network; Like many training scenarios, CPU resources and GPU resources do not match. At this time, GPU resources need to be called through the remote network.
The second is to break the whole into parts. Like some reasoning references, it does not need to consume a lot of GPU resources. At this time, you can slice the GPU by percentage, so that multiple business applications can be superimposed on a GPU card, and fully improve the utilization rate of the GPU card.
The third is to integrate the resources distributed on different GPU servers. Even the fragmented resources on a GPU card can be integrated, and then provide computing resources for an AI application.
Finally, on demand. AI applications tend to be flexible in their resource requirements and need to dynamically adjust GPU resources. Some applications may need 4 cards in the morning and 2 cards in the afternoon. For GPU resource adjustment, it needs to be real-time, without shutting down or restarting the application, and only needs to adjust the parameters.
AI and Smart Supply Chain
In the logistics circle, there is a word closely related to AI, that is, intelligent supply chain.
In fact, whether it is online e-commerce or offline retail, the core capability behind it is supply chain capability. Therefore, when we praise JD Logistics for its fast delivery and good shopping experience, its strong supply chain is behind it.
Today, AI technology is profoundly affecting the development of supply chain. Whether it is deep data mining or AI self-learning, it has an important impact on the decision-making of all links of the supply chain. Dr. Zhuang Xiaotian, senior director of intelligent supply chain of JD Logistics AI and Big Data Department, introduced that AI has been widely embedded in the three major links of supply chain design and planning, supply chain planning and management, and supply chain execution and operation, to ensure the efficiency of the supply chain from end to end.
For example, in terms of supply chain design and planning, various logistics networks need to be considered from the point, line and area, including the terminal stations of the business department, regional warehousing, etc. Taking JD Logistics as an example, among its six major networks including express, express and cold chain, JD Logistics supply chain team will use a large number of operational research optimization algorithms to seek the best overall cost and user experience in the complex network.
In the supply chain planning and management, the core is inventory. Dr. Zhuang Xiaotian said that the whole supply chain is based on how to make inventory better. There are two major capabilities: one is prediction capability, the other is simulation capability; For example, JD.com will predict the category, so as to know where consumers are, what consumer demand is, and how to put the inventory in the most appropriate place according to consumer demand.
In the implementation and operation of the supply chain, the key words are package and order, and the optimization for this part is also the most intuitive feeling of users. Take JD as an example. After the user places the order on November 11, JD Logistics will recommend the most suitable goods to the picking clerk according to the current warehouse goods storage, including the storage location, customer requirements, and also recommend the most suitable picking path.
At the same time, Dr. Zhuang Xiaotian also believes that the supply chain is a very complex scenario, which is mainly reflected in the number of chain participants, the number of space elements such as people and freight yards, and the need to make predictions in time. These factors together constitute the complex scenario of the supply chain. Moreover, as the business granularity is becoming more and more detailed, its complexity is also increasing, followed by the increasing demand for computing power.
“In essence, operational research is to solve a decision-making problem, which needs to adjust the application of algorithms and technologies according to different scenarios, and has a great dependence on computational power,” said Dr. Zhuang Xiaotian.
Finally, Dr. Zhuang Xiaotian believes that AI technology has broad application prospects in logistics and supply chain in the future, but it also needs to overcome three important challenges: first, the problem of data, how to gather the data of the whole industry and even society, and play a greater role; The second is the challenge of energy consumption. In the context of the dual-carbon strategy, how to design a better algorithm and architecture to ensure that computing resources are as efficient as possible, and promote the implementation of AI in the industry in a more sophisticated way; Finally, there is industry awareness. We need to go deep into industry problems and pain points before we can consider using technology to improve the efficiency of the whole industry.
Privacy computing and data value
In the digital era, how to realize data value has become the number one topic.
Now, it is the only way to realize data value, release data productivity and make data better connected, flowing, shared and applied. But in recent years, data leakage events have made people realize that data privacy is becoming more and more important. In general, the flow and sharing of data means more security challenges; However, more stringent data security measures will reduce the mobility and consumption of data, and data circulation sharing and data security seem to be an irreconcilable “contradiction”.
How to make data flow, share and apply better on the premise of ensuring data security? From the perspective of technology, the rise of privacy computing in recent years has been given high hopes.
AICC 2021, these AI hot topics you must know
Insight into digital intelligence federation all-in-one machine
Insight Technology is a technology innovation company focusing on privacy computing in China. Fang Dong, general manager of Insight Technology Project Management and Delivery Center, said that the mission of Insight Technology is to release data value safely, and the core technology is privacy computing. Insight Technology is working with its peers to build a data exchange network based on privacy computing. Through the data exchange network, data can play a greater role under the premise of security and compliance, mining the gold mine of “data elements”.
Privacy computing is a collection of technologies that realize data analysis and calculation on the basis of cryptography and protect the data itself from external disclosure. Compared with traditional data use methods, the encryption mechanism of privacy computing can enhance the protection of data and reduce the risk of data disclosure.
Fang Dong said that at present, there are three main directions of privacy computing: the first is the cryptography based privacy computing technology represented by multi-party security computing; The second is the technology derived from the integration of artificial intelligence and privacy protection technology represented by federal learning; The third category is the trusted hardware based privacy computing technology represented by the trusted execution environment.
“Federated learning is actually to divide the model into pieces. The part of the bank is done by the bank. The part of the external data source is connected with the external data source, and then trained together. A popular analogy is that the model is moving and the data is not moving, so as to ensure data security. In addition, the data is available and invisible. The data can be used externally or internally, but not visible. These are the most basic ways to ensure data security.” Fang Dong said.
In addition, compared with other learning, federated learning has a higher demand for computing power, because it involves data exchange, data encryption, multi-party security and other technologies.
For the scenarios suitable for privacy computing, Fang Dong believes that three very suitable industries are currently seen: finance, medical care and government affairs. “The financial risk control, anti-fraud and other scenarios are very suitable for the implementation of privacy computing. In the medical field, connecting the data of different hospitals and integrating the data without exposing the patient’s privacy through privacy computing will be of great benefit to the development of medicine. In the government affairs scenario, now everyone is talking about data opening, sharing and trading. If there is privacy computing, you can open and share more valuable data, instead of having to