New CP for Digital Transformation | Cloud Native + Edge Computing Practice Sharing


Recently, at the 2022 Smart Cloud Edge Open Source Summit (2022 Open Source AceCon), Du Dongming, chief solution expert of Lingque Cloud, was invited to give a presentation on "Cloud Edge Empowerment, Container Accompanying-Practical Thinking of Cloud Native Edge Computing" Topic sharing. He pointed out that with the continuous acceleration of digital transformation, cloud native has become an inevitable choice for enterprises, and the deep integration of cloud native and edge computing can bring greater benefits to enterprises and enable business development faster.
 

Cloud native has become a booster for enterprise digital transformation


After long-term practice, we found that the essential answer is the destructive effect of enterprise digital transformation. The so-called digital transformation refers to the integration of the enterprise's core competence of realizing value with digital technology and digital channels. For example, traditional banks realize value by absorbing deposits, lending loans, and providing financial services. In the past, when we chose a bank, we usually looked at the distance between the branch and the company. It has become the core tool for realizing value. In some more traditional industries, we have also seen digital transformation, such as industrial enterprises, in order to better improve quality and efficiency, reduce emissions and reduce costs, order and plan, design and process, production and quality, warehousing and The organic combination of logistics forms the industrial Internet and industrial big data, which is also digital transformation in essence.


Once the enterprise has undergone digital transformation, the IT of the enterprise has undergone tremendous changes, gradually transforming from the support department, the cost department to the profit center, and the digital application of enterprise IT development and operation and maintenance has become the core and the core of realizing value and business growth. element.


Against this background, we have seen many changes. First, the number of enterprise applications has increased significantly. IT used to only use internal OA, ERP, and financial systems for operation and maintenance, but now, digital business has led to a geometric increase in the number of applications for IT operation and maintenance Multiplied, we see that there are more than 8,000 services in the channel system of a city commercial bank, and the number of services in a joint-stock bank has reached an astonishing 10W.


Second, the proportion of self-developed enterprises is increasing. The IT systems obtained through purchase in the past are no longer applicable today, because digital applications are closely related to the core business of enterprises. This is where the competitive advantage lies and it is impossible to purchase them. Therefore, the Gartner report pointed out that by 2020, 75% of enterprise applications will come from self-construction rather than purchase.


Third, the agility of enterprise applications is improving. Unlike previous applications that satisfy internal and steady-state user needs, digital business needs to meet rapidly changing market demands. The previous business cycle of half-year planning and half-year procurement and implementation is no longer applicable. It has become that the business may change at any time. In a recent communication with a courier company, they frankly admitted that they have to update more than 3,000 business systems in 30 countries every Tuesday and Thursday.


Finally, the business structure has changed. The traditional system of record has been changed into a system of engagement, which may require a lot of interaction and bring about continuous growth in business scale. Architecture decoupling is imperative. It is necessary to separate concerns and separate The traditional architecture is split into a microservice architecture.


These phenomena, which we collectively refer to as sensitive IT, are very different from traditional steady-state IT. Is the emergence of sensitive IT a good thing for IT? No, it is hugely destructive. The traditional steady-state operation and maintenance system based on the Itil system has become exhausted under this destructive force. The industry needs a technology that can solve these sensitive problems, and these technologies are collectively called cloud native technologies.

Accompanying Growth of Containers and Edge Computing


 

Edge computing and containers seem to be two irrelevant things. The only similarity is that they were both born in 2013. The birth of containers originated from the open source of docker and the construction of docker hub; the concept of edge computing A report from the Pacific Northwest National Laboratory in the United States. From Google Trends, we can see that in the past ten years, people have maintained a sustained high level of attention and enthusiasm for the two technologies.


In the past, we usually thought that containers or Kubernetes belonged to the technology of the data center, which is good at managing hundreds of nodes and scheduling complex micro-service business; while edge computing is usually in an environment where resources are scarce, and it seems that it is not an area where K8s is good at. In fact, this is a misunderstanding of K8s. Its excellent architecture makes it extremely flexible. It can not only manage core business in the data center, but also show its talents in edge scenarios such as single-node and three-node. The application support issues that need to be addressed in a scale environment are no less than those in a large scale environment.


The convergence of edge computing and container technologies should be a subtle process, but the large-scale convergence should occur in 2021, which is completely consistent with our market perception. Requirements for building container platforms on the edge side.


 

In the Gartner report, there are similar views. In the "Edge Computing Maturity Curve" report released by Gartner in 2020, container-related technical elements have not been mentioned yet, and in the same report in 2021, we saw the concept of "Edge PaaS" appear for the first time. To describe the application of container technology on the edge side. Moreover, as soon as this concept appeared, it reached the peak of the first concept, which shows that Edge PaaS has been happening in the past, but it will only appear significantly in 2021. Gartner pointed out that by running containers on the limited computing resources at the edge, new edge-native, highly distributed, and edge-aware applications become possible.

Best CP: Cloud Native and Edge Computing


Let's take a closer look at how containers fit into edge scenarios.


First of all, the combination of cloud native and edge is conducive to improving resource utilization. The salient feature of the edge side is that the total amount of resources is limited, usually one or three servers, and some endpoint control devices often have only one small arm box. These resources are already stretched to run edge applications, and it is difficult to carry additional resource consumption at the platform layer. The container management platform is different from traditional virtual machines and hyper-converged architectures. The overhead of the platform layer is very low. Under normal circumstances, 1c2g can run a set of K8s. If you switch to microK8s, K3s and other slimmed-down K8s, the resource usage will decrease. Reduced further. Therefore, the use of containers will not bring additional resource pressure to the edge side, but can optimize resource allocation.


Second, cloud native can help the edge side achieve unified management of infrastructure. Although the edge side is small in scale, it still needs a platform to manage infrastructure and provide network, computing, and storage services, such as software-defined storage and software-defined networking. There are already a large number of open source solutions in the container community. In the commercial field, we have already landed a large number of full-stack cloud platforms that need to manage physical infrastructure.


Third, cloud native can effectively reduce the IT cost of AI services at the edge of enterprises. Edge scenarios and artificial intelligence are inseparable. Customers usually need to perform AI reasoning operations on the edge side, and even some model training will be placed on the edge side. Then the requirement of GPU is essential. In the container platform, through GPU virtualization, one GPU can be virtualized into multiple virtual GPUs and assigned to multiple containers, and the application in each container seems to monopolize a GPU. This greatly improves the flexibility of edge-side AI services while helping customers reduce costs.


Fourth, the cloud-native technology centered on containers makes it easier to implement edge-side device port calls. Edge computing is usually connected to devices. We see the use of serial ports, network ports, VGA, etc., and the Device Plugin technology of containers allows containers to easily access these ports.


Fifth, for development and maintenance personnel, the use of cloud-native technology can more easily achieve rapid business deployment and iteration. This is also an extremely critical point. In the edge environment, business offices are scattered in a large area, which brings huge difficulties to business deployment and post-operation and maintenance. Containers can easily cope with these scenarios. The multi-cluster management technology of Lingque Cloud allows us to deploy and upgrade hundreds of thousands of edge locations in the office. For edge-side services, operation and maintenance personnel can set monitoring and alarm policies to quickly learn about on-site accidents. When business problems arise, developers can check the application logs through the management platform, and even log in to the container to debug the business.


Finally, cloud-side collaboration can greatly improve business autonomy and availability. There is a weak network environment between the edge and the cloud, which cannot guarantee bandwidth, latency, or even connectivity. Therefore, edge autonomy is required. The container management platform running on the edge side is highly autonomous, and its business scheduling and high availability guarantees are completely independent of the cloud environment. When the cloud-side network is disconnected, the edge side can still maintain normal operation.


Individual customers may worry about whether the containerization transformation of the business is very complicated? Can you handle it yourself? This is actually nothing to worry about. First of all, in the edge scenario, the number of businesses deployed on the edge side is extremely limited, generally only 4 or 5. It is not difficult to transform these businesses into containers. According to Lingque’s experience, one business can usually be completed in half a day to a week containerized transformation. Once the business transformation is completed, it can be deployed to hundreds or thousands of sites with one click, and it brings great flexibility for later operation and maintenance. Therefore, container transformation for edge services is a very high ROI investment, and the deep integration of the two can bring greater benefits to enterprises.
In the near future, Lingqueyun will officially release cloud-native edge computing solutions, so stay tuned!

Immediately start a new experience of cloud-side collaboration

If you have more questions about cloud-native edge computing, please contact us to plan and discuss with Lingqueyun senior engineers.

Related reading: The Next Stop of Smart Manufacturing: Cloud Native + Edge Computing Two-Wheel Drive

Guess you like

Origin blog.csdn.net/alauda_andy/article/details/126539385