Lecturer | Luo Hao (Senior Architect of Alibaba Cloud Cloud Native)
The development trajectory of serverless
**In 2012, the word Serverless first appeared, proposed by Iron Company, which literally means no server required. But it was really well known when AWS launched Lambda in 2014. The launch of Lambda products opened a new era of cloud computing, and all major manufacturers followed suit, such as Microsoft, Google, and IBM, all of which have launched their own serverless products.
In China, in 2017, Alibaba Cloud and Tencent Cloud successively launched their own serverless platforms. But this time, it all refers to FaaS (Function as a Service). Then in 2018, everyone began to gradually come into contact with Serverless, and more of them were cloud development platforms for Alipay and WeChat mini-programs. Then in 2019, other domestic manufacturers such as Baidu, Huawei, and Byte also began to do serverless, and now serverless has become the standard configuration of major cloud manufacturers.
Serverless is the 2.0 of cloud computing
Why does everyone want to be Serverless? Because everyone thinks that Serverless is cloud computing 2.0. With the development of cloud computing, serverless has become a technology trend, a concept, and a cloud development direction.
There are two very famous papers in the field of cloud computing, which were published by the University of Berkeley in 2009 and 2019. A paper on cloud computing published by the University of Berkeley in 2009 predicted the development of cloud computing, such as the availability of computing resources on demand, supporting elasticity, simplifying operation and maintenance, etc. These predictions have now been realized.
In the second paper published by Berkeley in February 2019, it predicted that Serverless is the development direction of cloud computing in the next decade. The paper also gives the definition of Serverless, which is simply Serverless Computing, which consists of FaaS + BaaS (Backend as a Service) to form a Serverless software architecture. The feature is that it can be elastic on demand and pay on demand, which is similar to the definition of CNCF. Applications are disassembled and deployed to the cloud in the form of microservices or functions. Care about the underlying resources.
Serverless is an advanced stage of cloud native development
What does serverless have to do with cloud native? The emergence of serverless, like the evolution of human beings, represents the liberation of productivity and greatly improves the efficiency of customers using the cloud. Serverless encapsulates container technology on top of it and is an advanced stage of cloud native.
Serverless is to emphasize No Server to users. The essence is not to not need servers, but to fully host the servers to cloud vendors. Users do not need to care about or manage them. They only need to deploy the business on the platform, and only need to focus on the business logic code. , which can be elastically scaled according to actual requests, and no longer need to care about whether the resources are enough.
Core Values of Serverless
From physical machine to serverless, just like when we buy a car, if we want to buy a private car, we have to take care of the car's condition insurance, and then you have to drive it by yourself; after we get to the virtual machine, we host the business to the cloud, just It’s like car rental; then I don’t need to buy a car or care about the condition of the car. We only need to take a taxi to get from point A to point B, and it’s completely on-demand and flexible.
In abstraction, there are actually three core values:
- The first one is Elastic , which is more convenient. For example, we have an e-commerce scenario just now, which requires flexibility and a large amount of traffic. Serverless can pop up resources in a timely manner.
- The second feature, pay-as-you- , we spend as much as we use resources, and we don't need to pay for idle resources.
- The third is simplify operation and maintenance , can help users to save resource management troubles.
We can look at the value of serverless more intuitively: first, the top layer is the business logic, followed by the docking database, storage, microservice framework, etc. Next, we need to establish a monitoring system, a log system, and disaster recovery and high availability, etc., and then to The bottom layer also maintains various IaaS resources, such as virtual machine clusters. Serverless saves users the resource layer and observable layer. The platform is responsible for the underlying elastic resources, including all log monitoring, and users only need to care about business logic.
Serverless software architecture
As a developer, we can directly deploy the image or code package to the serverless computing platform, saving the whole process of resource purchase and environment deployment. After deployment, the backend can interact with storage and databases to form a complete serverless architecture. Afterwards, you can directly access the business code by means of LB or HTT, and the platform will schedule and elastically scale according to the user's request. The serverless platform supports load balancing to deal with various burst traffic, and users do not need to care about background resources.
The component architecture is a bit complicated, so I won't expand it this time. For developers, what needs to be paid attention to is the green part, that is, business code and service framework, etc., and what kind of tools and back-end BaaS are used. The serverless platform will take care of all infrastructure, and will do a good job in message caching, traffic scheduling, disaster recovery, and high availability.
Another very important component architecture is the serverless application engine, which essentially encapsulates K8S. If the enterprise has a microservice business and needs to deploy to the K8S cluster, and the maintenance challenge is relatively large, this form can be used. By directly hosting the developed microservices or monolithic applications on this platform, you can enjoy the value of elastic scaling and pay-as-you-go brought by Serverless.
Implementation of Serverless
Serverless has multiple landing scenarios. In various industries, both background services and REST APIs can be deployed on the serverless platform. Especially in serverless audio and video processing, lightweight ETL (low threshold data analysis/processing), event-driven, task batch running, application hosting, microservice containerization and other scenarios.
There are a lot of application cases on the serverless platform. For example, if you want to do microservice or containerized transformation, and you want to reduce the complexity of operation and maintenance, you can also have the ability of elastic scaling and convenient release, you can directly deploy the service to Serverless Application Engine.
To share another case, many people must have watched the European Cup. In China, iQiyi Sports is doing the live broadcast of this event, and the business behind it is to deploy it on the serverless application engine platform.
For the iQIYI sports team, one of the biggest pain points is the elasticity of resources. Because the live broadcast traffic of sports events has great uncertainty, in the face of a surge in traffic, it is necessary to expand the background services in a timely manner. If the resources are reserved according to the peak, it will cause the risk of inaccurate traffic estimation, and to a certain extent waste of resources.
Therefore, the serverless application engine matches the pain points of customers very well. It not only solves the problem of elastic scaling, but also improves resource utilization. At the same time, the supporting application monitoring also greatly improves the efficiency of locating problems.
Recommended reading: Sports: Experience serverless extreme expansion and shrinkage, resource utilization increased by 40%
The future of serverless
- Replacing Serverful on a large scale and becoming the default computing paradigm: Although Serverful will not disappear completely, as the shortcomings of Serverless are overcome one by one, Serverlsss will gradually increase its proportion in cloud computing and become the default in the cloud era. Computational Paradigm.
- Embrace the entire container ecosystem: In the future, Serverless will embrace the entire container ecosystem more. Currently, containers are a mainstream trend in the entire industry. Serverless will do more integration with containers, such as image deployment, image acceleration, and integration of K8S. Ability.
- Accelerate the change of the operation and maintenance relationship: Serverless will accelerate the transformation of the operation and maintenance relationship. The operation and maintenance students will gradually move from resource operation and maintenance to business operation and maintenance.
- Enhancement of complex task orchestration, toolchain, and observability: Serverless will enhance the capabilities of complex task orchestration, toolchain, and observability. Because the serverless platform highly encapsulates the underlying resources, it is necessary to disclose a lot of monitoring indicators to users, and use these indicators for business-level management and control.
We hope that serverless can really reduce the burden on everyone, make business development and maintenance easier, and bring greater value to the business.
For more content, pay attention to the Serverless WeChat official account (ID: serverlessdevs), which brings together the most comprehensive content of serverless technology, regularly holds serverless events, live broadcasts, and user best practices.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。