头图

The content of this article is selected from the text record and video playback of "The Latest Practice of Serverless Application DevOps" shared by Mr. Sun Hua at the 2021 China DevOps Community Summit Dalian Station.

image.png

Scan the code to view the video playback

following is the transcript of the speech:

Hello everyone, my name is Sun Hua, an expert in serverless products of Amazon Cloud Technology. I'm glad to have the opportunity to share with you the latest practices of DevOps in serverless applications today. Today, in my 45 minutes, I would like to introduce to you what serverless applications are, how to develop and deploy serverless applications, how to do CI/CD and monitoring, how to achieve safe deployment, and so on.

is

image.png

At Amazon Cloud Technologies, we believe that serverless computing allows you to run your code without having to manage servers. You provide code, define what event your code runs under. For example, when users visit your website, Amazon will automatically provide computing resources, run your code, and display the content of your website to users.

For Amazon, serverless computing means you don't need to deploy, install, and upgrade servers, and it can automatically scale your applications for you. For example, in your APP application, when the number of users increases rapidly, Serverless will help you automatically increase computing resources to ensure that customers can get fast response. Serverless applications are billed for how long the code actually runs to avoid wasting resources.

common applications of 161ea1aafbc473, the load will always have peaks and valleys. For example, a game may have a tenfold difference in user traffic between day and night. If the required servers are deployed according to the peak, a large number of servers will be idle. Serverless is billed according to the actual running time of the code, and you can save a lot of costs. Serverless services are built-in high availability. The computing resources of Amazon's serverless computing services are distributed across multiple Availability Zones. If there is a problem in one Availability Zone, it will continue to operate in the other Availability Zones, while the other Availability Zones have enough redundant capacity to carry these additional loads. The cost of such a highly available redundant computing cluster is considerable. But for users, when you use such a serverless service, you only need to be billed in milliseconds according to the actual time your code runs.

image.png

Amazon's serverless computing service, Amazon Lambda, is seven years old this year. A large number of users use Amazon Lambda in various scenarios. is IT automation first . All services on Amazon provide APIs, and you can easily automate operations and maintenance through these APIs. For example, you have set up development and testing environments on the cloud, and these environments are idle after everyone gets off work. We can define a timing rule to run an Amazon Lambda function after get off work to shut down the servers in these environments, and run an Amazon Lambda function to start these servers before going to work in the morning. This can save costs and allow everyone to have enough resources to use. In addition, Amazon Lambda provides a free credit of 1 million invocations per month. A lot of IT automation usage is basically free of charge and you can use it for free.

next data scenario is data processing. user uploads images to Amazon S3 object storage, we are going to generate thumbnails, or identify which objects are in the image. Amazon S3 can automatically invoke Amazon Lambda for processing when images are uploaded. At the same time, in the Amazon IoT scenario, the data collected by a large number of Amazon IoT devices can be written into the Kinesis data stream, and the data can be decoded and transformed through Amazon Lambda. Users can quickly build real-time monitoring reports and early warning systems.

third scenario is that after users are familiar with Amazon Lambda, they begin to use Amazon Lambda to develop business-supporting microservices, such as web applications or backend services for mobile applications.

fourth scenario is machine learning, especially machine learning inference. learning scientists and engineers typically don't want to manage servers, and serverless can help them quickly deploy trained models into production-ready APIs.

image.png

can summarize the features mentioned above : Serverless computing, first is convenient and fast no need to deploy the server, install the operating system, install the dependencies required by the application, etc., directly submit the code to configure the event source on it. second one is the performance , we have some customers that provide electronic invoice service. Electronic invoices are PDF files that need to be converted into thumbnails for display to customers. This client used to use an external service to generate PDF thumbnails, which cost 70,000 to 80,000 yuan per month. This is done by using Amazon Lambda to trigger an Amazon Lambda function when the PDF file is uploaded to Amazon S3. It was found that it used to take 10 seconds to complete, but now it only takes 2 seconds to use Amazon Lambda, and the cost has dropped from 80,000 yuan to less than 100 yuan now. This also reflects that Amazon Lambda can help customers reduce costs. third point is safe . With Amazon Lambda, the security of the server's operating system and runtime is managed by Amazon, and you are only responsible for the security of your application and data.

image.png

Because of the previous features, we can see that in the past 7 years, a lot of users have started to use this service, and these applications have been used trillions of times.

develop and deploy serverless applications?

image.png

Let's take a look at how serverless applications are developed. In fact, it is very simple. There are several things. There is an event source, a handler that triggers a function, and later in the code, you can access any service, whether it is a backend or an API.

image.png

Development is also simple, write code, deploy to Amazon Lambda, and run the code automatically when an event is triggered. When an event occurs, Amazon Lambda automatically triggers your code.

image.png

This is a very simple example, the Amazon Lambda event contains the data passed from the event source. This function body has its own logic code to carry out, and finally returns the result.

image.png

If you want to develop web applications, you can use API Gateway to provide Http on the front end, and in your Amazon Lambda, you can directly use common web frameworks.

image.png

This is an example of developing a Serverless Web application with Express.js, wrapping the Express.js application with Serverless. In this way, you can make your favorite web application framework to develop serverless web applications.

image.png

How do we package web applications on Amazon Lambda? At re:Invent 2020, Amazon Lambda announced support for container images. Amazon Lambda supports packaging and deployment of function code as container images. In the past, if there were too many dependent libraries, it might not be able to be packaged into a 250MB Amazon Lambda function. Therefore, use unified tools to build containers and Amazon Lambda-based applications, support container images with a maximum size of 10GB, sub-second automatic expansion, and high availability , integrates more than 200 event sources.

image.png

Here is an example, a Dockerfile for an Amazon Lambda function, using the node.js v12 base image from the Amazon ECR public repository, copying the code and dependencies into the image, running the npm command to install the dependencies, and at startup, using my handler function. Of course, you can also use your own defined function files, these are no problem.

image.png

The development experience is very simple. Use the conventional way to develop the experience. Use container images to deploy Amazon Lambda functions, container images, docker push, container images, Amazon ECR, and image pushes to the ECR image repository. Create functions: status: Pending, container Image, Amazon Lambda, pull images from Amazon ECR, optimize images, create Amazon Lambda functions. Invocation: Status: Active, Amazon Lambda function, Ready.

image.png

Amazon Lambda Adapter is the latest open source project . The role of this project is to run external applications on Amazon Lambda. Before using Serverless to load a package, you do not need to modify your code through this tool, do not need to introduce any third-party dependencies, Amazon Lambda Adapter, rapid development of serverless Open source tools for web applications. Universal Amazon Lambda API and Http API transfer tool, no need to add new dependency packages in the application, use any web framework, use any programming language, use mature development tools to debug locally, use Rust to develop, safe and efficient. Just add it in your Dockerfile.

image.png

The first step is to use the Amazon Lambda Adapter in the Dockefile: 1. Copy the Amazon Lambda Adapter; 2. Use it as the container's Entrypoint; 3. Use CMD to run your web application (default listening on port 8080).

How to build a CICD pipeline for serverless applications?

image.png

Let's look at the actual deployment application, because the billing method is that when you actually use it, or when you call it, you will incur charges. You can deploy it in many copies, and you can deploy it for each environment. One copy, and even one set can be deployed to every developer. If you use a traditional architecture, it is hard to imagine that my production environment is the same as mine. It is impossible to deploy the same production environment for every developer. You can deploy the same environment on Serverless, nothing is impossible.

image.png

One way we would recommend on Amazon is to isolate different environments with multiple accounts, why would you want to do that? Because some indicators are shared across the entire account, such as the number of Amazon Lambdas issued by default, the default is 1,000. Of course, this can be increased. Your account is in the same environment for production and testing. You are testing Running some stress tests in it may cause the entire concurrency and affect the production. If it is divided by different accounts, this will not happen.

image.png

One more question, we have so many environments, how do we go about deployment? On the cloud we'll be very emphatic, Serverless Application Model (SAM) is an Amazon CloudFormation extension optimized for serverless, new serverless resource types: functions, APIs, and simple tables, supports anything Amazon CloudFormation supports, open source Project (Apache 2.0).

image.png

So what can be described in a few lines of code can be turned into functions, including things like security controls.

image.png

So just now we mentioned the environment of many accounts. We can deploy it in such a way. We can also refer to the open environment. The open environment has environment permissions, so that we can quickly verify its application and test whether it is found. Can be applied, and finally to realize his application. Trigger sam through CI, trigger the corresponding environment through the corresponding Pipelines, and finally do the corresponding deployment in Amazon CloudFormotion.

image.png

Another new project is provided by sam. Many of our customers will encounter some problems. To create a safe, cross-account CIDI pipeline is not an easy job, it takes a lot of work to complete, Amazon Sam Pipeline , provides pipeline templates for common CI/CD systems, reflecting Amazon's experience, Jenkins, GitLab CI/CD, GitHub Actions. There are also two environments that can be created here, as well as Amazon’s minimum security permission rules. In this Pipelines, you can initially create an environment to help you create the access control service on the cloud of Pipelines, and then have the corresponding Access Control Service. Key and Secret Access Key, the user permission is to obtain another role, and there is no other permission. If the Access Key is leaked, the person can't do anything with it. Through fine-grained permission management, your team can quickly build a safe and efficient CI/CD pipeline.

implement secure deployment?

image.png

I talked about how to deploy, how to do it in terms of security deployment? Here, we can pass the API to the Amazon Lambda alias, and finally we release a new release, on V12, for the front end, it is to access the Amazon Lambda alias.

image.png

At the beginning of the deployment, we switched 10% of our traffic to the new version.

image.png

During this process, we can wait, wait for 3 minutes, monitor some indicators, whether there is an alarm, if during this waiting process, we find that the program reports an error, how many percentages are exceeded, or the extension of the front end becomes higher, the traffic can be automatically switched back . If there is no problem, continue to cut down.

How to monitor serverless applications?

image.png

Let's take a look at the monitoring at a glance. It involves some indicators. These are all related services. This is a service called Amazon Lambda Insights. There are also log and distributed tracking functions below.

image.png

There are many enterprise customers who also have their own commonly used operation tools. They want to use their own tools to monitor all environments. It was difficult before. We launched this Amazon Lambda extension service, which can modify variables when running in Amazon Lambda, and Amazon Lambda The function runs in the same process, the startup parameters of the runtime process can be modified, and the language-related environment variables and wrapper scripts can be set. This receives your monitoring indicators and sends them to commonly used monitoring methods. Now many partners provide this service.

image.png

There is another use besides this kind of monitoring to do other things, we can use AppConfig, through extension to access AppConfig, you can turn on a switch to do some deployment. Deploy application configuration changes at runtime, controlled deployment, validation and rollback.

image.png

image.png

image.png

image.png

Amazon CodeGuru collects some conditions of your running CPU, and reduces the impact of Profiler on your application in some ways. In this case, only a small part of the collection of agents will be improved, which will have a small impact. This is to continue running on the program. It can help you collect the application performance flame graph corresponding to the CPU, and will automatically give you some commonly used suggestions.

image.png

image.png

Amazon's team has improved CPU utilization by 325% through CodeGuru optimization. Amazon Lambda also has a new function, you can directly turn on the switch through the page, corresponding to your CodeGuru.

image.png

image.png

The last example is Coca-Cola, which is a customer of Amazon. Everyone can choose a beverage machine all over the world. After the epidemic, everyone does not use this much. Coca-Cola wants to make a machine that can be directly controlled on a mobile phone without any buttons. This is roughly used. For example, put on the cup, scan the code with the mobile phone, then an app will be opened, then select the brand of the drink you like to drink, select the corresponding flavor and press the button, and the drink will come out within 1 second. From the mobile phone to the cloud, the API comes in, and the control signal is transmitted back to the machine. The whole process is completed within 1 second. This solution is conceived, designed, developed, designed, deployed, and promoted globally. 100 days.

image.png

image.png

So, in the end, I hope you can learn about Amazon's services and computing from the previous introduction, which can help you achieve fast, low-cost, and stronger performance. If you have the opportunity to use serverless computing, join these global companies. Let's use Serverless together.


亚马逊云开发者
2.9k 声望9.6k 粉丝

亚马逊云开发者社区是面向开发者交流与互动的平台。在这里,你可以分享和获取有关云计算、人工智能、IoT、区块链等相关技术和前沿知识,也可以与同行或爱好者们交流探讨,共同成长。