Introduction to has rapidly expanded the use of serverless in foreign countries, the execution time of functions has been increasing, the usage methods have become more mature, and the developer tools have become more open.
To the front of the curtain
Author | Wang Chen
This article is an interpretation of Datadog's latest Serverless report. Welcome to leave a message for discussion.
In the emergence and evolution of each new technology, there will be its own fans and skeptics. The beauty of Serverless is that it can liberate customers’ investment in infrastructure as much as possible. They only need to focus on their own business and let the technology generate more commercial value. At the same time, customers only need to pay for the usage, without the need for constant computing resources. Resident.
"Half of AWS customers on Datadog use Lambda, and 80% of AWS container customers use Lambda."
Yes, this data comes from a survey report by Datadog last year, which objectively reflects the progress of Serverless in overseas markets. A year later, Datadog released the second Serverless research report. Let’s take a look at the latest progress of Serverless overseas. For decision makers and users who have already invested in the construction of Serverless or are still on the sidelines, maybe Can get some references.
Opinion 1: Lambda's invocation frequency is 3.5 times higher than two years ago, and the running time is 900 hours/day
How to define the depth of use of Serverless?
Since 2019, companies that have been using Lambda have greatly increased its usage. On average, by the beginning of 2021, these companies will call functions 3.5 times a day than they did two years ago. In addition, in the same group of Lambda users, the functions of each company run an average of 900 hours a day.
Ordinary cloud servers are charged according to the server's rental configuration and rental duration. Among them, the rental configuration is based on vCPU and memory pricing.
The function calculation is different, and it is charged according to the number of calls in the use process and the running time of the function. Therefore, the number of calls and the running time of the function are indicators to measure the depth of the customer's use of Serverless. The report does not provide information on the absolute value of the number of calls per day, but we can make an interval estimate of the customer's serverless consumption based on the data of 900 hours of operation per day.
is calculated based on the charging standard calculated by
The cost of running an instance of 1 GB of computing power for 1 second is 0.00003167 yuan. Based on the memory specification of 1GB and running for 900 hours a day, it is expected to consume 102.6 yuan, and the annual consumption is 37,000 yuan, plus storage, network, and security. The consumption of other cloud products such as, databases, etc., is already the cloud expenditure of a medium-sized enterprise. In addition, the cost of the number of function calls is usually not too much, especially for function applications related to AI modeling such as Python. Alibaba Cloud Function Calculates the cost of 1 million calls per day is 13.3 yuan.
Viewpoint 2: The median execution time of Lambda is 60 milliseconds, which is only half of that of a year ago
Under event-driven architecture, execution time is a key production factor
The execution time of a function is a new concept in the FaaS field. Because FaaS is an event-driven architecture, computing resources will be called to execute the function application and billed when it is actually triggered. The longer the execution time of the function application, the higher the cost. Different from function computing, ordinary cloud servers are billed based on rented servers. Although ordinary cloud servers also provide automatic elastic scaling functions, they are not event-driven architectures, so scaling rules are limited. Moreover, ordinary cloud servers Servers are billed by the second, and FaaS products in the industry, such as Lambda and Alibaba Cloud Function Computing, have already supported billing by milliseconds. The finer the billing granularity, the greater the room for optimization of calculation costs.
From the data structure, we can see that more and more AWS customers are following the official cost optimization best practices to shorten the execution time of the function, thereby further optimizing the calculation cost, and maximizing the cost advantage of serverless.
In the figure below, the tail of the function execution time curve is very long, which shows that Lambda not only supports short-term operations, but also supports computationally intensive use cases. Although this report does not show which computationally intensive business scenarios use Lambda, from the cases promoted by domestic cloud vendors, it is mainly audio and video processing and AI modeling applications.
Viewpoint 3: In addition to AWS Lambda, Azure Function and Google Cloud Function are also growing rapidly
Bringing together a hundred schools of thought is a necessary stage for the industry to mature
AWS Lambda is the earliest FaaS product. Azure and Google followed closely and launched FaaS products. Their growth rate may benefit from the maturity of the entire industry. Only 20% of Azure customers used Azure Functions a year ago. Years later, this data has grown to 36%, and Google already has 25% of cloud customers using Cloud Function.
this part of 160d3f539be93b, Vercel is cited in the report:
Vercel is a very useful website management and hosting tool, which can quickly deploy websites and apps, even without purchasing servers, domain names, installing and configuring Nginx, uploading website files, adding DNS resolution items, configuring webpage certificates, and most importantly personal Use is free forever.
Vercel hosts applications that are not highly coupled. Obviously, this business model of Vercel makes full use of the cost advantages of Serverless technology to reduce server costs brought by free individual users as much as possible, and uses function applications to process requests from server-side rendering, API routing, etc. In the past year, Vercel's monthly server calls increased from 262 million to 7.4 billion monthly, a 28-fold increase.
Vercel official website: https://vercel.com/
Opinion 4: AWS Step Functions is Lambda’s best companion. On average, each workflow contains 4 functions, which are increasing month by month
The orchestration function of function applications is expanding the boundaries of function applications
A complete business logic is usually not completed by a single function application. It requires multiple function applications, and even involves computing units such as elastic computing and batch computing. At this time, through the orchestration capabilities of the workflow, the distributed orchestration of each computing task in sequential, branching, parallel and other ways can simplify the code work brought about by tedious business splicing, and visually monitor each execution link in the entire business process. The state of killing two birds with one stone. AWS Step Functions provides such a function.
The report shows that, on average, each Step Functions workflow contains 4 Lambda functions, and the trend is increasing month by month. It shows that more and more customers are using workflow to process more and more complex business logic. Among them, workflows with execution time within 1 minute account for 40%, but there are also some workflows with execution time longer than 1 hour or even more than 1 day. These long-executed workflows are mainly based on AI modeling.
In this part, the report cited the case of Stedi. This company provides structured messaging services in the field of B2B transactions, such as marketing email push services. Such business scenarios are triggered by events and need to be called in a short time. With the characteristics of a large number of target mailboxes, Serverless + workflow can well satisfy customers' demands for optimization in cost and development, operation and maintenance efficiency.
Stedi official website: https://www.stedi.com/
Opinion 5: A quarter of AWS CluodFront customers use Lambda @Edge
The edge is becoming a new market for serverless
Lambda@Edge is a feature of Amazon CloudFront that allows customers to run code close to application users, thereby improving performance and reducing latency. With Lambda@Edge, customers do not need to pre-set or manage infrastructure in multiple places around the world, and only pay for the computing time used, and no cost is incurred when the code is not running.
Equivalent to the edge scenario, the network provides serverless computing power to customers together without the need to call computing power from the cloud, which improves the customer experience of those edge services that are sensitive to delay, such as video surveillance and intelligent analysis in the Internet of Things scenario , Business monitoring and analysis tasks.
The report shows that a quarter of Amazon CloudFront customers use Lambda@Edge, and 67% of them have function execution time of less than 20 milliseconds, indicating that this part of the application is very sensitive to latency. The more demand for this type of edge business, the more Serverless. The greater the potential at the edge.
Viewpoint 6: The resource utilization rate of more than half of the customer function reserved instances is less than 80%
Cold start is an unavoidable proposition under the event-driven architecture
Especially under the Java/.Net programming framework, application startup will be slower, because Java needs to initialize its virtual machine (JVM) and load a large number of classes into memory before it can execute user code. The industry has provided many ideas for optimizing cold start, such as providing function reserved instances, or improving the pull speed of container images through on-demand loading and more efficient storage and algorithms, thereby optimizing cold start speed.
Essentially, function reserved instance is a quick and effective way to avoid cold start, but it does not fundamentally solve the problem of cold start. The more resources reserved, the more waste, which is in line with Serverless advocates. The on-demand use is the opposite. Therefore, this year's report is very concerned about the utilization rate of customers in the function reserved instance.
The report shows that 57% of the customers who use the function reserved instance use less than 80% of the resources in the reserved instance, of which more than 30% of the customers only use less than 40% of the resources; the utilization rate is 80%-100% More than 40% of customers, then these customers should still encounter cold start problems. Therefore, continuously optimizing the reserved instance design of the business is still a proposition that manufacturers and customers need to face together, and related best practices will have higher guiding significance. Interested friends can take a look at this best practice Cloud Function Computing.
Viewpoint 7: Open source serverless framework is the main way to deploy function applications
The more detailed the application, the more difficult it is to deploy
Under the serverless architecture, manually deploying several function applications may not be too complicated. Once the application is expanded to dozens or hundreds, the difficulty of application deployment will be multiplied. At this time, it can be deployed through some deployment tools. Improve deployment efficiency. Just as Kubernetes is used to automatically deploy, expand, and manage containerized applications, Kubernetes has become an indispensable tool in the process of managing containers.
The report shows that more than 80% of customers use Serverless Framework to deploy and manage function applications. Although the report does not give a reason, it is largely inseparable from the ease of use, openness and community attributes of Serverless Framework. The report predicts that the infrastructure is the code category The deployment tools will play a more important role in large-scale deployment of serverless applications. The three deployment tools developed by AWS, vanilla CloudFormation, AWS CDK, and AWS SAM, have usage rates of 19%, 18%, and 13%, respectively. (There is the same customer using two or more tools at the same time, so the usage rate is higher than 100% after stacking)
Back in China, Alibaba Cloud, Baidu Cloud, Huawei Cloud, and Tencent Cloud all provided their own closed-source deployment tools. Tencent Cloud and Serverless Framework cooperated to develop the Serverless Application Center. Alibaba Cloud opened up Serverless Devs last year and provided The deployment, operation and maintenance and monitoring of function applications. In addition, Midway, another domestic open source project that provides a Node.js development framework, has received a 4k+ star. I believe that as the number of developers participating in Serverless increases, the domestic open source tool ecosystem will More active.
Viewpoint 8: Python is the most popular Lambda runtime, especially in large-scale environments
Serverless naturally supports multi-language development frameworks. So the question is, which type of programming language is the most popular?
The report shows that 58% of users use Python, Node.js accounts for 31%, and Java, Go, .NET Core and Ruby do not exceed 10%. However, considering the characteristics of different vendors, the share of Java on Alibaba Cloud may be higher, and there will be more customers for .NET on Azure.
Interestingly, in the small Lambda runtime environment, Node.js has a higher share than Python. As the scale of functions grows, Python becomes more and more popular. In enterprise-level organizations, the frequency of Python usage is Node.js. 4 times as shown below:
Lei Juan shared the analysis of the programming language part of the report on the Alibaba Cloud intranet: large enterprises use Python more in big data, AI, etc., and they use a large amount of Lambda, so in terms of the number of Lambdas, Python has Absolute advantage; Node.js applications do not use such a large runtime (multi-core CPU and large memory), and are usually small instances. In addition, they may be individual Node.js developers and usually choose a small Lambda environment.
In addition, the distribution of the versions of various programming languages is as follows, in descending order, Python 3.x, Node.js 12, Node.js 10, Python 2.7, Java 8, Go 1.x, .NET Core 2.1, .NET Core 3.1.
to sum up
On the whole, compared to last year, the use of serverless in foreign countries is rapidly expanding, the execution time of functions is constantly increasing, the way of use is becoming more mature, and the developer tools are more open. In China, Serverless is no longer limited to some offline tasks or low-coupling applications. Many corporate customers have already applied Serverless to the core links of the production process. For example, Century Lianhua has integrated trading systems, membership systems, inventory systems, and back-end systems. Core applications such as the promotion module and other core applications are deployed on function computing to reduce the customer’s investment in infrastructure; Xianyu has begun to practice serverless transformation of traditional giant applications to overcome code reuse between Functions and the use of functions. Relying on problems in the industry such as unified upgrades. Application transformation takes time and window period. It is believed that more and more enterprise customers will choose Serverless to free their hands.
English report link: https://www.datadoghq.com/state-of-serverless/
Copyright Notice: content of this article is contributed spontaneously by Alibaba Cloud real-name registered users, and the copyright belongs to the original author. The Alibaba Cloud Developer Community does not own its copyright and does not assume corresponding legal responsibilities. For specific rules, please refer to the "Alibaba Cloud Developer Community User Service Agreement" and the "Alibaba Cloud Developer Community Intellectual Property Protection Guidelines". If you find suspected plagiarism in this community, fill in the infringement complaint form to report it. Once verified, the community will immediately delete the suspected infringing content.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。