Cloud computing has been developing rapidly for more than ten years since it was initially questioned as an advertising concept, and now it has formed a huge industrial ecology. Over the past ten years, the IT world has been turned upside down, with countless new ideas and technologies constantly emerging, and the speed of change has made developers overwhelmed. No one can be sure that they are at the forefront of the times.
Perhaps it is precisely because of this that re:Invent, as the world's largest conference in the field of cloud computing, attracts tens of thousands of developers to Las Vegas every year. This year is the tenth year of re:Invent. In the past ten years, re:Invent has foreseen the future countless times. Some of its product releases are comparable to the research and tune of authoritative organizations, making this event almost the focus of developers all over the world. A window to the future.
So, what has happened in the cloud world in the past ten years, and what trend will it show in the future? This article will take readers back to the ten years of re:Invent, combined with the development history of cloud computing, and take a look at the evolution of cloud technology.
1 Computing Service Era: The Upsurge of Cloud Migration (2006)
The birth of cloud computing did not happen overnight.
In the past, most companies were troubled by purchasing hardware and renting IDC computer rooms to build IT infrastructure, and Amazon was one of them. There is still such a small story circulating in the industry as to why Amazon started to provide cloud services.
It is said that there were many project teams in Amazon at that time, and any project team needed server resources to test new products. When server resources are in short supply, the team leader can only go to CEO Bezos and cry. After Bezos heard this, he immediately furious at the technical team: "You are the one who killed the creativity of the Amazons."
During that time, Bezos accidentally saw the concept of "primitives" in a book, so he tried to apply the concept of "primitives" to development-separating computing and storage into the smallest and simplest components. For developers to use and create. In this way, Amazon's internal creativity is greatly released. Bezos thought, can these primitives also be provided to developers?
In 2006, Amazon launched Amazon Web Services to provide enterprises with IT infrastructure services in the form of Web services. By providing virtual machines and storage services to developers, users do not need to build their own servers, they can obtain computing and storage capabilities, and they can take them and use them like water and electricity. On March 14, 2006, Amazon Web Services released Simple Storage Service (S3 for short). So far, a new era of computing has begun. This era is called the "computing service era", and people vividly name this brand-new computing resource service model after the "cloud".
Then, Microsoft released a preview version of the Windows Azure technology community on PCD 2008, and Google also launched a preview version of Google App Engine in 2008. During that time, the concept of separation of computing and storage was initially established. Major cloud vendors were committed to providing enterprises with some infrastructure services. Various types of virtual machines and storage services emerged in an endless stream, which once formed a cloud migration boom in the industry.
But migrating computing services are still just the rudiments of cloud computing. No one expected that the infrastructure of the entire IT world would be completely changed. At the same time, there are still a large number of "information walls" in the industry at this time, and developers are separated from the most cutting-edge cloud concepts and cloud technologies, hindering the innovation and promotion of cutting-edge technologies.
2 Cloud Native Era: From On Cloud to In Cloud (2010)
The era of computing services has made it convenient for business owners, and at the same time, they have found that this is not enough. Enterprise cloud, not only from the infrastructure and platform level, but also requires the application itself, including architecture design, development methods, deployment and maintenance, etc., to be developed based on the characteristics of the cloud.
Under the grand vision of "cloud", just relocating servers is too simple. As far as cloud service providers are concerned, pure virtualization can no longer satisfy the rapidly growing number of customers.
Therefore, cloud native has emerged as an application development technology and operation and maintenance management method that is more suitable for the characteristics of cloud computing architecture.
The concept of cloud native was first mentioned in a blog by Paul Fremantle in 2010. It was mainly described as an application writing with the same system behavior as the cloud, such as distributed, loose, self-service, and continuous deployment. With the test. If the era of computing services is "On Cloud", then cloud native is "In Cloud". But at this time, cloud native is just an "imagination". From its birth, cloud native is absolutely abstract for developers.
However, no one came up with it. In 2012, a hardcore event, re:Invent, stepped onto the stage of history. For the first time in history, the "information wall" between developers and cutting-edge cloud technology has been overthrown in pieces. More than 6,000 developers from all over the world came to Las Vegas, USA, in a three-day agenda, in more than 100 agendas, to exchange the latest developments of various cloud technologies, from day to night.
It was at this conference that Amazon Cloud Technology launched the industry's first cloud-native data warehouse Amazon Redshift, marking the beginning of cloud-native implementation from a concept to a product, and from an idea to a technology stack that can be implemented.
This re:Invent not only made Las Vegas a paradise for developers from now on, but also seemed to destroy the floodgates of the entire cloud-native industry, and the subsequent development was almost a blowout.
In 2013, Netflix cloud architect, Adrian Cockcroft introduced the successful application of Netflix on Amazon Cloud Technology based on Cloud Native; 2013-2014, Docker released, Kubernetes open source; In 2015, the Cloud Native Computing Foundation (CNCF) was established, cloud native Concepts and implementation methods have gradually entered the mainstream vision. Matt Stine from Pivotal in the same year described Cloud Native as a set of best practices in his e-book "Migrating to Cloud Native Application Architecture", including the following: twelve factors, microservices, agile infrastructure, based on API collaboration, anti-fragility. Since then, Pivotal has made several revisions to the characteristics of cloud native, and finally summarized the characteristics of cloud native to what is generally recognized at the moment: DevOps + continuous delivery + microservices + containers.
After the rise of Docker, all major public clouds have provided container-related standard PaaS services. The management and orchestration of containers has become an urgent problem to be solved. After a battle without gunpowder orchestration framework, Kubernetes has declared victory by becoming the de facto standard in the container orchestration field.
But all this started in 2012 re:Invent. Developers all over the world have remembered this conference from then on-it was like a prophet, met the success of cloud native, and quickly verified it.
3 The era of data explosion: from container cloud to data cloud (2014)
With the gradual availability of cloud-native concepts and products, the latest challenge facing the industry is not the issue of cloud transformation, but the issue of data explosion.
With the advent of the mobile Internet and the Internet of Everything era, the scale of data has begun to grow exponentially. In 2010, the total amount of data generated by all human production activities in a year just exceeded the ZB level, and by 2020 this figure has reached 64 ZB. For example, Hitachi Group of Japan (Hitachi) established the Hitachi Innovation Analysis Global Center in early June 2013, which extensively collects huge data information such as automobile driving records, retail purchase trends, patient medical data, mine maintenance data, and resource price trends to develop big data. Analyze the business.
Traditional computing and storage methods cannot meet the current data scale requirements. The data product ecology is also innovating at an amazing speed, expanding from traditional relational databases to various non-relational databases and cloud services related to big data.
At re:Invent in 2014, Amazon Cloud Technology launched Amazon Aurora, the first database specifically designed for the cloud. Aurora is fully compatible with the most popular MySQL and PostgreSQL relational databases, while achieving a high degree of separation of computing and storage and almost unlimited expansion. It not only has the speed and availability of high-end commercial databases, but also has the simplicity and cost-effectiveness of open source databases, and its cost is only one-tenth of that of commercial-grade databases. Since its release in 2014, Amazon Aurora has always maintained the fastest growing service in the history of Amazon Cloud Technology.
In re:Invent again, developers all over the world have witnessed the power of the combination of big data technology and cloud computing. Amazon Cloud Technology has successively released and provided fifteen specially constructed database services, supporting eight data types: relation, key value, document, memory, graph, time series, wide column, and ledger.
The release of Amazon Aurora opened the curtain of the cloud database era. In the following years, Amazon has continued to innovate in database products, creating a precedent for serverless cloud-native databases. Amazon Aurora Serverless v1 released in 2018 also released v2 on re:Invent in 2020. Aurora Serverless v2 can be used in one Scale the database workload from hundreds of transactions to hundreds of thousands of transactions in seconds. , The successive releases of Amazon Aurora Global Database (global database), Amazon DynamoDB Global Table (global table) and a series of developments indicate that global synchronization capabilities are also important cloud-native database technology trends.
4 The AI Era: From "Cloud" + AI to Innovation Proving Ground (2016)
The growth of computing power is the support of artificial intelligence; the explosion of massive data makes artificial intelligence algorithms useful. Under the combined effect of the three elements of data, algorithms, and computing power, artificial intelligence applications have reached a complete engineering system from hardware to the underlying framework to training deployment.
In 2016, after AlphaGO defeated the human Go player Li Shishi, artificial intelligence technology ushered in a period of rapid development. By 2017, the era of machine learning engineering has completely arrived, and machine learning has gradually become a core competency of all walks of life, enterprises, and organizations from peripheral applications.
At re:Invent in 2017, Amazon Cloud Technology released Amazon SageMaker, a fully managed machine learning service for all developers and data scientists. SageMaker provides developers with a complete "central kitchen". Developers who use SageMaker only need to prepare "foodstuffs" (data) and start cooking (training models) directly, greatly improving the development and The efficiency of training and deployment of machine learning models has opened a new era of intelligence. In 2020 re:Invent, Amazon Cloud Technology further released Amazon SageMaker Studio, Amazon SageMaker Studio becoming the first fully integrated ML development environment.
The combination of artificial intelligence and AI is an inevitable trend. The SaaS attributes of cloud products are naturally suitable for pushing new technologies from concepts to specific application scenarios. Not only artificial intelligence, but behind the emergence of blockchain, IoT, and quantum computing in the past two years, cloud computing is used as a carrier. Cloud computing is gradually becoming an important trial ground for innovative technologies.
5 Every node is the starting point
From the era of computing services to the cloud-native era, from the era of data explosion to the era of AI. Each era begins at a certain critical node, and then continues to move forward, becoming the foundation of a new era. re:Invent has also repeatedly appeared at such key nodes, becoming the "vanguard" in transforming technical concepts into industrial-grade cloud products.
In the field of computing and storage services, following the release of S3, Amazon Cloud Technology launched the Amazon Nitro architecture in 2017. The Nitro system helps users get rid of the constraints of virtualization, and cloud server performance is "zero" loss; in 2019, the second generation of self-developed Arm architecture processor Graviton2 will be launched. , To create an era of large-scale use of Arm architecture services for enterprise-level applications, with a 40% increase in cost performance.
With the advent of the cloud-native era, Amazon Lambda, the first serverless computing product provided by a public cloud provider, was released on re:Invent in 2014, pushing the serverless era into a wave. Facts have proved that the release of this product has been successful again. Foresee the future. In 2019, Serverless was named by Gartner as the most promising cloud computing technology development direction.
In terms of cloud database, Amazon Aurora unveiled behind the cloud era. The Aurora Serverless V2 released on 2020 re:Invent realizes the automatic and elastic scaling of cloud-native relational databases like microservices, which is extremely demanding for today and the future. Provide flexible support for different applications and workloads.
How will cloud computing develop in the future? What new technologies will be combined with cloud computing to push the great wheel of the era forward? The way forward is unknown, but re:Invent is a beacon that illuminates the future. Through this beacon, developers can have a basic understanding and control of future trends, and companies also have the opportunity to achieve overtaking in curves on the road of development.
The grand event that has led the development of cloud computing for a decade will continue to open a new chapter in the next decade in Las Vegas, Nevada, USA. At that time, Amazon Cloud Technology will simultaneously broadcast the conference online from November 30th, Beijing time.
At this conference, Adam Selipsky will appear at re:Invent for the first time as the new CEO of Amazon Cloud Technology. The conference set up five themes of product, industry, organization, community and role. 22 leaders from various fields of Amazon Cloud Technology will gather to show up, bringing sharing in many fields including technology, training and certification, machine learning, and business. At present, it is still unknown which new products and technologies will be released at this conference for the first time. We can only look forward to the grand opening of Amazon Cloud Technology re:Invent on November 30th, Beijing time. Turn on. Click on the picture below to register for free.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。