In software development, project deployment is a very important part. Especially in agile development, how to quickly, efficiently and smoothly transform the modified code into actual results has always been a topic of interest. The common continuous integration and continuous deployment (hereinafter referred to as CI/CD) implementation scheme is Jenkins, through Jenkins + host server, rapid project iteration, this will indeed bring us great convenience, and it is also a popular scheme. . However, in reality, it is not that simple. We are now more based on containerization to implement K8s cluster development. Jenkins will always seem a little weak, let alone implement a complete set of multi-environment-based CI/CD operations in K8s. So how can we perfectly implement CI/CD based on K8s?
Through this experiment, we take Microsoft Azure Kubernetes Service as an example (hereinafter referred to as AKS) and use the ASP.NET Core project to analyze CI/CD operations.
Preliminary preparation of account
First, you need to register an Azure account.
Secondly, you need a source code management warehouse, you can create a new and use Azure DevOps code base, there are very good Kanban and operation tips, it is recommended to use.
You can also use your own GitHub account. Since Microsoft acquired Github, the compatibility of Azure is particularly friendly! If you already have a project on GitHub, you can directly build a CI Pipeline. After the completion, the effect is shown in the figure. The specific operation and deployment will be shown below.
There are two main purposes for using Azure to create a Pipeline:
1. Ensure the accuracy of the code. For example, if you temporarily modify a code without a compiler, you will usually modify it directly, then submit it, and let CI check the accuracy of the code initially;
2. You can build an image and push it to the warehouse, or even deploy it directly as mentioned in the next article.
Really realize the modification code == preview effect .
Github's Action also has such a function, and the realization idea is roughly the same, except that it is not easy to operate on CD (continuous deployment). For details, please refer to the code of my Blog.Core project. I will not go into details here.
Connect to the Dockers Registry service
To create a new CI pipeline, it is recommended to configure a Docker Registry service, so that the image after the build can be pushed to the mirror warehouse to facilitate subsequent CD operations.
Step 1-project settings
Click the Project settings operation link under the project:
In the new page that pops up, select the Service connections operation in the left navigation bar:
Click the New Service Connection button:
Step 2-Configure Docker Registry options
Search for docker keywords and select the Docker Registry option:
Select the Registry type, enter the DockerID and password, give this connection a name, and click the Save button.
A service connection is created, and this connection will be used in future CI operations.
The user name and password must be correct, otherwise an error will be prompted, you can click Verify to verify:
Create a new pipeline
Step 1-Configure Code source repository
Select New Pipeline, check Github, if the source code management is in Azure DevOps, you can go directly to the first option. Because I am using Github, I directly check GitHub. Basically, I use YAML.
Choose a project of your own (PS: Github secondary password confirmation may be required at this time):
Step 2-configure your pipeline
Using containerized deployment, click on the Docker option and pay attention to the difference with the second one:
Configure the YAML file, the system will create one by default, only the build operation, you can remove the default task, select a docker Assistant template on the back:
You can also directly click the settings link in the code on the left to automatically call up the editing window:
Others are the default, just check the service connection just now, and then enter your own container warehouse name, Please note that you need to add a prefix to the image name, which is the respective Docker ID , click the Add button, on the left The synchronization has changed.
Step 3-Click to run Pipeline
Then click the Save and run button in the upper right corner, and a complete CI operation is completed.
At the same time, the YAML file code just generated to create the Pipeline will be synchronized to GitHub:
After the build is completed, we can see the pushed image in the Docker hub:
Above, we have completed the continuous integration (hereinafter referred to as CI) part, successfully packaged and compiled the ASP.NET Core project, and pushed it to the designated Docker mirror warehouse.
Now we continue to complete the continuous deployment (hereinafter referred to as CD) process of the finished product.
Add Release pipeline
Step 1-Create a new pipeline
In the left navigation bar, click the Release option under Pipelines to create a new Pipeline. If you have not created a Release Pipeline before, the default display effect of the page is as follows:
In the template on the right, select an empty template:
Step 2-Configure Artifact
Move the mouse to the Artifacts module on the left, and click to add an Artifact. At this time, the editing window will be called up on the right, and click the build option:
Choose the pipeline of the previous build and use the latest version of Latest, At this moment, our CD pipeline is officially associated with the CI pipeline .
Step 3-automatic build
Click the build icon in the upper right corner of the product to turn on the automatic build, so that as long as the code is submitted, the CI Build operation will be triggered, and then the CD Release operation will be triggered immediately, and the whole process will be completed in one go.
After configuring the Artifact, you can configure the task task.
Place the mouse on the Stage option on the right, you can see that there are three function options, namely:
①. Rename the Stage;
②. Add a new Stage;
③, edit task (task);
Click on the task link to configure Agent Job. Here are two points to note:
1. Proxy pool refers to the place of deployment, currently the default is sufficient, and you can use your own server in the future;
2. Agent specification (agent specification), which is the server specification configuration;
Please pay attention to The default here is the vs2019 specification, which is a windows environment. If it is not changed, there will be a platform problem that Docker cannot run. So just choose Linux.
Step 1-delete the old container
Click the plus sign on the right side of AgentJob to filter the task template of docker
Edit the command in the newly invoked edit page, delete the old container, and use the run command directly, so the Task version uses 0.*:
Select the container Registry address, configure an action (behavior), and add a command to delete the image
rm -f xxxx
Step 2-run the new container
Similar to the first step of deleting the old container, build a stage to run the container. The overall process is the same. The configuration diagram is as follows:
Task version is 1.* Docker container configuration, use a custom DockerRegistry, configure the image name, support customization, for example, I added a prefix, and finally specify the port.
Click Save, and a simple continuous integration pipeline is built
You can manually trigger, create release, and you can see the detailed process:
After waiting for a period of time, the new container has been successfully run, but it has not yet configured its own proxy pool, so it is impossible to view the specific display effect.
How to check the effect
There are currently two common ways to deploy projects in Azure:
1. Configure a custom proxy pool, use your own server to provide the proxy pool, generate images and run containers will run in your own proxy pool server.
2. Directly use Azure's k8s service, create a new Deploy Task, specify the corresponding service connection and Yml file, so that the image will be pushed to the Azure mirror warehouse, and the image will be run to build a Pod instance.
The editor has used both, using the second example, roughly like this:
first build and push the image to the mirror warehouse
Then deploy the image
This article first uses ASP.NET Core as an example to explain how to implement continuous integration operations in Azure. The overall process is simple and convenient, and the documentation is particularly clear. Once again, cheers for the Microsoft Docs documentation.
Afterwards, we take ASP.NET Core as an example to explain how to implement continuous deployment operations in Azure. The overall process is based on continuous integration, and our Code is run step by step, which in a real sense realizes automated operations.
Microsoft's most valuable expert
Microsoft's Most Valuable Expert is a global award granted by Microsoft to third-party technology professionals. For 28 years, technology community leaders around the world have won this award for sharing their expertise and experience in online and offline technology communities.
MVP is a rigorously selected team of experts. They represent the most skilled and intelligent people. They are experts who are passionate and helpful to the community. MVP is committed to helping others through speeches, forum questions and answers, creating websites, writing blogs, sharing videos, open source projects, organizing conferences, etc., and to help users in the Microsoft technology community use Microsoft technology to the greatest extent.
For more details, please visit the official website:
Here are more official Microsoft learning materials and technical documents, scan the code to get the free version!
The content will be updated from time to time!