foreword
In February this year, Amazon CodePipeline was already available in the China (Beijing) region operated by Sinnet and the China (Ningxia) region operated by NWCD.
Amazon CodePipeline is a fully managed continuous delivery service that helps automate your release pipelines for fast and reliable application and infrastructure updates. Whenever there is a code change, Amazon CodePipeline automates the build, test, and deploy phases of the release process according to the release model you define.
Currently, Amazon CodePipeline in China has built-in support for Amazon CodeCommit, Amazon ECR and Amazon S3 as Source Providers. However, it does not support direct integration with source code management services such as GitHub, GitLab, and BitBucket.
GitHub, as the world's largest source code management service, mainly faces the following two problems when integrating Amazon CodePipeline with GitHub:
- Currently GitHub is not supported out of the box by Amazon CodePipeline.
- Unstable network connection sometimes fails to connect to GitHub.
This article will introduce how to solve the above two problems and realize the integration of GitHub and Amazon CodePipeline in Amazon Cloud Technology China.
Question 1: How to use GitHub as the Source Provider of Amazon CodePipeline
We can use Amazon API Gateway to provide external interfaces, integrate with GitHub webhook, and forward GitHub webhook events to Amazon Lambda Function through API Gateway. Get the source code of GitHub repository in Amazon Lambda Function, and upload the source code to S3 Bucket. Create a CodePipeline project with Amazon S3 Bucket as the Source Provider to implement GitHub as the Source Provider of CodePipeline.
As shown below:
- Users commit code to the GitHub Repository.
- GitHub generates new webhook events based on user behavior and sends them to API Gateway.
- API Gateway forwards the webhook event to Amazon Lambda Function.
- The Lambda Function calls the GitHub repository URL to get the ZIP package of the source code.
- The Lambda Function uploads the ZIP package to the S3 Bucket.
- Amazon CodePipeline is triggered when a file is uploaded or updated in the Amazon S3 Bucket.
Step 1: Create an Amazon S3 Bucket
Create Amazon S3 Bucket through S3 console, pay attention to select "Enable Bucket Versioning"
Step 2: Create an Amazon Lambda Function
We take nodejs as an example to create an index.js file, which mainly completes the following functions:
- Verify the signature information in the webhook event
- Get the GitHub repository from the webhook event
- Upload the obtained source code to S3 Bucket
The source code example is given below:
const AWS = require('aws-sdk');
const axios = require('axios');
const s3 = new AWS.S3();
const crypto = require('crypto');
exports.handler = async (event) => {
try {
console.log(`Incoming event: ${JSON.stringify(event)}`);
const eventBody = JSON.parse(event.body);
const normalizedHeaders = normalizeObject(event.headers);
// Validate message signature
if (!(checkSignature(process.env.GITHUB_SECRET, normalizedHeaders['x-hub-signature-256'], event.body))) {
console.log('Invalid webhook message signature');
return responseToApiGw(401, 'Signature is not valid');
}
console.log('Signature validated successfully');
const repoConfig = {
repoFullName: eventBody.repository.full_name,
branch: eventBody.ref.split('/')[2],
};
// Download the repository package from GitHub Server
const file = await downloadFile(repoConfig);
// Upload the repository package to S3 bucket
const s3Upload = await s3.upload({
Bucket: process.env.S3BUCKET,
ServerSideEncryption: 'AES256',
Key: `${repoConfig.repoFullName}/${repoConfig.branch}.zip`,
Body: file
}).promise();
console.log(s3Upload);
console.log('Exiting successfully');
return responseToApiGw(200, 'success');
}
catch (err) {
console.log('Exiting with error', err);
return responseToApiGw(500, 'Some weird thing happened');
}
};
/**
* Convert an object keys to lowercase
* @param {object} request - this is a object to convert the keys to lowercase
* @returns {object} - return a new object with keys in lower case
*/
function normalizeObject(inputObject) {
console.log('info', '>>> normalizeObject()');
const requestKeys = Object.keys(inputObject);
let outputObject = {};
for (let i = 0; i < requestKeys.length; i++) {
outputObject[requestKeys[i].toLowerCase()] = inputObject[requestKeys[i]];
}
console.log('info', '<<< normalizeObject()');
return outputObject;
}
/**
* Download the repository content as a zip file
* @param {object} repoConfig - this is a object containing the config for the repository
* @returns {stream} - return a stream containing the repository zip file
*/
async function downloadFile(repoConfig) {
console.log('info', '>>> downloadFile()');
const params = {
method: 'get',
url: `https://github.com/${repoConfig.repoFullName}/archive/refs/heads/${repoConfig.branch}.zip`,
responseType: 'stream'
};
try {
const resp = await axios.request(params);
console.log('info', '<<< downloadFile()');
return resp.data;
}
catch (err) {
console.log('error', err);
throw new Error(err);
}
}
/**
* Check GitHub Server Signature
* @param {string} signingSecret - this is the signing secret for the GitHub Server webhook
* @param {string} signature - this is the signatured applied by GitHub to the message
* @param {object} body - this is the message body
* @returns {boolean} - return true or false
*/
function checkSignature(signingSecret, signature, body) {
console.log('info', '>>> signingSecret()');
const hash = crypto.createHmac('sha256', signingSecret).update(body).digest('hex');
const signatureHash = signature.split('=');
if (signatureHash[1] === hash) {
console.log('info', '<<< signingSecret()');
return true;
}
console.log('info', '<<< signingSecret()');
return false;
}
/**
* Generate a response for API Gateway
* @param {string} statusCode - HTTP status code to return
* @param {string} detail - this is message detail to return
* @returns {object} - return the formatted response object
*/
function responseToApiGw(statusCode, detail) {
if (!statusCode) {
throw new TypeError('responseToApiGw() expects at least argument statusCode');
}
if (statusCode !== '200' && !detail) {
throw new TypeError('responseToApiGw() expects at least arguments statusCode and detail');
}
let body = {};
if (statusCode === '200' && detail) {
body = {
statusCode: statusCode,
message: detail
};
} else if (statusCode === '200' && !detail) {
body = {
statusCode: statusCode
};
} else {
body = {
statusCode: statusCode,
fault: detail
};
}
let response = {
statusCode: statusCode,
body: JSON.stringify(body),
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'POST, GET',
'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept'
}
};
return response;
}
Swipe left to see more
GITHUB_SECRET
: "Secret" set when creating a GitHub webhook
S3BUCKET
: S3 Bucket for saving source code
Because axios and crypto are used, these dependencies need to be downloaded locally in advance. Execute the following command:
npm install axios
npm install crypto
Swipe left to see more
You will get a file structure similar to the screenshot below:
Package these files into a ZIP package.
Create an Amazon Lambda Function.
Create an Amazon Lambda Function with the Amazon Lambda Function Console
The main configuration is as follows:
Function name
: codepipeline-github-integration
Runtime
:Node.js 14.x
Create two Environment variables GITHUB_SECRET
and S3BUCKET
.
Code source
: Use the ZIP package generated in the previous step as the code source, and upload it to this Amazon Lambda Function.
Step 3: Create an API Gateway
Create an API Gateway through the Amazon API Gateway Console to provide an external Rest interface for receiving GitHub webhook event requests and forwarding them to the Amazon Lambda Function created above.
The main configuration is as follows:
Method
: POST
Integration Request -> Integration type
: Lambda Function
Integration Request -> Use Lambda Proxy integration: Checked
Integration Request -> Lambda Function
: codepipeline-github-integration
After the creation is complete, click "Actions" -> "Deploy API" to deploy the created Resources to a Stage.
Step 4: Create a GitHub webhook
Go to the GitHub repository we need to integrate, select "Settings" -> "Webhooks" -> "Add webhook"
Enter the main parameters:
Payload URL
: API Gateway Invoke URL //The Invoke URL of the API created in step 3 above can be obtained through "API Gateway console" -> "API" -> "Stages" -> "Invoke URL".
Content type
: application/json
Secret
: Set a password for Webhook, which is consistent with GITHUB_SECRET of Lambda above
After the creation is complete, we can try to push a commit to the repo and check whether there is a .zip file uploaded in the Amazon S3 Bucket. If there is, it means the construction is successful.
Step 5: Create a CodePipeline project
Create a CodePipeline project through the Amazon CodePipeline Console. Select Amazon S3 as the Source Provider in the "Add source stage"
Main configuration:
Source provider
: select Amazon S3
Bucket
: The S3 Bucket created in the previous step to save the source code
S3 object key
: The key of the source code saved in the S3 Bucket
Since then, making GitHub a Source Provider for Amazon CodePipeline is complete.
Question 2: Sometimes the Beijing/Ningxia region fails to connect to GitHub due to network reasons
At present, some network problems may be encountered when connecting to GitHub from the Beijing/Ningxia region, causing the Lambda Function created in the above steps to fail to connect to GitHub. The following is a solution to such problems.
Create a NAT instance in your VPC. The NAT instance can connect to the source code hosting server (such as github git server) through the enterprise's private line or a registered VPN service, and add the specified IP CIDRs (such as github git server) to the VPC. Routing table, network traffic accessing these IP CIDRs will be forwarded to the NAT server, requesting the source code server through a stable and compliant connection.
Through a custom NAT instance, the network request routing is shown in the following figure,
The community's Simple NAT https://github.com/zxkane/snat/tree/main/example project is based on the implementation of infrastructure as code (Amazon CDK), which can assist you to quickly create a NAT instance in the VPC. refer to.
Summarize
Through the two techniques described above, the stable and seamless integration of Amazon CodePipeline and GitHub in the Beijing/Ningxia region can be achieved. If you need to integrate with other source code management services such as Gitlab, BitBucket or even Gitee, you can also refer to the techniques described in this article to achieve it.
Author of this article
Deng Mingtong
Amazon Cloud Technology Innovation Solution Architect
He has worked for NEC, Lucent, Amazon e-commerce and other companies with more than 15 years of experience in software development and architecture design, and has rich experience in microservices, containers, DevOps and other fields.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。