Abstract: gives you a brief introduction to multivariate linear regression, and also comes with two common tips in the process of dealing with gradient descent.

This article is shared from the HUAWEI CLOUD Community "[Follow Mi to Machine Learning! 】Multivariate linear regression (1)", original author: Skytier.

1 Multi-dimensional features

image.png

Now we have four characteristic variables, as shown in the figure below:
image.png
image.png

In detail, this hypothesis is to predict the price of a house in 10,000. The basic price of a house may be 80w, plus 1,000 yuan per square meter. Then the price will continue to increase with the increase in the number of floors, as the bedroom As the number increases, it increases, but it depreciates with the increase in the number of years of use. So does this example make sense!
image.png

2 Multivariable gradient descent

Well, the form of the multivariate linear regression hypothesis is already there, that is to say, the ingredients are already there, and the next step is to start cooking!
image.png

After finding the derivative, you can get:
image.png

Code example:

Calculate the cost function:
image.png

Python code:

def computeCost(X, y, theta):

    inner = np.power(((X * theta.T) - y), 2)

    return np.sum(inner) / (2 * len(X))

3 Gradient descent method practice-feature scaling

Next, we will learn some practical techniques in gradient descent operations. First, we will introduce a method called feature scaling. This method is as follows:
image.png
image.png
image.png
image.png
image.png
image.png
image.png

4 Gradient descent method practice-learning rate

image.png
image.png
image.png

The number of iterations required for the gradient descent algorithm to converge varies depending on the model. We cannot predict in advance. We can draw a graph of the number of iterations and the cost function to observe when the algorithm tends to converge.

There are also some automatic methods to test whether the convergence, for example, the change value of the cost function is less than a certain threshold (for example, 0.001), then the test will determine that the function has converged, but it is quite difficult to select an appropriate threshold. Therefore, in order to detect the gradient Whether the descent algorithm converges, it is more inclined to look at the graph instead of relying on the automatic convergence test.
image.png
image.png
image.png

Click to follow and learn about Huawei Cloud's fresh technology for the first time~


华为云开发者联盟
1.4k 声望1.8k 粉丝

生于云,长于云,让开发者成为决定性力量