码迷,mamicode.com
首页 > 编程语言 > 详细

线性回归——Python代码实现

时间:2019-09-02 19:45:56      阅读:142      评论:0      收藏:0      [点我收藏+]

标签:rate   线性回归   loss   point   多次   init   更新   run   learn   

import numpy as np

def computer_error_for_give_point(w, b, points): # 计算出 观测值与计算值 之间的误差, 并累加,最后返回 平均误差
loss = 0
for i in range(len(points)):
x = points[i, 0]
y = points[i, 1]
loss += ((w * x + b ) - y) ** 2
return loss/float(len(points))

# 下面函数只求导一次更新 w 和 b, 后面可用迭代方法多次更新 w, b

def get_gradient(w_current, b_current, points, LearningRate):
N = len(points)
w_gradient = 0
b_gradient = 0
for i in range(N):
x = points[i, 0]
y = points[i, 1]
w_gradient += 2/N * ((w_currentx+b_current)-y)x
b_gradient += 2/N * ((w_currentx+b_current)-y)
new_w = w_current - LearningRate
w_gradient
new_b = b_current - LearningRate * b_gradient
return new_w,new_b # 以列表的形式返回

def gradeient_descent_run(w, b, points, learn_rate, iteration):
points = np.array(points)
LearnRate = learn_rate
for i in range(iteration):
w, b = get_gradient(w, b, points, LearnRate)
return w, b

if name == "main":
initialize_w = 0
initialize_b = 0
points = [[10, 10], [9, 9], [8, 8], [7, 7], [6, 6], [5, 5], [4, 4], [3, 3], [2, 2], [1, 1]]
w, b = gradeient_descent_run(initialize_w, initialize_b, points, 0.005, 100)
print(w)
print(b)

线性回归——Python代码实现

标签:rate   线性回归   loss   point   多次   init   更新   run   learn   

原文地址:https://www.cnblogs.com/Salted-fish-turn-over/p/11448221.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!