码迷,mamicode.com
首页 > 其他好文 > 详细

自建神经网络

时间:2019-06-07 19:18:11      阅读:77      评论:0      收藏:0      [点我收藏+]

标签:dom   oss   ros   exp   spec   for   with   back   cti   

import numpy as np

def sigmoid(x):
return 1.0/(1+ np.exp(-x))

def sigmoid_derivative(x):
return x * (1.0 - x)

class NeuralNetwork:
def init(self, x, y):
self.input = x
self.weights1 = np.random.rand(self.input.shape[1],4)
self.weights2 = np.random.rand(4,1)
self.y = y
self.output = np.zeros(self.y.shape)

def feedforward(self):
    self.layer1 = sigmoid(np.dot(self.input, self.weights1))
    self.output = sigmoid(np.dot(self.layer1, self.weights2))

def backprop(self):
    # application of the chain rule to find derivative of the loss function with respect to weights2 and weights1
    d_weights2 = np.dot(self.layer1.T, (2*(self.y - self.output) * sigmoid_derivative(self.output)))
    d_weights1 = np.dot(self.input.T,  (np.dot(2*(self.y - self.output) * sigmoid_derivative(self.output), self.weights2.T) * sigmoid_derivative(self.layer1)))

    # update the weights with the derivative (slope) of the loss function
    self.weights1 += d_weights1
    self.weights2 += d_weights2

if name == "main":
X = np.array([[0,0,1],
[0,1,1],
[1,0,1],
[1,1,1]])
y = np.array([[0],[1],[1],[0]])
nn = NeuralNetwork(X,y)

for i in range(1500):
    nn.feedforward()
    nn.backprop()

print(nn.output)

自建神经网络

标签:dom   oss   ros   exp   spec   for   with   back   cti   

原文地址:https://www.cnblogs.com/gshang/p/10988735.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!