码迷,mamicode.com
首页 > 其他好文 > 详细

TFboy养成记 多层感知器 MLP

时间:2017-06-17 12:06:19      阅读:312      评论:0      收藏:0      [点我收藏+]

标签:数据   open   视频   部分   square   .sh   func   weight   odi   

内容总结与莫烦的视频。

这里多层感知器代码写的是一个简单的三层神经网络,输入层,隐藏层,输出层。代码的目的是你和一个二次曲线。同时,为了保证数据的自然,添加了mean为0,steddv为0.05的噪声。

添加层代码:

def addLayer(inputs,inSize,outSize,activ_func = None):#insize outsize表示输如输出层的大小,inputs是输入。activ_func是激活函数,输出层没有激活函数。默认激活函数为空
    with tf.name_scope(name = "layer"):
        with tf.name_scope("weigths"):
            Weights = tf.Variable(tf.random_normal([inSize,outSize]),name = "W")
        bias = tf.Variable(tf.zeros([1,outSize]),name = "bias")
        W_plus_b = tf.matmul(inputs,Weights)+bias
        if activ_func == None:
            return W_plus_b
        else:
            return activ_func(W_plus_b)

输入:

1 with tf.name_scope(name = "inputs"):#with这个主要是用来在tensorboard上显示用。
2     xs = tf.placeholder(tf.float32,[None,1],name = "x_input")#不是-1哦
3     ys = tf.placeholder(tf.float32,[None,1],name = "y_input")
4 l1 = addLayer(xs,1,10,activ_func= tf.nn.relu)
5 y_pre = addLayer(l1,10,1,activ_func=None)

其他部分:

需要注意的是

 1 with tf.name_scope("loss"):    
 2     loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys-y_pre),
 3                       reduction_indices=[1]))#这里reduction_indices=[1]类似于numpy中的那种用法,是指横向还是竖向,reduce_sum函数貌似主要是用于矩阵的,向量可以不使用
 4 with tf.name_scope("train"):
 5     train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)
 6 #在以后的版本中,这里的initialize_all_variable()可能被逐步抛弃使用global_variable_init(大概是这么写的)那个函数。欢迎指正。
 7 init = tf.initialize_all_variables()#init这一步很重要,在训练前一定要是使用sess.run(init)操作(只要是你用到了Variable)
 8 writer = tf.summary.FileWriter("logs/",sess.graph)
 9 with tf.Session() as sess:
10     
11     sess.run(init)
12     
13     for i in range(1000):
14         sess.run(train_step,feed_dict = {xs:x_data,ys:y_data})
15         if i % 50 == 0:
16             print(sess.run(loss,feed_dict = {xs:x_data,ys:y_data}))#只要是你的操作中有涉及到placeholder一定要记得使用feed_dict

 所有代码:

技术分享
 1 # -*- coding: utf-8 -*-
 2 """
 3 Created on Tue Jun 13 15:41:23 2017
 4 
 5 @author: Jarvis
 6 """
 7 
 8 import tensorflow as tf
 9 import numpy as np
10 
11 def addLayer(inputs,inSize,outSize,activ_func = None):
12     with tf.name_scope(name = "layer"):
13         with tf.name_scope("weigths"):
14             Weights = tf.Variable(tf.random_normal([inSize,outSize]),name = "W")
15         bias = tf.Variable(tf.zeros([1,outSize]),name = "bias")
16         W_plus_b = tf.matmul(inputs,Weights)+bias
17         if activ_func == None:
18             return W_plus_b
19         else:
20             return activ_func(W_plus_b)
21 x_data = np.linspace(-1,1,300)[:,np.newaxis]
22 noise = np.random.normal(0,0.05,x_data.shape)
23 y_data = np.square(x_data)-0.5+noise
24 
25 with tf.name_scope(name = "inputs"):
26     xs = tf.placeholder(tf.float32,[None,1],name = "x_input")#不是-1哦
27     ys = tf.placeholder(tf.float32,[None,1],name = "y_input")
28 l1 = addLayer(xs,1,10,activ_func= tf.nn.relu)
29 y_pre = addLayer(l1,10,1,activ_func=None)
30 with tf.name_scope("loss"):    
31     loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys-y_pre),
32                       reduction_indices=[1]))
33 with tf.name_scope("train"):
34     train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)
35 
36 init = tf.initialize_all_variables()
37 writer = tf.summary.FileWriter("logs/",sess.graph)
38 with tf.Session() as sess:
39     
40     sess.run(init)
41     
42     for i in range(1000):
43         sess.run(train_step,feed_dict = {xs:x_data,ys:y_data})
44         if i % 50 == 0:
45             print(sess.run(loss,feed_dict = {xs:x_data,ys:y_data}))
View Code

 

TFboy养成记 多层感知器 MLP

标签:数据   open   视频   部分   square   .sh   func   weight   odi   

原文地址:http://www.cnblogs.com/silence-tommy/p/7039702.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!