码迷,mamicode.com
首页 > 其他好文 > 详细

[theano]入门-一个简单训练的例子

时间:2015-01-17 20:47:21      阅读:218      评论:0      收藏:0      [点我收藏+]

标签:

#!/usr/bin/env python
# coding=utf-8
#这个例子相对来讲比较简单可以作为训练编程的模板
import numpy import theano import theano.tensor as T rng = numpy.random N = 400 feats = 784 D = (rng.randn(N, feats), rng.randint(size=N, low=0, high=2)) training_steps = 10000 # Declare Theano symbolic variables x = T.matrix("x") y = T.vector("y") w = theano.shared(rng.randn(feats), name="w") b = theano.shared(0., name="b") print "Initial model:" print w.get_value(), b.get_value() # Construct Theano expression graph p_1 = 1 / (1 + T.exp(-T.dot(x, w) - b)) # Probability that target = 1 prediction = p_1 > 0.5 # The prediction thresholded xent = -y * T.log(p_1) - (1-y) * T.log(1-p_1) # Cross-entropy loss function cost = xent.mean() + 0.01 * (w ** 2).sum()# The cost to minimize gw, gb = T.grad(cost, [w, b]) # Compute the gradient of the cost # (we shall return to this in a) # Compile train = theano.function( inputs=[x,y], outputs=[prediction, xent], updates=((w, w - 0.1 * gw), (b, b - 0.1 * gb)) ) predict = theano.function(inputs=[x], outputs=prediction) # Train for i in range(training_steps): pred, err = train(D[0], D[1]) print "Final model:" print w.get_value(), b.get_value() print "target values for D:", D[1] print "prediction on D:", predict(D[0])

 

#!/usr/bin/env python
# coding=utf-8
import numpy
import theano
import theano.tensor as T
rng = numpy.random

N = 400
feats = 784
D = (rng.randn(N, feats), rng.randint(size=N, low=0, high=2))
training_steps = 10000
# Declare Theano symbolic variables
x = T.matrix("x")
y = T.vector("y")
w = theano.shared(rng.randn(feats), name="w")
b = theano.shared(0., name="b")
print "Initial model:"
print w.get_value(), b.get_value()

# Construct Theano expression graph
p_1 = 1 / (1 + T.exp(-T.dot(x, w) - b))   # Probability that target = 1
prediction = p_1 > 0.5                    # The prediction thresholded
xent = -y * T.log(p_1) - (1-y) * T.log(1-p_1) # Cross-entropy loss function
cost = xent.mean() + 0.01 * (w ** 2).sum()# The cost to minimize
gw, gb = T.grad(cost, [w, b])             # Compute the gradient of the cost
                                          # (we shall return to this in a)
# Compile
train = theano.function(
              inputs=[x,y],
              outputs=[prediction, xent],
              updates=((w, w - 0.1 * gw), (b, b - 0.1 * gb))
    )
predict = theano.function(inputs=[x], outputs=prediction)

# Train
for i in range(training_steps):

    pred, err = train(D[0], D[1])
print "Final model:"

print w.get_value(), b.get_value()

print "target values for D:", D[1]
print "prediction on D:", predict(D[0])

[theano]入门-一个简单训练的例子

标签:

原文地址:http://www.cnblogs.com/taokongcn/p/4231008.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!