标签:ras tor 不用 NPU tensor and sha gre 实现
数据集介绍
共有506个样本,拆分为404个训练样本和102个测试样本
该数据集包含 13 个不同的特征:
技巧
注意事项
如果数据集下载失败,可以在我的Github上下载:https://github.com/MartinLwx/ML-DL
代码
from __future__ import absolute_import, division, print_function
import tensorflow as tf
from tensorflow import keras
import numpy as np
boston_housing = keras.datasets.boston_housing
(train_data, train_labels), (test_data, test_labels) = boston_housing.load_data()
# 打乱训练集
order = np.argsort(np.random.random(train_labels.shape))
train_data = train_data[order]
train_labels = train_labels[order]
#计算平均值和方差的时候不用测试集的数据
mean = train_data.mean(axis=0)
std = train_data.std(axis=0)
train_data = (train_data - mean) / std
test_data = (test_data - mean) / std
#因为后文要用earlystop技术所以写了一个函数
def build_model():
model = keras.Sequential([
keras.layers.Dense(64, activation=tf.nn.relu,
input_shape=(train_data.shape[1],)),
keras.layers.Dense(64, activation=tf.nn.relu),
keras.layers.Dense(1)
])
optimizer = tf.train.RMSPropOptimizer(0.001)
model.compile(loss='mse',
optimizer=optimizer,
metrics=['mae'])
return model
model = build_model()
history = model.fit(train_data, train_labels, epochs=EPOCHS,
validation_split=0.2, verbose=0)
#返回的是loss和mae(平均绝对误差)
model.evaluate(test_data, test_labels) #输出[16.7056874293907, 2.5310279341305004]
model = build_model()
# The patience parameter is the amount of epochs to check for improvement
early_stop = keras.callbacks.EarlyStopping(monitor='val_loss', patience=20)
history = model.fit(train_data, train_labels, epochs=EPOCHS,
validation_split=0.2, verbose=0,
callbacks=[early_stop])
model.evaluate(test_data, test_labels) #输出了[21.388992309570313, 2.9450648532194248]
参考
https://www.tensorflow.org/tutorials/keras/basic_regression?hl=zh-cn
标签:ras tor 不用 NPU tensor and sha gre 实现
原文地址:https://www.cnblogs.com/MartinLwx/p/10078340.html