码迷,mamicode.com
首页 > 其他好文 > 详细

Improving DNNs Hyperparameter tuning-Regularization and Optimization(week2)Regularization

时间:2020-06-09 18:29:42      阅读:53      评论:0      收藏:0      [点我收藏+]

标签:example   over   训练   mat   compute   case   let   The   sig   

Regularization

Deep Learning models have so much flexibility and capacity that overfitting can be a serious problem,if the training dataset is not big enough. Sure it does well on the training set, but the learned network doesn‘t generalize to new examples that it has never seen!(它在训练集上工作很好,但是不能用于 它从未见过的新样例)

You will learn to: Use regularization in your deep learning models.

Let‘s first import the packages you are going to use.

# import packages
import numpy as np
import matplotlib.pyplot as plt
from reg_utils import sigmoid, relu, plot_decision_boundary, initialize_parameters, load_2D_dataset, predict_dec
from reg_utils import compute_cost, predict, forward_propagation, backward_propagation, update_parameters
import sklearn
import sklearn.datasets
import scipy.io
from testCases import *

%matplotlib inline
plt.rcParams[‘figure.figsize‘] = (7.0, 4.0) # set default size of plots
plt.rcParams[‘image.interpolation‘] = ‘nearest‘
plt.rcParams[‘image.cmap‘] = ‘gray‘

Improving DNNs Hyperparameter tuning-Regularization and Optimization(week2)Regularization

标签:example   over   训练   mat   compute   case   let   The   sig   

原文地址:https://www.cnblogs.com/douzujun/p/13074312.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!