码迷,mamicode.com
首页 > 其他好文 > 详细

深度学习之模型构建

时间:2018-10-23 20:56:03      阅读:144      评论:0      收藏:0      [点我收藏+]

标签:oid   shadow   hidden   wrapper   graph   ica   ptr   src   images   

  • 标准模型
  • from keras.utils import plot_model
    from keras.models import Model
    from keras.layers import Input
    from keras.layers import Dense
    visible = Input(shape=(10,))
    hidden1 = Dense(10, activation=‘relu‘)(visible)
    hidden2 = Dense(20, activation=‘relu‘)(hidden1)
    hidden3 = Dense(10, activation=‘relu‘)(hidden2)
    output = Dense(1, activation=‘sigmoid‘)(hidden3)
    model = Model(inputs=visible, outputs=output)
    print(model.summary())
    plot_model(model, to_file=‘multilayer_perceptron_graph.png‘)

    技术分享图片

    1. 层共享模型
    from keras.utils import plot_model
    from keras.models import Model
    from keras.layers import Input
    from keras.layers import Dense
    from keras.layers.recurrent import LSTM
    from keras.layers.merge import concatenate
    
    visible = Input(shape=(100,1))
    extract1 = LSTM(10)(visible)
    interp1 = Dense(10, activation=‘relu‘)(extract1)
    interp11 = Dense(10, activation=‘relu‘)(extract1)
    interp12 = Dense(20, activation=‘relu‘)(interp11)
    interp13 = Dense(10, activation=‘relu‘)(interp12)
    merge = concatenate([interp1, interp13])
    output = Dense(1, activation=‘sigmoid‘)(merge)
    model = Model(inputs=visible, outputs=output)
    print(model.summary())
    plot_model(model, to_file=‘shared_feature_extractor.png‘)

    技术分享图片

    1. 多输出模型
    from keras.utils import plot_model
    from keras.models import Model
    from keras.layers import Input
    from keras.layers import Dense
    from keras.layers.recurrent import LSTM
    from keras.layers.wrappers import TimeDistributed
    # input layer
    visible = Input(shape=(100,1))
    # feature extraction
    extract = LSTM(10, return_sequences=True)(visible)
    # classification output
    class11 = LSTM(10)(extract)
    class12 = Dense(10, activation=‘relu‘)(class11)
    output1 = Dense(1, activation=‘sigmoid‘)(class12)
    # sequence output
    output2 = TimeDistributed(Dense(1, activation=‘linear‘))(extract)
    # output
    model = Model(inputs=visible, outputs=[output1, output2])
    # summarize layers
    print(model.summary())
    # plot graph
    plot_model(model, to_file=‘multiple_outputs.png‘)

    技术分享图片

    深度学习之模型构建

    标签:oid   shadow   hidden   wrapper   graph   ica   ptr   src   images   

    原文地址:http://blog.51cto.com/12597095/2308050

    (0)
    (0)
       
    举报
    评论 一句话评论(0
    登录后才能评论!
    © 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
    迷上了代码!