码迷,mamicode.com
首页 > 其他好文 > 详细

21天实战caffe(5)Layer

时间:2017-10-27 21:27:59      阅读:274      评论:0      收藏:0      [点我收藏+]

标签:gns   any   inf   sof   red   lin   mvn   数据   info   

Layer时caffe的基本单元,至少有一个输入Blob(Bottom Blob)和一个输出Blob(Top Blob),有的Layer带有权值(Weight)和偏置项(Bias),有两个运算方向:前向传播(Forward)和反向传播(Backward),其中前向传播会对输入Blob进行某种处理(有权值和偏置项的Layer会对输入进行处理),得到输出Blob;而反向传播计算则对输出Blob的diff进行处理,得到输入Blob的diff(有权值和偏置项的Layer可能会计算权值Blob、偏置项Blob的diff)。

数据结构描述

// NOTE

// Update the next available ID when you add a new LayerParameter field.

//

// LayerParameter next available layer-specific ID: 147 (last added: recurrent_param)

message LayerParameter {

  optional string name = 1; // the layer name

  optional string type = 2; // the layer type

  repeated string bottom = 3; // the name of each bottom blob

  repeated string top = 4; // the name of each top blob

 

//当前阶段(TRAIN或TEST)      // The train / test phase for computation.

  optional Phase phase = 10;

 

//为每个Top Blob分配为对损失函数的权重,每个Layer都有默认值,要么为0,表示不参与目标函数计算;要么为1,表示参与损失函数计算

// The amount of weight to assign each top blob in the objective.

  // Each layer assigns a default value, usually of either 0 or 1,

  // to each top blob.

  repeated float loss_weight = 5;

 

 

  // Specifies training parameters (multipliers on global learning constants,

  // and the name and other settings used for weight sharing).

  repeated ParamSpec param = 6;     //指定训练参数(例如相对全局学习常数的缩放因子,以及用于权值共享的名称或其他设置)

 

  // The blobs containing the numeric parameters of the layer.

  repeated BlobProto blobs = 7;   //承载了该层数值参数的Blob

 

  // Specifies whether to backpropagate to each bottom. If unspecified,

  // Caffe will automatically infer whether each input needs backpropagation

  // to compute parameter gradients. If set to true for some inputs,

  // backpropagation to those inputs is forced; if set false for some inputs,

  // backpropagation to those inputs is skipped.

  //

  // The size must be either 0 or equal to the number of bottoms.

  repeated bool propagate_down = 11;   //是否对Bottom Blob进行反向传播过程。该字段的维度应与Bottom Blob的个数一致

 

  // Rules controlling whether and when a layer is included in the network,

  // based on the current NetState.  You may specify a non-zero number of rules

  // to include OR exclude, but not both.  If no include or exclude rules are

  // specified, the layer is always included.  If the current NetState meets

  // ANY (i.e., one or more) of the specified rules, the layer is

  // included/excluded.

//控制某个层在某个时刻是否包含在网络中,相当于当前NetState。你可以为include或exclude(不要同时)指定非零值。如果没有任何规则,那么该层会被包含或排斥

  repeated NetStateRule include = 8;

  repeated NetStateRule exclude = 9;

 

  // Parameters for data pre-processing.

//数据预处理参数

  optional TransformationParameter transform_param = 100;

 

  // Parameters shared by loss layers.

//所有损失层共享的参数  

optional LossParameter loss_param = 101;

 

  // Layer type-specific parameters.

  //

  // Note: certain layers may have more than one computational engine

  // for their implementation. These layers include an Engine type and

  // engine parameter for selecting the implementation.

  // The default for the engine is set by the ENGINE switch at compile-time.

//指定类型层的参数。注意一些层实现时可能有多于一种的计算引擎,这些层包括一个引擎类型和引擎参数来选择实现。默认引擎是在编译阶段由引擎开关设置的

  optional AccuracyParameter accuracy_param = 102;

  optional ArgMaxParameter argmax_param = 103;

  optional BatchNormParameter batch_norm_param = 139;

  optional BiasParameter bias_param = 141;

  optional ConcatParameter concat_param = 104;

  optional ContrastiveLossParameter contrastive_loss_param = 105;

  optional ConvolutionParameter convolution_param = 106;

  optional CropParameter crop_param = 144;

  optional DataParameter data_param = 107;

  optional DropoutParameter dropout_param = 108;

  optional DummyDataParameter dummy_data_param = 109;

  optional EltwiseParameter eltwise_param = 110;

  optional ELUParameter elu_param = 140;

  optional EmbedParameter embed_param = 137;

  optional ExpParameter exp_param = 111;

  optional FlattenParameter flatten_param = 135;

  optional HDF5DataParameter hdf5_data_param = 112;

  optional HDF5OutputParameter hdf5_output_param = 113;

  optional HingeLossParameter hinge_loss_param = 114;

  optional ImageDataParameter image_data_param = 115;

  optional InfogainLossParameter infogain_loss_param = 116;

  optional InnerProductParameter inner_product_param = 117;

  optional InputParameter input_param = 143;

  optional LogParameter log_param = 134;

  optional LRNParameter lrn_param = 118;

  optional MemoryDataParameter memory_data_param = 119;

  optional MVNParameter mvn_param = 120;

  optional ParameterParameter parameter_param = 145;

  optional PoolingParameter pooling_param = 121;

  optional PowerParameter power_param = 122;

  optional PReLUParameter prelu_param = 131;

  optional PythonParameter python_param = 130;

  optional RecurrentParameter recurrent_param = 146;

  optional ReductionParameter reduction_param = 136;

  optional ReLUParameter relu_param = 123;

  optional ReshapeParameter reshape_param = 133;

  optional ScaleParameter scale_param = 142;

  optional SigmoidParameter sigmoid_param = 124;

  optional SoftmaxParameter softmax_param = 125;

  optional SPPParameter spp_param = 132;

  optional SliceParameter slice_param = 126;

  optional TanHParameter tanh_param = 127;

  optional ThresholdParameter threshold_param = 128;

  optional TileParameter tile_param = 138;

  optional WindowDataParameter window_data_param = 129;

}

 

21天实战caffe(5)Layer

标签:gns   any   inf   sof   red   lin   mvn   数据   info   

原文地址:http://www.cnblogs.com/mengmengmiaomiao/p/7745149.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!