資源簡(jiǎn)介
基于matlab的神經(jīng)網(wǎng)絡(luò)dropout層基于matlab的神經(jīng)網(wǎng)絡(luò)dropout層
代碼片段和文件信息
function?[ymask]?=?vl_nndropout(xvarargin)
%VL_NNDROPOUT?CNN?dropout.
%???[YMASK]?=?VL_NNDROPOUT(X)?applies?dropout?to?the?data?X.?MASK
%???is?the?randomly?sampled?dropout?mask.?Both?Y?and?MASK?have?the
%???same?size?as?X.
%
%???VL_NNDROPOUT(X?‘rate‘?R)?sets?the?dropout?rate?to?R.?Rate?is?defined
%???as?a?probability?of?a?variable?*not*?to?be?zeroed?(i.e.?it?is?the
%???expected?value?of?MASK).
%
%???[DZDX]?=?VL_NNDROPOUT(X?DZDY?‘mask‘?MASK)?computes?the
%???derivatives?of?the?blocks?projected?onto?DZDY.?Note?that?MASK?must
%???be?specified?in?order?to?compute?the?derivative?consistently?with
%???the?MASK?randomly?sampled?in?the?forward?pass.?DZDX?and?DZDY?have
%???the?same?dimesnions?as?X?and?Y?respectivey.
%
%???Note?that?in?the?original?paper?on?dropout?at?test?time?the
%???network?weights?for?the?dropout?layers?are?scaled?down?to
%???compensate?for?having?all?the?neurons?active.?In?this
%???implementation?the?dropout?function?itself?already?does?this
%???compensation?during?training.?So?at?test?time?no?alterations?are
%???required.
%?Copyright?(C)?2014-16?Andrea?Vedaldi?Karel?Lenc.
%?All?rights?reserved.
%
%?This?file?is
評(píng)論
共有 條評(píng)論