-
大小: 5KB文件類型: .7z金幣: 1下載: 0 次發(fā)布日期: 2021-05-28
- 語(yǔ)言: Python
- 標(biāo)簽: 鳶尾花??Python??分類??預(yù)測(cè)??
資源簡(jiǎn)介
本例中包含兩層BP神經(jīng)網(wǎng)絡(luò)模板程序(可以直接調(diào)用,可定制中間層神經(jīng)元個(gè)數(shù),設(shè)置學(xué)習(xí)率,繪制衰減曲線,可用于簡(jiǎn)單的模式識(shí)別和預(yù)測(cè))、一個(gè)調(diào)用的例程(包括簡(jiǎn)單的數(shù)據(jù)預(yù)處理如歸一化的使用,測(cè)試結(jié)果準(zhǔn)確率為98.3%)、一份鳶尾花處理后的數(shù)據(jù)和原始數(shù)據(jù)。歡迎下載。
代碼片段和文件信息
#!/usr/bin/env?python??
#?_*_?coding:utf-8?_*_??
#??
#?@Version?:?1.0??
#?@Time????:?2018/6/5??
#?@Author??:?圈圈烴
#?@File????:?Forward_NeuralNetwork
import?numpy?as?np
import?matplotlib.pyplot?as?plt
from?planar_utils?import?plot_decision_boundary?load_planar_dataset?load_extra_datasets
def?sigmoid(x):
????“““
????Compute?the?sigmoid?of?x
????(計(jì)算x的sigmoid函數(shù)值)
????:param?x:?A?scalar?or?numpy?array?of?any?size?(一個(gè)數(shù)或是一個(gè)一個(gè)任意大小的numpy數(shù)組)
????:return:?sigmoid(x)?(計(jì)算值s)
????“““
????s?=?1/(1+np.exp(-x))
????return?s
def?layer_size(X?Y):
????“““
????:param?X:?input?dataset?of?shape?(input?size?number?of?examples)??(輸入數(shù)據(jù)集大?。◣讉€(gè)屬性,樣本量))
????:param?Y:?labels?of?shape?(output?size?number?of?exmaples)?(標(biāo)簽數(shù)據(jù)大小(標(biāo)簽數(shù),樣本量))
????:return:?
????n_x:?the?size?of?the?input?layer
????n_y:?the?size?of?the?output?layer
????“““
????n_x?=?X.shape[0]
????n_y?=?Y.shape[0]
????return?(n_x?n_y)
def?initialize_parameters(n_x?n_h?n_y):
????“““
????initialize_parameters
????(參數(shù)初始化)
????:param?n_x:?size?of?the?input?layer?
????:param?n_h:?size?of?the?hidden?layer
????:param?n_y:?size?of?the?output?layer
????:return:?
????W1:?weight?matrix?of?shape?(n_h?n_x)?(第1層的權(quán)重矩陣(n_h?n_x))
????b1:?bias?vector?of?shape?(n_h?1)?(第1層的偏移量向量(n_h?1))
????W2:?weight?matrix?of?shape?(n_y?n_h)?(第2層的權(quán)重矩陣(n_y?n_h))
????b2:?bias?vector?of?shape?(n_y?1)?(第2層的偏移量向量(n_y?1))
????“““
????#?np.random.seed(2)??#Random?initialization?(隨機(jī)種子初始化參數(shù))
????W1?=?np.random.randn(n_h?n_x)?*?0.01
????b1?=?np.zeros((n_h?1))
????W2?=?np.random.randn(n_y?n_h)?*?0.01
????b2?=?np.zeros((n_y?1))
????parameters?=?{
????????‘W1‘:?W1
????????‘b1‘:?b1
????????‘W2‘:?W2
????????‘b2‘:?b2
????}
????return?parameters
def?forward_propagation(X?parameters):
????“““
????forward_propagation
????(正向傳播)
????:param?X:?input?data?of?size?(n_x?m)??(輸入數(shù)據(jù)集X)
????:param?parameters:?python?dictionary?containing?your?parameters?(output?of?initialization?function)?(字典類型,?權(quán)重以及偏移量參數(shù))
????:return:?
????A2:?The?sigmoid?output?of?the?second?activation?(第2層激活函數(shù)sigmoid函數(shù)輸出向量)
????cache:?a?dictionary?containing?“Z1“?“A1“?“Z2“?and?“A2“?(字典類型包含“Z1“?“A1“?“Z2“?“A2“)
????“““
????W1?=?parameters[‘W1‘]
????b1?=?parameters[‘b1‘]
????W2?=?parameters[‘W2‘]
????b2?=?parameters[‘b2‘]
????Z1?=?np.dot(W1?X)?+?b1
????A1?=?np.tanh(Z1)????????????#第1層激活函數(shù)選擇tanh
????Z2?=?np.dot(W2?A1)?+?b2
????A2?=?sigmoid(Z2)????????????#第2層激活函數(shù)選擇sigmod
????assert?(A2.shape?==?(1?X.shape[1]))?#若A2的大小和((1?X.shape[1]))?則直接報(bào)異常
????cache?=?{
????????‘Z1‘:?Z1
????????‘A1‘:?A1
????????‘Z2‘:?Z2
????????‘A2‘:?A2
????}
????return?A2?cache
def?compute_cost(A2?Y?parameters):
????“““
????compute?cost
????(計(jì)算成本函數(shù))
????:param?A2:?The?sigmoid?output?of?the?second?activation?of?shape?(1?number?of?examples)?(第2層激活函數(shù)sigmoid函數(shù)輸出向量)
????:param?Y:?“true“?labels?vector?of?shape?(1?number?of?examples)?(正確標(biāo)簽向量)
????:param?parameters:?python?dictionary?c
評(píng)論
共有 條評(píng)論