資源簡介
用matlab神經(jīng)網(wǎng)絡(luò)實(shí)現(xiàn)非線性識(shí)別 用matlab神經(jīng)網(wǎng)絡(luò)實(shí)現(xiàn)非線性識(shí)別

代碼片段和文件信息
%---------------------------------------------%
% ??????%
%??????????工作室提供代做matlab仿真 ??????%
% ??????%
%??詳情請?jiān)L問:http://cn.mikecrm.com/5k6v1DP??%
% ??????%
%---------------------------------------------%
clear?all
close?all
clc
rand(‘seed‘2);
nSampDim=2;??%樣本是二維的
t1=rand(10001).*pi;
t2=rand(10001).*pi+pi;
r=3*rand(10001)+5;?
x1=r.*cos(t1);
y1=r.*sin(t1)-1;?
x2=r.*cos(t2)+5;
y2=r.*sin(t2)+4;
%p1=[x1y1]*[cos(pi/50)sin(pi/50);-sin(pi/50)cos(pi/50)];
%p2=[x2y2]*[cos(pi/50)sin(pi/50);-sin(pi/50)cos(pi/50)];
%A1=p1(:1);??%旋轉(zhuǎn)后x軸的坐標(biāo)
%A2=p1(:2);??%旋轉(zhuǎn)后y軸的坐標(biāo)
%C1=p2(:1);
%C2=p2(:2);
A=[x1y1];??
C=[x2y2];
%AB各取150個(gè)進(jìn)行訓(xùn)練
train_num_A=750;
train_num_C=750;
nTrainNum=train_num_A+train_num_C;
NUM_A=1000;
NUM_C=1000;
%?A隨機(jī)選取訓(xùn)練與測試的數(shù)據(jù)
r=randperm(NUM_A);??%隨機(jī)打亂A序列
traind(1:train_num_A:)=A(r(1:train_num_A):);??%選A中的150個(gè)進(jìn)行訓(xùn)練
testd(1:NUM_A-train_num_A:)=A(r(train_num_A+1:NUM_A):);??%選A中剩下的150個(gè)進(jìn)行測試
%C隨機(jī)選取訓(xùn)練與測試的數(shù)據(jù)
r=randperm(NUM_C);??%隨機(jī)打亂C的序列
traind(train_num_A+1:train_num_A+train_num_C:)=C(r(1:train_num_C):);
testd(NUM_A-train_num_A+1:NUM_A+NUM_C-train_num_A-train_num_C:)=C(r(train_num_C+1:NUM_C):);
%賦值
train1=zeros(1train_num_A+train_num_C);
train1(1:train_num_A)=1;
test1=zeros(1NUM_A+NUM_C-train_num_A-train_num_C);
test1(1:NUM_A-train_num_A)=1;
%%構(gòu)造網(wǎng)絡(luò)
net.nIn=2;??%兩個(gè)輸入層節(jié)點(diǎn)
net.nHidden=10;??%3個(gè)隱含層節(jié)點(diǎn)
net.nOut=1;??%一個(gè)輸出層節(jié)點(diǎn)
w=2*(rand(net.nHiddennet.nIn)-1/2);
b=2*(rand(net.nHidden1)-1/2);
net.w1=[wb];
W=2*(rand(net.nOutnet.nHidden)-1/2);
B=2*(rand(net.nOut1)-1/2);
net.w2=[WB];
%%訓(xùn)練數(shù)據(jù)歸一化
%mm=mean(traind);
%均值平移
%traind_s=zeros(nTrainNum2);
%for?i=1:2
????%traind_s(:i)=traind(:i)-mm(i);
%end
%方差標(biāo)準(zhǔn)化
%m1(1)=std(traind_s(:1));
%m1(2)=std(traind_s(:2));
%for?i=1:2
????%traind_s(:i)=traind_s(:i)/m1(i);
%end
%%訓(xùn)練
SampleInEx=[traind‘;ones(1nTrainNum)];
expectedOut=train1;
eb=0.01;??%誤差容限
eta=0.1;??%學(xué)習(xí)率
mc=0.1;??%動(dòng)量因子
maxiter=5000;??%最大迭代次數(shù)
iteration=0;??%第一代
errRec=zeros(1maxiter);
outRec=zeros(nTrainNummaxiter);
NET=[];%記錄
%開始迭代
for?i=1:maxiter
????hid_input=net.w1*SampleInEx;??%隱含層的輸入
????hid_out=sigmoid(hid_input);??%隱含層的輸出
????ou_input1=[hid_out;ones(1nTrainNum)];??%輸出層的輸入
????ou_input2=net.w2*ou_input1;
????out_out=sigmoid(ou_input2);
????outRec(:i)=out_out‘;
????err=expectedOut-out_out;??%誤差
????sse=sumsqr(err);
????errRec(i)=sse;??%保存誤差值
????fprintf(‘第?%d?次迭代??誤差:?%f\n‘?i?sse);
????iteration=iteration?+?1;
????%判斷是否收斂
????if?sse<=eb
????????break;
????end
????%誤差反向傳播
????%隱含層與輸出層之間的局部梯度
????DELTA=err.*sigmoid(ou_input2).*(1-sigmoid(ou_input2));
????%輸入層與隱層之間的局部梯度
????delta=net.w2(:1:end-1)‘*DELTA.*sigmoid(hid_input).*(1-sigmoid(hid_input));
????%權(quán)值修改量
????dWEX=DELTA*ou_input1‘;
????dwex=delta*SampleInEx‘;
????%修改權(quán)值,如果不是第一次修改,則使用動(dòng)量因子
????if?i==1
????????net.w2=net.w2+eta*dWEX;
????????net.w1=net.w1+eta*dwex;
????else
????????net.w2=net.w2+(1-mc)*eta*dWEX+mc*dWEXOld;
????????net.w1=net.w1+(1-mc)*eta*dwex+
?屬性????????????大小?????日期????時(shí)間???名稱
-----------?---------??----------?-----??----
?????目錄???????????0??2018-09-17?10:21??用matlab神經(jīng)網(wǎng)絡(luò)實(shí)現(xiàn)非線性識(shí)別\
?????目錄???????????0??2018-09-17?10:24??用matlab神經(jīng)網(wǎng)絡(luò)實(shí)現(xiàn)非線性識(shí)別\非線性識(shí)別\
?????文件??????????55??2018-08-27?11:33??用matlab神經(jīng)網(wǎng)絡(luò)實(shí)現(xiàn)非線性識(shí)別\非線性識(shí)別\【源碼使用必讀】.url
?????文件????????5110??2018-09-17?10:50??用matlab神經(jīng)網(wǎng)絡(luò)實(shí)現(xiàn)非線性識(shí)別\非線性識(shí)別\nonlinear.m
?????文件?????????467??2018-09-17?10:50??用matlab神經(jīng)網(wǎng)絡(luò)實(shí)現(xiàn)非線性識(shí)別\非線性識(shí)別\sigmoid.m
評論
共有 條評論