資源簡介
matlab實現(xiàn)的偏最小二乘PLS和一個實例,適合初學(xué)者學(xué)習(xí)使用。
代碼片段和文件信息
%%?Principal?Component?Analysis?and?Partial?Least?Squares
%?Principal?Component?Analysis?(PCA)?and?Partial?Least?Squares?(PLS)?are
%?widely?used?tools.?This?code?is?to?show?their?relationship?through?the
%?Nonlinear?Iterative?PArtial?Least?Squares?(NIPALS)?algorithm.
%%?The?Eigenvalue?and?Power?Method
%?The?NIPALS?algorithm?can?be?derived?from?the?Power?method?to?solve?the
%?eigenvalue?problem.?Let?x?be?the?eigenvector?of?a?square?matrix?A
%?corresponding?to?the?eignvalue?s:
%
%?$$Ax=sx$$
%
%?Modifying?both?sides?by?A?iteratively?leads?to
%
%?$$A^kx=s^kx$$
%
%?Now?consider?another?vectro?y?which?can?be?represented?as?a?linear
%?combination?of?all?eigenvectors:
%
%?$$y=\sum_i^n?b_ix_i=Xb$$
%
%?where
%
%?$$X=\left[x_1\\\?\cdots\\\?x_n?\right]$$
%
%?and
%
%?$$b?=?\left[b_1\\\?\cdots\\\?b_n?\right]^T$$
%
%?Modifying?y?by?A?gives
%
%?$$Ay=AXb=XSb$$
%
%?Where?S?is?a?diagnal?matrix?consisting?all?eigenvalues.?Therefore?for
%?a?large?enough?k
%
%?$$A^ky=XS^kb\approx?\alpha?x_1$$
%
%?That?is?the?iteration?will?converge?to?the?direction?of?x_1?which?is?the
%?eigenvector?corresponding?to?the?eigenvalue?with?the?maximum?module.
%?This?leads?to?the?following?Power?method?to?solve?the?eigenvalue?problem.
A=randn(105);
%?sysmetric?matrix?to?ensure?real?eigenvalues
B=A‘*A;
%find?the?column?which?has?the?maximum?norm
[dumidx]=max(sum(A.*A));
x=A(:idx);
%storage?to?judge?convergence
x0=x-x;
%convergence?tolerant
tol=1e-6;
%iteration?if?not?converged
while?norm(x-x0)>tol
????%iteration?to?approach?the?eigenvector?direction
????y=A‘*x;
????%normalize?the?vector
????y=y/norm(y);
????%save?previous?x
????x0=x;
????%x?is?a?product?of?eigenvalue?and?eigenvector
????x=A*y;
end
%?the?largest?eigen?value?corresponding?eigenvector?is?y
s=x‘*x;
%?compare?it?with?those?obtained?with?eig
[VD]=eig(B);
[didx]=max(diag(D));
v=V(:idx);
disp(d-s)
%?v?and?y?may?be?different?in?signs
disp(min(norm(v-y)norm(v+y)))
%%?The?NIPALS?Algorithm?for?PCA
%?The?PCA?is?a?dimension?reduction?technique?which?is?based?on?the
%?following?decomposition:
%
%?$$X=TP^T+E$$
%
%?Where?X?is?the?data?matrix?(m?x?n)?to?be?analysed?T?is?the?so?called
%?score?matrix?(m?x?a)?P?the?loading?matrix?(n?x?a)?and?E?the?residual.
%?For?a?given?tolerance?of?residual?the?number?of?principal?components?a
%?can?be?much?smaller?than?the?orginal?variable?dimension?n.?
%?The?above?power?algorithm?can?be?extended?to?get?T?and?P?by?iteratively
%?subtracting?A?(in?this?case?X)?by?x*y‘?(in?this?case?t*p‘)?until?the
%?given?tolerance?satisfied.?This?is?the?so?called?NIPALS?algorithm.
%?The?data?matrix?with?normalization
A=randn(105);
meanx=mean(A);
stdx=std(A);
X=(A-meanx(ones(101):))./stdx(ones(101):);
B=X‘*X;
%?allocate?T?and?P
T=zeros(105);
P=zeros(5);
%?tol?for?convergence
tol=1e-6;
%?tol?for?PC?of?95?percent
tol2=(1-0.95)*5*(10-1);
for?k=1:5
????%find?the?column?which?has?the?maxim
?屬性????????????大小?????日期????時間???名稱
-----------?---------??----------?-----??----
?????文件????????8947??2014-07-11?16:56??pls\learningpcapls.m
?????文件????????1327??2014-07-11?16:56??pls\license.txt
?????文件????????4642??2014-08-01?17:13??pls\NIPLS_for_PLS.m
?????文件????????3732??2014-07-24?16:28??pls\pls.m
?????文件????????3543??2014-07-11?16:56??pls\plsnc.m
?????文件???????23709??2014-07-11?16:56??pls\html\learningpcapls.html
?????文件????????2766??2014-07-11?16:56??pls\html\learningpcapls_01.png
?????文件????????2544??2014-07-11?16:56??pls\html\learningpcapls_eq14726.png
?????文件????????1651??2014-07-11?16:56??pls\html\learningpcapls_eq1475.png
?????文件????????1239??2014-07-11?16:56??pls\html\learningpcapls_eq1937.png
?????文件????????2086??2014-07-11?16:56??pls\html\learningpcapls_eq2092.png
?????文件????????4390??2014-07-11?16:56??pls\html\learningpcapls_eq29314.png
?????文件????????1293??2014-07-11?16:56??pls\html\learningpcapls_eq38356.png
?????文件????????1368??2014-07-11?16:56??pls\html\learningpcapls_eq48172.png
?????文件????????2334??2014-07-11?16:56??pls\html\learningpcapls_eq7260.png
?????文件?????????850??2014-07-11?16:56??pls\html\learningpcapls_eq955.png
評論
共有 條評論