-
iris數(shù)據(jù)集:
5.1 3.5 1.4 0.2 Iris-setosa 4.9 3 1.4 0.2 Iris-setosa 4.7 3.2 1.3 0.2 Iris-setosa 4.6 3.1 1.5 0.2 Iris-setosa 5 3.6 1.4 0.2 Iris-setosa 5.4 3.9 1.7 0.4 Iris-setosa 4.6 3.4 1.4 0.3 Iris-setosa 5 3.4 1.5 0.2 Iris-setosa 4.4 2.9 1.4 0.2 Iris-setosa 4.9 3.1 1.5 0.1 Iris-setosa 5.4 3.7 1.5 0.2 Iris-setosa 4.8 3.4 1.6 0.2 Iris-setosa 4.8 3 1.4 0.1 Iris-setosa 4.3 3 1.1 0.1 Iris-setosa 5.8 4 1.2 0.2 Iris-setosa 5.7 4.4 1.5 0.4 Iris-setosa 5.4 3.9 1.3 0.4 Iris-setosa 5.1 3.5 1.4 0.3 Iris-setosa 5.7 3.8 1.7 0.3 Iris-setosa 5.1 3.8 1.5 0.3 Iris-setosa 5.4 3.4 1.7 0.2 Iris-setosa 5.1 3.7 1.5 0.4 Iris-setosa 4.6 3.6 1 0.2 Iris-setosa 5.1 3.3 1.7 0.5 Iris-setosa 4.8 3.4 1.9 0.2 Iris-setosa 5 3 1.6 0.2 Iris-setosa 5 3.4 1.6 0.4 Iris-setosa 5.2 3.5 1.5 0.2 Iris-setosa 5.2 3.4 1.4 0.2 Iris-setosa 4.7 3.2 1.6 0.2 Iris-setosa 4.8 3.1 1.6 0.2 Iris-setosa 5.4 3.4 1.5 0.4 Iris-setosa 5.2 4.1 1.5 0.1 Iris-setosa 5.5 4.2 1.4 0.2 Iris-setosa 4.9 3.1 1.5 0.1 Iris-setosa 5 3.2 1.2 0.2 Iris-setosa 5.5 3.5 1.3 0.2 Iris-setosa 4.9 3.1 1.5 0.1 Iris-setosa 4.4 3 1.3 0.2 Iris-setosa 5.1 3.4 1.5 0.2 Iris-setosa 5 3.5 1.3 0.3 Iris-setosa 4.5 2.3 1.3 0.3 Iris-setosa 4.4 3.2 1.3 0.2 Iris-setosa 5 3.5 1.6 0.6 Iris-setosa 5.1 3.8 1.9 0.4 Iris-setosa 4.8 3 1.4 0.3 Iris-setosa 5.1 3.8 1.6 0.2 Iris-setosa 4.6 3.2 1.4 0.2 Iris-setosa 5.3 3.7 1.5 0.2 Iris-setosa 5 3.3 1.4 0.2 Iris-setosa 7 3.2 4.7 1.4 Iris-versicolor 6.4 3.2 4.5 1.5 Iris-versicolor 6.9 3.1 4.9 1.5 Iris-versicolor 5.5 2.3 4 1.3 Iris-versicolor 6.5 2.8 4.6 1.5 Iris-versicolor 5.7 2.8 4.5 1.3 Iris-versicolor 6.3 3.3 4.7 1.6 Iris-versicolor 4.9 2.4 3.3 1 Iris-versicolor 6.6 2.9 4.6 1.3 Iris-versicolor 5.2 2.7 3.9 1.4 Iris-versicolor 5 2 3.5 1 Iris-versicolor 5.9 3 4.2 1.5 Iris-versicolor 6 2.2 4 1 Iris-versicolor 6.1 2.9 4.7 1.4 Iris-versicolor 5.6 2.9 3.6 1.3 Iris-versicolor 6.7 3.1 4.4 1.4 Iris-versicolor 5.6 3 4.5 1.5 Iris-versicolor 5.8 2.7 4.1 1 Iris-versicolor 6.2 2.2 4.5 1.5 Iris-versicolor 5.6 2.5 3.9 1.1 Iris-versicolor 5.9 3.2 4.8 1.8 Iris-versicolor 6.1 2.8 4 1.3 Iris-versicolor 6.3 2.5 4.9 1.5 Iris-versicolor 6.1 2.8 4.7 1.2 Iris-versicolor 6.4 2.9 4.3 1.3 Iris-versicolor 6.6 3 4.4 1.4 Iris-versicolor 6.8 2.8 4.8 1.4 Iris-versicolor 6.7 3 5 1.7 Iris-versicolor 6 2.9 4.5 1.5 Iris-versicolor 5.7 2.6 3.5 1 Iris-versicolor 5.5 2.4 3.8 1.1 Iris-versicolor 5.5 2.4 3.7 1 Iris-versicolor 5.8 2.7 3.9 1.2 Iris-versicolor 6 2.7 5.1 1.6 Iris-versicolor 5.4 3 4.5 1.5 Iris-versicolor 6 3.4 4.5 1.6 Iris-versicolor 6.7 3.1 4.7 1.5 Iris-versicolor 6.3 2.3 4.4 1.3 Iris-versicolor 5.6 3 4.1 1.3 Iris-versicolor 5.5 2.5 4 1.3 Iris-versicolor 5.5 2.6 4.4 1.2 Iris-versicolor 6.1 3 4.6 1.4 Iris-versicolor 5.8 2.6 4 1.2 Iris-versicolor 5 2.3 3.3 1 Iris-versicolor 5.6 2.7 4.2 1.3 Iris-versicolor 5.7 3 4.2 1.2 Iris-versicolor 5.7 2.9 4.2 1.3 Iris-versicolor 6.2 2.9 4.3 1.3 Iris-versicolor 5.1 2.5 3 1.1 Iris-versicolor 5.7 2.8 4.1 1.3 Iris-versicolor 6.3 3.3 6 2.5 Iris-virginica 5.8 2.7 5.1 1.9 Iris-virginica 7.1 3 5.9 2.1 Iris-virginica 6.3 2.9 5.6 1.8 Iris-virginica 6.5 3 5.8 2.2 Iris-virginica 7.6 3 6.6 2.1 Iris-virginica 4.9 2.5 4.5 1.7 Iris-virginica 7.3 2.9 6.3 1.8 Iris-virginica 6.7 2.5 5.8 1.8 Iris-virginica 7.2 3.6 6.1 2.5 Iris-virginica 6.5 3.2 5.1 2 Iris-virginica 6.4 2.7 5.3 1.9 Iris-virginica 6.8 3 5.5 2.1 Iris-virginica 5.7 2.5 5 2 Iris-virginica 5.8 2.8 5.1 2.4 Iris-virginica 6.4 3.2 5.3 2.3 Iris-virginica 6.5 3 5.5 1.8 Iris-virginica 7.7 3.8 6.7 2.2 Iris-virginica 7.7 2.6 6.9 2.3 Iris-virginica 6 2.2 5 1.5 Iris-virginica 6.9 3.2 5.7 2.3 Iris-virginica 5.6 2.8 4.9 2 Iris-virginica 7.7 2.8 6.7 2 Iris-virginica 6.3 2.7 4.9 1.8 Iris-virginica 6.7 3.3 5.7 2.1 Iris-virginica 7.2 3.2 6 1.8 Iris-virginica 6.2 2.8 4.8 1.8 Iris-virginica 6.1 3 4.9 1.8 Iris-virginica 6.4 2.8 5.6 2.1 Iris-virginica 7.2 3 5.8 1.6 Iris-virginica 7.4 2.8 6.1 1.9 Iris-virginica 7.9 3.8 6.4 2 Iris-virginica 6.4 2.8 5.6 2.2 Iris-virginica 6.3 2.8 5.1 1.5 Iris-virginica 6.1 2.6 5.6 1.4 Iris-virginica 7.7 3 6.1 2.3 Iris-virginica 6.3 3.4 5.6 2.4 Iris-virginica 6.4 3.1 5.5 1.8 Iris-virginica 6 3 4.8 1.8 Iris-virginica 6.9 3.1 5.4 2.1 Iris-virginica 6.7 3.1 5.6 2.4 Iris-virginica 6.9 3.1 5.1 2.3 Iris-virginica 5.8 2.7 5.1 1.9 Iris-virginica 6.8 3.2 5.9 2.3 Iris-virginica 6.7 3.3 5.7 2.5 Iris-virginica 6.7 3 5.2 2.3 Iris-virginica 6.3 2.5 5 1.9 Iris-virginica 6.5 3 5.2 2 Iris-virginica 6.2 3.4 5.4 2.3 Iris-virginica 5.9 3 5.1 1.8 Iris-virginica 查看全部 -
import?numpy?as?np class?Perceptron(object): ????"""???? ????eta:學(xué)習(xí)率???? ????n_iter:權(quán)重向量的訓(xùn)練次數(shù)???? ????w_:神經(jīng)分叉權(quán)重向量???? ????errors:用于記錄神經(jīng)元判斷出錯次數(shù)???? ????""" ????def?__init__(self,?eta,?n_iter):???????? ????????self.eta=eta???????? ????????self.n_iter=n_iter???????? ????????pass ????def?net_input(self,X):???????????? ????????"""???????????? ????????z?=?W0*1?+?W1*X1?+....?Wn*Xn???????????? ????????"""???????????? ????????return?np.dot(X,self.w_[1:])?+?self.w_[0]???????????? ????????pass ???? ????def?predict(self,?X):???????? ????????return?np.where(self.net_input(X)?>=?0.0?,1,?-1)???????? ????????pass ????def?fit(self,X,y):???????? ????????"""???????? ????????輸入訓(xùn)練數(shù)據(jù),訓(xùn)練神經(jīng)元,x輸入樣本向量,y對應(yīng)樣本的正確分類???????????????? ????????X:shape[n_samples,?n_features]???????? ????????eg:X:[[1,?2,?,3],[4,?5,?6]]??????????? ???????????n_samples:?2??????????? ???????????n_features:?3?????????????????????? ???????????y:[1,?-1]???????? ????????"""???????? ????????"""???????? ????????初始化權(quán)重向量為0???????? ????????加一是因?yàn)榍懊嫠惴ㄌ岬降膚0,也就是步調(diào)函數(shù)閾值???????? ????????"""???????? ????????self.w_?=?np.zeros(1+X.shape[1])???????? ????????self.errors_?=[]???????????????? ???????? ????????for?_?in?range(self.n_iter):???????????? ????????????errors?=?0???????????? ????????????"""???????????? ????????????X:[[1,2,3],[4,5,6]]???????????? ????????????y:[1,?-1]???????????? ????????????zip(X,y)=[[1,2,3,?1],[4,5,6,?-1]]???????????? ????????????"""???????????? ????????????for?xi,?target?in?zip(X,y):????????????????? ????????????????"""???????????????? ????????????????update?=?學(xué)習(xí)率?*?(y-y')???????????????? ????????????????"""???????????????? ????????????????update?=?self.eta?*?(target?-?self.predict(xi))???????????????? ????????????????"""???????????????? ????????????????xi?是一個向量???????????????? ????????????????update?*?xi?等價:???????????????? ????????????????[更新w(1)?=?X[1]*update,?更新w(2)?=?X[2]*update,更新w(3)?=?X[3]*update,]???????????????? ????????????????"""???????????????? ????????????????self.w_[1:]+=update?*?xi???????????????? ????????????????self.w_[0]?+=?update???????????????????????????????? ????????????????errors?+=?int(update?!=?0.0)???????????????? ????????????????self.errors_.append(errors)???????????????? ????????????????pass???????????????????????? ????????????pass??? ????pass file?=?"D:/PyCharm_test_file/Jupyter_test/iris1.xlsx" import?pandas?as?pd df?=?pd.read_excel(file,header=None) #df.head(10) #print(df)?? import?matplotlib.pyplot?as?plt import?numpy?as?npy?=?df.loc[0:99,?4].values y?=?np.where(y?==?'Iris-setosa',?-1,?1) #print(y) X?=?df.iloc[0:100,?[0,?2]].values #print(X) plt.scatter(X[:50,?0],?X[:50,?1],?color='red',?marker='o',?label='setosa') plt.scatter(X[50:100,?0],?X[50:100,?1],?color='blue',?marker='x',?label='vericolor') plt.rcParams['font.sans-serif']=['SimHei'] plt.xlabel('花瓣長度') plt.ylabel('花莖長度') plt.legend(loc='upper?left') plt.show()?? from?matplotlib.colors?import?ListedColormap def?plot_decision_regions(X,?y,?classifier,?resolution=0.02):???? ????marker?=?('s',?'x',?'o',?'v')???? ????colors?=?('red',?'blue',?'lightgreen',?'gray',?'cyan')???? ????cmap?=?ListedColormap(colors[:len(np.unique(y))])???? ????x1_min,?x1_max?=?X[:,?0].min()?-?1,?X[:,?0].max()???? ????x2_min,?x2_max?=?X[:,?1].min()?-?1,?X[:,?1].max()???? ????#將x1、x2最大最小值通過arange函數(shù)得到的向量,擴(kuò)展成兩個二維矩陣???? ????xx1,?xx2?=?np.meshgrid(np.arange(x1_min,?x1_max,?resolution),?np.arange(x2_min,?x2_max,?resolution))???? ????#預(yù)測???? ????Z?=?classifier.predict(np.array([xx1.ravel(),?xx2.ravel()]).T)?#ravel還原成單維向量???? ????#繪制???? ????Z=?Z.reshape(xx1.shape)?#將Z轉(zhuǎn)換成與xx1一樣的二維數(shù)組???? ????plt.contourf(xx1,?xx2,?Z,?alpha=0.4,?cmap=cmap)?#在兩組分類結(jié)果中間畫分割線-->必須線性可分???? ????plt.xlim(xx1.min(),?xx1.max())???? ????plt.ylim(xx2.min(),?xx2.max())???? ????for?idx,?cl?in?enumerate(np.unique(y)):???????? ????????plt.scatter(x=X[y==cl,?0],?y=X[y==cl,?1],?alpha=0.8,?c=cmap(idx),?marker=marker[idx],?label=cl) ???????? ???????? ppn?=?Perceptron(0.1,?10) ppn.fit(X,?y) plot_decision_regions(X,?y,?ppn,?resolution=0.02)
查看全部 -
自適應(yīng)線性神經(jīng)元的分類算法
class?AdalineGD(object): ????def?__init__(self,?eta=0.1,?n_iter=50): ????????self.eta?=?eta ????????self.n_iter?=?n_iter ????def?fit(self,?X,?y):?#與感知器最大的不同 ????????self.w_?=?np.zeros(1?+?X.shape[1]) ????????self.cost_?=?[]?#成本函數(shù):判斷改進(jìn)效果,使其不斷減小 ????????for?i?in?range(self.n_iter): ????????????output?=?self.net_input(X) ????????????errors?=?y?-?output ????????????self.w_[1:]?+=?self.eta?*?X.T.dot(errors) ????????????self.w_[0]?+=?self.eta?*?errors.sum() ????????????cost?=?(errors?**?2).sum()/2.0 ????????????self.cost_.append(cost) ????????return?self ????def?net_input(self,?X): ????????return?np.dot(X,?self.w_[1:]?+?self.w_[0]) ????def?activation(self,?X): ????????return?self.net_input(self,?X): ????def?predict(self,?X): ????????return?np.where(self.activation(X)?>=?0,?1,?-1) #運(yùn)行算法??????????????? ada?=?AdalineGD(eta=0.0001,?n_iter=50)?#學(xué)習(xí)率越低,每次權(quán)重的改進(jìn)越精確;迭代次數(shù)越大,優(yōu)化的結(jié)果越準(zhǔn)確。 ada.fit(X,?y) plot_decision_regions(X,?y,?classifier=ada)?#預(yù)測數(shù)據(jù)輸入到神經(jīng)網(wǎng)絡(luò)后預(yù)測 plt.title('Adaline-Gradient?descent') plt.xlabel('花莖長度') plt.ylabel('花瓣長度') plt.legend(loc='upper?left') plt.show() plt.plot(range(1,?len(ada.cost_)+1),?ada.cost_,?marker='o')?#檢測改進(jìn)效果 plt.xlabel('Epochs')?#自我迭代的學(xué)習(xí)次數(shù) plt.ylabel('sum-squard-error')?#做出錯誤判斷的次數(shù)
查看全部 -
感知器的激活函數(shù)是步調(diào)函數(shù),當(dāng)輸入數(shù)據(jù)大于閾值-->1,小于閾值-->0.
自適應(yīng)線性神經(jīng)元的激活函數(shù)是直接將輸入數(shù)據(jù)與參數(shù)相乘后求和:w0+x1w1+x2w2+……,并且會將計(jì)算結(jié)果與輸入結(jié)果進(jìn)行比較,若不一致,會根據(jù)給定結(jié)果動態(tài)調(diào)整參數(shù)-->漸進(jìn)下降法。
漸進(jìn)下降法:
和方差函數(shù)(U型),當(dāng)函數(shù)對w求偏導(dǎo)后得到的切線斜率小于0(大于0)-->增大(減?。┫鄳?yīng)神經(jīng)元參數(shù)w值
查看全部 -
數(shù)據(jù)解析和可視化
import?pandas?as?pd import?matplotlib.pyplot?as?plt import?numpy?as?np df?=?pd.read_csv(file,?header=None) y?=?df.loc[0:100,?4].values y?=?np.where(y?==?'Iris-setosa',?-1,?1) X?=?df.iloc[0:100,?[0,?2]].values plt.scatter(X[:50,?0],?X[:50,?1],?color='red',?marker='o',?label='setosa') plt.scatter(X[50:100,?0],?X[50:100,?1],?color='blue',?marker='x',?label='versicolor') plt.xlabel('花瓣長度') plt.ylabel('花莖長度') plt.legend(loc='upper?left') plt.show()
數(shù)據(jù)分類(將預(yù)測的數(shù)據(jù)輸入到神經(jīng)網(wǎng)絡(luò)中,以圖形方式繪制)
from?matplotlib.colors?import?ListedColormap def?plot_decision_regions(X,?y,?classifier,?resolution=0.02): ????marker?=?('s',?'x',?'o',?'v') ????colors?=?('red',?'blue',?'lightgreen',?'gray',?'cyan') ????cmap?=?ListedColormap(colors[:len(np.unique(y))]) ????x1_min,?x1_max?=?X[:,?0].min()?-?1,?X[:,?0].max() ????x2_min,?x2_max?=?X[:,?1].min()?-?1,?X[:,?1].max() ????#將x1、x2最大最小值通過arange函數(shù)得到的向量,擴(kuò)展成兩個二維矩陣 ????xx1,?xx2?=?np.meshgrid(np.arange(x1_min,?x1_max,?resolution),?np.arange(x2_min,?x2_max,?resolution)) ????#預(yù)測 ????Z?=?classifier.predict(np.array([xx1.ravel(),?xx2.ravel()]).T)?#ravel還原成單維向量 ????#繪制 ????Z=?Z.reshape(xx1.shape)?#將Z轉(zhuǎn)換成與xx1一樣的二維數(shù)組 ????plt.contourf(xx1,?xx2,?Z,?alpha=0.4,?cmap=cmap)?#在兩組分類結(jié)果中間畫分割線-->必須線性可分 ????plt.xlim(xx1.min(),?xx1.max()) ????plt.ylim(xx2.min(),?xx2.max()) ????for?idx,?cl?in?enumerate(np.unique(y)): ????????plt.scatter(x=X[y==cl,?0],?y=X[y==cl,?1],?alpha=0.8,?c=cmap(idx),?marker=markers[idx],?label=cl)
查看全部 -
感知器的分類算法
import?numpy?as?mp class?Perception(objet): ????def?__init__(self,?eta=0.01,?n_ier=10):?#eta學(xué)習(xí)率,n_iter訓(xùn)練權(quán)重向量的重復(fù)次數(shù) ????????self.eta?=?eta ????????self.n_iter?=?n_iter ????#根據(jù)輸入樣本培訓(xùn)神經(jīng)元,X:shape[n_sample,?n_feature],?n_feature指神經(jīng)元接收的電信號X數(shù)量 ????def?fit(self,?X,?y):? ????????self.w_?=?np.zero(1+X.shape[1])?#初始權(quán)重向量w_為0,括號里加一是包括激勵函數(shù)的閾值w0 ????????self.errors_?=?[]?#errors_用于記錄神經(jīng)元判斷出錯次數(shù),是個隊(duì)列 ???????? ????????for?_?in?range(self.n_iter): ????????????errors?=?0 ????????????for?xi,?target?in?zip(X,?y): ????????????????#權(quán)重更新 ????????????????update?=?self.eta?*?(target?-?self.predict(xi)) ????????????????self.w_[1:]?+=?update?*?xi ????????????????#閾值更新 ????????????????self.w_[0]?+=?update ???????????????? ????????????????errors?+=?int(update?!=?0.0) ????????????????self.errors_.append(errors) ????????? ????????def?net_input(self,?X):?#輸入電信號X與權(quán)重w的點(diǎn)積 ????????????return?np.dot(X,?self.w_[1:])?+?self.w_[0] ???????????? ????????def?predict(self,?X):?#預(yù)測分類 ????????????return?np.where(self.net_input(X)?>=?0.0,?1,?-1)
查看全部 -
感知器算法:要求數(shù)據(jù)線性可分
查看全部 -
????機(jī)器學(xué)習(xí)中用于對數(shù)據(jù)進(jìn)行分類的算法:1.感知器。2.適應(yīng)性的線性神經(jīng)元。
????激活函數(shù)(也叫單元步調(diào)函數(shù))
????機(jī)器學(xué)習(xí)的本質(zhì)是模擬人的神經(jīng)元對信息的處理方法。根據(jù)神經(jīng)學(xué)的研究,神經(jīng)元可以看做是一個簡單的帶有二進(jìn)制輸出功能的邏輯電路門。
查看全部 -
自適應(yīng)線性神經(jīng)元
查看全部 -
神經(jīng)元的數(shù)學(xué)表示
激活函數(shù)
向量的轉(zhuǎn)置與乘積的表達(dá)式
查看全部 -
課程大綱
查看全部 -
算法步驟總結(jié)
查看全部 -
感知器數(shù)據(jù)分類算法步驟
查看全部 -
真反饋電路查看全部
-
更新權(quán)重
查看全部
舉報