我正在嘗試使用一個簡單的邏輯回歸示例 sklearn.linear_model.LogisticRegression這是代碼:import pandas as pdimport numpy as npimport matplotlib.pyplot as pltimport seaborn as snsfrom sklearn.linear_model import LogisticRegressionfrom sklearn import metrics# some randomly generated data with two well differentiated groups x1 = np.random.normal(loc=15, scale=2, size=(30,1))y1 = np.random.normal(loc=10, scale=2, size=(30,1))x2 = np.random.normal(loc=25, scale=2, size=(30,1))y2 = np.random.normal(loc=20, scale=2, size=(30,1))data1 = np.concatenate([x1, y1, np.zeros(shape=(30,1))], axis=1)data2 = np.concatenate([x2, y2, np.ones(shape=(30,1))], axis=1)dfa = pd.DataFrame(data=data1, columns=["F1", "F2", "group"])dfb = pd.DataFrame(data=data2, columns=["F1", "F2", "group"])df = pd.concat([dfa, dfb], ignore_index=True)# the actual fittingfeatures = [item for item in df.columns if item not in ("group")]logreg = LogisticRegression(verbose=1)logreg.fit(df[features], df.group)# plotting and checking the resulttheta = logreg.coef_[0,:] # parametersy0 = logreg.intercept_ # interceptprint("Theta =", theta)print("Intercept = ", y0)xdb = np.arange(0, 30, 0.2) # dummy x vector for decision boundaryydb = -(y0+theta[0]*xdb) / theta[1] # decision boundary y valuesfig = plt.figure()ax = fig.add_subplot(111)colors = {0 : "red", 1 : "blue"}for i, group in df.groupby("group"): plt.plot(group["F1"], group["F2"], MarkerFaceColor = colors[i], Marker = "o", LineStyle="", MarkerEdgeColor=colors[i])plt.plot(xdb, ydb, LineStyle="--", Color="b")令人震驚的是,結(jié)果圖如下所示:事實(shí)上,準(zhǔn)確度可以計算為:predictions = logreg.predict(df[features])metrics.accuracy_score(predictions, df["group"])結(jié)果是 0.966...我一定是做錯了什么,只是想不通是什么。任何幫助深表感謝!
1 回答

慕慕森
TA貢獻(xiàn)1856條經(jīng)驗(yàn) 獲得超17個贊
這是由于正則化。線的最佳值是截距值約為 -16,但由于正則化,它無法達(dá)到該水平。
Logistic 回歸最小化損失函數(shù),即誤差和權(quán)重值的組合。在這種情況下,當(dāng)我們增加 C 模型的值時,將更多地關(guān)注減少錯誤(從而找到更好的決策邊界)而不是權(quán)重。結(jié)果在適當(dāng)?shù)臎Q策邊界。
盡管正則化在大多數(shù)現(xiàn)實(shí)世界場景中非常重要。在某些情況下,重要的是不要使用一個。
進(jìn)行以下更改
logreg = LogisticRegression(verbose=1, C=100)
輸出如下
閱讀有關(guān)正則化的更多信息以更好地理解這一點(diǎn)
添加回答
舉報
0/150
提交
取消