統計學習(第一章)李航 最小二乘擬合正弦函數,正則化

1.用最小二乘法擬合曲線 

"用目標函數y=sin2πx, 加上一個正態分佈的噪音干擾,用多項式去擬合"
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import leastsq#最小二乘

def real_f(x):#目標函數
    return np.sin(2*np.pi*x)
def fit_f(p, x):#多項式
    f = np.poly1d(p)#多項式函數
    return f(x)
def residuals_f(p,x,y):#殘差, y預測值
    return fit_f(p,x) - y

x = np.linspace(0, 1, 10)#10個噪聲點
x_points = np.linspace(0, 1, 1000)#1000個真實目標點
y_ = real_f(x)
y = [np.random.normal(0,0.1) + y1 for y1 in y_]#加均值0、方差0.1的正態分佈噪音
# print(y)

#擬合函數
def f(M = 0):#M多項式次數
    p_init = np.random.rand(M+1)#隨機初始化多項式參數,次數爲M+1個
    p_lsq = leastsq(residuals_f, p_init, args=(x, y))#三個參數:誤差函數、函數參數列表、數據點
    print('Fitting Parameters', p_lsq[0])

    plt.plot(x_points, real_f(x_points), label = 'real')#真實曲線
    plt.plot(x_points, fit_f(p_lsq[0], x_points), label = 'fitted')#擬合曲線
    plt.plot(x, y, 'bo',label = 'noise')#bo藍,ro紅
    plt.legend()#加圖例
    plt.show()
    return p_lsq

p_lsq_0 = f(M=0)
p_lsq_1 = f(M=1)
p_lsq_3 = f(M=3)
p_lsq_9 = f(M=9)

M=0:

Fitting Parameters [0.08448856]

M=1:

Fitting Parameters [-1.30154406  0.7352606 ]

M=3:

Fitting Parameters [ 22.44579609 -33.8218247   11.62629268  -0.06332948]

M=9:

Fitting Parameters [ 1.82375584e+02  2.43791693e+03 -1.09575769e+04  1.79559584e+04
 -1.51086907e+04  7.17416839e+03 -1.93675226e+03  2.60956104e+02
 -8.20740890e+00  7.11734480e-02]

2.正則化

由上圖可見,當M=9,顯示過擬合, 所以引入正則化項(regularizer),降低過擬合

迴歸問題中,損失函數是平方損失,正則化可以是參數向量的L2範數,也可以是L1範數。

  • L1: regularization*abs(p)

  • L2: 0.5 * regularization * np.square(p)

"用目標函數y=sin2πx, 加上一個正態分佈的噪音干擾,用多項式去擬合"
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import leastsq#最小二乘

def real_f(x):#目標函數
    return np.sin(2*np.pi*x)
def fit_f(p, x):#多項式
    f = np.poly1d(p)#多項式函數
    return f(x)
def residuals_f(p,x,y):#殘差, y預測值
    return fit_f(p,x) - y

'採用L2正則化'
lamda = 0.0001
def residuals_f_regularization(p,x,y):
    ret = fit_f(p,x) - y
    return np.append(ret, 0.5*lamda*np.square(p))

x = np.linspace(0, 1, 10)#10個噪聲點
x_points = np.linspace(0, 1, 1000)#1000個真實目標點
y_ = real_f(x)
y = [np.random.normal(0,0.1) + y1 for y1 in y_]#加均值0、方差0.1的正態分佈噪音
# print(y)

#擬合函數
def f(M = 0):#M多項式次數
    p_init = np.random.rand(M+1)#隨機初始化多項式參數,次數爲M+1個
    p_lsq = leastsq(residuals_f, p_init, args=(x, y))#三個參數:誤差函數、函數參數列表、數據點
    p_lsq_regularization = leastsq(residuals_f_regularization, p_init, args=(x, y))
    print('Fitting Parameters', p_lsq[0])

    plt.plot(x_points, real_f(x_points), label = 'real')#真實曲線
    plt.plot(x_points, fit_f(p_lsq[0], x_points), label = 'fitted')#擬合曲線
    plt.plot(x_points, fit_f(p_lsq_regularization[0], x_points), label = 'regularization')#正則化後的擬合曲線
    plt.plot(x, y, 'bo',label = 'noise')#bo藍,ro紅
    plt.legend()#加圖例
    plt.show()
    return p_lsq

p_lsq_9 = f(M=9)

M=9:

Fitting Parameters [ 4.76887118e+04 -2.13138573e+05  3.99285066e+05 -4.07159339e+05
  2.45507544e+05 -8.89206003e+04  1.87421262e+04 -2.10846132e+03
  1.03908481e+02 -2.43222631e-01]

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章