tensorflow學習1-線性迴歸

1.代碼:

import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf

x=np.random.normal(0.0,0.55,1000)
y=x*0.1+0.3+np.random.normal(0.0,0.03,1000)

#plt.scatter(x,y,c='r')
#plt.show()

W=tf.Variable(tf.random_uniform([1],-1,0.1),name="W")
b=tf.Variable(tf.zeros([1]),name="b")

#計算預估值
yy=W*x+b

#損失函數:方差
loss=tf.reduce_mean(tf.square(yy-y),name="loss")
#採用梯度下降來優化參數
train=tf.train.GradientDescentOptimizer(0.5).minimize(loss)

init=tf.global_variables_initializer()
sess=tf.Session()
sess.run(init)

print("W=",sess.run(W),"b=",sess.run(b),"loss=",sess.run(loss))

for step in range(0,20):
    sess.run(train)
    print("W=", sess.run(W), "b=", sess.run(b), "loss=", sess.run(loss))

2.測試:

WARNING:tensorflow:From F:\python\shi_jue\venv\lib\site-packages\tensorflow\python\framework\op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
2019-05-13 20:50:58.173000: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
W= [-0.3000741] b= [0.] loss= 0.13893381
W= [-0.17857648] b= [0.29861674] loss= 0.024517918
W= [-0.09336426] b= [0.29904932] loss= 0.012227297
W= [-0.03434484] b= [0.29935268] loss= 0.0063312617
W= [0.00653303] b= [0.29956278] loss= 0.0035028234
W= [0.03484574] b= [0.29970834] loss= 0.0021459693
W= [0.05445562] b= [0.29980913] loss= 0.0014950612
W= [0.06803775] b= [0.29987893] loss= 0.0011828084
W= [0.07744497] b= [0.2999273] loss= 0.0010330149
W= [0.08396057] b= [0.29996076] loss= 0.0009611564
W= [0.08847339] b= [0.29998398] loss= 0.00092668424
W= [0.09159904] b= [0.30000004] loss= 0.00091014756
W= [0.09376393] b= [0.30001116] loss= 0.00090221455
W= [0.09526336] b= [0.30001888] loss= 0.0008984089
W= [0.0963019] b= [0.3000242] loss= 0.0008965833
W= [0.09702121] b= [0.3000279] loss= 0.0008957074
W= [0.09751941] b= [0.30003047] loss= 0.0008952874
W= [0.09786448] b= [0.30003226] loss= 0.0008950857
W= [0.09810347] b= [0.30003348] loss= 0.0008949891
W= [0.098269] b= [0.3000343] loss= 0.0008949428
W= [0.09838365] b= [0.3000349] loss= 0.0008949205

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章