tensorflow2.0筆記14:模型的保存與加載(非常重要)以及CIFAR10自定義網絡實現!

模型的保存與加載(重要)以及CIFAR10自定義網絡實現!

一、模型的保存與加載(非常重要)!

1.1、三種保存模式

1.2、模式1:save/load weights

1.2.1、實戰
  • 代碼演示
import tensorflow as tf
from tensorflow.python.keras import datasets, layers, optimizers, Sequential, metrics
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'


def preprocess(x, y):
    """
    x is a simple image, not a batch
    """
    x = tf.cast(x, dtype=tf.float32) / 255.
    x = tf.reshape(x, [28*28])
    y = tf.cast(y, dtype=tf.int32)
    y = tf.one_hot(y, depth=10)
    return x, y


batchsz = 256*2
(x, y), (x_val, y_val) = datasets.mnist.load_data()
print('datasets:', x.shape, y.shape, x.min(), x.max())

db = tf.data.Dataset.from_tensor_slices((x, y))
db = db.map(preprocess).shuffle(60000).batch(batchsz)
ds_val = tf.data.Dataset.from_tensor_slices((x_val, y_val))
ds_val = ds_val.map(preprocess).batch(batchsz)

sample = next(iter(db))
print(sample[0].shape, sample[1].shape)

network = Sequential([layers.Dense(256, activation='relu'),
                      layers.Dense(128, activation='relu'),
                      layers.Dense(64, activation='relu'),
                      layers.Dense(32, activation='relu'),
                      layers.Dense(10)])
network.build(input_shape=(None, 28 * 28))
network.summary()

network.compile(optimizer=optimizers.Adam(lr=0.01),
                loss=tf.losses.CategoricalCrossentropy(from_logits=True),
                metrics=['accuracy']
                )

network.fit(db, epochs=4, validation_data=ds_val, validation_freq=2)

network.evaluate(ds_val)

network.save_weights('weight.ckpt')
print('saved weights')
del network

# 這個創建過程必須和上面的過程一模一樣。
network = Sequential([layers.Dense(256, activation='relu'),
                     layers.Dense(128, activation='relu'),
                     layers.Dense(64, activation='relu'),
                     layers.Dense(32, activation='relu'),
                     layers.Dense(10)])
network.compile(optimizer=optimizers.Adam(lr=0.01),
		loss=tf.losses.CategoricalCrossentropy(from_logits=True),
		metrics=['accuracy'])

network.load_weights('weight.ckpt')
print('loaded weights!')
network.evaluate(ds_val)

  • 運行結果:
C:\Anaconda3\envs\tf2\python.exe E:/Codes/MyCodes/TF2/save_load_weight.py
datasets: (60000, 28, 28) (60000,) 0 255
(512, 784) (512, 10)
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                multiple                  200960    
_________________________________________________________________
dense_1 (Dense)              multiple                  32896     
_________________________________________________________________
dense_2 (Dense)              multiple                  8256      
_________________________________________________________________
dense_3 (Dense)              multiple                  2080      
_________________________________________________________________
dense_4 (Dense)              multiple                  330       
=================================================================
Total params: 244,522
Trainable params: 244,522
Non-trainable params: 0
_________________________________________________________________
Epoch 1/4

  1/118 [..............................] - ETA: 3:41 - loss: 2.3026 - accuracy: 0.0996
  7/118 [>.............................] - ETA: 30s - loss: 2.1395 - accuracy: 0.1506 
 13/118 [==>...........................] - ETA: 16s - loss: 1.6017 - accuracy: 0.2469
 19/118 [===>..........................] - ETA: 10s - loss: 1.2732 - accuracy: 0.3298
 25/118 [=====>........................] - ETA: 7s - loss: 1.0719 - accuracy: 0.3955 
 31/118 [======>.......................] - ETA: 6s - loss: 0.9313 - accuracy: 0.4474
 37/118 [========>.....................] - ETA: 4s - loss: 0.8256 - accuracy: 0.4894
 43/118 [=========>....................] - ETA: 3s - loss: 0.7480 - accuracy: 0.5241
 49/118 [===========>..................] - ETA: 3s - loss: 0.6862 - accuracy: 0.5532
 55/118 [============>.................] - ETA: 2s - loss: 0.6361 - accuracy: 0.5782
 61/118 [==============>...............] - ETA: 2s - loss: 0.5935 - accuracy: 0.5997
 68/118 [================>.............] - ETA: 1s - loss: 0.5514 - accuracy: 0.6215
 74/118 [=================>............] - ETA: 1s - loss: 0.5238 - accuracy: 0.6379
 80/118 [===================>..........] - ETA: 1s - loss: 0.4981 - accuracy: 0.6525
 86/118 [====================>.........] - ETA: 0s - loss: 0.4752 - accuracy: 0.6656
 93/118 [======================>.......] - ETA: 0s - loss: 0.4524 - accuracy: 0.6794
 99/118 [========================>.....] - ETA: 0s - loss: 0.4344 - accuracy: 0.6900
105/118 [=========================>....] - ETA: 0s - loss: 0.4195 - accuracy: 0.6998
111/118 [===========================>..] - ETA: 0s - loss: 0.4056 - accuracy: 0.7087
116/118 [============================>.] - ETA: 0s - loss: 0.3943 - accuracy: 0.7157
118/118 [==============================] - 3s 25ms/step - loss: 0.3898 - accuracy: 0.7196
Epoch 2/4

  1/118 [..............................] - ETA: 1:49 - loss: 0.1223 - accuracy: 0.9609
  8/118 [=>............................] - ETA: 13s - loss: 0.1451 - accuracy: 0.9549 
 15/118 [==>...........................] - ETA: 7s - loss: 0.1468 - accuracy: 0.9545 
 22/118 [====>.........................] - ETA: 4s - loss: 0.1421 - accuracy: 0.9550
 28/118 [======>.......................] - ETA: 3s - loss: 0.1387 - accuracy: 0.9553
 34/118 [=======>......................] - ETA: 3s - loss: 0.1353 - accuracy: 0.9558
 40/118 [=========>....................] - ETA: 2s - loss: 0.1347 - accuracy: 0.9563
 46/118 [==========>...................] - ETA: 2s - loss: 0.1345 - accuracy: 0.9566
 52/118 [============>.................] - ETA: 1s - loss: 0.1357 - accuracy: 0.9569
 58/118 [=============>................] - ETA: 1s - loss: 0.1351 - accuracy: 0.9571
 64/118 [===============>..............] - ETA: 1s - loss: 0.1336 - accuracy: 0.9573
 70/118 [================>.............] - ETA: 1s - loss: 0.1314 - accuracy: 0.9575
 76/118 [==================>...........] - ETA: 0s - loss: 0.1309 - accuracy: 0.9577
 82/118 [===================>..........] - ETA: 0s - loss: 0.1291 - accuracy: 0.9579
 88/118 [=====================>........] - ETA: 0s - loss: 0.1271 - accuracy: 0.9581
 94/118 [======================>.......] - ETA: 0s - loss: 0.1257 - accuracy: 0.9583
100/118 [========================>.....] - ETA: 0s - loss: 0.1244 - accuracy: 0.9585
106/118 [=========================>....] - ETA: 0s - loss: 0.1242 - accuracy: 0.9587
112/118 [===========================>..] - ETA: 0s - loss: 0.1234 - accuracy: 0.9589
117/118 [============================>.] - ETA: 0s - loss: 0.1227 - accuracy: 0.9590
118/118 [==============================] - 2s 20ms/step - loss: 0.1219 - accuracy: 0.9591 - val_loss: 0.1277 - val_accuracy: 0.9621
Epoch 3/4

  1/118 [..............................] - ETA: 1:35 - loss: 0.0803 - accuracy: 0.9746
  8/118 [=>............................] - ETA: 11s - loss: 0.0922 - accuracy: 0.9751 
 15/118 [==>...........................] - ETA: 6s - loss: 0.0927 - accuracy: 0.9742 
 21/118 [====>.........................] - ETA: 4s - loss: 0.0894 - accuracy: 0.9741
 27/118 [=====>........................] - ETA: 3s - loss: 0.0905 - accuracy: 0.9738
 33/118 [=======>......................] - ETA: 2s - loss: 0.0919 - accuracy: 0.9737
 39/118 [========>.....................] - ETA: 2s - loss: 0.0909 - accuracy: 0.9736
 45/118 [==========>...................] - ETA: 1s - loss: 0.0901 - accuracy: 0.9735
 51/118 [===========>..................] - ETA: 1s - loss: 0.0921 - accuracy: 0.9734
 57/118 [=============>................] - ETA: 1s - loss: 0.0939 - accuracy: 0.9732
 63/118 [===============>..............] - ETA: 1s - loss: 0.0947 - accuracy: 0.9730
 69/118 [================>.............] - ETA: 0s - loss: 0.0951 - accuracy: 0.9728
 75/118 [==================>...........] - ETA: 0s - loss: 0.0950 - accuracy: 0.9726
 81/118 [===================>..........] - ETA: 0s - loss: 0.0941 - accuracy: 0.9725
 87/118 [=====================>........] - ETA: 0s - loss: 0.0932 - accuracy: 0.9724
 93/118 [======================>.......] - ETA: 0s - loss: 0.0930 - accuracy: 0.9723
 99/118 [========================>.....] - ETA: 0s - loss: 0.0933 - accuracy: 0.9722
105/118 [=========================>....] - ETA: 0s - loss: 0.0940 - accuracy: 0.9721
111/118 [===========================>..] - ETA: 0s - loss: 0.0941 - accuracy: 0.9721
117/118 [============================>.] - ETA: 0s - loss: 0.0935 - accuracy: 0.9720
118/118 [==============================] - 2s 16ms/step - loss: 0.0929 - accuracy: 0.9720
Epoch 4/4

  1/118 [..............................] - ETA: 1:53 - loss: 0.0628 - accuracy: 0.9785
  7/118 [>.............................] - ETA: 16s - loss: 0.0742 - accuracy: 0.9779 
 14/118 [==>...........................] - ETA: 8s - loss: 0.0821 - accuracy: 0.9768 
 21/118 [====>.........................] - ETA: 5s - loss: 0.0766 - accuracy: 0.9764
 27/118 [=====>........................] - ETA: 4s - loss: 0.0773 - accuracy: 0.9762
 33/118 [=======>......................] - ETA: 3s - loss: 0.0773 - accuracy: 0.9761
 39/118 [========>.....................] - ETA: 2s - loss: 0.0772 - accuracy: 0.9760
 45/118 [==========>...................] - ETA: 2s - loss: 0.0774 - accuracy: 0.9760
 51/118 [===========>..................] - ETA: 1s - loss: 0.0779 - accuracy: 0.9759
 57/118 [=============>................] - ETA: 1s - loss: 0.0778 - accuracy: 0.9758
 63/118 [===============>..............] - ETA: 1s - loss: 0.0777 - accuracy: 0.9758
 69/118 [================>.............] - ETA: 1s - loss: 0.0782 - accuracy: 0.9758
 75/118 [==================>...........] - ETA: 0s - loss: 0.0783 - accuracy: 0.9757
 81/118 [===================>..........] - ETA: 0s - loss: 0.0769 - accuracy: 0.9757
 87/118 [=====================>........] - ETA: 0s - loss: 0.0759 - accuracy: 0.9757
 93/118 [======================>.......] - ETA: 0s - loss: 0.0756 - accuracy: 0.9757
 99/118 [========================>.....] - ETA: 0s - loss: 0.0753 - accuracy: 0.9758
105/118 [=========================>....] - ETA: 0s - loss: 0.0751 - accuracy: 0.9758
111/118 [===========================>..] - ETA: 0s - loss: 0.0750 - accuracy: 0.9758
116/118 [============================>.] - ETA: 0s - loss: 0.0744 - accuracy: 0.9759
118/118 [==============================] - 2s 19ms/step - loss: 0.0737 - accuracy: 0.9759 - val_loss: 0.1082 - val_accuracy: 0.9693

 1/20 [>.............................] - ETA: 0s - loss: 0.0897 - accuracy: 0.9766
 6/20 [========>.....................] - ETA: 0s - loss: 0.1456 - accuracy: 0.9583
11/20 [===============>..............] - ETA: 0s - loss: 0.1283 - accuracy: 0.9625
15/20 [=====================>........] - ETA: 0s - loss: 0.1195 - accuracy: 0.9664
WARNING: Logging before flag parsing goes to stderr.
20/20 [==============================] - 0s 11ms/step - loss: 0.1082 - accuracy: 0.9693
W0505 11:09:55.762711  3344 network.py:1410] This model was compiled with a Keras optimizer (<tensorflow.python.keras.optimizers.Adam object at 0x000002A2F9F35D30>) but is being saved in TensorFlow format with `save_weights`. The model's weights will be saved, but unlike with TensorFlow optimizers in the TensorFlow format the optimizer's state will not be saved.

Consider using a TensorFlow optimizer from `tf.train`.
saved weights
loaded weights!

 1/20 [>.............................] - ETA: 4s - loss: 0.0897 - accuracy: 0.9766
 7/20 [=========>....................] - ETA: 0s - loss: 0.1418 - accuracy: 0.9637
12/20 [=================>............] - ETA: 0s - loss: 0.1309 - accuracy: 0.9625
17/20 [========================>.....] - ETA: 0s - loss: 0.1112 - accuracy: 0.9635
20/20 [==============================] - 0s 23ms/step - loss: 0.1082 - accuracy: 0.9647

Process finished with exit code 0

注意: 最後測試的是accuracy爲0.9647,上面測試的爲0.9693有些細微的差別,這是因爲我們的神經網絡不止有w,b參數,還有其他一些額外的其他影響因子(比如隨機種子沒有設置),有些影響的,如果你想完全還原這個模型狀態的話,就是下面的內容,把整個模型的狀態保存下來。

1.3、模式2:save/load entire model

1.3.1、實戰
import tensorflow as tf
from tensorflow.keras import datasets, layers, optimizers, Sequential, metrics
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

def preprocess(x, y):
    """
    x is a simple image, not a batch
    """
    x = tf.cast(x, dtype=tf.float32) / 255.
    x = tf.reshape(x, [28 * 28])
    y = tf.cast(y, dtype=tf.int32)
    y = tf.one_hot(y, depth=10)
    return x, y


batchsz = 128
(x, y), (x_val, y_val) = datasets.mnist.load_data()
print('datasets:', x.shape, y.shape, x.min(), x.max())

db = tf.data.Dataset.from_tensor_slices((x, y))
db = db.map(preprocess).shuffle(60000).batch(batchsz)
ds_val = tf.data.Dataset.from_tensor_slices((x_val, y_val))
ds_val = ds_val.map(preprocess).batch(batchsz)

sample = next(iter(db))
print(sample[0].shape, sample[1].shape)

network = Sequential([layers.Dense(256, activation='relu'),
                      layers.Dense(128, activation='relu'),
                      layers.Dense(64, activation='relu'),
                      layers.Dense(32, activation='relu'),
                      layers.Dense(10)])
network.build(input_shape=(None, 28 * 28))
network.summary()

network.compile(optimizer=optimizers.Adam(lr=0.01),
                loss=tf.losses.CategoricalCrossentropy(from_logits=True),
                metrics=['accuracy']
                )

network.fit(db, epochs=3, validation_data=ds_val, validation_freq=2)

network.evaluate(ds_val)

network.save('./savemodel/model.h5')
print('saved total model.')
del network

print('load model from file')
network = tf.keras.models.load_model('./savemodel/model.h5')
network.compile(optimizer=optimizers.Adam(lr=0.01),
        loss=tf.losses.CategoricalCrossentropy(from_logits=True),
        metrics=['accuracy'])
x_val = tf.cast(x_val, dtype=tf.float32) / 255.
x_val = tf.reshape(x_val, [-1, 28 * 28])
y_val = tf.cast(y_val, dtype=tf.int32)
y_val = tf.one_hot(y_val, depth=10)
ds_val = tf.data.Dataset.from_tensor_slices((x_val, y_val)).batch(128)
network.evaluate(ds_val)

  • 運行結果:
C:\Anaconda3\envs\tf2\python.exe E:/Codes/MyCodes/TF2/save_load_model.py
datasets: (60000, 28, 28) (60000,) 0 255
(128, 784) (128, 10)
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                multiple                  200960    
_________________________________________________________________
dense_1 (Dense)              multiple                  32896     
_________________________________________________________________
dense_2 (Dense)              multiple                  8256      
_________________________________________________________________
dense_3 (Dense)              multiple                  2080      
_________________________________________________________________
dense_4 (Dense)              multiple                  330       
=================================================================
Total params: 244,522
Trainable params: 244,522
Non-trainable params: 0
_________________________________________________________________
Epoch 1/3

  1/469 [..............................] - ETA: 12:10 - loss: 2.3746 - accuracy: 0.0781
 11/469 [..............................] - ETA: 1:07 - loss: 1.6461 - accuracy: 0.2697 
 22/469 [>.............................] - ETA: 33s - loss: 1.1244 - accuracy: 0.4077 
 33/469 [=>............................] - ETA: 22s - loss: 0.9076 - accuracy: 0.4947
 44/469 [=>............................] - ETA: 17s - loss: 0.7778 - accuracy: 0.5533
 54/469 [==>...........................] - ETA: 14s - loss: 0.6928 - accuracy: 0.5925
 65/469 [===>..........................] - ETA: 11s - loss: 0.6378 - accuracy: 0.6257
 76/469 [===>..........................] - ETA: 10s - loss: 0.5904 - accuracy: 0.6518
 87/469 [====>.........................] - ETA: 8s - loss: 0.5485 - accuracy: 0.6732 
 98/469 [=====>........................] - ETA: 7s - loss: 0.5158 - accuracy: 0.6911
109/469 [=====>........................] - ETA: 6s - loss: 0.4916 - accuracy: 0.7063
120/469 [======>.......................] - ETA: 6s - loss: 0.4696 - accuracy: 0.7195
131/469 [=======>......................] - ETA: 5s - loss: 0.4542 - accuracy: 0.7309
141/469 [========>.....................] - ETA: 5s - loss: 0.4398 - accuracy: 0.7401
151/469 [========>.....................] - ETA: 4s - loss: 0.4256 - accuracy: 0.7484
162/469 [=========>....................] - ETA: 4s - loss: 0.4084 - accuracy: 0.7567
172/469 [==========>...................] - ETA: 4s - loss: 0.3997 - accuracy: 0.7636
182/469 [==========>...................] - ETA: 3s - loss: 0.3915 - accuracy: 0.7699
193/469 [===========>..................] - ETA: 3s - loss: 0.3808 - accuracy: 0.7763
204/469 [============>.................] - ETA: 3s - loss: 0.3721 - accuracy: 0.7821
215/469 [============>.................] - ETA: 3s - loss: 0.3640 - accuracy: 0.7875
226/469 [=============>................] - ETA: 2s - loss: 0.3565 - accuracy: 0.7925
237/469 [==============>...............] - ETA: 2s - loss: 0.3518 - accuracy: 0.7972
248/469 [==============>...............] - ETA: 2s - loss: 0.3467 - accuracy: 0.8015
259/469 [===============>..............] - ETA: 2s - loss: 0.3396 - accuracy: 0.8055
270/469 [================>.............] - ETA: 2s - loss: 0.3340 - accuracy: 0.8092
281/469 [================>.............] - ETA: 1s - loss: 0.3291 - accuracy: 0.8127
292/469 [=================>............] - ETA: 1s - loss: 0.3244 - accuracy: 0.8161
303/469 [==================>...........] - ETA: 1s - loss: 0.3203 - accuracy: 0.8192
314/469 [===================>..........] - ETA: 1s - loss: 0.3170 - accuracy: 0.8222
325/469 [===================>..........] - ETA: 1s - loss: 0.3119 - accuracy: 0.8250
336/469 [====================>.........] - ETA: 1s - loss: 0.3077 - accuracy: 0.8277
347/469 [=====================>........] - ETA: 1s - loss: 0.3027 - accuracy: 0.8302
358/469 [=====================>........] - ETA: 1s - loss: 0.2988 - accuracy: 0.8326
369/469 [======================>.......] - ETA: 0s - loss: 0.2956 - accuracy: 0.8350
380/469 [=======================>......] - ETA: 0s - loss: 0.2915 - accuracy: 0.8372
391/469 [========================>.....] - ETA: 0s - loss: 0.2868 - accuracy: 0.8394
401/469 [========================>.....] - ETA: 0s - loss: 0.2835 - accuracy: 0.8412
412/469 [=========================>....] - ETA: 0s - loss: 0.2812 - accuracy: 0.8432
423/469 [==========================>...] - ETA: 0s - loss: 0.2780 - accuracy: 0.8451
434/469 [==========================>...] - ETA: 0s - loss: 0.2766 - accuracy: 0.8470
445/469 [===========================>..] - ETA: 0s - loss: 0.2743 - accuracy: 0.8487
456/469 [============================>.] - ETA: 0s - loss: 0.2717 - accuracy: 0.8504
465/469 [============================>.] - ETA: 0s - loss: 0.2701 - accuracy: 0.8518
469/469 [==============================] - 4s 8ms/step - loss: 0.2689 - accuracy: 0.8525
Epoch 2/3

  1/469 [..............................] - ETA: 6:24 - loss: 0.2015 - accuracy: 0.9375
 13/469 [..............................] - ETA: 30s - loss: 0.1865 - accuracy: 0.9441 
 25/469 [>.............................] - ETA: 16s - loss: 0.1681 - accuracy: 0.9477
 37/469 [=>............................] - ETA: 11s - loss: 0.1631 - accuracy: 0.9498
 49/469 [==>...........................] - ETA: 8s - loss: 0.1565 - accuracy: 0.9508 
 61/469 [==>...........................] - ETA: 7s - loss: 0.1537 - accuracy: 0.9519
 73/469 [===>..........................] - ETA: 6s - loss: 0.1558 - accuracy: 0.9524
 85/469 [====>.........................] - ETA: 5s - loss: 0.1526 - accuracy: 0.9529
 96/469 [=====>........................] - ETA: 4s - loss: 0.1495 - accuracy: 0.9534
107/469 [=====>........................] - ETA: 4s - loss: 0.1479 - accuracy: 0.9539
118/469 [======>.......................] - ETA: 4s - loss: 0.1467 - accuracy: 0.9543
128/469 [=======>......................] - ETA: 3s - loss: 0.1466 - accuracy: 0.9547
138/469 [=======>......................] - ETA: 3s - loss: 0.1490 - accuracy: 0.9551
148/469 [========>.....................] - ETA: 3s - loss: 0.1496 - accuracy: 0.9553
159/469 [=========>....................] - ETA: 3s - loss: 0.1475 - accuracy: 0.9555
170/469 [=========>....................] - ETA: 2s - loss: 0.1460 - accuracy: 0.9557
181/469 [==========>...................] - ETA: 2s - loss: 0.1460 - accuracy: 0.9559
191/469 [===========>..................] - ETA: 2s - loss: 0.1445 - accuracy: 0.9561
202/469 [===========>..................] - ETA: 2s - loss: 0.1446 - accuracy: 0.9563
213/469 [============>.................] - ETA: 2s - loss: 0.1436 - accuracy: 0.9564
224/469 [=============>................] - ETA: 2s - loss: 0.1437 - accuracy: 0.9566
235/469 [==============>...............] - ETA: 1s - loss: 0.1446 - accuracy: 0.9567
246/469 [==============>...............] - ETA: 1s - loss: 0.1442 - accuracy: 0.9568
257/469 [===============>..............] - ETA: 1s - loss: 0.1431 - accuracy: 0.9569
268/469 [================>.............] - ETA: 1s - loss: 0.1417 - accuracy: 0.9570
279/469 [================>.............] - ETA: 1s - loss: 0.1415 - accuracy: 0.9571
290/469 [=================>............] - ETA: 1s - loss: 0.1415 - accuracy: 0.9572
301/469 [==================>...........] - ETA: 1s - loss: 0.1419 - accuracy: 0.9573
312/469 [==================>...........] - ETA: 1s - loss: 0.1424 - accuracy: 0.9574
323/469 [===================>..........] - ETA: 1s - loss: 0.1416 - accuracy: 0.9575
333/469 [====================>.........] - ETA: 0s - loss: 0.1424 - accuracy: 0.9575
344/469 [=====================>........] - ETA: 0s - loss: 0.1417 - accuracy: 0.9576
355/469 [=====================>........] - ETA: 0s - loss: 0.1409 - accuracy: 0.9577
366/469 [======================>.......] - ETA: 0s - loss: 0.1409 - accuracy: 0.9578
377/469 [=======================>......] - ETA: 0s - loss: 0.1405 - accuracy: 0.9578
388/469 [=======================>......] - ETA: 0s - loss: 0.1402 - accuracy: 0.9579
399/469 [========================>.....] - ETA: 0s - loss: 0.1397 - accuracy: 0.9580
410/469 [=========================>....] - ETA: 0s - loss: 0.1395 - accuracy: 0.9580
421/469 [=========================>....] - ETA: 0s - loss: 0.1389 - accuracy: 0.9581
431/469 [==========================>...] - ETA: 0s - loss: 0.1387 - accuracy: 0.9582
442/469 [===========================>..] - ETA: 0s - loss: 0.1387 - accuracy: 0.9582
452/469 [===========================>..] - ETA: 0s - loss: 0.1388 - accuracy: 0.9583
462/469 [============================>.] - ETA: 0s - loss: 0.1385 - accuracy: 0.9584
468/469 [============================>.] - ETA: 0s - loss: 0.1379 - accuracy: 0.9584
469/469 [==============================] - 4s 8ms/step - loss: 0.1376 - accuracy: 0.9584 - val_loss: 0.1267 - val_accuracy: 0.9669
Epoch 3/3

  1/469 [..............................] - ETA: 6:39 - loss: 0.0663 - accuracy: 0.9844
 13/469 [..............................] - ETA: 31s - loss: 0.1175 - accuracy: 0.9732 
 25/469 [>.............................] - ETA: 16s - loss: 0.1121 - accuracy: 0.9707
 37/469 [=>............................] - ETA: 11s - loss: 0.1123 - accuracy: 0.9702
 48/469 [==>...........................] - ETA: 9s - loss: 0.1095 - accuracy: 0.9699 
 58/469 [==>...........................] - ETA: 7s - loss: 0.1051 - accuracy: 0.9699
 69/469 [===>..........................] - ETA: 6s - loss: 0.1087 - accuracy: 0.9699
 79/469 [====>.........................] - ETA: 5s - loss: 0.1130 - accuracy: 0.9699
 90/469 [====>.........................] - ETA: 5s - loss: 0.1140 - accuracy: 0.9700
100/469 [=====>........................] - ETA: 4s - loss: 0.1092 - accuracy: 0.9701
110/469 [======>.......................] - ETA: 4s - loss: 0.1079 - accuracy: 0.9702
120/469 [======>.......................] - ETA: 4s - loss: 0.1073 - accuracy: 0.9703
131/469 [=======>......................] - ETA: 3s - loss: 0.1087 - accuracy: 0.9704
141/469 [========>.....................] - ETA: 3s - loss: 0.1092 - accuracy: 0.9705
150/469 [========>.....................] - ETA: 3s - loss: 0.1082 - accuracy: 0.9705
160/469 [=========>....................] - ETA: 3s - loss: 0.1070 - accuracy: 0.9705
170/469 [=========>....................] - ETA: 2s - loss: 0.1076 - accuracy: 0.9706
180/469 [==========>...................] - ETA: 2s - loss: 0.1080 - accuracy: 0.9706
190/469 [===========>..................] - ETA: 2s - loss: 0.1079 - accuracy: 0.9706
199/469 [===========>..................] - ETA: 2s - loss: 0.1082 - accuracy: 0.9706
208/469 [============>.................] - ETA: 2s - loss: 0.1076 - accuracy: 0.9706
218/469 [============>.................] - ETA: 2s - loss: 0.1073 - accuracy: 0.9706
228/469 [=============>................] - ETA: 2s - loss: 0.1079 - accuracy: 0.9707
238/469 [==============>...............] - ETA: 1s - loss: 0.1096 - accuracy: 0.9706
249/469 [==============>...............] - ETA: 1s - loss: 0.1090 - accuracy: 0.9706
259/469 [===============>..............] - ETA: 1s - loss: 0.1078 - accuracy: 0.9706
269/469 [================>.............] - ETA: 1s - loss: 0.1080 - accuracy: 0.9706
279/469 [================>.............] - ETA: 1s - loss: 0.1080 - accuracy: 0.9706
289/469 [=================>............] - ETA: 1s - loss: 0.1082 - accuracy: 0.9706
299/469 [==================>...........] - ETA: 1s - loss: 0.1082 - accuracy: 0.9706
309/469 [==================>...........] - ETA: 1s - loss: 0.1085 - accuracy: 0.9706
319/469 [===================>..........] - ETA: 1s - loss: 0.1079 - accuracy: 0.9706
328/469 [===================>..........] - ETA: 1s - loss: 0.1073 - accuracy: 0.9706
338/469 [====================>.........] - ETA: 1s - loss: 0.1065 - accuracy: 0.9706
347/469 [=====================>........] - ETA: 0s - loss: 0.1057 - accuracy: 0.9706
358/469 [=====================>........] - ETA: 0s - loss: 0.1052 - accuracy: 0.9706
369/469 [======================>.......] - ETA: 0s - loss: 0.1073 - accuracy: 0.9706
379/469 [=======================>......] - ETA: 0s - loss: 0.1074 - accuracy: 0.9706
390/469 [=======================>......] - ETA: 0s - loss: 0.1077 - accuracy: 0.9706
401/469 [========================>.....] - ETA: 0s - loss: 0.1074 - accuracy: 0.9707
412/469 [=========================>....] - ETA: 0s - loss: 0.1077 - accuracy: 0.9707
424/469 [==========================>...] - ETA: 0s - loss: 0.1078 - accuracy: 0.9707
435/469 [==========================>...] - ETA: 0s - loss: 0.1080 - accuracy: 0.9707
446/469 [===========================>..] - ETA: 0s - loss: 0.1080 - accuracy: 0.9707
457/469 [============================>.] - ETA: 0s - loss: 0.1080 - accuracy: 0.9707
465/469 [============================>.] - ETA: 0s - loss: 0.1082 - accuracy: 0.9707
469/469 [==============================] - 3s 7ms/step - loss: 0.1079 - accuracy: 0.9707

 1/79 [..............................] - ETA: 0s - loss: 0.0657 - accuracy: 0.9844
18/79 [=====>........................] - ETA: 0s - loss: 0.1957 - accuracy: 0.9492
34/79 [===========>..................] - ETA: 0s - loss: 0.1965 - accuracy: 0.9492
50/79 [=================>............] - ETA: 0s - loss: 0.1712 - accuracy: 0.9541
67/79 [========================>.....] - ETA: 0s - loss: 0.1482 - accuracy: 0.9594
79/79 [==============================] - 0s 3ms/step - loss: 0.1427 - accuracy: 0.9615
saved total model.
load model from file
WARNING: Logging before flag parsing goes to stderr.
W0505 12:25:20.992213  1504 hdf5_format.py:266] Sequential models without an `input_shape` passed to the first layer cannot reload their optimizer state. As a result, your model isstarting with a freshly initialized optimizer.

 1/79 [..............................] - ETA: 14s - loss: 0.0657 - accuracy: 0.9844
22/79 [=======>......................] - ETA: 0s - loss: 0.1888 - accuracy: 0.9588 
44/79 [===============>..............] - ETA: 0s - loss: 0.1804 - accuracy: 0.9547
68/79 [========================>.....] - ETA: 0s - loss: 0.1461 - accuracy: 0.9552
79/79 [==============================] - 0s 5ms/step - loss: 0.1427 - accuracy: 0.9562

Process finished with exit code 0

1.4、模式3:save_model

二、CIFAR10自定義網絡實現

  • 關於cifar-10的介紹:數據集加載以及測試(張量)—實戰!
  • 這個數據集肯定比mnist數據集複雜很多,大小爲32*32的彩色圖片。比較模糊,這個數據集現在識別最好的就是80%左右。如果我們使用簡單的網絡也就60%左右,本身來說由於照片比較模糊。
  • 下面要使用自定義的網絡層MyDense
  • 注意cifar10這裏和mnist的區別:直接one_hot不行,這就是tf.squeeze()的作用就是把維度擠壓掉。
  • tf.squeeze()tf.squeeze()用於壓縮張量中爲1的軸

2.1、測試代碼1:

import tensorflow as tf
from tensorflow.python.keras import datasets, layers, optimizers, Sequential, metrics
from tensorflow.python import keras
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
def preprocess(x, y):
    # [0~255] => [-1~1]
    x = tf.cast(x, dtype=tf.float32) / 255.
    y = tf.cast(y, dtype=tf.int32)
    return x,y


batchsz = 256
# [50k, 32, 32, 3], [10k, 1]
(x, y), (x_val, y_val) = datasets.cifar10.load_data()
y = tf.squeeze(y)
y_val = tf.squeeze(y_val)
y = tf.one_hot(y, depth=10) # [50k, 10]
y_val = tf.one_hot(y_val, depth=10) # [10k, 10]
print('datasets:', x.shape, y.shape, x_val.shape, y_val.shape, x.min(), x.max())

train_db = tf.data.Dataset.from_tensor_slices((x,y))
train_db = train_db.map(preprocess).shuffle(10000).batch(batchsz)
test_db = tf.data.Dataset.from_tensor_slices((x_val, y_val))
test_db = test_db.map(preprocess).batch(batchsz)

sample = next(iter(train_db))
print("batch: ", sample[0].shape, sample[1].shape)


# 下面新建一個網絡對象,這裏不再使用標準的layers.Dense方式,使用自己創建的類。
# 因爲要使用一些標準的keras中的接口,這裏要使用母類layers.Layer
# 這就是我們自定義的簡單的線性層,只需要給入簡單的:輸入維度,輸出維度
class MyDense(layers.Layer):
    # 去替代標準的 layers.Dense()
    def __init__(self, inp_dim, outp_dim):
        super(MyDense, self).__init__()

        self.kernel = self.add_variable('w', [inp_dim, outp_dim])
        # self.bias = self.add_variable('b', [outp_dim]) 這裏自定義的,去掉了這個。

    def call(self, inputs, training=None):
        x = inputs @ self.kernel
        return x


# 實現自定義的層以後,現在實現自定義網絡,這個網絡包含了5層
# 首先MyNetwork是調用MyDense層,還可以調用其他的層來組成統一的網絡結構。
class MyNetwork(keras.Model):
    
    def __init__(self):
        super(MyNetwork, self).__init__()
        self.fc1 = MyDense(32*32*3, 256)   #新建的5個網絡層
        self.fc2 = MyDense(256, 128)
        self.fc3 = MyDense(128, 64)
        self.fc4 = MyDense(64, 32)
        self.fc5 = MyDense(32, 10)



    def call(self, inputs, training=None):
        """

        :param inputs: [b, 32, 32, 3]
        :param training:
        :return:
        """
        x = tf.reshape(inputs, [-1, 32*32*3])
        # [b, 32*32*3] => [b, 256]
        x = self.fc1(x)      # 會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 256] => [b, 128]
        x = self.fc2(x)      # 會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 128] => [b, 64]
        x = self.fc3(x)      # 會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 64] => [b, 32]
        x = self.fc4(x)      # 會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 32] => [b, 10]
        x = self.fc5(x)      # 會調用__call__方法 => call()

        return x

# 下面新建一個網絡對象;這裏是沒有參數的。
network = MyNetwork()
# 得到network之後,我們把它裝配起來。
network.compile(optimizer=optimizers.Adam(lr=1e-3),
                loss=tf.losses.CategoricalCrossentropy(from_logits=True),
                metrics=['accuracy'])

network.fit(train_db, epochs=5, validation_data=test_db, validation_freq=1)

2.2、測試結果1:

C:\Anaconda3\envs\tf2\python.exe E:/Codes/MyCodes/TF2/keras_train.py
datasets: (50000, 32, 32, 3) (50000, 10) (10000, 32, 32, 3) (10000, 10) 0 255
batch:  (256, 32, 32, 3) (256, 10)
Epoch 1/5

  1/196 [..............................] - ETA: 3:06 - loss: 2.4048 - accuracy: 0.0586
  5/196 [..............................] - ETA: 39s - loss: 2.3852 - accuracy: 0.0811 
  9/196 [>.............................] - ETA: 22s - loss: 2.3466 - accuracy: 0.0913
 13/196 [>.............................] - ETA: 16s - loss: 2.3215 - accuracy: 0.0992
 16/196 [=>............................] - ETA: 13s - loss: 2.3044 - accuracy: 0.1038
 20/196 [==>...........................] - ETA: 11s - loss: 2.2862 - accuracy: 0.1085
 24/196 [==>...........................] - ETA: 9s - loss: 2.2751 - accuracy: 0.1133 
 28/196 [===>..........................] - ETA: 8s - loss: 2.2601 - accuracy: 0.1179
 32/196 [===>..........................] - ETA: 7s - loss: 2.2493 - accuracy: 0.1222
 36/196 [====>.........................] - ETA: 6s - loss: 2.2378 - accuracy: 0.1263
 40/196 [=====>........................] - ETA: 6s - loss: 2.2263 - accuracy: 0.1301
 43/196 [=====>........................] - ETA: 5s - loss: 2.2179 - accuracy: 0.1329
 46/196 [======>.......................] - ETA: 5s - loss: 2.2083 - accuracy: 0.1356
 49/196 [======>.......................] - ETA: 5s - loss: 2.1992 - accuracy: 0.1383
 52/196 [======>.......................] - ETA: 5s - loss: 2.1906 - accuracy: 0.1408
 55/196 [=======>......................] - ETA: 4s - loss: 2.1833 - accuracy: 0.1433
 58/196 [=======>......................] - ETA: 4s - loss: 2.1745 - accuracy: 0.1456
 61/196 [========>.....................] - ETA: 4s - loss: 2.1679 - accuracy: 0.1479
 65/196 [========>.....................] - ETA: 4s - loss: 2.1581 - accuracy: 0.1509
 69/196 [=========>....................] - ETA: 3s - loss: 2.1521 - accuracy: 0.1539
 72/196 [==========>...................] - ETA: 3s - loss: 2.1437 - accuracy: 0.1560
 75/196 [==========>...................] - ETA: 3s - loss: 2.1370 - accuracy: 0.1581
 78/196 [==========>...................] - ETA: 3s - loss: 2.1291 - accuracy: 0.1602
 81/196 [===========>..................] - ETA: 3s - loss: 2.1228 - accuracy: 0.1622
 85/196 [============>.................] - ETA: 3s - loss: 2.1168 - accuracy: 0.1648
 89/196 [============>.................] - ETA: 2s - loss: 2.1101 - accuracy: 0.1672
 93/196 [=============>................] - ETA: 2s - loss: 2.1024 - accuracy: 0.1696
 97/196 [=============>................] - ETA: 2s - loss: 2.0972 - accuracy: 0.1719
100/196 [==============>...............] - ETA: 2s - loss: 2.0913 - accuracy: 0.1736
103/196 [==============>...............] - ETA: 2s - loss: 2.0849 - accuracy: 0.1752
107/196 [===============>..............] - ETA: 2s - loss: 2.0786 - accuracy: 0.1773
111/196 [===============>..............] - ETA: 2s - loss: 2.0735 - accuracy: 0.1794
114/196 [================>.............] - ETA: 2s - loss: 2.0701 - accuracy: 0.1810
117/196 [================>.............] - ETA: 1s - loss: 2.0663 - accuracy: 0.1825
121/196 [=================>............] - ETA: 1s - loss: 2.0622 - accuracy: 0.1844
125/196 [==================>...........] - ETA: 1s - loss: 2.0579 - accuracy: 0.1862
129/196 [==================>...........] - ETA: 1s - loss: 2.0537 - accuracy: 0.1880
132/196 [===================>..........] - ETA: 1s - loss: 2.0510 - accuracy: 0.1893
136/196 [===================>..........] - ETA: 1s - loss: 2.0456 - accuracy: 0.1910
140/196 [====================>.........] - ETA: 1s - loss: 2.0405 - accuracy: 0.1927
144/196 [=====================>........] - ETA: 1s - loss: 2.0360 - accuracy: 0.1944
147/196 [=====================>........] - ETA: 1s - loss: 2.0318 - accuracy: 0.1956
151/196 [======================>.......] - ETA: 1s - loss: 2.0282 - accuracy: 0.1972
154/196 [======================>.......] - ETA: 0s - loss: 2.0248 - accuracy: 0.1983
158/196 [=======================>......] - ETA: 0s - loss: 2.0219 - accuracy: 0.1999
163/196 [=======================>......] - ETA: 0s - loss: 2.0177 - accuracy: 0.2017
168/196 [========================>.....] - ETA: 0s - loss: 2.0130 - accuracy: 0.2035
173/196 [=========================>....] - ETA: 0s - loss: 2.0087 - accuracy: 0.2052
177/196 [==========================>...] - ETA: 0s - loss: 2.0060 - accuracy: 0.2065
182/196 [==========================>...] - ETA: 0s - loss: 2.0020 - accuracy: 0.2082
187/196 [===========================>..] - ETA: 0s - loss: 1.9977 - accuracy: 0.2098
192/196 [============================>.] - ETA: 0s - loss: 1.9933 - accuracy: 0.2114
196/196 [==============================] - 4s 23ms/step - loss: 1.9902 - accuracy: 0.2130 - val_loss: 1.8445 - val_accuracy: 0.3282
Epoch 2/5

  1/196 [..............................] - ETA: 42s - loss: 1.8504 - accuracy: 0.3320
  5/196 [..............................] - ETA: 10s - loss: 1.7972 - accuracy: 0.3436
  8/196 [>.............................] - ETA: 7s - loss: 1.8112 - accuracy: 0.3476 
 12/196 [>.............................] - ETA: 6s - loss: 1.8053 - accuracy: 0.3506
 15/196 [=>............................] - ETA: 5s - loss: 1.8075 - accuracy: 0.3518
 18/196 [=>............................] - ETA: 4s - loss: 1.8062 - accuracy: 0.3532
 21/196 [==>...........................] - ETA: 4s - loss: 1.8019 - accuracy: 0.3538
 24/196 [==>...........................] - ETA: 4s - loss: 1.8037 - accuracy: 0.3540
 27/196 [===>..........................] - ETA: 4s - loss: 1.8008 - accuracy: 0.3542
 31/196 [===>..........................] - ETA: 3s - loss: 1.8032 - accuracy: 0.3543
 35/196 [====>.........................] - ETA: 3s - loss: 1.8025 - accuracy: 0.3545
 38/196 [====>.........................] - ETA: 3s - loss: 1.8008 - accuracy: 0.3546
 42/196 [=====>........................] - ETA: 3s - loss: 1.7954 - accuracy: 0.3550
 46/196 [======>.......................] - ETA: 3s - loss: 1.7945 - accuracy: 0.3556
 50/196 [======>.......................] - ETA: 3s - loss: 1.7945 - accuracy: 0.3561
 53/196 [=======>......................] - ETA: 2s - loss: 1.7956 - accuracy: 0.3565
 56/196 [=======>......................] - ETA: 2s - loss: 1.7915 - accuracy: 0.3568
 60/196 [========>.....................] - ETA: 2s - loss: 1.7898 - accuracy: 0.3572
 64/196 [========>.....................] - ETA: 2s - loss: 1.7871 - accuracy: 0.3575
 68/196 [=========>....................] - ETA: 2s - loss: 1.7879 - accuracy: 0.3579
 71/196 [=========>....................] - ETA: 2s - loss: 1.7844 - accuracy: 0.3582
 75/196 [==========>...................] - ETA: 2s - loss: 1.7848 - accuracy: 0.3586
 79/196 [===========>..................] - ETA: 2s - loss: 1.7831 - accuracy: 0.3590
 83/196 [===========>..................] - ETA: 2s - loss: 1.7816 - accuracy: 0.3593
 86/196 [============>.................] - ETA: 2s - loss: 1.7825 - accuracy: 0.3595
 90/196 [============>.................] - ETA: 2s - loss: 1.7817 - accuracy: 0.3597
 94/196 [=============>................] - ETA: 1s - loss: 1.7823 - accuracy: 0.3600
 98/196 [==============>...............] - ETA: 1s - loss: 1.7828 - accuracy: 0.3602
101/196 [==============>...............] - ETA: 1s - loss: 1.7818 - accuracy: 0.3603
105/196 [===============>..............] - ETA: 1s - loss: 1.7798 - accuracy: 0.3604
109/196 [===============>..............] - ETA: 1s - loss: 1.7801 - accuracy: 0.3606
113/196 [================>.............] - ETA: 1s - loss: 1.7793 - accuracy: 0.3607
116/196 [================>.............] - ETA: 1s - loss: 1.7783 - accuracy: 0.3608
119/196 [=================>............] - ETA: 1s - loss: 1.7786 - accuracy: 0.3608
123/196 [=================>............] - ETA: 1s - loss: 1.7781 - accuracy: 0.3609
127/196 [==================>...........] - ETA: 1s - loss: 1.7778 - accuracy: 0.3610
131/196 [===================>..........] - ETA: 1s - loss: 1.7780 - accuracy: 0.3611
135/196 [===================>..........] - ETA: 1s - loss: 1.7763 - accuracy: 0.3612
139/196 [====================>.........] - ETA: 1s - loss: 1.7742 - accuracy: 0.3613
143/196 [====================>.........] - ETA: 0s - loss: 1.7719 - accuracy: 0.3615
147/196 [=====================>........] - ETA: 0s - loss: 1.7694 - accuracy: 0.3616
150/196 [=====================>........] - ETA: 0s - loss: 1.7686 - accuracy: 0.3617
153/196 [======================>.......] - ETA: 0s - loss: 1.7669 - accuracy: 0.3619
157/196 [=======================>......] - ETA: 0s - loss: 1.7673 - accuracy: 0.3620
162/196 [=======================>......] - ETA: 0s - loss: 1.7661 - accuracy: 0.3622
167/196 [========================>.....] - ETA: 0s - loss: 1.7655 - accuracy: 0.3624
172/196 [=========================>....] - ETA: 0s - loss: 1.7628 - accuracy: 0.3626
177/196 [==========================>...] - ETA: 0s - loss: 1.7624 - accuracy: 0.3627
181/196 [==========================>...] - ETA: 0s - loss: 1.7611 - accuracy: 0.3629
185/196 [===========================>..] - ETA: 0s - loss: 1.7597 - accuracy: 0.3630
190/196 [============================>.] - ETA: 0s - loss: 1.7593 - accuracy: 0.3632
194/196 [============================>.] - ETA: 0s - loss: 1.7589 - accuracy: 0.3633
196/196 [==============================] - 4s 18ms/step - loss: 1.7583 - accuracy: 0.3634 - val_loss: 1.7307 - val_accuracy: 0.3812
Epoch 3/5

  1/196 [..............................] - ETA: 42s - loss: 1.7493 - accuracy: 0.3945
  5/196 [..............................] - ETA: 10s - loss: 1.6978 - accuracy: 0.3869
  9/196 [>.............................] - ETA: 7s - loss: 1.7038 - accuracy: 0.3875 
 13/196 [>.............................] - ETA: 5s - loss: 1.7084 - accuracy: 0.3870
 17/196 [=>............................] - ETA: 4s - loss: 1.6945 - accuracy: 0.3876
 21/196 [==>...........................] - ETA: 4s - loss: 1.6928 - accuracy: 0.3880
 24/196 [==>...........................] - ETA: 4s - loss: 1.6969 - accuracy: 0.3874
 28/196 [===>..........................] - ETA: 3s - loss: 1.6944 - accuracy: 0.3869
 32/196 [===>..........................] - ETA: 3s - loss: 1.6959 - accuracy: 0.3868
 36/196 [====>.........................] - ETA: 3s - loss: 1.6983 - accuracy: 0.3868
 40/196 [=====>........................] - ETA: 3s - loss: 1.6967 - accuracy: 0.3870
 43/196 [=====>........................] - ETA: 3s - loss: 1.6949 - accuracy: 0.3872
 47/196 [======>.......................] - ETA: 3s - loss: 1.6951 - accuracy: 0.3876
 50/196 [======>.......................] - ETA: 2s - loss: 1.6945 - accuracy: 0.3878
 54/196 [=======>......................] - ETA: 2s - loss: 1.6937 - accuracy: 0.3880
 58/196 [=======>......................] - ETA: 2s - loss: 1.6902 - accuracy: 0.3881
 61/196 [========>.....................] - ETA: 2s - loss: 1.6904 - accuracy: 0.3883
 65/196 [========>.....................] - ETA: 2s - loss: 1.6902 - accuracy: 0.3884
 69/196 [=========>....................] - ETA: 2s - loss: 1.6911 - accuracy: 0.3885
 73/196 [==========>...................] - ETA: 2s - loss: 1.6901 - accuracy: 0.3887
 76/196 [==========>...................] - ETA: 2s - loss: 1.6905 - accuracy: 0.3888
 79/196 [===========>..................] - ETA: 2s - loss: 1.6884 - accuracy: 0.3889
 83/196 [===========>..................] - ETA: 2s - loss: 1.6875 - accuracy: 0.3891
 87/196 [============>.................] - ETA: 2s - loss: 1.6901 - accuracy: 0.3892
 91/196 [============>.................] - ETA: 1s - loss: 1.6891 - accuracy: 0.3892
 95/196 [=============>................] - ETA: 1s - loss: 1.6910 - accuracy: 0.3893
 99/196 [==============>...............] - ETA: 1s - loss: 1.6908 - accuracy: 0.3894
103/196 [==============>...............] - ETA: 1s - loss: 1.6880 - accuracy: 0.3895
106/196 [===============>..............] - ETA: 1s - loss: 1.6857 - accuracy: 0.3895
110/196 [===============>..............] - ETA: 1s - loss: 1.6855 - accuracy: 0.3896
114/196 [================>.............] - ETA: 1s - loss: 1.6847 - accuracy: 0.3897
118/196 [=================>............] - ETA: 1s - loss: 1.6848 - accuracy: 0.3899
122/196 [=================>............] - ETA: 1s - loss: 1.6841 - accuracy: 0.3900
126/196 [==================>...........] - ETA: 1s - loss: 1.6851 - accuracy: 0.3901
130/196 [==================>...........] - ETA: 1s - loss: 1.6851 - accuracy: 0.3902
134/196 [===================>..........] - ETA: 1s - loss: 1.6859 - accuracy: 0.3903
137/196 [===================>..........] - ETA: 1s - loss: 1.6833 - accuracy: 0.3904
140/196 [====================>.........] - ETA: 1s - loss: 1.6834 - accuracy: 0.3905
144/196 [=====================>........] - ETA: 0s - loss: 1.6805 - accuracy: 0.3906
148/196 [=====================>........] - ETA: 0s - loss: 1.6789 - accuracy: 0.3908
152/196 [======================>.......] - ETA: 0s - loss: 1.6782 - accuracy: 0.3910
156/196 [======================>.......] - ETA: 0s - loss: 1.6775 - accuracy: 0.3912
161/196 [=======================>......] - ETA: 0s - loss: 1.6768 - accuracy: 0.3914
166/196 [========================>.....] - ETA: 0s - loss: 1.6767 - accuracy: 0.3916
170/196 [=========================>....] - ETA: 0s - loss: 1.6755 - accuracy: 0.3918
175/196 [=========================>....] - ETA: 0s - loss: 1.6758 - accuracy: 0.3920
180/196 [==========================>...] - ETA: 0s - loss: 1.6746 - accuracy: 0.3922
185/196 [===========================>..] - ETA: 0s - loss: 1.6744 - accuracy: 0.3924
190/196 [============================>.] - ETA: 0s - loss: 1.6749 - accuracy: 0.3926
194/196 [============================>.] - ETA: 0s - loss: 1.6747 - accuracy: 0.3927
196/196 [==============================] - 4s 18ms/step - loss: 1.6746 - accuracy: 0.3928 - val_loss: 1.6366 - val_accuracy: 0.4122
Epoch 4/5

  1/196 [..............................] - ETA: 42s - loss: 1.6521 - accuracy: 0.3867
  5/196 [..............................] - ETA: 10s - loss: 1.6446 - accuracy: 0.3843
  9/196 [>.............................] - ETA: 7s - loss: 1.6475 - accuracy: 0.3900 
 13/196 [>.............................] - ETA: 5s - loss: 1.6477 - accuracy: 0.3913
 17/196 [=>............................] - ETA: 5s - loss: 1.6350 - accuracy: 0.3940
 21/196 [==>...........................] - ETA: 4s - loss: 1.6333 - accuracy: 0.3958
 25/196 [==>...........................] - ETA: 4s - loss: 1.6305 - accuracy: 0.3966
 29/196 [===>..........................] - ETA: 3s - loss: 1.6306 - accuracy: 0.3976
 33/196 [====>.........................] - ETA: 3s - loss: 1.6288 - accuracy: 0.3987
 36/196 [====>.........................] - ETA: 3s - loss: 1.6258 - accuracy: 0.3996
 40/196 [=====>........................] - ETA: 3s - loss: 1.6237 - accuracy: 0.4008
 44/196 [=====>........................] - ETA: 3s - loss: 1.6191 - accuracy: 0.4019
 48/196 [======>.......................] - ETA: 3s - loss: 1.6192 - accuracy: 0.4028
 52/196 [======>.......................] - ETA: 2s - loss: 1.6204 - accuracy: 0.4035
 56/196 [=======>......................] - ETA: 2s - loss: 1.6190 - accuracy: 0.4041
 60/196 [========>.....................] - ETA: 2s - loss: 1.6207 - accuracy: 0.4046
 63/196 [========>.....................] - ETA: 2s - loss: 1.6198 - accuracy: 0.4050
 67/196 [=========>....................] - ETA: 2s - loss: 1.6216 - accuracy: 0.4054
 71/196 [=========>....................] - ETA: 2s - loss: 1.6206 - accuracy: 0.4058
 74/196 [==========>...................] - ETA: 2s - loss: 1.6231 - accuracy: 0.4061
 77/196 [==========>...................] - ETA: 2s - loss: 1.6236 - accuracy: 0.4064
 80/196 [===========>..................] - ETA: 2s - loss: 1.6221 - accuracy: 0.4067
 84/196 [===========>..................] - ETA: 2s - loss: 1.6215 - accuracy: 0.4071
 87/196 [============>.................] - ETA: 2s - loss: 1.6233 - accuracy: 0.4073
 90/196 [============>.................] - ETA: 1s - loss: 1.6222 - accuracy: 0.4075
 93/196 [=============>................] - ETA: 1s - loss: 1.6221 - accuracy: 0.4078
 97/196 [=============>................] - ETA: 1s - loss: 1.6242 - accuracy: 0.4080
100/196 [==============>...............] - ETA: 1s - loss: 1.6229 - accuracy: 0.4082
104/196 [==============>...............] - ETA: 1s - loss: 1.6202 - accuracy: 0.4085
108/196 [===============>..............] - ETA: 1s - loss: 1.6198 - accuracy: 0.4087
112/196 [================>.............] - ETA: 1s - loss: 1.6178 - accuracy: 0.4090
115/196 [================>.............] - ETA: 1s - loss: 1.6172 - accuracy: 0.4092
119/196 [=================>............] - ETA: 1s - loss: 1.6168 - accuracy: 0.4095
122/196 [=================>............] - ETA: 1s - loss: 1.6167 - accuracy: 0.4097
126/196 [==================>...........] - ETA: 1s - loss: 1.6177 - accuracy: 0.4099
130/196 [==================>...........] - ETA: 1s - loss: 1.6179 - accuracy: 0.4102
134/196 [===================>..........] - ETA: 1s - loss: 1.6187 - accuracy: 0.4104
138/196 [====================>.........] - ETA: 1s - loss: 1.6160 - accuracy: 0.4106
142/196 [====================>.........] - ETA: 0s - loss: 1.6157 - accuracy: 0.4108
146/196 [=====================>........] - ETA: 0s - loss: 1.6139 - accuracy: 0.4111
150/196 [=====================>........] - ETA: 0s - loss: 1.6132 - accuracy: 0.4113
154/196 [======================>.......] - ETA: 0s - loss: 1.6113 - accuracy: 0.4116
158/196 [=======================>......] - ETA: 0s - loss: 1.6119 - accuracy: 0.4118
162/196 [=======================>......] - ETA: 0s - loss: 1.6105 - accuracy: 0.4120
167/196 [========================>.....] - ETA: 0s - loss: 1.6105 - accuracy: 0.4123
172/196 [=========================>....] - ETA: 0s - loss: 1.6091 - accuracy: 0.4126
177/196 [==========================>...] - ETA: 0s - loss: 1.6105 - accuracy: 0.4129
182/196 [==========================>...] - ETA: 0s - loss: 1.6099 - accuracy: 0.4131
187/196 [===========================>..] - ETA: 0s - loss: 1.6091 - accuracy: 0.4134
192/196 [============================>.] - ETA: 0s - loss: 1.6084 - accuracy: 0.4136
196/196 [==============================] - 4s 18ms/step - loss: 1.6081 - accuracy: 0.4138 - val_loss: 1.5944 - val_accuracy: 0.4300
Epoch 5/5

  1/196 [..............................] - ETA: 42s - loss: 1.6218 - accuracy: 0.4141
  5/196 [..............................] - ETA: 10s - loss: 1.5962 - accuracy: 0.4071
  9/196 [>.............................] - ETA: 7s - loss: 1.5895 - accuracy: 0.4151 
 13/196 [>.............................] - ETA: 5s - loss: 1.5934 - accuracy: 0.4173
 17/196 [=>............................] - ETA: 4s - loss: 1.5799 - accuracy: 0.4197
 20/196 [==>...........................] - ETA: 4s - loss: 1.5768 - accuracy: 0.4210
 24/196 [==>...........................] - ETA: 4s - loss: 1.5875 - accuracy: 0.4218
 28/196 [===>..........................] - ETA: 3s - loss: 1.5840 - accuracy: 0.4225
 31/196 [===>..........................] - ETA: 3s - loss: 1.5855 - accuracy: 0.4230
 35/196 [====>.........................] - ETA: 3s - loss: 1.5835 - accuracy: 0.4239
 39/196 [====>.........................] - ETA: 3s - loss: 1.5827 - accuracy: 0.4246
 43/196 [=====>........................] - ETA: 3s - loss: 1.5817 - accuracy: 0.4251
 47/196 [======>.......................] - ETA: 3s - loss: 1.5802 - accuracy: 0.4257
 51/196 [======>.......................] - ETA: 2s - loss: 1.5801 - accuracy: 0.4262
 55/196 [=======>......................] - ETA: 2s - loss: 1.5796 - accuracy: 0.4265
 59/196 [========>.....................] - ETA: 2s - loss: 1.5763 - accuracy: 0.4268
 62/196 [========>.....................] - ETA: 2s - loss: 1.5774 - accuracy: 0.4270
 66/196 [=========>....................] - ETA: 2s - loss: 1.5774 - accuracy: 0.4273
 70/196 [=========>....................] - ETA: 2s - loss: 1.5769 - accuracy: 0.4276
 74/196 [==========>...................] - ETA: 2s - loss: 1.5789 - accuracy: 0.4279
 78/196 [==========>...................] - ETA: 2s - loss: 1.5779 - accuracy: 0.4281
 82/196 [===========>..................] - ETA: 2s - loss: 1.5767 - accuracy: 0.4284
 85/196 [============>.................] - ETA: 2s - loss: 1.5789 - accuracy: 0.4286
 89/196 [============>.................] - ETA: 1s - loss: 1.5794 - accuracy: 0.4288
 92/196 [=============>................] - ETA: 1s - loss: 1.5775 - accuracy: 0.4289
 95/196 [=============>................] - ETA: 1s - loss: 1.5802 - accuracy: 0.4290
 99/196 [==============>...............] - ETA: 1s - loss: 1.5793 - accuracy: 0.4291
103/196 [==============>...............] - ETA: 1s - loss: 1.5768 - accuracy: 0.4292
107/196 [===============>..............] - ETA: 1s - loss: 1.5755 - accuracy: 0.4294
111/196 [===============>..............] - ETA: 1s - loss: 1.5749 - accuracy: 0.4295
115/196 [================>.............] - ETA: 1s - loss: 1.5728 - accuracy: 0.4297
119/196 [=================>............] - ETA: 1s - loss: 1.5723 - accuracy: 0.4298
122/196 [=================>............] - ETA: 1s - loss: 1.5720 - accuracy: 0.4299
126/196 [==================>...........] - ETA: 1s - loss: 1.5728 - accuracy: 0.4301
130/196 [==================>...........] - ETA: 1s - loss: 1.5732 - accuracy: 0.4302
134/196 [===================>..........] - ETA: 1s - loss: 1.5746 - accuracy: 0.4303
138/196 [====================>.........] - ETA: 1s - loss: 1.5723 - accuracy: 0.4304
142/196 [====================>.........] - ETA: 0s - loss: 1.5717 - accuracy: 0.4305
146/196 [=====================>........] - ETA: 0s - loss: 1.5702 - accuracy: 0.4307
150/196 [=====================>........] - ETA: 0s - loss: 1.5696 - accuracy: 0.4308
154/196 [======================>.......] - ETA: 0s - loss: 1.5676 - accuracy: 0.4310
158/196 [=======================>......] - ETA: 0s - loss: 1.5684 - accuracy: 0.4311
163/196 [=======================>......] - ETA: 0s - loss: 1.5670 - accuracy: 0.4313
168/196 [========================>.....] - ETA: 0s - loss: 1.5664 - accuracy: 0.4315
172/196 [=========================>....] - ETA: 0s - loss: 1.5663 - accuracy: 0.4317
177/196 [==========================>...] - ETA: 0s - loss: 1.5679 - accuracy: 0.4318
182/196 [==========================>...] - ETA: 0s - loss: 1.5675 - accuracy: 0.4320
187/196 [===========================>..] - ETA: 0s - loss: 1.5673 - accuracy: 0.4322
192/196 [============================>.] - ETA: 0s - loss: 1.5663 - accuracy: 0.4323
196/196 [==============================] - 4s 18ms/step - loss: 1.5662 - accuracy: 0.4325 - val_loss: 1.5695 - val_accuracy: 0.4419

Process finished with exit code 0

2.3、測試代碼2:

  • 爲了提升準確率,如下調整!
  1. 把輸入數據歸一化到[-1, 1];這裏我們把它放在0-1之間其實不是最好的,[0~255] => [-1~1],這個範圍可能是最適合神經網絡優化的範圍。
  2. 上面我們可以把參數量調大一些,訓練起來效果可能會好一些。
  3. epoch調整到100,這次使用服務器進行測試!
import tensorflow as tf
from tensorflow.python.keras import datasets, layers, optimizers, Sequential, metrics
from tensorflow.python import keras
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
def preprocess(x, y):
    #  #這裏我們把它放在0-1之間其實不是最好的,[0~255] => [-1~1],這個範圍可能是最適合的範圍。
    x = 2 * tf.cast(x, dtype=tf.float32) / 255. -1.
    y = tf.cast(y, dtype=tf.int32)
    return x,y


batchsz = 256
# [50k, 32, 32, 3], [10k, 1]
(x, y), (x_val, y_val) = datasets.cifar10.load_data()
y = tf.squeeze(y)
y_val = tf.squeeze(y_val)
y = tf.one_hot(y, depth=10) # [50k, 10]
y_val = tf.one_hot(y_val, depth=10) # [10k, 10]
print('datasets:', x.shape, y.shape, x_val.shape, y_val.shape, x.min(), x.max())

train_db = tf.data.Dataset.from_tensor_slices((x,y))
train_db = train_db.map(preprocess).shuffle(10000).batch(batchsz)
test_db = tf.data.Dataset.from_tensor_slices((x_val, y_val))
test_db = test_db.map(preprocess).batch(batchsz)

sample = next(iter(train_db))
print("batch: ", sample[0].shape, sample[1].shape)


# 下面新建一個網絡對象,這裏不再使用標準的layers.Dense方式,使用自己創建的類。
# 因爲要使用一些標準的keras中的接口,這裏要使用母類layers.Layer
# 這就是我們自定義的簡單的線性層,只需要給入簡單的:輸入維度,輸出維度
class MyDense(layers.Layer):
    # 去替代標準的 layers.Dense()
    def __init__(self, inp_dim, outp_dim):
        super(MyDense, self).__init__()

        self.kernel = self.add_variable('w', [inp_dim, outp_dim])
        # self.bias = self.add_variable('b', [outp_dim]) 這裏自定義的,去掉了這個。

    def call(self, inputs, training=None):
        x = inputs @ self.kernel
        return x


# 實現自定義的層以後,現在實現自定義網絡,這個網絡包含了5層
# 首先MyNetwork是調用MyDense層,還可以調用其他的層來組成統一的網絡結構。
class MyNetwork(keras.Model):
    
    def __init__(self):
        super(MyNetwork, self).__init__()
        self.fc1 = MyDense(32*32*3, 256)   #新建的5個網絡層
        self.fc2 = MyDense(256, 256)
        self.fc3 = MyDense(256, 256)
        self.fc4 = MyDense(256, 256)
        self.fc5 = MyDense(256, 10)
        # =============上面我們可以把參數量調大一些,訓練起來效果可能會好一些。


    def call(self, inputs, training=None):
        """

        :param inputs: [b, 32, 32, 3]
        :param training:
        :return:
        """
        x = tf.reshape(inputs, [-1, 32*32*3])
        # [b, 32*32*3] => [b, 256]
        x = self.fc1(x)      # fc1(x)會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 256] => [b, 128]
        x = self.fc2(x)      # 會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 128] => [b, 64]
        x = self.fc3(x)      # 會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 64] => [b, 32]
        x = self.fc4(x)      # 會調用__call__方法 => call()
        x = tf.nn.relu(x)    #激活函數可以卸載MyDense裏面。
        # [b, 32] => [b, 10]
        x = self.fc5(x)      # 會調用__call__方法 => call()

        return x

# 下面新建一個網絡對象;這裏是沒有參數的。
network = MyNetwork()
# 得到network之後,我們把它裝配起來。
network.compile(optimizer=optimizers.Adam(lr=1e-3),
                loss=tf.losses.CategoricalCrossentropy(from_logits=True),
                metrics=['accuracy'])

network.fit(train_db, epochs=100, validation_data=test_db, validation_freq=1)

network.evaluate(test_db)

network.save_weights('ckpt/weights.ckpt')   #後綴名可以隨便的取。
del network                                 #把這個網絡刪除一下。
print('saved to ckpt/weights.ckpt')


# 再創建一下,因爲這裏只是單純的保存一下權值,需要把網絡創建加進來。
network = MyNetwork()
network.compile(optimizer=optimizers.Adam(lr=1e-3),
                loss=tf.losses.CategoricalCrossentropy(from_logits=True),
                metrics=['accuracy'])
network.load_weights('ckpt/weights.ckpt')
print('loaded weights from file.')
# 再做一個測試,使用同一個數據集。
network.evaluate(test_db)

2.4、測試結果2:

ssh://[email protected]:22/home/zhangkf/anaconda3/envs/tf2gpu/bin/python -u /home/zhangkf/tf1/TF2/keras_train.py
datasets: (50000, 32, 32, 3) (50000, 10) (10000, 32, 32, 3) (10000, 10) 0 255
batch:  (256, 32, 32, 3) (256, 10)
Epoch 1/100
196/196 [==============================] - 7s 37ms/step - loss: 1.7186 - accuracy: 0.3382 - val_loss: 1.5680 - val_accuracy: 0.4423
Epoch 2/100
196/196 [==============================] - 6s 32ms/step - loss: 1.4942 - accuracy: 0.4604 - val_loss: 1.4738 - val_accuracy: 0.4846
Epoch 3/100
196/196 [==============================] - 7s 33ms/step - loss: 1.3767 - accuracy: 0.5105 - val_loss: 1.4411 - val_accuracy: 0.4921
Epoch 4/100
196/196 [==============================] - 7s 34ms/step - loss: 1.2899 - accuracy: 0.5433 - val_loss: 1.4346 - val_accuracy: 0.4982
Epoch 5/100
196/196 [==============================] - 7s 35ms/step - loss: 1.2098 - accuracy: 0.5756 - val_loss: 1.4449 - val_accuracy: 0.5011
Epoch 6/100
196/196 [==============================] - 7s 34ms/step - loss: 1.1403 - accuracy: 0.5974 - val_loss: 1.4457 - val_accuracy: 0.5029
Epoch 7/100
196/196 [==============================] - 6s 33ms/step - loss: 1.0750 - accuracy: 0.6230 - val_loss: 1.4638 - val_accuracy: 0.5016
Epoch 8/100
196/196 [==============================] - 6s 32ms/step - loss: 1.0248 - accuracy: 0.6418 - val_loss: 1.4997 - val_accuracy: 0.5004
Epoch 9/100
196/196 [==============================] - 7s 34ms/step - loss: 0.9679 - accuracy: 0.6619 - val_loss: 1.5376 - val_accuracy: 0.4943
Epoch 10/100
196/196 [==============================] - 7s 34ms/step - loss: 0.9331 - accuracy: 0.6737 - val_loss: 1.6079 - val_accuracy: 0.4900
Epoch 11/100
196/196 [==============================] - 6s 30ms/step - loss: 0.8987 - accuracy: 0.6835 - val_loss: 1.6212 - val_accuracy: 0.4983
Epoch 12/100
196/196 [==============================] - 6s 30ms/step - loss: 0.8581 - accuracy: 0.6953 - val_loss: 1.6736 - val_accuracy: 0.4973
Epoch 13/100
196/196 [==============================] - 6s 29ms/step - loss: 0.8204 - accuracy: 0.7089 - val_loss: 1.7515 - val_accuracy: 0.4917
Epoch 14/100
196/196 [==============================] - 7s 34ms/step - loss: 0.7912 - accuracy: 0.7218 - val_loss: 1.7996 - val_accuracy: 0.4867
Epoch 15/100
196/196 [==============================] - 7s 36ms/step - loss: 0.7640 - accuracy: 0.7257 - val_loss: 1.8401 - val_accuracy: 0.4897
Epoch 16/100
196/196 [==============================] - 7s 34ms/step - loss: 0.7333 - accuracy: 0.7365 - val_loss: 1.8550 - val_accuracy: 0.4910
Epoch 17/100
196/196 [==============================] - 7s 35ms/step - loss: 0.7097 - accuracy: 0.7502 - val_loss: 1.9408 - val_accuracy: 0.4865
Epoch 18/100
196/196 [==============================] - 6s 31ms/step - loss: 0.6825 - accuracy: 0.7575 - val_loss: 1.9862 - val_accuracy: 0.4890
Epoch 19/100
196/196 [==============================] - 7s 35ms/step - loss: 0.6543 - accuracy: 0.7672 - val_loss: 2.0883 - val_accuracy: 0.4808
Epoch 20/100
196/196 [==============================] - 6s 29ms/step - loss: 0.6261 - accuracy: 0.7760 - val_loss: 2.1840 - val_accuracy: 0.4740
Epoch 21/100
196/196 [==============================] - 6s 29ms/step - loss: 0.6076 - accuracy: 0.7799 - val_loss: 2.2328 - val_accuracy: 0.4754
Epoch 22/100
196/196 [==============================] - 6s 30ms/step - loss: 0.5753 - accuracy: 0.7978 - val_loss: 2.3189 - val_accuracy: 0.4714
Epoch 23/100
196/196 [==============================] - 7s 35ms/step - loss: 0.5642 - accuracy: 0.7972 - val_loss: 2.3504 - val_accuracy: 0.4723
Epoch 24/100
196/196 [==============================] - 7s 35ms/step - loss: 0.5517 - accuracy: 0.8000 - val_loss: 2.4260 - val_accuracy: 0.4656
Epoch 25/100
196/196 [==============================] - 7s 33ms/step - loss: 0.5353 - accuracy: 0.8124 - val_loss: 2.4914 - val_accuracy: 0.4660
Epoch 26/100
196/196 [==============================] - 7s 34ms/step - loss: 0.5298 - accuracy: 0.8110 - val_loss: 2.5469 - val_accuracy: 0.4630
Epoch 27/100
196/196 [==============================] - 7s 34ms/step - loss: 0.5117 - accuracy: 0.8149 - val_loss: 2.4893 - val_accuracy: 0.4755
Epoch 28/100
196/196 [==============================] - 7s 36ms/step - loss: 0.5030 - accuracy: 0.8210 - val_loss: 2.4983 - val_accuracy: 0.4830
Epoch 29/100
196/196 [==============================] - 7s 34ms/step - loss: 0.4778 - accuracy: 0.8295 - val_loss: 2.5940 - val_accuracy: 0.4723
Epoch 30/100
196/196 [==============================] - 7s 35ms/step - loss: 0.4644 - accuracy: 0.8314 - val_loss: 2.7092 - val_accuracy: 0.4788
Epoch 31/100
196/196 [==============================] - 6s 33ms/step - loss: 0.4422 - accuracy: 0.8427 - val_loss: 2.7854 - val_accuracy: 0.4819
Epoch 32/100
196/196 [==============================] - 7s 35ms/step - loss: 0.4319 - accuracy: 0.8434 - val_loss: 2.7766 - val_accuracy: 0.4751
Epoch 33/100
196/196 [==============================] - 7s 34ms/step - loss: 0.4172 - accuracy: 0.8508 - val_loss: 2.8281 - val_accuracy: 0.4791
Epoch 34/100
196/196 [==============================] - 7s 34ms/step - loss: 0.3999 - accuracy: 0.8584 - val_loss: 2.9011 - val_accuracy: 0.4781
Epoch 35/100
196/196 [==============================] - 6s 33ms/step - loss: 0.3896 - accuracy: 0.8667 - val_loss: 2.9909 - val_accuracy: 0.4802
Epoch 36/100
196/196 [==============================] - 6s 30ms/step - loss: 0.3858 - accuracy: 0.8614 - val_loss: 3.1531 - val_accuracy: 0.4738
Epoch 37/100
196/196 [==============================] - 6s 33ms/step - loss: 0.3799 - accuracy: 0.8654 - val_loss: 3.1965 - val_accuracy: 0.4758
Epoch 38/100
196/196 [==============================] - 7s 34ms/step - loss: 0.3690 - accuracy: 0.8687 - val_loss: 3.2377 - val_accuracy: 0.4766
Epoch 39/100
196/196 [==============================] - 5s 27ms/step - loss: 0.3681 - accuracy: 0.8698 - val_loss: 3.2637 - val_accuracy: 0.4778
Epoch 40/100
196/196 [==============================] - 6s 29ms/step - loss: 0.3670 - accuracy: 0.8660 - val_loss: 3.1468 - val_accuracy: 0.4827
Epoch 41/100
196/196 [==============================] - 7s 34ms/step - loss: 0.3523 - accuracy: 0.8731 - val_loss: 3.0965 - val_accuracy: 0.4896
Epoch 42/100
196/196 [==============================] - 6s 32ms/step - loss: 0.3431 - accuracy: 0.8792 - val_loss: 3.2617 - val_accuracy: 0.4880
Epoch 43/100
196/196 [==============================] - 7s 34ms/step - loss: 0.3245 - accuracy: 0.8869 - val_loss: 3.2619 - val_accuracy: 0.4857
Epoch 44/100
196/196 [==============================] - 7s 33ms/step - loss: 0.3237 - accuracy: 0.8879 - val_loss: 3.3753 - val_accuracy: 0.4814
Epoch 45/100
196/196 [==============================] - 6s 33ms/step - loss: 0.3092 - accuracy: 0.8912 - val_loss: 3.5018 - val_accuracy: 0.4854
Epoch 46/100
196/196 [==============================] - 7s 33ms/step - loss: 0.3046 - accuracy: 0.8942 - val_loss: 3.4471 - val_accuracy: 0.4888
Epoch 47/100
196/196 [==============================] - 6s 33ms/step - loss: 0.2889 - accuracy: 0.8992 - val_loss: 3.5435 - val_accuracy: 0.4825
Epoch 48/100
196/196 [==============================] - 7s 34ms/step - loss: 0.2820 - accuracy: 0.9013 - val_loss: 3.5380 - val_accuracy: 0.4820
Epoch 49/100
196/196 [==============================] - 7s 34ms/step - loss: 0.2727 - accuracy: 0.9048 - val_loss: 3.6659 - val_accuracy: 0.4790
Epoch 50/100
196/196 [==============================] - 7s 34ms/step - loss: 0.2780 - accuracy: 0.9020 - val_loss: 3.6552 - val_accuracy: 0.4823
Epoch 51/100
196/196 [==============================] - 6s 33ms/step - loss: 0.2747 - accuracy: 0.9022 - val_loss: 3.6271 - val_accuracy: 0.4781
Epoch 52/100
196/196 [==============================] - 6s 29ms/step - loss: 0.2579 - accuracy: 0.9069 - val_loss: 3.8432 - val_accuracy: 0.4793
Epoch 53/100
196/196 [==============================] - 6s 29ms/step - loss: 0.2560 - accuracy: 0.9111 - val_loss: 3.7360 - val_accuracy: 0.4757
Epoch 54/100
196/196 [==============================] - 6s 29ms/step - loss: 0.2561 - accuracy: 0.9110 - val_loss: 3.8931 - val_accuracy: 0.4768
Epoch 55/100
196/196 [==============================] - 6s 33ms/step - loss: 0.2502 - accuracy: 0.9114 - val_loss: 3.9386 - val_accuracy: 0.4748
Epoch 56/100
196/196 [==============================] - 7s 35ms/step - loss: 0.2295 - accuracy: 0.9196 - val_loss: 3.9047 - val_accuracy: 0.4764
Epoch 57/100
196/196 [==============================] - 6s 33ms/step - loss: 0.2322 - accuracy: 0.9190 - val_loss: 4.0256 - val_accuracy: 0.4743
Epoch 58/100
196/196 [==============================] - 7s 35ms/step - loss: 0.2333 - accuracy: 0.9194 - val_loss: 4.1367 - val_accuracy: 0.4711
Epoch 59/100
196/196 [==============================] - 6s 33ms/step - loss: 0.2277 - accuracy: 0.9190 - val_loss: 4.0834 - val_accuracy: 0.4761
Epoch 60/100
196/196 [==============================] - 6s 32ms/step - loss: 0.2214 - accuracy: 0.9239 - val_loss: 4.1716 - val_accuracy: 0.4737
Epoch 61/100
196/196 [==============================] - 6s 33ms/step - loss: 0.2207 - accuracy: 0.9209 - val_loss: 4.1157 - val_accuracy: 0.4721
Epoch 62/100
196/196 [==============================] - 7s 34ms/step - loss: 0.2295 - accuracy: 0.9211 - val_loss: 4.3243 - val_accuracy: 0.4712
Epoch 63/100
196/196 [==============================] - 7s 34ms/step - loss: 0.2172 - accuracy: 0.9217 - val_loss: 4.3557 - val_accuracy: 0.4717
Epoch 64/100
196/196 [==============================] - 6s 30ms/step - loss: 0.2082 - accuracy: 0.9245 - val_loss: 4.4790 - val_accuracy: 0.4745
Epoch 65/100
196/196 [==============================] - 6s 31ms/step - loss: 0.2135 - accuracy: 0.9239 - val_loss: 4.3580 - val_accuracy: 0.4727
Epoch 66/100
196/196 [==============================] - 6s 29ms/step - loss: 0.2251 - accuracy: 0.9204 - val_loss: 4.4125 - val_accuracy: 0.4690
Epoch 67/100
196/196 [==============================] - 6s 30ms/step - loss: 0.2125 - accuracy: 0.9247 - val_loss: 4.4460 - val_accuracy: 0.4690
Epoch 68/100
196/196 [==============================] - 7s 33ms/step - loss: 0.2121 - accuracy: 0.9243 - val_loss: 4.2774 - val_accuracy: 0.4720
Epoch 69/100
196/196 [==============================] - 6s 33ms/step - loss: 0.2046 - accuracy: 0.9256 - val_loss: 4.2823 - val_accuracy: 0.4743
Epoch 70/100
196/196 [==============================] - 5s 27ms/step - loss: 0.1975 - accuracy: 0.9304 - val_loss: 4.2939 - val_accuracy: 0.4801
Epoch 71/100
196/196 [==============================] - 5s 27ms/step - loss: 0.1881 - accuracy: 0.9361 - val_loss: 4.4338 - val_accuracy: 0.4775
Epoch 72/100
196/196 [==============================] - 5s 28ms/step - loss: 0.1862 - accuracy: 0.9347 - val_loss: 4.4342 - val_accuracy: 0.4761
Epoch 73/100
196/196 [==============================] - 6s 31ms/step - loss: 0.1810 - accuracy: 0.9371 - val_loss: 4.4888 - val_accuracy: 0.4732
Epoch 74/100
196/196 [==============================] - 7s 36ms/step - loss: 0.1889 - accuracy: 0.9347 - val_loss: 4.5377 - val_accuracy: 0.4729
Epoch 75/100
196/196 [==============================] - 6s 33ms/step - loss: 0.1821 - accuracy: 0.9364 - val_loss: 4.5335 - val_accuracy: 0.4786
Epoch 76/100
196/196 [==============================] - 6s 31ms/step - loss: 0.1816 - accuracy: 0.9371 - val_loss: 4.5314 - val_accuracy: 0.4739
Epoch 77/100
196/196 [==============================] - 6s 30ms/step - loss: 0.1886 - accuracy: 0.9356 - val_loss: 4.6643 - val_accuracy: 0.4716
Epoch 78/100
196/196 [==============================] - 6s 30ms/step - loss: 0.1808 - accuracy: 0.9353 - val_loss: 4.5243 - val_accuracy: 0.4714
Epoch 79/100
196/196 [==============================] - 6s 32ms/step - loss: 0.1640 - accuracy: 0.9425 - val_loss: 4.5498 - val_accuracy: 0.4781
Epoch 80/100
196/196 [==============================] - 6s 32ms/step - loss: 0.1609 - accuracy: 0.9413 - val_loss: 4.7355 - val_accuracy: 0.4760
Epoch 81/100
196/196 [==============================] - 6s 29ms/step - loss: 0.1657 - accuracy: 0.9419 - val_loss: 4.7074 - val_accuracy: 0.4787
Epoch 82/100
196/196 [==============================] - 5s 28ms/step - loss: 0.1630 - accuracy: 0.9424 - val_loss: 4.7073 - val_accuracy: 0.4812
Epoch 83/100
196/196 [==============================] - 6s 29ms/step - loss: 0.1609 - accuracy: 0.9427 - val_loss: 4.7601 - val_accuracy: 0.4861
Epoch 84/100
196/196 [==============================] - 6s 31ms/step - loss: 0.1487 - accuracy: 0.9469 - val_loss: 4.8887 - val_accuracy: 0.4781
Epoch 85/100
196/196 [==============================] - 7s 34ms/step - loss: 0.1485 - accuracy: 0.9477 - val_loss: 4.7442 - val_accuracy: 0.4807
Epoch 86/100
196/196 [==============================] - 6s 33ms/step - loss: 0.1476 - accuracy: 0.9490 - val_loss: 4.9249 - val_accuracy: 0.4870
Epoch 87/100
196/196 [==============================] - 6s 32ms/step - loss: 0.1626 - accuracy: 0.9427 - val_loss: 4.9209 - val_accuracy: 0.4828
Epoch 88/100
196/196 [==============================] - 6s 33ms/step - loss: 0.1679 - accuracy: 0.9396 - val_loss: 4.8001 - val_accuracy: 0.4829
Epoch 89/100
196/196 [==============================] - 6s 33ms/step - loss: 0.1422 - accuracy: 0.9505 - val_loss: 4.8969 - val_accuracy: 0.4801
Epoch 90/100
196/196 [==============================] - 6s 30ms/step - loss: 0.1458 - accuracy: 0.9501 - val_loss: 5.1439 - val_accuracy: 0.4799
Epoch 91/100
196/196 [==============================] - 6s 29ms/step - loss: 0.1445 - accuracy: 0.9490 - val_loss: 5.0177 - val_accuracy: 0.4748
Epoch 92/100
196/196 [==============================] - 5s 25ms/step - loss: 0.1524 - accuracy: 0.9491 - val_loss: 5.0628 - val_accuracy: 0.4838
Epoch 93/100
196/196 [==============================] - 6s 31ms/step - loss: 0.1390 - accuracy: 0.9512 - val_loss: 5.0847 - val_accuracy: 0.4812
Epoch 94/100
196/196 [==============================] - 7s 34ms/step - loss: 0.1459 - accuracy: 0.9461 - val_loss: 5.0484 - val_accuracy: 0.4802
Epoch 95/100
196/196 [==============================] - 6s 33ms/step - loss: 0.1379 - accuracy: 0.9537 - val_loss: 5.1362 - val_accuracy: 0.4747
Epoch 96/100
196/196 [==============================] - 6s 29ms/step - loss: 0.1519 - accuracy: 0.9484 - val_loss: 5.1338 - val_accuracy: 0.4810
Epoch 97/100
196/196 [==============================] - 5s 28ms/step - loss: 0.1464 - accuracy: 0.9505 - val_loss: 5.1788 - val_accuracy: 0.4816
Epoch 98/100
196/196 [==============================] - 6s 29ms/step - loss: 0.1342 - accuracy: 0.9531 - val_loss: 5.0979 - val_accuracy: 0.4797
Epoch 99/100
196/196 [==============================] - 6s 33ms/step - loss: 0.1370 - accuracy: 0.9508 - val_loss: 5.1631 - val_accuracy: 0.4815
Epoch 100/100
196/196 [==============================] - 7s 34ms/step - loss: 0.1298 - accuracy: 0.9559 - val_loss: 5.1771 - val_accuracy: 0.4786
40/40 [==============================] - 1s 24ms/step - loss: 5.1771 - accuracy: 0.4786

saved to ckpt/weights.ckpt
loaded weights from file.
40/40 [==============================] - 1s 29ms/step - loss: 5.1771 - accuracy: 0.4788

Process finished with exit code 0

注意: 可以發現上面的程序訓練起來的準確率比較高,但是測試的時候準確率不高。出現了了過擬合的現象。

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章