自适应参数化ReLU是一种动态激活函数,对所有输入不是“一视同仁”,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布。
在之前的调参记录18中,是将深度残差网络ResNet中的所有ReLU都替换成了自适应参数化ReLU(Adaptively Parametric ReLU,APReLU)。
由于APReLU的输入特征图与输出特征图的尺寸是完全一致的,所以APReLU可以被嵌入到神经网络的任意部分。
本文将APReLU放在每个残差模块的第二个卷积层之后。这种结构与Squeeze-and-Excitation Network是非常相似的,其区别在于APReLU额外地包含了非线性变换。
同时,迭代次数也从5000个epoch减少到了500个epoch。时间耗不起。
APReLU激活函数的原理如下图所示:
整体代码如下:
#!/usr/bin/env python3# -*- coding: utf-8 -*-"""Created on Tue Apr 14 04:17:45 2020Implemented using TensorFlow 1.0.1 and Keras 2.2.1Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458,Date of Publication: 13 February 2020@author: Minghang Zhao"""from __future__ import print_functionimport kerasimport numpy as npfrom keras.datasets import cifar10from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimumfrom keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshapefrom keras.regularizers import l2from keras import backend as Kfrom keras.models import Modelfrom keras import optimizersfrom keras.preprocessing.image import ImageDataGeneratorfrom keras.callbacks import LearningRateSchedulerK.set_learning_phase(1)# The data, split between train and test sets(x_train, y_train), (x_test, y_test) = cifar10.load_data()x_train = x_train.astype('float32') / 255.x_test = x_test.astype('float32') / 255.x_test = x_test-np.mean(x_train)x_train = x_train-np.mean(x_train)print('x_train shape:', x_train.shape)print(x_train.shape[0], 'train samples')print(x_test.shape[0], 'test samples')# convert class vectors to binary class matricesy_train = keras.utils.to_categorical(y_train, 10)y_test = keras.utils.to_categorical(y_test, 10)# Schedule the learning rate, multiply 0.1 every 150 epochesdef scheduler(epoch): if epoch % 150 == 0 and epoch != 0: lr = K.get_value(model.optimizer.lr) K.set_value(model.optimizer.lr, lr * 0.1) print("lr changed to {}".format(lr * 0.1)) return K.get_value(model.optimizer.lr)# An adaptively parametric rectifier linear unit (APReLU)def aprelu(inputs): # get the number of channels channels = inputs.get_shape().as_list()[-1] # get a zero feature map zeros_input = keras.layers.subtract([inputs, inputs]) # get a feature map with only positive features pos_input = Activation('relu')(inputs) # get a feature map with only negative features neg_input = Minimum()([inputs,zeros_input]) # define a network to obtain the scaling coefficients scales_p = GlobalAveragePooling2D()(pos_input) scales_n = GlobalAveragePooling2D()(neg_input) scales = Concatenate()([scales_n, scales_p]) scales = Dense(channels//16, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales) scales = Activation('relu')(scales) scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales) scales = Activation('sigmoid')(scales) scales = Reshape((1,1,channels))(scales) # apply a paramtetric relu neg_part = keras.layers.multiply([scales, neg_input]) return keras.layers.add([pos_input, neg_part])# Residual Blockdef residual_block(incoming, nb_blocks, out_channels, downsample=False, downsample_strides=2): residual = incoming in_channels = incoming.get_shape().as_list()[-1] for i in range(nb_blocks): identity = residual if not downsample: downsample_strides = 1 residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual) residual = Activation('relu')(residual) residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual) residual = Activation('relu')(residual) residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) residual = aprelu(residual) # Downsampling if downsample_strides > 1: identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity) # Zero_padding to match channels if in_channels != out_channels: zeros_identity = keras.layers.subtract([identity, identity]) identity = keras.layers.concatenate([identity, zeros_identity]) in_channels = out_channels residual = keras.layers.add([residual, identity]) return residual# define and train a modelinputs = Input(shape=(32, 32, 3))net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)net = residual_block(net, 9, 32, downsample=False)net = residual_block(net, 1, 32, downsample=True)net = residual_block(net, 8, 32, downsample=False)net = residual_block(net, 1, 64, downsample=True)net = residual_block(net, 8, 64, downsample=False)net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net)net = Activation('relu')(net)net = GlobalAveragePooling2D()(net)outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)model = Model(inputs=inputs, outputs=outputs)sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])# data augmentationdatagen = ImageDataGenerator( # randomly rotate images in the range (deg 0 to 180) rotation_range=30, # Range for random zoom zoom_range = 0.2, # shear angle in counter-clockwise direction in degrees shear_range = 30, # randomly flip images horizontal_flip=True, # randomly shift images horizontally width_shift_range=0.125, # randomly shift images vertically height_shift_range=0.125)reduce_lr = LearningRateScheduler(scheduler)# fit the model on the batches generated by datagen.flow().model.fit_generator(datagen.flow(x_train, y_train, batch_size=100), validation_data=(x_test, y_test), epochs=500, verbose=1, callbacks=[reduce_lr], workers=4)# get resultsK.set_learning_phase(0)DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)print('Train loss:', DRSN_train_score[0])print('Train accuracy:', DRSN_train_score[1])DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)print('Test loss:', DRSN_test_score[0])print('Test accuracy:', DRSN_test_score[1])
实验结果如下:
Using TensorFlow backend.x_train shape: (50000, 32, 32, 3)50000 train samples10000 test samplesEpoch 1/50070s 140ms/step - loss: 2.5742 - acc: 0.4145 - val_loss: 2.1224 - val_acc: 0.5538Epoch 2/50052s 104ms/step - loss: 2.0566 - acc: 0.5631 - val_loss: 1.7678 - val_acc: 0.6502Epoch 3/50052s 104ms/step - loss: 1.7697 - acc: 0.6317 - val_loss: 1.5134 - val_acc: 0.7114Epoch 4/50052s 103ms/step - loss: 1.5790 - acc: 0.6694 - val_loss: 1.3269 - val_acc: 0.7508Epoch 5/50052s 104ms/step - loss: 1.4270 - acc: 0.6971 - val_loss: 1.2040 - val_acc: 0.7703Epoch 6/50052s 104ms/step - loss: 1.3109 - acc: 0.7165 - val_loss: 1.1187 - val_acc: 0.7809Epoch 7/50052s 104ms/step - loss: 1.2249 - acc: 0.7302 - val_loss: 1.0393 - val_acc: 0.7919Epoch 8/50052s 103ms/step - loss: 1.1457 - acc: 0.7482 - val_loss: 0.9639 - val_acc: 0.8084Epoch 9/50052s 104ms/step - loss: 1.0931 - acc: 0.7555 - val_loss: 0.9324 - val_acc: 0.8130Epoch 10/50052s 104ms/step - loss: 1.0418 - acc: 0.7693 - val_loss: 0.9043 - val_acc: 0.8138Epoch 11/50052s 104ms/step - loss: 1.0049 - acc: 0.7747 - val_loss: 0.8600 - val_acc: 0.8281Epoch 12/50051s 102ms/step - loss: 0.9835 - acc: 0.7796 - val_loss: 0.8364 - val_acc: 0.8288Epoch 13/50051s 103ms/step - loss: 0.9480 - acc: 0.7878 - val_loss: 0.7905 - val_acc: 0.8432Epoch 14/50052s 104ms/step - loss: 0.9246 - acc: 0.7906 - val_loss: 0.7895 - val_acc: 0.8400Epoch 15/50052s 103ms/step - loss: 0.9067 - acc: 0.7967 - val_loss: 0.7709 - val_acc: 0.8451Epoch 16/50052s 103ms/step - loss: 0.8928 - acc: 0.8000 - val_loss: 0.7728 - val_acc: 0.8447Epoch 17/50052s 104ms/step - loss: 0.8787 - acc: 0.8042 - val_loss: 0.7842 - val_acc: 0.8367Epoch 18/50052s 104ms/step - loss: 0.8644 - acc: 0.8095 - val_loss: 0.7588 - val_acc: 0.8489Epoch 19/50052s 103ms/step - loss: 0.8531 - acc: 0.8111 - val_loss: 0.7626 - val_acc: 0.8427Epoch 20/50052s 103ms/step - loss: 0.8463 - acc: 0.8140 - val_loss: 0.7256 - val_acc: 0.8620Epoch 21/50052s 103ms/step - loss: 0.8402 - acc: 0.8158 - val_loss: 0.7499 - val_acc: 0.8502Epoch 22/50052s 103ms/step - loss: 0.8347 - acc: 0.8170 - val_loss: 0.7154 - val_acc: 0.8629Epoch 23/50052s 103ms/step - loss: 0.8262 - acc: 0.8219 - val_loss: 0.7074 - val_acc: 0.8625Epoch 24/50051s 102ms/step - loss: 0.8251 - acc: 0.8211 - val_loss: 0.7165 - val_acc: 0.8601Epoch 25/50051s 102ms/step - loss: 0.8166 - acc: 0.8250 - val_loss: 0.7233 - val_acc: 0.8564Epoch 26/50051s 102ms/step - loss: 0.8090 - acc: 0.8266 - val_loss: 0.7401 - val_acc: 0.8481Epoch 27/50051s 102ms/step - loss: 0.8049 - acc: 0.8290 - val_loss: 0.6991 - val_acc: 0.8658Epoch 28/50051s 102ms/step - loss: 0.8037 - acc: 0.8302 - val_loss: 0.7159 - val_acc: 0.8630Epoch 29/50051s 102ms/step - loss: 0.7977 - acc: 0.8327 - val_loss: 0.7189 - val_acc: 0.8614Epoch 30/50051s 102ms/step - loss: 0.7968 - acc: 0.8334 - val_loss: 0.7030 - val_acc: 0.8706Epoch 31/50051s 102ms/step - loss: 0.7956 - acc: 0.8339 - val_loss: 0.6863 - val_acc: 0.8764Epoch 32/50051s 102ms/step - loss: 0.7875 - acc: 0.8377 - val_loss: 0.7160 - val_acc: 0.8647Epoch 33/50051s 102ms/step - loss: 0.7874 - acc: 0.8368 - val_loss: 0.7002 - val_acc: 0.8704Epoch 34/50051s 102ms/step - loss: 0.7917 - acc: 0.8357 - val_loss: 0.6829 - val_acc: 0.8783Epoch 35/50051s 102ms/step - loss: 0.7808 - acc: 0.8423 - val_loss: 0.7057 - val_acc: 0.8685Epoch 36/50051s 103ms/step - loss: 0.7795 - acc: 0.8413 - val_loss: 0.7044 - val_acc: 0.8710Epoch 37/50052s 103ms/step - loss: 0.7768 - acc: 0.8429 - val_loss: 0.6871 - val_acc: 0.8743Epoch 38/50052s 103ms/step - loss: 0.7763 - acc: 0.8418 - val_loss: 0.6995 - val_acc: 0.8687Epoch 39/50052s 103ms/step - loss: 0.7665 - acc: 0.8459 - val_loss: 0.6907 - val_acc: 0.8736Epoch 40/50052s 103ms/step - loss: 0.7751 - acc: 0.8428 - val_loss: 0.6947 - val_acc: 0.8717Epoch 41/50052s 103ms/step - loss: 0.7691 - acc: 0.8456 - val_loss: 0.6960 - val_acc: 0.8709Epoch 42/50052s 103ms/step - loss: 0.7704 - acc: 0.8460 - val_loss: 0.6690 - val_acc: 0.8851Epoch 43/50052s 103ms/step - loss: 0.7641 - acc: 0.8484 - val_loss: 0.6825 - val_acc: 0.8792Epoch 44/50052s 103ms/step - loss: 0.7630 - acc: 0.8499 - val_loss: 0.6765 - val_acc: 0.8781Epoch 45/50052s 103ms/step - loss: 0.7583 - acc: 0.8502 - val_loss: 0.6955 - val_acc: 0.8724Epoch 46/50052s 103ms/step - loss: 0.7599 - acc: 0.8493 - val_loss: 0.6750 - val_acc: 0.8773Epoch 47/50051s 102ms/step - loss: 0.7617 - acc: 0.8497 - val_loss: 0.6925 - val_acc: 0.8757Epoch 48/50051s 102ms/step - loss: 0.7589 - acc: 0.8501 - val_loss: 0.6848 - val_acc: 0.8775Epoch 49/50051s 102ms/step - loss: 0.7572 - acc: 0.8506 - val_loss: 0.6834 - val_acc: 0.8767Epoch 50/50051s 103ms/step - loss: 0.7512 - acc: 0.8523 - val_loss: 0.6908 - val_acc: 0.8742Epoch 51/50051s 102ms/step - loss: 0.7553 - acc: 0.8523 - val_loss: 0.6987 - val_acc: 0.8705Epoch 52/50051s 103ms/step - loss: 0.7501 - acc: 0.8523 - val_loss: 0.7237 - val_acc: 0.8665Epoch 53/50052s 103ms/step - loss: 0.7544 - acc: 0.8535 - val_loss: 0.6973 - val_acc: 0.8747Epoch 54/50052s 103ms/step - loss: 0.7467 - acc: 0.8561 - val_loss: 0.6836 - val_acc: 0.8814Epoch 55/50052s 103ms/step - loss: 0.7490 - acc: 0.8544 - val_loss: 0.6914 - val_acc: 0.8767Epoch 56/50052s 103ms/step - loss: 0.7451 - acc: 0.8568 - val_loss: 0.6881 - val_acc: 0.8803Epoch 57/50052s 103ms/step - loss: 0.7488 - acc: 0.8554 - val_loss: 0.6867 - val_acc: 0.8786Epoch 58/50052s 103ms/step - loss: 0.7465 - acc: 0.8576 - val_loss: 0.6845 - val_acc: 0.8759Epoch 59/50052s 103ms/step - loss: 0.7443 - acc: 0.8570 - val_loss: 0.6715 - val_acc: 0.8851Epoch 60/50052s 103ms/step - loss: 0.7472 - acc: 0.8562 - val_loss: 0.7045 - val_acc: 0.8751Epoch 61/50052s 103ms/step - loss: 0.7425 - acc: 0.8596 - val_loss: 0.6793 - val_acc: 0.8836Epoch 62/50052s 103ms/step - loss: 0.7433 - acc: 0.8587 - val_loss: 0.6963 - val_acc: 0.8752Epoch 63/50052s 103ms/step - loss: 0.7472 - acc: 0.8560 - val_loss: 0.6911 - val_acc: 0.8773Epoch 64/50052s 103ms/step - loss: 0.7423 - acc: 0.8577 - val_loss: 0.6808 - val_acc: 0.8809Epoch 65/50052s 103ms/step - loss: 0.7404 - acc: 0.8594 - val_loss: 0.7071 - val_acc: 0.8703Epoch 66/50052s 103ms/step - loss: 0.7418 - acc: 0.8580 - val_loss: 0.6881 - val_acc: 0.8776Epoch 67/50052s 103ms/step - loss: 0.7404 - acc: 0.8584 - val_loss: 0.6973 - val_acc: 0.8770Epoch 68/50052s 103ms/step - loss: 0.7458 - acc: 0.8569 - val_loss: 0.6871 - val_acc: 0.8783Epoch 69/50052s 103ms/step - loss: 0.7342 - acc: 0.8625 - val_loss: 0.6856 - val_acc: 0.8778Epoch 70/50052s 103ms/step - loss: 0.7358 - acc: 0.8603 - val_loss: 0.6999 - val_acc: 0.8742Epoch 71/50052s 103ms/step - loss: 0.7419 - acc: 0.8600 - val_loss: 0.6921 - val_acc: 0.8816Epoch 72/50052s 103ms/step - loss: 0.7368 - acc: 0.8607 - val_loss: 0.6825 - val_acc: 0.8813Epoch 73/50051s 103ms/step - loss: 0.7376 - acc: 0.8581 - val_loss: 0.6771 - val_acc: 0.8846Epoch 74/50051s 102ms/step - loss: 0.7371 - acc: 0.8598 - val_loss: 0.6963 - val_acc: 0.8787Epoch 75/50051s 102ms/step - loss: 0.7409 - acc: 0.8593 - val_loss: 0.6884 - val_acc: 0.8794Epoch 76/50051s 102ms/step - loss: 0.7348 - acc: 0.8633 - val_loss: 0.6655 - val_acc: 0.8904Epoch 77/50051s 102ms/step - loss: 0.7358 - acc: 0.8624 - val_loss: 0.6860 - val_acc: 0.8812Epoch 78/50051s 102ms/step - loss: 0.7392 - acc: 0.8616 - val_loss: 0.7251 - val_acc: 0.8686Epoch 79/50051s 102ms/step - loss: 0.7342 - acc: 0.8645 - val_loss: 0.6891 - val_acc: 0.8804Epoch 80/50051s 102ms/step - loss: 0.7341 - acc: 0.8635 - val_loss: 0.6761 - val_acc: 0.8847Epoch 81/50051s 102ms/step - loss: 0.7382 - acc: 0.8605 - val_loss: 0.7034 - val_acc: 0.8752ETA: 43s - loss: 0.6977 - acc: 0.8793Epoch 82/50051s 102ms/step - loss: 0.7377 - acc: 0.8617 - val_loss: 0.6670 - val_acc: 0.8867Epoch 83/50051s 102ms/step - loss: 0.7347 - acc: 0.8643 - val_loss: 0.6862 - val_acc: 0.8815Epoch 84/50051s 102ms/step - loss: 0.7301 - acc: 0.8660 - val_loss: 0.6837 - val_acc: 0.8818Epoch 85/50051s 102ms/step - loss: 0.7322 - acc: 0.8648 - val_loss: 0.6842 - val_acc: 0.8809Epoch 86/50051s 102ms/step - loss: 0.7303 - acc: 0.8640 - val_loss: 0.6906 - val_acc: 0.8823Epoch 87/50051s 102ms/step - loss: 0.7285 - acc: 0.8651 - val_loss: 0.6701 - val_acc: 0.8847Epoch 88/50051s 102ms/step - loss: 0.7313 - acc: 0.8645 - val_loss: 0.6774 - val_acc: 0.8832Epoch 89/50051s 102ms/step - loss: 0.7303 - acc: 0.8625 - val_loss: 0.6827 - val_acc: 0.8851Epoch 90/50051s 102ms/step - loss: 0.7283 - acc: 0.8647 - val_loss: 0.6886 - val_acc: 0.8821Epoch 91/50051s 103ms/step - loss: 0.7256 - acc: 0.8662 - val_loss: 0.6889 - val_acc: 0.8803Epoch 92/50052s 103ms/step - loss: 0.7313 - acc: 0.8634 - val_loss: 0.6747 - val_acc: 0.8865Epoch 93/50052s 103ms/step - loss: 0.7257 - acc: 0.8656 - val_loss: 0.6831 - val_acc: 0.8836Epoch 94/50052s 103ms/step - loss: 0.7303 - acc: 0.8645 - val_loss: 0.7008 - val_acc: 0.8772Epoch 95/50052s 103ms/step - loss: 0.7268 - acc: 0.8657 - val_loss: 0.6998 - val_acc: 0.8783Epoch 96/50051s 102ms/step - loss: 0.7245 - acc: 0.8659 - val_loss: 0.6927 - val_acc: 0.8789Epoch 97/50051s 102ms/step - loss: 0.7236 - acc: 0.8685 - val_loss: 0.6620 - val_acc: 0.8919Epoch 98/50051s 102ms/step - loss: 0.7249 - acc: 0.8669 - val_loss: 0.6741 - val_acc: 0.8857Epoch 99/50051s 102ms/step - loss: 0.7252 - acc: 0.8659 - val_loss: 0.6770 - val_acc: 0.8844Epoch 100/50051s 102ms/step - loss: 0.7239 - acc: 0.8672 - val_loss: 0.6815 - val_acc: 0.8851Epoch 101/50051s 102ms/step - loss: 0.7272 - acc: 0.8667 - val_loss: 0.6752 - val_acc: 0.8867Epoch 102/50051s 102ms/step - loss: 0.7268 - acc: 0.8651 - val_loss: 0.6985 - val_acc: 0.8766Epoch 103/50051s 102ms/step - loss: 0.7243 - acc: 0.8685 - val_loss: 0.7136 - val_acc: 0.8714Epoch 104/50051s 102ms/step - loss: 0.7293 - acc: 0.8646 - val_loss: 0.6930 - val_acc: 0.8813Epoch 105/50051s 102ms/step - loss: 0.7237 - acc: 0.8665 - val_loss: 0.6919 - val_acc: 0.8812Epoch 106/50051s 102ms/step - loss: 0.7275 - acc: 0.8669 - val_loss: 0.6722 - val_acc: 0.8857Epoch 107/50051s 102ms/step - loss: 0.7149 - acc: 0.8703 - val_loss: 0.6730 - val_acc: 0.8827Epoch 108/50051s 102ms/step - loss: 0.7214 - acc: 0.8689 - val_loss: 0.6530 - val_acc: 0.8930Epoch 109/50051s 102ms/step - loss: 0.7212 - acc: 0.8678 - val_loss: 0.7015 - val_acc: 0.8784Epoch 110/50051s 102ms/step - loss: 0.7238 - acc: 0.8676 - val_loss: 0.6730 - val_acc: 0.8833Epoch 111/50051s 102ms/step - loss: 0.7260 - acc: 0.8670 - val_loss: 0.6541 - val_acc: 0.8919Epoch 112/50051s 102ms/step - loss: 0.7209 - acc: 0.8688 - val_loss: 0.6577 - val_acc: 0.8926Epoch 113/50051s 102ms/step - loss: 0.7258 - acc: 0.8654 - val_loss: 0.6864 - val_acc: 0.8832Epoch 114/50051s 102ms/step - loss: 0.7211 - acc: 0.8695 - val_loss: 0.6749 - val_acc: 0.8873Epoch 115/50051s 102ms/step - loss: 0.7196 - acc: 0.8707 - val_loss: 0.6660 - val_acc: 0.8873Epoch 116/50051s 102ms/step - loss: 0.7189 - acc: 0.8684 - val_loss: 0.6945 - val_acc: 0.8809Epoch 117/50051s 102ms/step - loss: 0.7246 - acc: 0.8672 - val_loss: 0.7153 - val_acc: 0.8706Epoch 118/50051s 102ms/step - loss: 0.7189 - acc: 0.8719 - val_loss: 0.6718 - val_acc: 0.8849Epoch 119/50051s 102ms/step - loss: 0.7214 - acc: 0.8695 - val_loss: 0.6883 - val_acc: 0.8802Epoch 120/50051s 102ms/step - loss: 0.7154 - acc: 0.8706 - val_loss: 0.6846 - val_acc: 0.8844Epoch 121/50051s 102ms/step - loss: 0.7150 - acc: 0.8691 - val_loss: 0.6820 - val_acc: 0.8836Epoch 122/50051s 102ms/step - loss: 0.7190 - acc: 0.8696 - val_loss: 0.6737 - val_acc: 0.8857Epoch 123/50051s 102ms/step - loss: 0.7191 - acc: 0.8686 - val_loss: 0.6752 - val_acc: 0.8848Epoch 124/50051s 102ms/step - loss: 0.7185 - acc: 0.8701 - val_loss: 0.6841 - val_acc: 0.8828Epoch 125/50051s 103ms/step - loss: 0.7221 - acc: 0.8689 - val_loss: 0.6739 - val_acc: 0.8837Epoch 126/50051s 102ms/step - loss: 0.7202 - acc: 0.8699 - val_loss: 0.6787 - val_acc: 0.8888Epoch 127/50051s 102ms/step - loss: 0.7172 - acc: 0.8703 - val_loss: 0.6889 - val_acc: 0.8815Epoch 128/50051s 102ms/step - loss: 0.7157 - acc: 0.8719 - val_loss: 0.6832 - val_acc: 0.8852Epoch 129/50051s 102ms/step - loss: 0.7149 - acc: 0.8704 - val_loss: 0.6777 - val_acc: 0.8859Epoch 130/50051s 102ms/step - loss: 0.7205 - acc: 0.8698 - val_loss: 0.6675 - val_acc: 0.8908Epoch 131/50051s 102ms/step - loss: 0.7146 - acc: 0.8721 - val_loss: 0.6741 - val_acc: 0.8916Epoch 132/50051s 102ms/step - loss: 0.7140 - acc: 0.8720 - val_loss: 0.6649 - val_acc: 0.8891Epoch 133/50052s 103ms/step - loss: 0.7128 - acc: 0.8714 - val_loss: 0.6883 - val_acc: 0.8834Epoch 134/50052s 103ms/step - loss: 0.7203 - acc: 0.8699 - val_loss: 0.6899 - val_acc: 0.8854Epoch 135/50052s 103ms/step - loss: 0.7131 - acc: 0.8716 - val_loss: 0.6907 - val_acc: 0.8823Epoch 136/50052s 103ms/step - loss: 0.7127 - acc: 0.8706 - val_loss: 0.6957 - val_acc: 0.8786Epoch 137/50052s 103ms/step - loss: 0.7147 - acc: 0.8726 - val_loss: 0.7036 - val_acc: 0.8764Epoch 138/50052s 103ms/step - loss: 0.7125 - acc: 0.8735 - val_loss: 0.6704 - val_acc: 0.8896Epoch 139/50052s 103ms/step - loss: 0.7153 - acc: 0.8692 - val_loss: 0.6620 - val_acc: 0.8923Epoch 140/50052s 103ms/step - loss: 0.7125 - acc: 0.8707 - val_loss: 0.6862 - val_acc: 0.8839Epoch 141/50052s 103ms/step - loss: 0.7149 - acc: 0.8722 - val_loss: 0.6573 - val_acc: 0.8951Epoch 142/50051s 103ms/step - loss: 0.7154 - acc: 0.8726 - val_loss: 0.6658 - val_acc: 0.8898Epoch 143/50052s 103ms/step - loss: 0.7115 - acc: 0.8728 - val_loss: 0.6868 - val_acc: 0.8848Epoch 144/50052s 103ms/step - loss: 0.7116 - acc: 0.8733 - val_loss: 0.6679 - val_acc: 0.8894Epoch 145/50052s 103ms/step - loss: 0.7198 - acc: 0.8696 - val_loss: 0.6865 - val_acc: 0.8824Epoch 146/50052s 103ms/step - loss: 0.7146 - acc: 0.8726 - val_loss: 0.6906 - val_acc: 0.8826Epoch 147/50052s 103ms/step - loss: 0.7165 - acc: 0.8718 - val_loss: 0.6597 - val_acc: 0.8898Epoch 148/50052s 103ms/step - loss: 0.7130 - acc: 0.8725 - val_loss: 0.6596 - val_acc: 0.8926Epoch 149/50052s 103ms/step - loss: 0.7096 - acc: 0.8728 - val_loss: 0.6855 - val_acc: 0.8816Epoch 150/50052s 103ms/step - loss: 0.7131 - acc: 0.8730 - val_loss: 0.6872 - val_acc: 0.8838Epoch 151/500lr changed to 0.01000000014901161252s 103ms/step - loss: 0.5994 - acc: 0.9120 - val_loss: 0.5846 - val_acc: 0.9190Epoch 152/50051s 103ms/step - loss: 0.5434 - acc: 0.9286 - val_loss: 0.5670 - val_acc: 0.9214Epoch 153/50052s 103ms/step - loss: 0.5249 - acc: 0.9328 - val_loss: 0.5552 - val_acc: 0.9216Epoch 154/50051s 103ms/step - loss: 0.5089 - acc: 0.9363 - val_loss: 0.5436 - val_acc: 0.9275Epoch 155/50051s 102ms/step - loss: 0.4992 - acc: 0.9393 - val_loss: 0.5400 - val_acc: 0.9263Epoch 156/50051s 102ms/step - loss: 0.4838 - acc: 0.9424 - val_loss: 0.5373 - val_acc: 0.9269Epoch 157/50051s 102ms/step - loss: 0.4733 - acc: 0.9442 - val_loss: 0.5283 - val_acc: 0.9264ETA: 24s - loss: 0.4741 - acc: 0.9435Epoch 158/50051s 103ms/step - loss: 0.4641 - acc: 0.9458 - val_loss: 0.5195 - val_acc: 0.9296Epoch 159/50051s 102ms/step - loss: 0.4579 - acc: 0.9462 - val_loss: 0.5159 - val_acc: 0.9316Epoch 160/50051s 102ms/step - loss: 0.4515 - acc: 0.9467 - val_loss: 0.5058 - val_acc: 0.9299Epoch 161/50051s 102ms/step - loss: 0.4420 - acc: 0.9485 - val_loss: 0.5039 - val_acc: 0.9296Epoch 162/50051s 102ms/step - loss: 0.4357 - acc: 0.9485 - val_loss: 0.4972 - val_acc: 0.9325Epoch 163/50051s 102ms/step - loss: 0.4238 - acc: 0.9514 - val_loss: 0.4934 - val_acc: 0.9320Epoch 164/50051s 102ms/step - loss: 0.4187 - acc: 0.9523 - val_loss: 0.4884 - val_acc: 0.9314Epoch 165/50051s 102ms/step - loss: 0.4139 - acc: 0.9523 - val_loss: 0.4825 - val_acc: 0.9311Epoch 166/50051s 102ms/step - loss: 0.4071 - acc: 0.9538 - val_loss: 0.4861 - val_acc: 0.9308Epoch 167/50051s 102ms/step - loss: 0.4004 - acc: 0.9539 - val_loss: 0.4780 - val_acc: 0.9292Epoch 168/50051s 102ms/step - loss: 0.3950 - acc: 0.9550 - val_loss: 0.4754 - val_acc: 0.9311Epoch 169/50051s 102ms/step - loss: 0.3948 - acc: 0.9541 - val_loss: 0.4717 - val_acc: 0.9328Epoch 170/50051s 102ms/step - loss: 0.3828 - acc: 0.9570 - val_loss: 0.4741 - val_acc: 0.9313Epoch 171/50051s 102ms/step - loss: 0.3831 - acc: 0.9554 - val_loss: 0.4666 - val_acc: 0.9302Epoch 172/50051s 102ms/step - loss: 0.3781 - acc: 0.9561 - val_loss: 0.4653 - val_acc: 0.9324Epoch 173/50051s 102ms/step - loss: 0.3707 - acc: 0.9575 - val_loss: 0.4633 - val_acc: 0.9300Epoch 174/50051s 102ms/step - loss: 0.3679 - acc: 0.9573 - val_loss: 0.4688 - val_acc: 0.9262Epoch 175/50051s 102ms/step - loss: 0.3621 - acc: 0.9585 - val_loss: 0.4521 - val_acc: 0.9307Epoch 176/50051s 102ms/step - loss: 0.3619 - acc: 0.9571 - val_loss: 0.4465 - val_acc: 0.9329ETA: 42s - loss: 0.3637 - acc: 0.9586Epoch 177/50051s 102ms/step - loss: 0.3586 - acc: 0.9569 - val_loss: 0.4481 - val_acc: 0.9315Epoch 178/50051s 102ms/step - loss: 0.3485 - acc: 0.9598 - val_loss: 0.4531 - val_acc: 0.9305Epoch 179/50051s 102ms/step - loss: 0.3469 - acc: 0.9590 - val_loss: 0.4452 - val_acc: 0.9324Epoch 180/50051s 102ms/step - loss: 0.3456 - acc: 0.9590 - val_loss: 0.4466 - val_acc: 0.9312Epoch 181/50051s 102ms/step - loss: 0.3440 - acc: 0.9584 - val_loss: 0.4395 - val_acc: 0.9324Epoch 182/50051s 102ms/step - loss: 0.3393 - acc: 0.9595 - val_loss: 0.4456 - val_acc: 0.9295Epoch 183/50051s 102ms/step - loss: 0.3372 - acc: 0.9591 - val_loss: 0.4441 - val_acc: 0.9268Epoch 184/50051s 102ms/step - loss: 0.3342 - acc: 0.9592 - val_loss: 0.4347 - val_acc: 0.9283Epoch 185/50051s 102ms/step - loss: 0.3313 - acc: 0.9604 - val_loss: 0.4392 - val_acc: 0.9277Epoch 186/50051s 102ms/step - loss: 0.3245 - acc: 0.9618 - val_loss: 0.4281 - val_acc: 0.9318Epoch 187/50051s 102ms/step - loss: 0.3279 - acc: 0.9596 - val_loss: 0.4449 - val_acc: 0.9266Epoch 188/50051s 102ms/step - loss: 0.3244 - acc: 0.9593 - val_loss: 0.4367 - val_acc: 0.9262Epoch 189/50051s 102ms/step - loss: 0.3223 - acc: 0.9601 - val_loss: 0.4250 - val_acc: 0.9316Epoch 190/50051s 102ms/step - loss: 0.3197 - acc: 0.9597 - val_loss: 0.4245 - val_acc: 0.9273Epoch 191/50051s 102ms/step - loss: 0.3179 - acc: 0.9591 - val_loss: 0.4279 - val_acc: 0.9299Epoch 192/50051s 102ms/step - loss: 0.3166 - acc: 0.9603 - val_loss: 0.4243 - val_acc: 0.9285Epoch 193/50051s 102ms/step - loss: 0.3156 - acc: 0.9600 - val_loss: 0.4209 - val_acc: 0.9299Epoch 194/50051s 103ms/step - loss: 0.3102 - acc: 0.9622 - val_loss: 0.4253 - val_acc: 0.9267Epoch 195/50051s 102ms/step - loss: 0.3044 - acc: 0.9627 - val_loss: 0.4237 - val_acc: 0.9283Epoch 196/50051s 102ms/step - loss: 0.3085 - acc: 0.9609 - val_loss: 0.4174 - val_acc: 0.9297Epoch 197/50051s 102ms/step - loss: 0.3044 - acc: 0.9617 - val_loss: 0.4135 - val_acc: 0.9283Epoch 198/50051s 102ms/step - loss: 0.2983 - acc: 0.9631 - val_loss: 0.4152 - val_acc: 0.9312Epoch 199/50051s 102ms/step - loss: 0.3054 - acc: 0.9601 - val_loss: 0.4156 - val_acc: 0.9296Epoch 200/50051s 102ms/step - loss: 0.3007 - acc: 0.9619 - val_loss: 0.4084 - val_acc: 0.9305Epoch 201/50051s 102ms/step - loss: 0.3018 - acc: 0.9608 - val_loss: 0.4119 - val_acc: 0.9285Epoch 202/50051s 102ms/step - loss: 0.3023 - acc: 0.9596 - val_loss: 0.4075 - val_acc: 0.9312Epoch 203/50051s 102ms/step - loss: 0.2984 - acc: 0.9608 - val_loss: 0.4125 - val_acc: 0.9315Epoch 204/50051s 102ms/step - loss: 0.2937 - acc: 0.9621 - val_loss: 0.4088 - val_acc: 0.9267Epoch 205/50051s 102ms/step - loss: 0.2918 - acc: 0.9620 - val_loss: 0.4138 - val_acc: 0.9258Epoch 206/50051s 102ms/step - loss: 0.2922 - acc: 0.9617 - val_loss: 0.4048 - val_acc: 0.9310Epoch 207/50051s 102ms/step - loss: 0.2914 - acc: 0.9615 - val_loss: 0.3929 - val_acc: 0.9343Epoch 208/50051s 102ms/step - loss: 0.2915 - acc: 0.9615 - val_loss: 0.4042 - val_acc: 0.9291Epoch 209/50051s 102ms/step - loss: 0.2880 - acc: 0.9619 - val_loss: 0.4065 - val_acc: 0.9257Epoch 210/50051s 102ms/step - loss: 0.2912 - acc: 0.9601 - val_loss: 0.4155 - val_acc: 0.9238Epoch 211/50051s 102ms/step - loss: 0.2896 - acc: 0.9616 - val_loss: 0.3948 - val_acc: 0.9304Epoch 212/50051s 102ms/step - loss: 0.2890 - acc: 0.9605 - val_loss: 0.4077 - val_acc: 0.9288Epoch 213/50051s 102ms/step - loss: 0.2913 - acc: 0.9596 - val_loss: 0.3965 - val_acc: 0.9282Epoch 214/50051s 102ms/step - loss: 0.2865 - acc: 0.9599 - val_loss: 0.4121 - val_acc: 0.9240Epoch 215/50051s 102ms/step - loss: 0.2841 - acc: 0.9610 - val_loss: 0.4025 - val_acc: 0.9302Epoch 216/50051s 102ms/step - loss: 0.2870 - acc: 0.9603 - val_loss: 0.4064 - val_acc: 0.9244Epoch 217/50051s 102ms/step - loss: 0.2811 - acc: 0.9610 - val_loss: 0.4086 - val_acc: 0.9276Epoch 218/50051s 102ms/step - loss: 0.2837 - acc: 0.9606 - val_loss: 0.3963 - val_acc: 0.9269Epoch 219/50051s 102ms/step - loss: 0.2801 - acc: 0.9623 - val_loss: 0.3981 - val_acc: 0.9290Epoch 220/50051s 102ms/step - loss: 0.2786 - acc: 0.9631 - val_loss: 0.4038 - val_acc: 0.9261Epoch 221/50051s 103ms/step - loss: 0.2792 - acc: 0.9617 - val_loss: 0.3992 - val_acc: 0.9275Epoch 222/50052s 104ms/step - loss: 0.2780 - acc: 0.9619 - val_loss: 0.3951 - val_acc: 0.9290Epoch 223/50051s 103ms/step - loss: 0.2816 - acc: 0.9602 - val_loss: 0.3909 - val_acc: 0.9290Epoch 224/50051s 102ms/step - loss: 0.2766 - acc: 0.9622 - val_loss: 0.4014 - val_acc: 0.9276Epoch 225/50051s 102ms/step - loss: 0.2801 - acc: 0.9612 - val_loss: 0.3954 - val_acc: 0.9264Epoch 226/50051s 102ms/step - loss: 0.2803 - acc: 0.9595 - val_loss: 0.3920 - val_acc: 0.9309Epoch 227/50051s 102ms/step - loss: 0.2759 - acc: 0.9608 - val_loss: 0.3968 - val_acc: 0.9281Epoch 228/50051s 102ms/step - loss: 0.2814 - acc: 0.9586 - val_loss: 0.3927 - val_acc: 0.9288Epoch 229/50051s 102ms/step - loss: 0.2732 - acc: 0.9636 - val_loss: 0.4051 - val_acc: 0.9249Epoch 230/50051s 102ms/step - loss: 0.2770 - acc: 0.9606 - val_loss: 0.3944 - val_acc: 0.9276Epoch 231/50051s 102ms/step - loss: 0.2765 - acc: 0.9616 - val_loss: 0.3943 - val_acc: 0.9279Epoch 232/50051s 102ms/step - loss: 0.2762 - acc: 0.9603 - val_loss: 0.3855 - val_acc: 0.9271Epoch 233/50051s 102ms/step - loss: 0.2747 - acc: 0.9609 - val_loss: 0.3948 - val_acc: 0.9272Epoch 234/50051s 102ms/step - loss: 0.2776 - acc: 0.9595 - val_loss: 0.3861 - val_acc: 0.9285Epoch 235/50051s 102ms/step - loss: 0.2763 - acc: 0.9606 - val_loss: 0.3802 - val_acc: 0.9301Epoch 236/50051s 102ms/step - loss: 0.2747 - acc: 0.9616 - val_loss: 0.3883 - val_acc: 0.9295Epoch 237/50051s 102ms/step - loss: 0.2763 - acc: 0.9604 - val_loss: 0.3882 - val_acc: 0.9288Epoch 238/50051s 103ms/step - loss: 0.2699 - acc: 0.9616 - val_loss: 0.3973 - val_acc: 0.9260Epoch 239/50052s 103ms/step - loss: 0.2720 - acc: 0.9612 - val_loss: 0.3893 - val_acc: 0.9255Epoch 240/50051s 102ms/step - loss: 0.2758 - acc: 0.9595 - val_loss: 0.3975 - val_acc: 0.9267Epoch 241/50051s 102ms/step - loss: 0.2724 - acc: 0.9619 - val_loss: 0.3955 - val_acc: 0.9242Epoch 242/50051s 102ms/step - loss: 0.2704 - acc: 0.9622 - val_loss: 0.3889 - val_acc: 0.9263Epoch 243/50051s 102ms/step - loss: 0.2699 - acc: 0.9612 - val_loss: 0.4002 - val_acc: 0.9267Epoch 244/50051s 102ms/step - loss: 0.2733 - acc: 0.9605 - val_loss: 0.3858 - val_acc: 0.9290Epoch 245/50051s 102ms/step - loss: 0.2733 - acc: 0.9603 - val_loss: 0.3944 - val_acc: 0.9272Epoch 246/50051s 102ms/step - loss: 0.2682 - acc: 0.9616 - val_loss: 0.3787 - val_acc: 0.9306Epoch 247/50051s 103ms/step - loss: 0.2679 - acc: 0.9623 - val_loss: 0.3862 - val_acc: 0.9281Epoch 248/50052s 103ms/step - loss: 0.2688 - acc: 0.9619 - val_loss: 0.3904 - val_acc: 0.9266Epoch 249/50052s 103ms/step - loss: 0.2683 - acc: 0.9626 - val_loss: 0.3898 - val_acc: 0.9269Epoch 250/50052s 103ms/step - loss: 0.2706 - acc: 0.9600 - val_loss: 0.3884 - val_acc: 0.9294Epoch 251/50052s 103ms/step - loss: 0.2681 - acc: 0.9612 - val_loss: 0.3848 - val_acc: 0.9289Epoch 252/50052s 103ms/step - loss: 0.2696 - acc: 0.9618 - val_loss: 0.3852 - val_acc: 0.9268Epoch 253/50052s 103ms/step - loss: 0.2717 - acc: 0.9603 - val_loss: 0.3837 - val_acc: 0.9270Epoch 254/50051s 102ms/step - loss: 0.2718 - acc: 0.9596 - val_loss: 0.3855 - val_acc: 0.9264Epoch 255/50051s 103ms/step - loss: 0.2658 - acc: 0.9626 - val_loss: 0.3890 - val_acc: 0.9286Epoch 256/50051s 102ms/step - loss: 0.2660 - acc: 0.9623 - val_loss: 0.3866 - val_acc: 0.9314Epoch 257/50051s 103ms/step - loss: 0.2713 - acc: 0.9605 - val_loss: 0.3863 - val_acc: 0.9271Epoch 258/50052s 103ms/step - loss: 0.2715 - acc: 0.9599 - val_loss: 0.3833 - val_acc: 0.9301Epoch 259/50052s 103ms/step - loss: 0.2705 - acc: 0.9596 - val_loss: 0.3897 - val_acc: 0.9257Epoch 260/50052s 103ms/step - loss: 0.2682 - acc: 0.9613 - val_loss: 0.3911 - val_acc: 0.9274Epoch 261/50052s 103ms/step - loss: 0.2692 - acc: 0.9601 - val_loss: 0.3798 - val_acc: 0.9292Epoch 262/50052s 103ms/step - loss: 0.2650 - acc: 0.9622 - val_loss: 0.3910 - val_acc: 0.9270Epoch 263/50052s 103ms/step - loss: 0.2639 - acc: 0.9620 - val_loss: 0.3870 - val_acc: 0.9285Epoch 264/50052s 103ms/step - loss: 0.2651 - acc: 0.9620 - val_loss: 0.3994 - val_acc: 0.9263Epoch 265/50052s 103ms/step - loss: 0.2694 - acc: 0.9601 - val_loss: 0.3807 - val_acc: 0.9314Epoch 266/50052s 103ms/step - loss: 0.2691 - acc: 0.9613 - val_loss: 0.3811 - val_acc: 0.9282Epoch 267/50052s 103ms/step - loss: 0.2618 - acc: 0.9631 - val_loss: 0.3789 - val_acc: 0.9334Epoch 268/50051s 102ms/step - loss: 0.2636 - acc: 0.9631 - val_loss: 0.3905 - val_acc: 0.9254Epoch 269/50051s 102ms/step - loss: 0.2695 - acc: 0.9609 - val_loss: 0.4035 - val_acc: 0.9235Epoch 270/50051s 102ms/step - loss: 0.2661 - acc: 0.9615 - val_loss: 0.3942 - val_acc: 0.9264Epoch 271/50051s 102ms/step - loss: 0.2640 - acc: 0.9623 - val_loss: 0.3878 - val_acc: 0.9294Epoch 272/50052s 103ms/step - loss: 0.2658 - acc: 0.9620 - val_loss: 0.3952 - val_acc: 0.9260Epoch 273/50052s 103ms/step - loss: 0.2654 - acc: 0.9623 - val_loss: 0.4001 - val_acc: 0.9225Epoch 274/50051s 102ms/step - loss: 0.2648 - acc: 0.9621 - val_loss: 0.3885 - val_acc: 0.9285Epoch 275/50051s 102ms/step - loss: 0.2618 - acc: 0.9625 - val_loss: 0.3925 - val_acc: 0.9260Epoch 276/50051s 103ms/step - loss: 0.2615 - acc: 0.9628 - val_loss: 0.3860 - val_acc: 0.9279Epoch 277/50051s 103ms/step - loss: 0.2740 - acc: 0.9584 - val_loss: 0.3831 - val_acc: 0.9279Epoch 278/50051s 102ms/step - loss: 0.2653 - acc: 0.9616 - val_loss: 0.3898 - val_acc: 0.9257Epoch 279/50051s 102ms/step - loss: 0.2623 - acc: 0.9630 - val_loss: 0.3902 - val_acc: 0.9271Epoch 280/50052s 103ms/step - loss: 0.2652 - acc: 0.9624 - val_loss: 0.3899 - val_acc: 0.9274Epoch 281/50051s 103ms/step - loss: 0.2674 - acc: 0.9614 - val_loss: 0.3948 - val_acc: 0.9232Epoch 282/50051s 102ms/step - loss: 0.2641 - acc: 0.9622 - val_loss: 0.3880 - val_acc: 0.9284Epoch 283/50051s 103ms/step - loss: 0.2689 - acc: 0.9600 - val_loss: 0.3856 - val_acc: 0.9281Epoch 284/50051s 102ms/step - loss: 0.2616 - acc: 0.9618 - val_loss: 0.3795 - val_acc: 0.9294Epoch 285/50051s 102ms/step - loss: 0.2645 - acc: 0.9618 - val_loss: 0.3904 - val_acc: 0.9270Epoch 286/50051s 102ms/step - loss: 0.2656 - acc: 0.9612 - val_loss: 0.3848 - val_acc: 0.9264Epoch 287/50051s 102ms/step - loss: 0.2645 - acc: 0.9621 - val_loss: 0.3850 - val_acc: 0.9273Epoch 288/50051s 103ms/step - loss: 0.2607 - acc: 0.9628 - val_loss: 0.3806 - val_acc: 0.9295Epoch 289/50051s 102ms/step - loss: 0.2627 - acc: 0.9619 - val_loss: 0.3875 - val_acc: 0.9279Epoch 290/50051s 103ms/step - loss: 0.2641 - acc: 0.9618 - val_loss: 0.3967 - val_acc: 0.9284Epoch 291/50052s 103ms/step - loss: 0.2624 - acc: 0.9633 - val_loss: 0.3906 - val_acc: 0.9253Epoch 292/50051s 103ms/step - loss: 0.2651 - acc: 0.9623 - val_loss: 0.3843 - val_acc: 0.9273Epoch 293/50052s 103ms/step - loss: 0.2604 - acc: 0.9632 - val_loss: 0.3887 - val_acc: 0.9264Epoch 294/50051s 103ms/step - loss: 0.2605 - acc: 0.9633 - val_loss: 0.3969 - val_acc: 0.9244Epoch 295/50051s 102ms/step - loss: 0.2636 - acc: 0.9615 - val_loss: 0.3853 - val_acc: 0.9295Epoch 296/50051s 103ms/step - loss: 0.2620 - acc: 0.9625 - val_loss: 0.3993 - val_acc: 0.9248Epoch 297/50052s 103ms/step - loss: 0.2657 - acc: 0.9616 - val_loss: 0.3840 - val_acc: 0.9283Epoch 298/50052s 103ms/step - loss: 0.2630 - acc: 0.9627 - val_loss: 0.3899 - val_acc: 0.9257Epoch 299/50052s 103ms/step - loss: 0.2623 - acc: 0.9619 - val_loss: 0.3894 - val_acc: 0.9249Epoch 300/50052s 103ms/step - loss: 0.2709 - acc: 0.9589 - val_loss: 0.3775 - val_acc: 0.9304Epoch 301/500lr changed to 0.000999999977648258352s 103ms/step - loss: 0.2329 - acc: 0.9743 - val_loss: 0.3555 - val_acc: 0.9362Epoch 302/50052s 103ms/step - loss: 0.2176 - acc: 0.9795 - val_loss: 0.3499 - val_acc: 0.9380Epoch 303/50052s 103ms/step - loss: 0.2121 - acc: 0.9810 - val_loss: 0.3486 - val_acc: 0.9394Epoch 304/50052s 103ms/step - loss: 0.2088 - acc: 0.9825 - val_loss: 0.3498 - val_acc: 0.9401Epoch 305/50052s 103ms/step - loss: 0.2045 - acc: 0.9837 - val_loss: 0.3475 - val_acc: 0.9406Epoch 306/50052s 103ms/step - loss: 0.2032 - acc: 0.9846 - val_loss: 0.3479 - val_acc: 0.9404Epoch 307/50052s 103ms/step - loss: 0.2013 - acc: 0.9845 - val_loss: 0.3481 - val_acc: 0.9407Epoch 308/50051s 103ms/step - loss: 0.2013 - acc: 0.9846 - val_loss: 0.3519 - val_acc: 0.9397Epoch 309/50051s 103ms/step - loss: 0.1980 - acc: 0.9865 - val_loss: 0.3485 - val_acc: 0.9411Epoch 310/50052s 103ms/step - loss: 0.1949 - acc: 0.9868 - val_loss: 0.3500 - val_acc: 0.9407Epoch 311/50051s 102ms/step - loss: 0.1946 - acc: 0.9866 - val_loss: 0.3517 - val_acc: 0.9394Epoch 312/50051s 102ms/step - loss: 0.1950 - acc: 0.9864 - val_loss: 0.3522 - val_acc: 0.9395Epoch 313/50051s 103ms/step - loss: 0.1925 - acc: 0.9874 - val_loss: 0.3497 - val_acc: 0.9406Epoch 314/50052s 103ms/step - loss: 0.1917 - acc: 0.9873 - val_loss: 0.3515 - val_acc: 0.9400Epoch 315/50051s 102ms/step - loss: 0.1919 - acc: 0.9876 - val_loss: 0.3528 - val_acc: 0.9391Epoch 316/50051s 102ms/step - loss: 0.1914 - acc: 0.9872 - val_loss: 0.3521 - val_acc: 0.9391Epoch 317/50051s 103ms/step - loss: 0.1891 - acc: 0.9883 - val_loss: 0.3520 - val_acc: 0.9387Epoch 318/50051s 102ms/step - loss: 0.1891 - acc: 0.9883 - val_loss: 0.3529 - val_acc: 0.9389Epoch 319/50051s 102ms/step - loss: 0.1885 - acc: 0.9881 - val_loss: 0.3531 - val_acc: 0.9402Epoch 320/50051s 102ms/step - loss: 0.1866 - acc: 0.9890 - val_loss: 0.3549 - val_acc: 0.9397Epoch 321/50051s 102ms/step - loss: 0.1869 - acc: 0.9885 - val_loss: 0.3530 - val_acc: 0.9401Epoch 322/50051s 103ms/step - loss: 0.1871 - acc: 0.9873 - val_loss: 0.3572 - val_acc: 0.9394Epoch 323/50052s 103ms/step - loss: 0.1862 - acc: 0.9881 - val_loss: 0.3538 - val_acc: 0.9403Epoch 324/50051s 103ms/step - loss: 0.1861 - acc: 0.9883 - val_loss: 0.3545 - val_acc: 0.9394Epoch 325/50051s 102ms/step - loss: 0.1832 - acc: 0.9894 - val_loss: 0.3555 - val_acc: 0.9394Epoch 326/50051s 102ms/step - loss: 0.1813 - acc: 0.9904 - val_loss: 0.3542 - val_acc: 0.9399Epoch 327/50051s 102ms/step - loss: 0.1842 - acc: 0.9887 - val_loss: 0.3552 - val_acc: 0.9382Epoch 328/50051s 102ms/step - loss: 0.1825 - acc: 0.9890 - val_loss: 0.3541 - val_acc: 0.9402Epoch 329/50051s 102ms/step - loss: 0.1804 - acc: 0.9899 - val_loss: 0.3554 - val_acc: 0.9397Epoch 330/50051s 103ms/step - loss: 0.1816 - acc: 0.9896 - val_loss: 0.3549 - val_acc: 0.9403Epoch 331/50051s 102ms/step - loss: 0.1804 - acc: 0.9899 - val_loss: 0.3518 - val_acc: 0.9407Epoch 332/50051s 102ms/step - loss: 0.1806 - acc: 0.9900 - val_loss: 0.3515 - val_acc: 0.9419Epoch 333/50051s 102ms/step - loss: 0.1786 - acc: 0.9903 - val_loss: 0.3517 - val_acc: 0.9401Epoch 334/50051s 102ms/step - loss: 0.1812 - acc: 0.9894 - val_loss: 0.3529 - val_acc: 0.9407Epoch 335/50051s 102ms/step - loss: 0.1796 - acc: 0.9894 - val_loss: 0.3528 - val_acc: 0.9424Epoch 336/50051s 102ms/step - loss: 0.1792 - acc: 0.9896 - val_loss: 0.3534 - val_acc: 0.9400Epoch 337/50051s 102ms/step - loss: 0.1777 - acc: 0.9902 - val_loss: 0.3531 - val_acc: 0.9401Epoch 338/50051s 102ms/step - loss: 0.1774 - acc: 0.9904 - val_loss: 0.3525 - val_acc: 0.9392Epoch 339/50052s 103ms/step - loss: 0.1768 - acc: 0.9905 - val_loss: 0.3517 - val_acc: 0.9414Epoch 340/50051s 103ms/step - loss: 0.1781 - acc: 0.9899 - val_loss: 0.3540 - val_acc: 0.9417Epoch 341/50051s 102ms/step - loss: 0.1750 - acc: 0.9908 - val_loss: 0.3574 - val_acc: 0.9396Epoch 342/50051s 102ms/step - loss: 0.1756 - acc: 0.9906 - val_loss: 0.3560 - val_acc: 0.9407Epoch 343/50051s 102ms/step - loss: 0.1758 - acc: 0.9908 - val_loss: 0.3568 - val_acc: 0.9401Epoch 344/50051s 102ms/step - loss: 0.1767 - acc: 0.9904 - val_loss: 0.3518 - val_acc: 0.9418Epoch 345/50051s 102ms/step - loss: 0.1748 - acc: 0.9911 - val_loss: 0.3542 - val_acc: 0.9399Epoch 346/50051s 102ms/step - loss: 0.1737 - acc: 0.9909 - val_loss: 0.3557 - val_acc: 0.9399Epoch 347/50051s 102ms/step - loss: 0.1727 - acc: 0.9915 - val_loss: 0.3572 - val_acc: 0.9395Epoch 348/50051s 102ms/step - loss: 0.1722 - acc: 0.9914 - val_loss: 0.3557 - val_acc: 0.9395Epoch 349/50051s 102ms/step - loss: 0.1743 - acc: 0.9904 - val_loss: 0.3515 - val_acc: 0.9407Epoch 350/50051s 102ms/step - loss: 0.1730 - acc: 0.9910 - val_loss: 0.3529 - val_acc: 0.9395Epoch 351/50051s 102ms/step - loss: 0.1732 - acc: 0.9906 - val_loss: 0.3518 - val_acc: 0.9390Epoch 352/50051s 102ms/step - loss: 0.1724 - acc: 0.9911 - val_loss: 0.3526 - val_acc: 0.9409Epoch 353/50051s 102ms/step - loss: 0.1705 - acc: 0.9913 - val_loss: 0.3529 - val_acc: 0.9404Epoch 354/50051s 103ms/step - loss: 0.1712 - acc: 0.9912 - val_loss: 0.3517 - val_acc: 0.9399Epoch 355/50051s 102ms/step - loss: 0.1697 - acc: 0.9915 - val_loss: 0.3525 - val_acc: 0.9416Epoch 356/50051s 102ms/step - loss: 0.1709 - acc: 0.9906 - val_loss: 0.3489 - val_acc: 0.9410Epoch 357/50051s 102ms/step - loss: 0.1689 - acc: 0.9921 - val_loss: 0.3503 - val_acc: 0.9412Epoch 358/50051s 102ms/step - loss: 0.1692 - acc: 0.9914 - val_loss: 0.3512 - val_acc: 0.9410Epoch 359/50051s 102ms/step - loss: 0.1703 - acc: 0.9914 - val_loss: 0.3504 - val_acc: 0.9407Epoch 360/50051s 102ms/step - loss: 0.1705 - acc: 0.9910 - val_loss: 0.3535 - val_acc: 0.9412Epoch 361/50051s 102ms/step - loss: 0.1684 - acc: 0.9922 - val_loss: 0.3494 - val_acc: 0.9413Epoch 362/50051s 102ms/step - loss: 0.1682 - acc: 0.9922 - val_loss: 0.3516 - val_acc: 0.9401Epoch 363/50051s 102ms/step - loss: 0.1687 - acc: 0.9910 - val_loss: 0.3511 - val_acc: 0.9403Epoch 364/50051s 102ms/step - loss: 0.1679 - acc: 0.9917 - val_loss: 0.3539 - val_acc: 0.9394Epoch 365/50051s 102ms/step - loss: 0.1685 - acc: 0.9909 - val_loss: 0.3545 - val_acc: 0.9389Epoch 366/50051s 102ms/step - loss: 0.1672 - acc: 0.9914 - val_loss: 0.3546 - val_acc: 0.9385Epoch 367/50051s 102ms/step - loss: 0.1663 - acc: 0.9921 - val_loss: 0.3563 - val_acc: 0.9391Epoch 368/50051s 102ms/step - loss: 0.1658 - acc: 0.9920 - val_loss: 0.3546 - val_acc: 0.9398Epoch 369/50051s 102ms/step - loss: 0.1675 - acc: 0.9908 - val_loss: 0.3553 - val_acc: 0.9393Epoch 370/50051s 102ms/step - loss: 0.1659 - acc: 0.9916 - val_loss: 0.3556 - val_acc: 0.9387Epoch 371/50051s 102ms/step - loss: 0.1656 - acc: 0.9915 - val_loss: 0.3538 - val_acc: 0.9386Epoch 372/50051s 102ms/step - loss: 0.1651 - acc: 0.9918 - val_loss: 0.3547 - val_acc: 0.9400Epoch 373/50051s 102ms/step - loss: 0.1656 - acc: 0.9916 - val_loss: 0.3567 - val_acc: 0.9399Epoch 374/50051s 102ms/step - loss: 0.1625 - acc: 0.9931 - val_loss: 0.3539 - val_acc: 0.9399Epoch 375/50051s 102ms/step - loss: 0.1644 - acc: 0.9920 - val_loss: 0.3547 - val_acc: 0.9399Epoch 376/50051s 102ms/step - loss: 0.1632 - acc: 0.9920 - val_loss: 0.3592 - val_acc: 0.9378Epoch 377/50051s 102ms/step - loss: 0.1620 - acc: 0.9927 - val_loss: 0.3564 - val_acc: 0.9392Epoch 378/50051s 102ms/step - loss: 0.1633 - acc: 0.9926 - val_loss: 0.3542 - val_acc: 0.9378Epoch 379/50051s 102ms/step - loss: 0.1629 - acc: 0.9925 - val_loss: 0.3539 - val_acc: 0.9391Epoch 380/50051s 102ms/step - loss: 0.1614 - acc: 0.9924 - val_loss: 0.3527 - val_acc: 0.9389Epoch 381/50051s 102ms/step - loss: 0.1632 - acc: 0.9917 - val_loss: 0.3532 - val_acc: 0.9400Epoch 382/50051s 102ms/step - loss: 0.1627 - acc: 0.9922 - val_loss: 0.3520 - val_acc: 0.9411Epoch 383/50051s 102ms/step - loss: 0.1615 - acc: 0.9922 - val_loss: 0.3541 - val_acc: 0.9399Epoch 384/50051s 102ms/step - loss: 0.1611 - acc: 0.9928 - val_loss: 0.3528 - val_acc: 0.9395Epoch 385/50051s 102ms/step - loss: 0.1613 - acc: 0.9921 - val_loss: 0.3521 - val_acc: 0.9406Epoch 386/50051s 102ms/step - loss: 0.1612 - acc: 0.9918 - val_loss: 0.3535 - val_acc: 0.9400Epoch 387/50051s 102ms/step - loss: 0.1606 - acc: 0.9927 - val_loss: 0.3505 - val_acc: 0.9412Epoch 388/50051s 102ms/step - loss: 0.1607 - acc: 0.9919 - val_loss: 0.3497 - val_acc: 0.9419Epoch 389/50051s 102ms/step - loss: 0.1589 - acc: 0.9925 - val_loss: 0.3491 - val_acc: 0.9431Epoch 390/50051s 102ms/step - loss: 0.1604 - acc: 0.9923 - val_loss: 0.3523 - val_acc: 0.9429Epoch 391/50051s 102ms/step - loss: 0.1588 - acc: 0.9927 - val_loss: 0.3509 - val_acc: 0.9417Epoch 392/50051s 102ms/step - loss: 0.1596 - acc: 0.9924 - val_loss: 0.3496 - val_acc: 0.9404Epoch 393/50051s 102ms/step - loss: 0.1584 - acc: 0.9928 - val_loss: 0.3477 - val_acc: 0.9425Epoch 394/50051s 102ms/step - loss: 0.1588 - acc: 0.9921 - val_loss: 0.3503 - val_acc: 0.9397Epoch 395/50051s 103ms/step - loss: 0.1582 - acc: 0.9925 - val_loss: 0.3512 - val_acc: 0.9399Epoch 396/50051s 102ms/step - loss: 0.1570 - acc: 0.9931 - val_loss: 0.3508 - val_acc: 0.9406Epoch 397/50051s 102ms/step - loss: 0.1573 - acc: 0.9927 - val_loss: 0.3472 - val_acc: 0.9397Epoch 398/50051s 102ms/step - loss: 0.1562 - acc: 0.9928 - val_loss: 0.3480 - val_acc: 0.9408ETA: 0s - loss: 0.1562 - acc: 0.9928Epoch 399/50051s 102ms/step - loss: 0.1569 - acc: 0.9930 - val_loss: 0.3514 - val_acc: 0.9402Epoch 400/50051s 102ms/step - loss: 0.1557 - acc: 0.9932 - val_loss: 0.3516 - val_acc: 0.9390Epoch 401/50051s 102ms/step - loss: 0.1567 - acc: 0.9929 - val_loss: 0.3554 - val_acc: 0.9403Epoch 402/50051s 102ms/step - loss: 0.1572 - acc: 0.9922 - val_loss: 0.3525 - val_acc: 0.9385Epoch 403/50051s 102ms/step - loss: 0.1560 - acc: 0.9929 - val_loss: 0.3521 - val_acc: 0.9395Epoch 404/50051s 102ms/step - loss: 0.1564 - acc: 0.9927 - val_loss: 0.3491 - val_acc: 0.9402Epoch 405/50051s 103ms/step - loss: 0.1543 - acc: 0.9932 - val_loss: 0.3494 - val_acc: 0.9417Epoch 406/50051s 102ms/step - loss: 0.1553 - acc: 0.9926 - val_loss: 0.3483 - val_acc: 0.9408Epoch 407/50051s 102ms/step - loss: 0.1548 - acc: 0.9930 - val_loss: 0.3520 - val_acc: 0.9399Epoch 408/50051s 102ms/step - loss: 0.1548 - acc: 0.9929 - val_loss: 0.3523 - val_acc: 0.9411Epoch 409/50051s 102ms/step - loss: 0.1552 - acc: 0.9926 - val_loss: 0.3506 - val_acc: 0.9399Epoch 410/50051s 102ms/step - loss: 0.1530 - acc: 0.9939 - val_loss: 0.3460 - val_acc: 0.9406Epoch 411/50051s 102ms/step - loss: 0.1549 - acc: 0.9924 - val_loss: 0.3445 - val_acc: 0.9397Epoch 412/50051s 102ms/step - loss: 0.1533 - acc: 0.9931 - val_loss: 0.3445 - val_acc: 0.9410Epoch 413/50051s 102ms/step - loss: 0.1534 - acc: 0.9928 - val_loss: 0.3471 - val_acc: 0.9406Epoch 414/50051s 102ms/step - loss: 0.1542 - acc: 0.9922 - val_loss: 0.3478 - val_acc: 0.9417Epoch 415/50051s 102ms/step - loss: 0.1518 - acc: 0.9934 - val_loss: 0.3527 - val_acc: 0.9414Epoch 416/50051s 102ms/step - loss: 0.1529 - acc: 0.9926 - val_loss: 0.3484 - val_acc: 0.9423Epoch 417/50051s 102ms/step - loss: 0.1533 - acc: 0.9928 - val_loss: 0.3468 - val_acc: 0.9421Epoch 418/50051s 103ms/step - loss: 0.1531 - acc: 0.9927 - val_loss: 0.3497 - val_acc: 0.9418Epoch 419/50051s 102ms/step - loss: 0.1519 - acc: 0.9931 - val_loss: 0.3500 - val_acc: 0.9413Epoch 420/50051s 102ms/step - loss: 0.1512 - acc: 0.9932 - val_loss: 0.3480 - val_acc: 0.9418Epoch 421/50051s 102ms/step - loss: 0.1512 - acc: 0.9931 - val_loss: 0.3491 - val_acc: 0.9414Epoch 422/50051s 102ms/step - loss: 0.1518 - acc: 0.9927 - val_loss: 0.3464 - val_acc: 0.9412Epoch 423/50051s 102ms/step - loss: 0.1505 - acc: 0.9935 - val_loss: 0.3495 - val_acc: 0.9400Epoch 424/50051s 102ms/step - loss: 0.1502 - acc: 0.9930 - val_loss: 0.3467 - val_acc: 0.9418Epoch 425/50051s 102ms/step - loss: 0.1513 - acc: 0.9928 - val_loss: 0.3472 - val_acc: 0.9391Epoch 426/50051s 102ms/step - loss: 0.1496 - acc: 0.9933 - val_loss: 0.3430 - val_acc: 0.9400Epoch 427/50051s 102ms/step - loss: 0.1490 - acc: 0.9935 - val_loss: 0.3452 - val_acc: 0.9398Epoch 428/50051s 102ms/step - loss: 0.1499 - acc: 0.9931 - val_loss: 0.3456 - val_acc: 0.9394Epoch 429/50051s 102ms/step - loss: 0.1496 - acc: 0.9933 - val_loss: 0.3405 - val_acc: 0.9426Epoch 430/50051s 102ms/step - loss: 0.1486 - acc: 0.9933 - val_loss: 0.3467 - val_acc: 0.9404Epoch 431/50051s 103ms/step - loss: 0.1481 - acc: 0.9933 - val_loss: 0.3448 - val_acc: 0.9408Epoch 432/50051s 102ms/step - loss: 0.1479 - acc: 0.9936 - val_loss: 0.3479 - val_acc: 0.9390Epoch 433/50051s 102ms/step - loss: 0.1485 - acc: 0.9934 - val_loss: 0.3484 - val_acc: 0.9380Epoch 434/50051s 102ms/step - loss: 0.1479 - acc: 0.9935 - val_loss: 0.3495 - val_acc: 0.9384Epoch 435/50051s 102ms/step - loss: 0.1468 - acc: 0.9940 - val_loss: 0.3490 - val_acc: 0.9425Epoch 436/50051s 103ms/step - loss: 0.1482 - acc: 0.9929 - val_loss: 0.3538 - val_acc: 0.9386Epoch 437/50051s 102ms/step - loss: 0.1478 - acc: 0.9932 - val_loss: 0.3494 - val_acc: 0.9400Epoch 438/50051s 102ms/step - loss: 0.1476 - acc: 0.9937 - val_loss: 0.3508 - val_acc: 0.9386Epoch 439/50051s 102ms/step - loss: 0.1483 - acc: 0.9932 - val_loss: 0.3483 - val_acc: 0.9386Epoch 440/50051s 103ms/step - loss: 0.1475 - acc: 0.9932 - val_loss: 0.3466 - val_acc: 0.9402Epoch 441/50051s 102ms/step - loss: 0.1473 - acc: 0.9932 - val_loss: 0.3477 - val_acc: 0.9406Epoch 442/50051s 102ms/step - loss: 0.1469 - acc: 0.9931 - val_loss: 0.3461 - val_acc: 0.9402Epoch 443/50051s 102ms/step - loss: 0.1460 - acc: 0.9936 - val_loss: 0.3483 - val_acc: 0.9404Epoch 444/50051s 102ms/step - loss: 0.1463 - acc: 0.9936 - val_loss: 0.3438 - val_acc: 0.9407Epoch 445/50051s 102ms/step - loss: 0.1454 - acc: 0.9936 - val_loss: 0.3488 - val_acc: 0.9401Epoch 446/50051s 102ms/step - loss: 0.1452 - acc: 0.9935 - val_loss: 0.3450 - val_acc: 0.9402Epoch 447/50051s 102ms/step - loss: 0.1452 - acc: 0.9933 - val_loss: 0.3494 - val_acc: 0.9401Epoch 448/50051s 102ms/step - loss: 0.1447 - acc: 0.9936 - val_loss: 0.3448 - val_acc: 0.9418Epoch 449/50051s 102ms/step - loss: 0.1446 - acc: 0.9936 - val_loss: 0.3507 - val_acc: 0.9403Epoch 450/50051s 103ms/step - loss: 0.1442 - acc: 0.9935 - val_loss: 0.3464 - val_acc: 0.9401Epoch 451/500lr changed to 9.999999310821295e-0551s 102ms/step - loss: 0.1437 - acc: 0.9936 - val_loss: 0.3447 - val_acc: 0.9410Epoch 452/50051s 102ms/step - loss: 0.1435 - acc: 0.9938 - val_loss: 0.3450 - val_acc: 0.9408ETA: 19s - loss: 0.1439 - acc: 0.9936Epoch 453/50051s 102ms/step - loss: 0.1421 - acc: 0.9943 - val_loss: 0.3444 - val_acc: 0.9402Epoch 454/50051s 102ms/step - loss: 0.1419 - acc: 0.9945 - val_loss: 0.3443 - val_acc: 0.9407Epoch 455/50051s 102ms/step - loss: 0.1424 - acc: 0.9943 - val_loss: 0.3445 - val_acc: 0.9407Epoch 456/50051s 102ms/step - loss: 0.1425 - acc: 0.9942 - val_loss: 0.3442 - val_acc: 0.9406Epoch 457/50051s 102ms/step - loss: 0.1420 - acc: 0.9946 - val_loss: 0.3440 - val_acc: 0.9408Epoch 458/50051s 102ms/step - loss: 0.1422 - acc: 0.9938 - val_loss: 0.3443 - val_acc: 0.9407Epoch 459/50051s 102ms/step - loss: 0.1399 - acc: 0.9953 - val_loss: 0.3442 - val_acc: 0.9411Epoch 460/50051s 102ms/step - loss: 0.1417 - acc: 0.9944 - val_loss: 0.3446 - val_acc: 0.9415Epoch 461/50051s 102ms/step - loss: 0.1420 - acc: 0.9942 - val_loss: 0.3451 - val_acc: 0.9415Epoch 462/50051s 102ms/step - loss: 0.1402 - acc: 0.9949 - val_loss: 0.3443 - val_acc: 0.9413Epoch 463/50051s 102ms/step - loss: 0.1414 - acc: 0.9944 - val_loss: 0.3437 - val_acc: 0.9407Epoch 464/50051s 102ms/step - loss: 0.1411 - acc: 0.9947 - val_loss: 0.3442 - val_acc: 0.9408Epoch 465/50051s 102ms/step - loss: 0.1403 - acc: 0.9949 - val_loss: 0.3440 - val_acc: 0.9414Epoch 466/50051s 103ms/step - loss: 0.1406 - acc: 0.9948 - val_loss: 0.3439 - val_acc: 0.9408Epoch 467/50051s 102ms/step - loss: 0.1407 - acc: 0.9947 - val_loss: 0.3438 - val_acc: 0.9412Epoch 468/50051s 102ms/step - loss: 0.1419 - acc: 0.9942 - val_loss: 0.3438 - val_acc: 0.9413Epoch 469/50051s 102ms/step - loss: 0.1417 - acc: 0.9945 - val_loss: 0.3443 - val_acc: 0.9406Epoch 470/50051s 102ms/step - loss: 0.1409 - acc: 0.9945 - val_loss: 0.3439 - val_acc: 0.9407Epoch 471/50051s 102ms/step - loss: 0.1418 - acc: 0.9939 - val_loss: 0.3439 - val_acc: 0.9410Epoch 472/50051s 102ms/step - loss: 0.1407 - acc: 0.9948 - val_loss: 0.3434 - val_acc: 0.9408Epoch 473/50051s 102ms/step - loss: 0.1413 - acc: 0.9942 - val_loss: 0.3438 - val_acc: 0.9410Epoch 474/50051s 102ms/step - loss: 0.1394 - acc: 0.9950 - val_loss: 0.3439 - val_acc: 0.9413Epoch 475/50051s 102ms/step - loss: 0.1405 - acc: 0.9946 - val_loss: 0.3438 - val_acc: 0.9414Epoch 476/50051s 102ms/step - loss: 0.1396 - acc: 0.9949 - val_loss: 0.3438 - val_acc: 0.9411Epoch 477/50051s 102ms/step - loss: 0.1401 - acc: 0.9950 - val_loss: 0.3439 - val_acc: 0.9411Epoch 478/50051s 102ms/step - loss: 0.1406 - acc: 0.9947 - val_loss: 0.3442 - val_acc: 0.9408Epoch 479/50051s 102ms/step - loss: 0.1394 - acc: 0.9952 - val_loss: 0.3446 - val_acc: 0.9408Epoch 480/50051s 102ms/step - loss: 0.1399 - acc: 0.9950 - val_loss: 0.3447 - val_acc: 0.9402Epoch 481/50051s 102ms/step - loss: 0.1409 - acc: 0.9945 - val_loss: 0.3451 - val_acc: 0.9405Epoch 482/50051s 102ms/step - loss: 0.1392 - acc: 0.9955 - val_loss: 0.3446 - val_acc: 0.9406Epoch 483/50051s 102ms/step - loss: 0.1403 - acc: 0.9948 - val_loss: 0.3450 - val_acc: 0.9402Epoch 484/50051s 102ms/step - loss: 0.1396 - acc: 0.9950 - val_loss: 0.3450 - val_acc: 0.9403Epoch 485/50051s 102ms/step - loss: 0.1395 - acc: 0.9949 - val_loss: 0.3449 - val_acc: 0.9401Epoch 486/50051s 102ms/step - loss: 0.1413 - acc: 0.9944 - val_loss: 0.3450 - val_acc: 0.9402Epoch 487/50051s 103ms/step - loss: 0.1406 - acc: 0.9945 - val_loss: 0.3447 - val_acc: 0.9403Epoch 488/50051s 102ms/step - loss: 0.1403 - acc: 0.9943 - val_loss: 0.3448 - val_acc: 0.9409Epoch 489/50051s 102ms/step - loss: 0.1408 - acc: 0.9948 - val_loss: 0.3442 - val_acc: 0.9413Epoch 490/50051s 102ms/step - loss: 0.1398 - acc: 0.9951 - val_loss: 0.3438 - val_acc: 0.9416Epoch 491/50051s 102ms/step - loss: 0.1398 - acc: 0.9947 - val_loss: 0.3438 - val_acc: 0.9409Epoch 492/50051s 102ms/step - loss: 0.1396 - acc: 0.9951 - val_loss: 0.3437 - val_acc: 0.9414Epoch 493/50051s 102ms/step - loss: 0.1402 - acc: 0.9947 - val_loss: 0.3434 - val_acc: 0.9416Epoch 494/50051s 102ms/step - loss: 0.1400 - acc: 0.9951 - val_loss: 0.3441 - val_acc: 0.9413Epoch 495/50051s 102ms/step - loss: 0.1395 - acc: 0.9952 - val_loss: 0.3442 - val_acc: 0.9409Epoch 496/50051s 102ms/step - loss: 0.1395 - acc: 0.9948 - val_loss: 0.3446 - val_acc: 0.9413Epoch 497/50051s 102ms/step - loss: 0.1386 - acc: 0.9953 - val_loss: 0.3444 - val_acc: 0.9415Epoch 498/50051s 102ms/step - loss: 0.1396 - acc: 0.9951 - val_loss: 0.3440 - val_acc: 0.9415Epoch 499/50051s 102ms/step - loss: 0.1396 - acc: 0.9950 - val_loss: 0.3444 - val_acc: 0.9414Epoch 500/50051s 102ms/step - loss: 0.1386 - acc: 0.9951 - val_loss: 0.3449 - val_acc: 0.9417Train loss: 0.128365815192461Train accuracy: 0.9987200012207031Test loss: 0.3449158281087875Test accuracy: 0.941700000166893
相较于调参记录18的94.28%,这次的测试准确率低了一点。
但是,值得指出的是,这次只训练了500个epoch,而调参记录18训练了5000个epoch。通过观察loss可以发现,其实这次的loss下降得更快。
Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458
https://ieeexplore.ieee.org/d...