本文介绍哈工大团队提出的一种Dynamic ReLU激活函数,即自适应参数化ReLU激活函数,原本是应用在基于一维振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布。
本文在调参记录9的基础上,在数据增强部分添加了shear_range = 30,测试Adaptively Parametric ReLU(APReLU)激活函数在Cifar10图像集上的效果。
Keras里ImageDataGenerator的用法见如下网址:
https://fairyonice.github.io/Learn-about-ImageDataGenerator.html
自适应参数化ReLU激活函数的基本原理见下图:
自适应参数化ReLU:一种Dynamic ReLU激活函数Keras程序如下:
#!/usr/bin/env python3 # -*- coding: utf-8 -*- """ Created on Tue Apr 14 04:17:45 2020 Implemented using TensorFlow 1.0.1 and Keras 2.2.1 Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020 @author: Minghang Zhao """ from __future__ import print_function import keras import numpy as np from keras.datasets import cifar10 from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape from keras.regularizers import l2 from keras import backend as K from keras.models import Model from keras import optimizers from keras.preprocessing.image import ImageDataGenerator from keras.callbacks import LearningRateScheduler K.set_learning_phase(1) # The data, split between train and test sets (x_train, y_train), (x_test, y_test) = cifar10.load_data() x_train = x_train.astype('float32') / 255. x_test = x_test.astype('float32') / 255. x_test = x_test-np.mean(x_train) x_train = x_train-np.mean(x_train) print('x_train shape:', x_train.shape) print(x_train.shape[0], 'train samples') print(x_test.shape[0], 'test samples') # convert class vectors to binary class matrices y_train = keras.utils.to_categorical(y_train, 10) y_test = keras.utils.to_categorical(y_test, 10) # Schedule the learning rate, multiply 0.1 every 300 epoches def scheduler(epoch): if epoch % 300 == 0 and epoch != 0: lr = K.get_value(model.optimizer.lr) K.set_value(model.optimizer.lr, lr * 0.1) print("lr changed to {}".format(lr * 0.1)) return K.get_value(model.optimizer.lr) # An adaptively parametric rectifier linear unit (APReLU) def aprelu(inputs): # get the number of channels channels = inputs.get_shape().as_list()[-1] # get a zero feature map zeros_input = keras.layers.subtract([inputs, inputs]) # get a feature map with only positive features pos_input = Activation('relu')(inputs) # get a feature map with only negative features neg_input = Minimum()([inputs,zeros_input]) # define a network to obtain the scaling coefficients scales_p = GlobalAveragePooling2D()(pos_input) scales_n = GlobalAveragePooling2D()(neg_input) scales = Concatenate()([scales_n, scales_p]) scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales) scales = Activation('relu')(scales) scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales) scales = Activation('sigmoid')(scales) scales = Reshape((1,1,channels))(scales) # apply a paramtetric relu neg_part = keras.layers.multiply([scales, neg_input]) return keras.layers.add([pos_input, neg_part]) # Residual Block def residual_block(incoming, nb_blocks, out_channels, downsample=False, downsample_strides=2): residual = incoming in_channels = incoming.get_shape().as_list()[-1] for i in range(nb_blocks): identity = residual if not downsample: downsample_strides = 1 residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual) residual = aprelu(residual) residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual) residual = aprelu(residual) residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) # Downsampling if downsample_strides > 1: identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity) # Zero_padding to match channels if in_channels != out_channels: zeros_identity = keras.layers.subtract([identity, identity]) identity = keras.layers.concatenate([identity, zeros_identity]) in_channels = out_channels residual = keras.layers.add([residual, identity]) return residual # define and train a model inputs = Input(shape=(32, 32, 3)) net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs) net = residual_block(net, 9, 16, downsample=False) net = residual_block(net, 1, 32, downsample=True) net = residual_block(net, 8, 32, downsample=False) net = residual_block(net, 1, 64, downsample=True) net = residual_block(net, 8, 64, downsample=False) net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net) net = Activation('relu')(net) net = GlobalAveragePooling2D()(net) outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net) model = Model(inputs=inputs, outputs=outputs) sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True) model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy']) # data augmentation datagen = ImageDataGenerator( # randomly rotate images in the range (deg 0 to 180) rotation_range=30, # shear angle in counter-clockwise direction in degrees shear_range = 30, # randomly flip images horizontal_flip=True, # randomly shift images horizontally width_shift_range=0.125, # randomly shift images vertically height_shift_range=0.125) reduce_lr = LearningRateScheduler(scheduler) # fit the model on the batches generated by datagen.flow(). model.fit_generator(datagen.flow(x_train, y_train, batch_size=100), validation_data=(x_test, y_test), epochs=1000, verbose=1, callbacks=[reduce_lr], workers=4) # get results K.set_learning_phase(0) DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0) print('Train loss:', DRSN_train_score[0]) print('Train accuracy:', DRSN_train_score[1]) DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0) print('Test loss:', DRSN_test_score[0]) print('Test accuracy:', DRSN_test_score[1])实验结果如下:
x_train shape: (50000, 32, 32, 3) 50000 train samples 10000 test samples Epoch 1/1000 113s 225ms/step - loss: 3.2549 - acc: 0.4158 - val_loss: 2.7729 - val_acc: 0.5394 Epoch 2/1000 68s 137ms/step - loss: 2.6403 - acc: 0.5484 - val_loss: 2.3416 - val_acc: 0.6117 Epoch 3/1000 69s 138ms/step - loss: 2.2763 - acc: 0.6049 - val_loss: 2.0151 - val_acc: 0.6705 Epoch 4/1000 69s 137ms/step - loss: 2.0062 - acc: 0.6393 - val_loss: 1.8055 - val_acc: 0.6907 Epoch 5/1000 69s 137ms/step - loss: 1.7997 - acc: 0.6673 - val_loss: 1.6339 - val_acc: 0.7058 Epoch 6/1000 69s 138ms/step - loss: 1.6338 - acc: 0.6849 - val_loss: 1.4391 - val_acc: 0.7345 Epoch 7/1000 69s 138ms/step - loss: 1.4911 - acc: 0.7032 - val_loss: 1.3495 - val_acc: 0.7435 Epoch 8/1000 69s 138ms/step - loss: 1.3733 - acc: 0.7196 - val_loss: 1.2311 - val_acc: 0.7668 Epoch 9/1000 68s 137ms/step - loss: 1.2893 - acc: 0.7308 - val_loss: 1.1543 - val_acc: 0.7741 Epoch 10/1000 68s 137ms/step - loss: 1.2164 - acc: 0.7402 - val_loss: 1.0974 - val_acc: 0.7761 Epoch 11/1000 69s 137ms/step - loss: 1.1580 - acc: 0.7470 - val_loss: 1.0477 - val_acc: 0.7835 Epoch 12/1000 69s 137ms/step - loss: 1.1127 - acc: 0.7519 - val_loss: 1.0269 - val_acc: 0.7813 Epoch 13/1000 69s 138ms/step - loss: 1.0713 - acc: 0.7598 - val_loss: 0.9656 - val_acc: 0.7996 Epoch 14/1000 68s 136ms/step - loss: 1.0369 - acc: 0.7664 - val_loss: 0.9576 - val_acc: 0.7929 Epoch 15/1000 68s 135ms/step - loss: 1.0158 - acc: 0.7677 - val_loss: 0.9189 - val_acc: 0.8064 Epoch 16/1000 68s 135ms/step - loss: 0.9948 - acc: 0.7733 - val_loss: 0.9198 - val_acc: 0.8022 Epoch 17/1000 68s 136ms/step - loss: 0.9720 - acc: 0.7775 - val_loss: 0.9267 - val_acc: 0.7954 Epoch 18/1000 68s 135ms/step - loss: 0.9548 - acc: 0.7813 - val_loss: 0.8897 - val_acc: 0.8043 Epoch 19/1000 68s 135ms/step - loss: 0.9446 - acc: 0.7847 - val_loss: 0.8642 - val_acc: 0.8104 Epoch 20/1000 68s 135ms/step - loss: 0.9290 - acc: 0.7873 - val_loss: 0.8666 - val_acc: 0.8119 Epoch 21/1000 68s 135ms/step - loss: 0.9131 - acc: 0.7913 - val_loss: 0.8433 - val_acc: 0.8202 Epoch 22/1000 68s 135ms/step - loss: 0.9099 - acc: 0.7912 - val_loss: 0.8735 - val_acc: 0.8077 Epoch 23/1000 67s 135ms/step - loss: 0.9000 - acc: 0.7956 - val_loss: 0.8418 - val_acc: 0.8150 Epoch 24/1000 68s 135ms/step - loss: 0.8962 - acc: 0.7966 - val_loss: 0.8452 - val_acc: 0.8181 Epoch 25/1000 68s 135ms/step - loss: 0.8874 - acc: 0.7994 - val_loss: 0.8209 - val_acc: 0.8242 Epoch 26/1000 68s 136ms/step - loss: 0.8810 - acc: 0.8021 - val_loss: 0.8378 - val_acc: 0.8202 Epoch 27/1000 68s 135ms/step - loss: 0.8764 - acc: 0.8026 - val_loss: 0.8474 - val_acc: 0.8173 Epoch 28/1000 67s 135ms/step - loss: 0.8706 - acc: 0.8040 - val_loss: 0.8239 - val_acc: 0.8230 Epoch 29/1000 68s 135ms/step - loss: 0.8655 - acc: 0.8075 - val_loss: 0.8163 - val_acc: 0.8244 Epoch 30/1000 68s 135ms/step - loss: 0.8600 - acc: 0.8074 - val_loss: 0.8065 - val_acc: 0.8288 Epoch 31/1000 68s 135ms/step - loss: 0.8544 - acc: 0.8113 - val_loss: 0.8080 - val_acc: 0.8306 Epoch 32/1000 68s 135ms/step - loss: 0.8510 - acc: 0.8121 - val_loss: 0.8152 - val_acc: 0.8304 Epoch 33/1000 68s 135ms/step - loss: 0.8464 - acc: 0.8142 - val_loss: 0.7827 - val_acc: 0.8387 Epoch 34/1000 68s 135ms/step - loss: 0.8429 - acc: 0.8166 - val_loss: 0.7738 - val_acc: 0.8453 Epoch 35/1000 68s 135ms/step - loss: 0.8366 - acc: 0.8160 - val_loss: 0.7855 - val_acc: 0.8388 Epoch 36/1000 68s 135ms/step - loss: 0.8352 - acc: 0.8191 - val_loss: 0.7651 - val_acc: 0.8468 Epoch 37/1000 68s 135ms/step - loss: 0.8292 - acc: 0.8212 - val_loss: 0.7620 - val_acc: 0.8470 Epoch 38/1000 68s 135ms/step - loss: 0.8319 - acc: 0.8208 - val_loss: 0.7890 - val_acc: 0.8376 Epoch 39/1000 68s 136ms/step - loss: 0.8239 - acc: 0.8256 - val_loss: 0.7870 - val_acc: 0.8370 Epoch 40/1000 68s 135ms/step - loss: 0.8266 - acc: 0.8216 - val_loss: 0.7975 - val_acc: 0.8331 Epoch 41/1000 68s 135ms/step - loss: 0.8209 - acc: 0.8239 - val_loss: 0.7982 - val_acc: 0.8334 Epoch 42/1000 68s 135ms/step - loss: 0.8135 - acc: 0.8276 - val_loss: 0.7722 - val_acc: 0.8427 Epoch 43/1000 68s 135ms/step - loss: 0.8115 - acc: 0.8280 - val_loss: 0.7658 - val_acc: 0.8430 Epoch 44/1000 67s 135ms/step - loss: 0.8166 - acc: 0.8259 - val_loss: 0.7388 - val_acc: 0.8559 Epoch 45/1000 67s 135ms/step - loss: 0.8108 - acc: 0.8293 - val_loss: 0.7728 - val_acc: 0.8436 Epoch 46/1000 68s 135ms/step - loss: 0.8046 - acc: 0.8303 - val_loss: 0.7684 - val_acc: 0.8434 Epoch 47/1000 68s 136ms/step - loss: 0.8055 - acc: 0.8322 - val_loss: 0.7478 - val_acc: 0.8511 Epoch 48/1000 68s 135ms/step - loss: 0.8100 - acc: 0.8290 - val_loss: 0.7644 - val_acc: 0.8445 Epoch 49/1000 68s 135ms/step - loss: 0.8027 - acc: 0.8325 - val_loss: 0.7449 - val_acc: 0.8545 Epoch 50/1000 67s 135ms/step - loss: 0.8052 - acc: 0.8299 - val_loss: 0.7941 - val_acc: 0.8377 Epoch 51/1000 68s 135ms/step - loss: 0.7969 - acc: 0.8339 - val_loss: 0.7617 - val_acc: 0.8481 Epoch 52/1000 68s 135ms/step - loss: 0.7989 - acc: 0.8335 - val_loss: 0.7559 - val_acc: 0.8550 Epoch 53/1000 68s 136ms/step - loss: 0.7927 - acc: 0.8353 - val_loss: 0.7482 - val_acc: 0.8536 Epoch 54/1000 68s 135ms/step - loss: 0.7931 - acc: 0.8365 - val_loss: 0.7405 - val_acc: 0.8570 Epoch 55/1000 68s 135ms/step - loss: 0.7933 - acc: 0.8372 - val_loss: 0.7541 - val_acc: 0.8535 Epoch 56/1000 68s 135ms/step - loss: 0.7887 - acc: 0.8389 - val_loss: 0.7805 - val_acc: 0.8436 Epoch 57/1000 68s 135ms/step - loss: 0.7877 - acc: 0.8385 - val_loss: 0.7304 - val_acc: 0.8617 Epoch 58/1000 68s 135ms/step - loss: 0.7836 - acc: 0.8404 - val_loss: 0.7630 - val_acc: 0.8480 Epoch 59/1000 68s 135ms/step - loss: 0.7859 - acc: 0.8394 - val_loss: 0.7369 - val_acc: 0.8568 Epoch 60/1000 68s 135ms/step - loss: 0.7864 - acc: 0.8376 - val_loss: 0.7606 - val_acc: 0.8492 Epoch 61/1000 68s 135ms/step - loss: 0.7827 - acc: 0.8401 - val_loss: 0.7497 - val_acc: 0.8524 Epoch 62/1000 68s 135ms/step - loss: 0.7804 - acc: 0.8427 - val_loss: 0.7526 - val_acc: 0.8559 Epoch 63/1000 68s 135ms/step - loss: 0.7766 - acc: 0.8435 - val_loss: 0.7448 - val_acc: 0.8586 Epoch 64/1000 68s 135ms/step - loss: 0.7792 - acc: 0.8419 - val_loss: 0.7605 - val_acc: 0.8511 Epoch 65/1000 68s 135ms/step - loss: 0.7790 - acc: 0.8435 - val_loss: 0.7330 - val_acc: 0.8551 Epoch 66/1000 68s 135ms/step - loss: 0.7748 - acc: 0.8435 - val_loss: 0.7528 - val_acc: 0.8543 Epoch 67/1000 68s 135ms/step - loss: 0.7733 - acc: 0.8452 - val_loss: 0.7330 - val_acc: 0.8585 Epoch 68/1000 68s 135ms/step - loss: 0.7759 - acc: 0.8438 - val_loss: 0.7497 - val_acc: 0.8520 Epoch 69/1000 68s 135ms/step - loss: 0.7680 - acc: 0.8466 - val_loss: 0.7422 - val_acc: 0.8606 Epoch 70/1000 68s 135ms/step - loss: 0.7662 - acc: 0.8473 - val_loss: 0.7185 - val_acc: 0.8633 Epoch 71/1000 68s 135ms/step - loss: 0.7658 - acc: 0.8467 - val_loss: 0.7170 - val_acc: 0.8657 Epoch 72/1000 68s 135ms/step - loss: 0.7681 - acc: 0.8464 - val_loss: 0.7325 - val_acc: 0.8600 Epoch 73/1000 68s 135ms/step - loss: 0.7658 - acc: 0.8477 - val_loss: 0.7109 - val_acc: 0.8662 Epoch 74/1000 68s 135ms/step - loss: 0.7616 - acc: 0.8499 - val_loss: 0.7028 - val_acc: 0.8733 Epoch 75/1000 68s 135ms/step - loss: 0.7621 - acc: 0.8482 - val_loss: 0.7178 - val_acc: 0.8639 Epoch 76/1000 68s 135ms/step - loss: 0.7606 - acc: 0.8496 - val_loss: 0.7096 - val_acc: 0.8674 Epoch 77/1000 68s 135ms/step - loss: 0.7590 - acc: 0.8500 - val_loss: 0.7340 - val_acc: 0.8598 Epoch 78/1000 68s 135ms/step - loss: 0.7639 - acc: 0.8475 - val_loss: 0.7212 - val_acc: 0.8655 Epoch 79/1000 68s 135ms/step - loss: 0.7613 - acc: 0.8477 - val_loss: 0.7171 - val_acc: 0.8702 Epoch 80/1000 67s 135ms/step - loss: 0.7562 - acc: 0.8518 - val_loss: 0.7336 - val_acc: 0.8594 Epoch 81/1000 68s 136ms/step - loss: 0.7532 - acc: 0.8515 - val_loss: 0.7229 - val_acc: 0.8607 Epoch 82/1000 68s 135ms/step - loss: 0.7511 - acc: 0.8541 - val_loss: 0.7062 - val_acc: 0.8688 Epoch 83/1000 68s 135ms/step - loss: 0.7510 - acc: 0.8530 - val_loss: 0.6977 - val_acc: 0.8746 Epoch 84/1000 68s 135ms/step - loss: 0.7562 - acc: 0.8524 - val_loss: 0.7319 - val_acc: 0.8595 Epoch 85/1000 67s 135ms/step - loss: 0.7527 - acc: 0.8530 - val_loss: 0.7161 - val_acc: 0.8660 Epoch 86/1000 67s 135ms/step - loss: 0.7523 - acc: 0.8524 - val_loss: 0.7244 - val_acc: 0.8654 Epoch 87/1000 67s 135ms/step - loss: 0.7505 - acc: 0.8532 - val_loss: 0.7192 - val_acc: 0.8636 Epoch 88/1000 68s 135ms/step - loss: 0.7528 - acc: 0.8516 - val_loss: 0.7316 - val_acc: 0.8645 Epoch 89/1000 68s 135ms/step - loss: 0.7480 - acc: 0.8557 - val_loss: 0.7289 - val_acc: 0.8638 Epoch 90/1000 68s 135ms/step - loss: 0.7435 - acc: 0.8550 - val_loss: 0.7020 - val_acc: 0.8763 Epoch 91/1000 68s 135ms/step - loss: 0.7466 - acc: 0.8563 - val_loss: 0.6977 - val_acc: 0.8750 Epoch 92/1000 68s 135ms/step - loss: 0.7438 - acc: 0.8561 - val_loss: 0.7171 - val_acc: 0.8643 Epoch 93/1000 67s 135ms/step - loss: 0.7438 - acc: 0.8564 - val_loss: 0.7189 - val_acc: 0.8687 Epoch 94/1000 68s 135ms/step - loss: 0.7442 - acc: 0.8566 - val_loss: 0.7072 - val_acc: 0.8685 Epoch 95/1000 68s 135ms/step - loss: 0.7468 - acc: 0.8569 - val_loss: 0.7547 - val_acc: 0.8560 Epoch 96/1000 68s 135ms/step - loss: 0.7468 - acc: 0.8547 - val_loss: 0.7080 - val_acc: 0.8699 Epoch 97/1000 68s 135ms/step - loss: 0.7455 - acc: 0.8559 - val_loss: 0.7020 - val_acc: 0.8711 Epoch 98/1000 68s 135ms/step - loss: 0.7427 - acc: 0.8544 - val_loss: 0.7352 - val_acc: 0.8610 Epoch 99/1000 68s 136ms/step - loss: 0.7424 - acc: 0.8567 - val_loss: 0.7480 - val_acc: 0.8583 Epoch 100/1000 68s 135ms/step - loss: 0.7397 - acc: 0.8579 - val_loss: 0.7151 - val_acc: 0.8650 Epoch 101/1000 68s 135ms/step - loss: 0.7447 - acc: 0.8568 - val_loss: 0.7235 - val_acc: 0.8659 Epoch 102/1000 68s 135ms/step - loss: 0.7367 - acc: 0.8598 - val_loss: 0.7229 - val_acc: 0.8623 Epoch 103/1000 67s 135ms/step - loss: 0.7371 - acc: 0.8586 - val_loss: 0.6899 - val_acc: 0.8769 Epoch 104/1000 68s 135ms/step - loss: 0.7401 - acc: 0.8567 - val_loss: 0.7273 - val_acc: 0.8616 Epoch 105/1000 68s 135ms/step - loss: 0.7382 - acc: 0.8578 - val_loss: 0.7089 - val_acc: 0.8682 Epoch 106/1000 68s 135ms/step - loss: 0.7386 - acc: 0.8580 - val_loss: 0.7158 - val_acc: 0.8659 Epoch 107/1000 67s 135ms/step - loss: 0.7361 - acc: 0.8584 - val_loss: 0.7147 - val_acc: 0.8701 Epoch 108/1000 67s 135ms/step - loss: 0.7408 - acc: 0.8580 - val_loss: 0.7083 - val_acc: 0.8686 Epoch 109/1000 68s 135ms/step - loss: 0.7362 - acc: 0.8599 - val_loss: 0.7096 - val_acc: 0.8703 Epoch 110/1000 67s 135ms/step - loss: 0.7335 - acc: 0.8600 - val_loss: 0.7148 - val_acc: 0.8683 Epoch 111/1000 67s 135ms/step - loss: 0.7334 - acc: 0.8626 - val_loss: 0.7050 - val_acc: 0.8741 Epoch 112/1000 68s 135ms/step - loss: 0.7360 - acc: 0.8586 - val_loss: 0.7150 - val_acc: 0.8682 Epoch 113/1000 68s 136ms/step - loss: 0.7371 - acc: 0.8583 - val_loss: 0.7447 - val_acc: 0.8583 Epoch 114/1000 68s 135ms/step - loss: 0.7352 - acc: 0.8599 - val_loss: 0.6937 - val_acc: 0.8755 Epoch 115/1000 68s 135ms/step - loss: 0.7314 - acc: 0.8604 - val_loss: 0.7140 - val_acc: 0.8684 Epoch 116/1000 68s 135ms/step - loss: 0.7333 - acc: 0.8607 - val_loss: 0.7305 - val_acc: 0.8686 Epoch 117/1000 68s 135ms/step - loss: 0.7277 - acc: 0.8617 - val_loss: 0.7002 - val_acc: 0.8719 Epoch 118/1000 68s 135ms/step - loss: 0.7356 - acc: 0.8580 - val_loss: 0.6926 - val_acc: 0.8763 Epoch 119/1000 68s 135ms/step - loss: 0.7244 - acc: 0.8642 - val_loss: 0.7079 - val_acc: 0.8669 Epoch 120/1000 68s 136ms/step - loss: 0.7302 - acc: 0.8613 - val_loss: 0.7113 - val_acc: 0.8695 Epoch 121/1000 68s 135ms/step - loss: 0.7340 - acc: 0.8608 - val_loss: 0.7415 - val_acc: 0.8554 Epoch 122/1000 68s 135ms/step - loss: 0.7304 - acc: 0.8608 - val_loss: 0.6978 - val_acc: 0.8760 Epoch 123/1000 68s 135ms/step - loss: 0.7263 - acc: 0.8630 - val_loss: 0.6974 - val_acc: 0.8734 Epoch 124/1000 68s 135ms/step - loss: 0.7261 - acc: 0.8625 - val_loss: 0.7109 - val_acc: 0.8715 Epoch 125/1000 67s 135ms/step - loss: 0.7313 - acc: 0.8623 - val_loss: 0.6946 - val_acc: 0.8745 Epoch 126/1000 67s 135ms/step - loss: 0.7277 - acc: 0.8620 - val_loss: 0.7178 - val_acc: 0.8685 Epoch 127/1000 68s 135ms/step - loss: 0.7231 - acc: 0.8653 - val_loss: 0.6999 - val_acc: 0.8762 Epoch 128/1000 68s 135ms/step - loss: 0.7252 - acc: 0.8635 - val_loss: 0.7009 - val_acc: 0.8718 Epoch 129/1000 68s 135ms/step - loss: 0.7284 - acc: 0.8626 - val_loss: 0.7148 - val_acc: 0.8682 Epoch 130/1000 68s 135ms/step - loss: 0.7236 - acc: 0.8646 - val_loss: 0.6945 - val_acc: 0.8746 Epoch 131/1000 68s 135ms/step - loss: 0.7203 - acc: 0.8653 - val_loss: 0.7002 - val_acc: 0.8705 Epoch 132/1000 68s 135ms/step - loss: 0.7248 - acc: 0.8626 - val_loss: 0.7097 - val_acc: 0.8718 Epoch 133/1000 67s 135ms/step - loss: 0.7190 - acc: 0.8660 - val_loss: 0.6993 - val_acc: 0.8722 Epoch 134/1000 68s 136ms/step - loss: 0.7206 - acc: 0.8645 - val_loss: 0.7042 - val_acc: 0.8763 Epoch 135/1000 68s 135ms/step - loss: 0.7248 - acc: 0.8637 - val_loss: 0.6742 - val_acc: 0.8844 Epoch 136/1000 68s 135ms/step - loss: 0.7181 - acc: 0.8650 - val_loss: 0.6972 - val_acc: 0.8721 Epoch 137/1000 67s 135ms/step - loss: 0.7170 - acc: 0.8667 - val_loss: 0.7270 - val_acc: 0.8642 Epoch 138/1000 68s 135ms/step - loss: 0.7209 - acc: 0.8649 - val_loss: 0.7107 - val_acc: 0.8687 Epoch 139/1000 68s 136ms/step - loss: 0.7195 - acc: 0.8652 - val_loss: 0.6993 - val_acc: 0.8752 Epoch 140/1000 68s 135ms/step - loss: 0.7229 - acc: 0.8647 - val_loss: 0.6949 - val_acc: 0.8800 Epoch 141/1000 67s 135ms/step - loss: 0.7154 - acc: 0.8674 - val_loss: 0.6828 - val_acc: 0.8780 Epoch 142/1000 67s 135ms/step - loss: 0.7146 - acc: 0.8675 - val_loss: 0.6799 - val_acc: 0.8818 Epoch 143/1000 68s 135ms/step - loss: 0.7131 - acc: 0.8679 - val_loss: 0.7237 - val_acc: 0.8655 Epoch 144/1000 68s 135ms/step - loss: 0.7167 - acc: 0.8662 - val_loss: 0.7140 - val_acc: 0.8696 Epoch 145/1000 68s 136ms/step - loss: 0.7131 - acc: 0.8677 - val_loss: 0.7086 - val_acc: 0.8696 Epoch 146/1000 67s 135ms/step - loss: 0.7184 - acc: 0.8665 - val_loss: 0.7058 - val_acc: 0.8729 Epoch 147/1000 68s 135ms/step - loss: 0.7179 - acc: 0.8654 - val_loss: 0.7021 - val_acc: 0.8741 Epoch 148/1000 67s 135ms/step - loss: 0.7176 - acc: 0.8671 - val_loss: 0.6892 - val_acc: 0.8795 Epoch 149/1000 68s 135ms/step - loss: 0.7123 - acc: 0.8685 - val_loss: 0.7027 - val_acc: 0.8700 Epoch 150/1000 68s 136ms/step - loss: 0.7146 - acc: 0.8671 - val_loss: 0.6926 - val_acc: 0.8755 Epoch 151/1000 68s 135ms/step - loss: 0.7122 - acc: 0.8651 - val_loss: 0.7179 - val_acc: 0.8685 Epoch 152/1000 68s 136ms/step - loss: 0.7149 - acc: 0.8675 - val_loss: 0.7136 - val_acc: 0.8690 Epoch 153/1000 68s 135ms/step - loss: 0.7141 - acc: 0.8669 - val_loss: 0.7193 - val_acc: 0.8672 Epoch 154/1000 68s 136ms/step - loss: 0.7084 - acc: 0.8684 - val_loss: 0.6779 - val_acc: 0.8826 Epoch 155/1000 67s 135ms/step - loss: 0.7143 - acc: 0.8671 - val_loss: 0.7092 - val_acc: 0.8685 Epoch 156/1000 68s 136ms/step - loss: 0.7118 - acc: 0.8674 - val_loss: 0.7010 - val_acc: 0.8732 Epoch 157/1000 69s 138ms/step - loss: 0.7126 - acc: 0.8677 - val_loss: 0.6918 - val_acc: 0.8766 Epoch 158/1000 68s 137ms/step - loss: 0.7064 - acc: 0.8701 - val_loss: 0.7253 - val_acc: 0.8636 Epoch 159/1000 68s 137ms/step - loss: 0.7107 - acc: 0.8674 - val_loss: 0.7008 - val_acc: 0.8745 Epoch 160/1000 68s 137ms/step - loss: 0.7097 - acc: 0.8698 - val_loss: 0.6922 - val_acc: 0.8771 Epoch 161/1000 68s 137ms/step - loss: 0.7091 - acc: 0.8675 - val_loss: 0.6786 - val_acc: 0.8813 Epoch 162/1000 69s 138ms/step - loss: 0.7117 - acc: 0.8680 - val_loss: 0.7017 - val_acc: 0.8740 Epoch 163/1000 69s 137ms/step - loss: 0.7110 - acc: 0.8681 - val_loss: 0.6862 - val_acc: 0.8800 Epoch 164/1000 68s 137ms/step - loss: 0.7099 - acc: 0.8693 - val_loss: 0.7053 - val_acc: 0.8709 Epoch 165/1000 69s 138ms/step - loss: 0.7104 - acc: 0.8694 - val_loss: 0.6846 - val_acc: 0.8828 Epoch 166/1000 68s 136ms/step - loss: 0.7078 - acc: 0.8715 - val_loss: 0.6968 - val_acc: 0.8749 Epoch 167/1000 68s 136ms/step - loss: 0.7076 - acc: 0.8719 - val_loss: 0.6872 - val_acc: 0.8782 Epoch 168/1000 68s 136ms/step - loss: 0.7099 - acc: 0.8679 - val_loss: 0.6928 - val_acc: 0.8755 Epoch 169/1000 68s 136ms/step - loss: 0.7101 - acc: 0.8678 - val_loss: 0.6947 - val_acc: 0.8786 Epoch 170/1000 68s 137ms/step - loss: 0.7097 - acc: 0.8717 - val_loss: 0.6886 - val_acc: 0.8789 Epoch 171/1000 69s 137ms/step - loss: 0.7070 - acc: 0.8702 - val_loss: 0.6878 - val_acc: 0.8793 Epoch 172/1000 69s 137ms/step - loss: 0.7117 - acc: 0.8679 - val_loss: 0.6783 - val_acc: 0.8836 Epoch 173/1000 68s 137ms/step - loss: 0.7102 - acc: 0.8687 - val_loss: 0.6709 - val_acc: 0.8865 Epoch 174/1000 69s 137ms/step - loss: 0.7038 - acc: 0.8717 - val_loss: 0.6839 - val_acc: 0.8804 Epoch 175/1000 68s 137ms/step - loss: 0.7062 - acc: 0.8713 - val_loss: 0.6934 - val_acc: 0.8780 Epoch 176/1000 68s 137ms/step - loss: 0.7092 - acc: 0.8684 - val_loss: 0.7045 - val_acc: 0.8737 Epoch 177/1000 68s 137ms/step - loss: 0.7048 - acc: 0.8703 - val_loss: 0.6935 - val_acc: 0.8764 Epoch 178/1000 68s 137ms/step - loss: 0.7056 - acc: 0.8713 - val_loss: 0.6825 - val_acc: 0.8800 Epoch 179/1000 69s 137ms/step - loss: 0.7027 - acc: 0.8722 - val_loss: 0.6860 - val_acc: 0.8812 Epoch 180/1000 67s 135ms/step - loss: 0.7056 - acc: 0.8699 - val_loss: 0.6882 - val_acc: 0.8762 Epoch 181/1000 67s 135ms/step - loss: 0.6974 - acc: 0.8745 - val_loss: 0.7030 - val_acc: 0.8704 Epoch 182/1000 67s 135ms/step - loss: 0.7028 - acc: 0.8714 - val_loss: 0.6754 - val_acc: 0.8860 Epoch 183/1000 67s 135ms/step - loss: 0.7022 - acc: 0.8715 - val_loss: 0.6635 - val_acc: 0.8842 Epoch 184/1000 68s 136ms/step - loss: 0.7034 - acc: 0.8704 - val_loss: 0.6905 - val_acc: 0.8762 Epoch 185/1000 67s 135ms/step - loss: 0.7058 - acc: 0.8709 - val_loss: 0.7066 - val_acc: 0.8740 Epoch 186/1000 67s 135ms/step - loss: 0.7016 - acc: 0.8726 - val_loss: 0.6842 - val_acc: 0.8784 Epoch 187/1000 67s 135ms/step - loss: 0.6999 - acc: 0.8719 - val_loss: 0.7051 - val_acc: 0.8731 Epoch 188/1000 67s 135ms/step - loss: 0.7026 - acc: 0.8710 - val_loss: 0.6811 - val_acc: 0.8811 Epoch 189/1000 68s 135ms/step - loss: 0.7040 - acc: 0.8711 - val_loss: 0.6794 - val_acc: 0.8786 Epoch 190/1000 67s 135ms/step - loss: 0.7004 - acc: 0.8728 - val_loss: 0.6594 - val_acc: 0.8916 Epoch 191/1000 68s 136ms/step - loss: 0.6982 - acc: 0.8747 - val_loss: 0.6616 - val_acc: 0.8850 Epoch 192/1000 68s 135ms/step - loss: 0.7036 - acc: 0.8718 - val_loss: 0.6959 - val_acc: 0.8730 Epoch 193/1000 67s 135ms/step - loss: 0.7017 - acc: 0.8708 - val_loss: 0.6671 - val_acc: 0.8862 Epoch 194/1000 67s 135ms/step - loss: 0.6982 - acc: 0.8738 - val_loss: 0.6885 - val_acc: 0.8790 Epoch 195/1000 68s 136ms/step - loss: 0.6996 - acc: 0.8714 - val_loss: 0.6892 - val_acc: 0.8770 Epoch 196/1000 68s 136ms/step - loss: 0.7026 - acc: 0.8706 - val_loss: 0.6824 - val_acc: 0.8792 Epoch 197/1000 68s 136ms/step - loss: 0.7061 - acc: 0.8695 - val_loss: 0.6893 - val_acc: 0.8793 Epoch 198/1000 68s 135ms/step - loss: 0.7023 - acc: 0.8714 - val_loss: 0.6797 - val_acc: 0.8819 Epoch 199/1000 67s 135ms/step - loss: 0.7021 - acc: 0.8726 - val_loss: 0.6969 - val_acc: 0.8754 Epoch 200/1000 68s 136ms/step - loss: 0.7023 - acc: 0.8711 - val_loss: 0.6922 - val_acc: 0.8758 Epoch 201/1000 68s 135ms/step - loss: 0.7050 - acc: 0.8705 - val_loss: 0.6879 - val_acc: 0.8792 Epoch 202/1000 68s 135ms/step - loss: 0.7012 - acc: 0.8713 - val_loss: 0.6756 - val_acc: 0.8845 Epoch 203/1000 68s 136ms/step - loss: 0.7021 - acc: 0.8726 - val_loss: 0.6542 - val_acc: 0.8904 Epoch 204/1000 68s 136ms/step - loss: 0.6981 - acc: 0.8741 - val_loss: 0.7060 - val_acc: 0.8739 Epoch 205/1000 68s 135ms/step - loss: 0.7008 - acc: 0.8718 - val_loss: 0.6938 - val_acc: 0.8741 Epoch 206/1000 68s 136ms/step - loss: 0.6974 - acc: 0.8725 - val_loss: 0.6786 - val_acc: 0.8833 Epoch 207/1000 67s 135ms/step - loss: 0.6938 - acc: 0.8739 - val_loss: 0.6928 - val_acc: 0.8750 Epoch 208/1000 68s 135ms/step - loss: 0.7075 - acc: 0.8690 - val_loss: 0.6770 - val_acc: 0.8806 Epoch 209/1000 68s 136ms/step - loss: 0.6978 - acc: 0.8723 - val_loss: 0.6913 - val_acc: 0.8812 Epoch 210/1000 67s 135ms/step - loss: 0.6974 - acc: 0.8727 - val_loss: 0.6764 - val_acc: 0.8827 ... Epoch 291/1000 69s 138ms/step - loss: 0.6827 - acc: 0.8805 - val_loss: 0.6700 - val_acc: 0.8877 Epoch 292/1000 69s 137ms/step - loss: 0.6875 - acc: 0.8770 - val_loss: 0.6843 - val_acc: 0.8802 Epoch 293/1000 69s 138ms/step - loss: 0.6861 - acc: 0.8795 - val_loss: 0.6889 - val_acc: 0.8812 Epoch 294/1000 68s 137ms/step - loss: 0.6896 - acc: 0.8759 - val_loss: 0.6688 - val_acc: 0.8874 Epoch 295/1000 69s 138ms/step - loss: 0.6792 - acc: 0.8805 - val_loss: 0.6813 - val_acc: 0.8802 Epoch 296/1000 69s 138ms/step - loss: 0.6946 - acc: 0.8733 - val_loss: 0.6697 - val_acc: 0.8858 Epoch 297/1000 69s 138ms/step - loss: 0.6887 - acc: 0.8755 - val_loss: 0.6707 - val_acc: 0.8848 Epoch 298/1000 69s 138ms/step - loss: 0.6875 - acc: 0.8765 - val_loss: 0.7025 - val_acc: 0.8718 Epoch 299/1000 69s 137ms/step - loss: 0.6853 - acc: 0.8789 - val_loss: 0.6842 - val_acc: 0.8805 Epoch 300/1000 69s 138ms/step - loss: 0.6806 - acc: 0.8809 - val_loss: 0.6948 - val_acc: 0.8809 Epoch 301/1000 lr changed to 0.010000000149011612 69s 138ms/step - loss: 0.5763 - acc: 0.9142 - val_loss: 0.5780 - val_acc: 0.9169 Epoch 302/1000 69s 138ms/step - loss: 0.5127 - acc: 0.9355 - val_loss: 0.5618 - val_acc: 0.9209 Epoch 303/1000 68s 137ms/step - loss: 0.4950 - acc: 0.9401 - val_loss: 0.5561 - val_acc: 0.9223 Epoch 304/1000 68s 137ms/step - loss: 0.4744 - acc: 0.9449 - val_loss: 0.5485 - val_acc: 0.9229 Epoch 305/1000 68s 137ms/step - loss: 0.4602 - acc: 0.9489 - val_loss: 0.5469 - val_acc: 0.9206 Epoch 306/1000 69s 137ms/step - loss: 0.4533 - acc: 0.9479 - val_loss: 0.5368 - val_acc: 0.9209 Epoch 307/1000 69s 137ms/step - loss: 0.4463 - acc: 0.9498 - val_loss: 0.5294 - val_acc: 0.9230 Epoch 308/1000 69s 137ms/step - loss: 0.4371 - acc: 0.9508 - val_loss: 0.5304 - val_acc: 0.9228 Epoch 309/1000 69s 137ms/step - loss: 0.4276 - acc: 0.9515 - val_loss: 0.5217 - val_acc: 0.9236 Epoch 310/1000 68s 136ms/step - loss: 0.4185 - acc: 0.9542 - val_loss: 0.5202 - val_acc: 0.9235 Epoch 311/1000 69s 138ms/step - loss: 0.4079 - acc: 0.9563 - val_loss: 0.5213 - val_acc: 0.9224 Epoch 312/1000 69s 137ms/step - loss: 0.4028 - acc: 0.9559 - val_loss: 0.5149 - val_acc: 0.9241 Epoch 313/1000 68s 136ms/step - loss: 0.3940 - acc: 0.9582 - val_loss: 0.5182 - val_acc: 0.9229 Epoch 314/1000 69s 138ms/step - loss: 0.3913 - acc: 0.9584 - val_loss: 0.5063 - val_acc: 0.9222 Epoch 315/1000 69s 138ms/step - loss: 0.3815 - acc: 0.9599 - val_loss: 0.5065 - val_acc: 0.9242 Epoch 316/1000 69s 138ms/step - loss: 0.3779 - acc: 0.9596 - val_loss: 0.5105 - val_acc: 0.9197 Epoch 317/1000 69s 138ms/step - loss: 0.3734 - acc: 0.9607 - val_loss: 0.4951 - val_acc: 0.9242 Epoch 318/1000 69s 138ms/step - loss: 0.3668 - acc: 0.9608 - val_loss: 0.4984 - val_acc: 0.9226 Epoch 319/1000 68s 137ms/step - loss: 0.3600 - acc: 0.9628 - val_loss: 0.5003 - val_acc: 0.9195 Epoch 320/1000 68s 137ms/step - loss: 0.3562 - acc: 0.9622 - val_loss: 0.4927 - val_acc: 0.9206 Epoch 321/1000 69s 138ms/step - loss: 0.3551 - acc: 0.9619 - val_loss: 0.4883 - val_acc: 0.9233 Epoch 322/1000 69s 138ms/step - loss: 0.3467 - acc: 0.9635 - val_loss: 0.4820 - val_acc: 0.9247 Epoch 323/1000 69s 138ms/step - loss: 0.3468 - acc: 0.9621 - val_loss: 0.4795 - val_acc: 0.9225 Epoch 324/1000 68s 136ms/step - loss: 0.3386 - acc: 0.9651 - val_loss: 0.4927 - val_acc: 0.9205 Epoch 325/1000 68s 135ms/step - loss: 0.3368 - acc: 0.9644 - val_loss: 0.4823 - val_acc: 0.9205 Epoch 326/1000 68s 136ms/step - loss: 0.3284 - acc: 0.9667 - val_loss: 0.4691 - val_acc: 0.9236 Epoch 327/1000 69s 138ms/step - loss: 0.3255 - acc: 0.9658 - val_loss: 0.4734 - val_acc: 0.9252 Epoch 328/1000 68s 136ms/step - loss: 0.3255 - acc: 0.9648 - val_loss: 0.4795 - val_acc: 0.9230 Epoch 329/1000 68s 136ms/step - loss: 0.3257 - acc: 0.9638 - val_loss: 0.4681 - val_acc: 0.9223 Epoch 330/1000 68s 136ms/step - loss: 0.3181 - acc: 0.9648 - val_loss: 0.4670 - val_acc: 0.9215 Epoch 331/1000 68s 136ms/step - loss: 0.3138 - acc: 0.9660 - val_loss: 0.4821 - val_acc: 0.9185 Epoch 332/1000 68s 136ms/step - loss: 0.3140 - acc: 0.9648 - val_loss: 0.4727 - val_acc: 0.9202 Epoch 333/1000 69s 137ms/step - loss: 0.3102 - acc: 0.9663 - val_loss: 0.4632 - val_acc: 0.9231 Epoch 334/1000 68s 137ms/step - loss: 0.3085 - acc: 0.9663 - val_loss: 0.4611 - val_acc: 0.9240 Epoch 335/1000 68s 137ms/step - loss: 0.3019 - acc: 0.9679 - val_loss: 0.4614 - val_acc: 0.9238 Epoch 336/1000 69s 138ms/step - loss: 0.3046 - acc: 0.9654 - val_loss: 0.4635 - val_acc: 0.9202 Epoch 337/1000 68s 137ms/step - loss: 0.3015 - acc: 0.9660 - val_loss: 0.4599 - val_acc: 0.9228 Epoch 338/1000 69s 137ms/step - loss: 0.2992 - acc: 0.9662 - val_loss: 0.4577 - val_acc: 0.9207 Epoch 339/1000 69s 138ms/step - loss: 0.2942 - acc: 0.9669 - val_loss: 0.4702 - val_acc: 0.9172 Epoch 340/1000 69s 137ms/step - loss: 0.2924 - acc: 0.9675 - val_loss: 0.4545 - val_acc: 0.9211 ... Epoch 597/1000 68s 135ms/step - loss: 0.2366 - acc: 0.9703 - val_loss: 0.4557 - val_acc: 0.9103 Epoch 598/1000 68s 135ms/step - loss: 0.2399 - acc: 0.9697 - val_loss: 0.4449 - val_acc: 0.9117 Epoch 599/1000 67s 135ms/step - loss: 0.2397 - acc: 0.9689 - val_loss: 0.4359 - val_acc: 0.9147 Epoch 600/1000 68s 136ms/step - loss: 0.2341 - acc: 0.9717 - val_loss: 0.4224 - val_acc: 0.9169 Epoch 601/1000 lr changed to 0.0009999999776482583 68s 136ms/step - loss: 0.2082 - acc: 0.9813 - val_loss: 0.3916 - val_acc: 0.9268 Epoch 602/1000 68s 136ms/step - loss: 0.1952 - acc: 0.9865 - val_loss: 0.3854 - val_acc: 0.9281 Epoch 603/1000 68s 136ms/step - loss: 0.1878 - acc: 0.9881 - val_loss: 0.3852 - val_acc: 0.9299 Epoch 604/1000 68s 136ms/step - loss: 0.1846 - acc: 0.9899 - val_loss: 0.3842 - val_acc: 0.9298 Epoch 605/1000 68s 135ms/step - loss: 0.1826 - acc: 0.9909 - val_loss: 0.3829 - val_acc: 0.9326 Epoch 606/1000 68s 136ms/step - loss: 0.1808 - acc: 0.9912 - val_loss: 0.3838 - val_acc: 0.9305 Epoch 607/1000 68s 136ms/step - loss: 0.1771 - acc: 0.9927 - val_loss: 0.3851 - val_acc: 0.9303 Epoch 608/1000 68s 136ms/step - loss: 0.1768 - acc: 0.9922 - val_loss: 0.3898 - val_acc: 0.9304 Epoch 609/1000 68s 135ms/step - loss: 0.1758 - acc: 0.9926 - val_loss: 0.3878 - val_acc: 0.9309 Epoch 610/1000 68s 136ms/step - loss: 0.1739 - acc: 0.9931 - val_loss: 0.3887 - val_acc: 0.9294 Epoch 611/1000 68s 136ms/step - loss: 0.1731 - acc: 0.9934 - val_loss: 0.3874 - val_acc: 0.9311 Epoch 612/1000 68s 136ms/step - loss: 0.1725 - acc: 0.9935 - val_loss: 0.3898 - val_acc: 0.9297 Epoch 613/1000 68s 135ms/step - loss: 0.1717 - acc: 0.9937 - val_loss: 0.3900 - val_acc: 0.9298 Epoch 614/1000 68s 136ms/step - loss: 0.1705 - acc: 0.9937 - val_loss: 0.3912 - val_acc: 0.9299 Epoch 615/1000 68s 136ms/step - loss: 0.1709 - acc: 0.9934 - val_loss: 0.3898 - val_acc: 0.9307 Epoch 616/1000 68s 136ms/step - loss: 0.1686 - acc: 0.9948 - val_loss: 0.3905 - val_acc: 0.9311 Epoch 617/1000 68s 136ms/step - loss: 0.1695 - acc: 0.9942 - val_loss: 0.3948 - val_acc: 0.9303 Epoch 618/1000 68s 136ms/step - loss: 0.1688 - acc: 0.9941 - val_loss: 0.3936 - val_acc: 0.9298 Epoch 619/1000 68s 136ms/step - loss: 0.1679 - acc: 0.9945 - val_loss: 0.3950 - val_acc: 0.9290 Epoch 620/1000 68s 136ms/step - loss: 0.1675 - acc: 0.9941 - val_loss: 0.3940 - val_acc: 0.9300 Epoch 621/1000 68s 136ms/step - loss: 0.1651 - acc: 0.9949 - val_loss: 0.3956 - val_acc: 0.9309 Epoch 622/1000 68s 136ms/step - loss: 0.1653 - acc: 0.9951 - val_loss: 0.3950 - val_acc: 0.9306 Epoch 623/1000 68s 136ms/step - loss: 0.1656 - acc: 0.9946 - val_loss: 0.3947 - val_acc: 0.9306 Epoch 624/1000 68s 136ms/step - loss: 0.1644 - acc: 0.9949 - val_loss: 0.3946 - val_acc: 0.9304 Epoch 625/1000 68s 136ms/step - loss: 0.1636 - acc: 0.9951 - val_loss: 0.3944 - val_acc: 0.9296 Epoch 626/1000 68s 136ms/step - loss: 0.1630 - acc: 0.9951 - val_loss: 0.3937 - val_acc: 0.9295 Epoch 627/1000 68s 136ms/step - loss: 0.1630 - acc: 0.9953 - val_loss: 0.3959 - val_acc: 0.9296 Epoch 628/1000 68s 136ms/step - loss: 0.1627 - acc: 0.9954 - val_loss: 0.3939 - val_acc: 0.9289 Epoch 629/1000 68s 136ms/step - loss: 0.1630 - acc: 0.9947 - val_loss: 0.3937 - val_acc: 0.9303 Epoch 630/1000 68s 135ms/step - loss: 0.1614 - acc: 0.9958 - val_loss: 0.3909 - val_acc: 0.9316 Epoch 631/1000 68s 137ms/step - loss: 0.1624 - acc: 0.9950 - val_loss: 0.3922 - val_acc: 0.9310 Epoch 632/1000 68s 135ms/step - loss: 0.1611 - acc: 0.9954 - val_loss: 0.3907 - val_acc: 0.9313 Epoch 633/1000 68s 136ms/step - loss: 0.1599 - acc: 0.9955 - val_loss: 0.3893 - val_acc: 0.9295 Epoch 634/1000 68s 136ms/step - loss: 0.1600 - acc: 0.9954 - val_loss: 0.3886 - val_acc: 0.9308 Epoch 635/1000 68s 136ms/step - loss: 0.1593 - acc: 0.9953 - val_loss: 0.3926 - val_acc: 0.9297 Epoch 636/1000 68s 136ms/step - loss: 0.1594 - acc: 0.9950 - val_loss: 0.3945 - val_acc: 0.9289 Epoch 637/1000 68s 136ms/step - loss: 0.1595 - acc: 0.9955 - val_loss: 0.3937 - val_acc: 0.9306 Epoch 638/1000 68s 136ms/step - loss: 0.1591 - acc: 0.9958 - val_loss: 0.3882 - val_acc: 0.9306 Epoch 639/1000 68s 135ms/step - loss: 0.1586 - acc: 0.9959 - val_loss: 0.3893 - val_acc: 0.9309 Epoch 640/1000 68s 136ms/step - loss: 0.1588 - acc: 0.9956 - val_loss: 0.3935 - val_acc: 0.9300 Epoch 641/1000 68s 135ms/step - loss: 0.1571 - acc: 0.9960 - val_loss: 0.3917 - val_acc: 0.9298 Epoch 642/1000 68s 136ms/step - loss: 0.1576 - acc: 0.9956 - val_loss: 0.3945 - val_acc: 0.9284 Epoch 643/1000 68s 136ms/step - loss: 0.1570 - acc: 0.9961 - val_loss: 0.3899 - val_acc: 0.9309 Epoch 644/1000 68s 136ms/step - loss: 0.1565 - acc: 0.9962 - val_loss: 0.3918 - val_acc: 0.9307 Epoch 645/1000 68s 136ms/step - loss: 0.1563 - acc: 0.9956 - val_loss: 0.3940 - val_acc: 0.9307 Epoch 646/1000 68s 136ms/step - loss: 0.1563 - acc: 0.9956 - val_loss: 0.3895 - val_acc: 0.9322 Epoch 647/1000 68s 136ms/step - loss: 0.1555 - acc: 0.9963 - val_loss: 0.3903 - val_acc: 0.9302 Epoch 648/1000 68s 135ms/step - loss: 0.1556 - acc: 0.9958 - val_loss: 0.3926 - val_acc: 0.9307 Epoch 649/1000 68s 135ms/step - loss: 0.1542 - acc: 0.9962 - val_loss: 0.3904 - val_acc: 0.9308 Epoch 650/1000 68s 136ms/step - loss: 0.1552 - acc: 0.9959 - val_loss: 0.3934 - val_acc: 0.9295 Epoch 651/1000 68s 136ms/step - loss: 0.1548 - acc: 0.9959 - val_loss: 0.3921 - val_acc: 0.9307 Epoch 652/1000 68s 136ms/step - loss: 0.1537 - acc: 0.9964 - val_loss: 0.3973 - val_acc: 0.9293 Epoch 653/1000 68s 136ms/step - loss: 0.1540 - acc: 0.9958 - val_loss: 0.3950 - val_acc: 0.9287 Epoch 654/1000 68s 136ms/step - loss: 0.1523 - acc: 0.9965 - val_loss: 0.3956 - val_acc: 0.9296 Epoch 655/1000 68s 137ms/step - loss: 0.1532 - acc: 0.9964 - val_loss: 0.3991 - val_acc: 0.9292 Epoch 656/1000 68s 136ms/step - loss: 0.1538 - acc: 0.9957 - val_loss: 0.3995 - val_acc: 0.9296 Epoch 657/1000 68s 136ms/step - loss: 0.1520 - acc: 0.9966 - val_loss: 0.3988 - val_acc: 0.9310 Epoch 658/1000 68s 136ms/step - loss: 0.1532 - acc: 0.9959 - val_loss: 0.3961 - val_acc: 0.9307 Epoch 659/1000 68s 136ms/step - loss: 0.1526 - acc: 0.9958 - val_loss: 0.3948 - val_acc: 0.9306 Epoch 660/1000 68s 136ms/step - loss: 0.1512 - acc: 0.9965 - val_loss: 0.3947 - val_acc: 0.9309 Epoch 661/1000 68s 136ms/step - loss: 0.1519 - acc: 0.9962 - val_loss: 0.3959 - val_acc: 0.9315 Epoch 662/1000 68s 136ms/step - loss: 0.1510 - acc: 0.9963 - val_loss: 0.3962 - val_acc: 0.9312 Epoch 663/1000 68s 136ms/step - loss: 0.1517 - acc: 0.9960 - val_loss: 0.3939 - val_acc: 0.9304 Epoch 664/1000 68s 135ms/step - loss: 0.1494 - acc: 0.9964 - val_loss: 0.3928 - val_acc: 0.9309 Epoch 665/1000 68s 135ms/step - loss: 0.1492 - acc: 0.9966 - val_loss: 0.3900 - val_acc: 0.9320 Epoch 666/1000 68s 136ms/step - loss: 0.1493 - acc: 0.9963 - val_loss: 0.3907 - val_acc: 0.9312 Epoch 667/1000 68s 136ms/step - loss: 0.1491 - acc: 0.9967 - val_loss: 0.3930 - val_acc: 0.9309 Epoch 668/1000 68s 136ms/step - loss: 0.1494 - acc: 0.9960 - val_loss: 0.3923 - val_acc: 0.9301 Epoch 669/1000 68s 136ms/step - loss: 0.1485 - acc: 0.9966 - val_loss: 0.3941 - val_acc: 0.9308 Epoch 670/1000 68s 135ms/step - loss: 0.1486 - acc: 0.9963 - val_loss: 0.3927 - val_acc: 0.9314 ... Epoch 879/1000 68s 137ms/step - loss: 0.1065 - acc: 0.9977 - val_loss: 0.3767 - val_acc: 0.9280 Epoch 880/1000 69s 138ms/step - loss: 0.1055 - acc: 0.9979 - val_loss: 0.3741 - val_acc: 0.9285 Epoch 881/1000 68s 137ms/step - loss: 0.1056 - acc: 0.9979 - val_loss: 0.3716 - val_acc: 0.9290 Epoch 882/1000 69s 138ms/step - loss: 0.1061 - acc: 0.9977 - val_loss: 0.3736 - val_acc: 0.9295 Epoch 883/1000 69s 138ms/step - loss: 0.1066 - acc: 0.9976 - val_loss: 0.3745 - val_acc: 0.9307 Epoch 884/1000 69s 137ms/step - loss: 0.1059 - acc: 0.9975 - val_loss: 0.3702 - val_acc: 0.9302 Epoch 885/1000 69s 138ms/step - loss: 0.1051 - acc: 0.9979 - val_loss: 0.3656 - val_acc: 0.9311 Epoch 886/1000 68s 137ms/step - loss: 0.1051 - acc: 0.9978 - val_loss: 0.3677 - val_acc: 0.9305 Epoch 887/1000 68s 137ms/step - loss: 0.1062 - acc: 0.9974 - val_loss: 0.3636 - val_acc: 0.9315 Epoch 888/1000 69s 137ms/step - loss: 0.1052 - acc: 0.9977 - val_loss: 0.3710 - val_acc: 0.9295 Epoch 889/1000 68s 137ms/step - loss: 0.1046 - acc: 0.9979 - val_loss: 0.3642 - val_acc: 0.9318 Epoch 890/1000 69s 138ms/step - loss: 0.1051 - acc: 0.9975 - val_loss: 0.3673 - val_acc: 0.9306 Epoch 891/1000 69s 138ms/step - loss: 0.1045 - acc: 0.9978 - val_loss: 0.3681 - val_acc: 0.9299 Epoch 892/1000 68s 137ms/step - loss: 0.1043 - acc: 0.9979 - val_loss: 0.3659 - val_acc: 0.9320 Epoch 893/1000 69s 137ms/step - loss: 0.1040 - acc: 0.9979 - val_loss: 0.3627 - val_acc: 0.9326 Epoch 894/1000 69s 138ms/step - loss: 0.1041 - acc: 0.9976 - val_loss: 0.3698 - val_acc: 0.9301 Epoch 895/1000 68s 137ms/step - loss: 0.1039 - acc: 0.9978 - val_loss: 0.3659 - val_acc: 0.9321 Epoch 896/1000 69s 137ms/step - loss: 0.1040 - acc: 0.9978 - val_loss: 0.3718 - val_acc: 0.9300 Epoch 897/1000 68s 137ms/step - loss: 0.1039 - acc: 0.9977 - val_loss: 0.3728 - val_acc: 0.9311 Epoch 898/1000 68s 137ms/step - loss: 0.1044 - acc: 0.9973 - val_loss: 0.3743 - val_acc: 0.9313 Epoch 899/1000 69s 137ms/step - loss: 0.1036 - acc: 0.9976 - val_loss: 0.3675 - val_acc: 0.9312 Epoch 900/1000 69s 138ms/step - loss: 0.1030 - acc: 0.9979 - val_loss: 0.3730 - val_acc: 0.9313 Epoch 901/1000 lr changed to 9.999999310821295e-05 69s 138ms/step - loss: 0.1023 - acc: 0.9982 - val_loss: 0.3709 - val_acc: 0.9310 Epoch 902/1000 69s 137ms/step - loss: 0.1025 - acc: 0.9979 - val_loss: 0.3690 - val_acc: 0.9311 Epoch 903/1000 68s 137ms/step - loss: 0.1024 - acc: 0.9980 - val_loss: 0.3679 - val_acc: 0.9311 Epoch 904/1000 69s 137ms/step - loss: 0.1020 - acc: 0.9982 - val_loss: 0.3673 - val_acc: 0.9315 Epoch 905/1000 69s 138ms/step - loss: 0.1027 - acc: 0.9979 - val_loss: 0.3672 - val_acc: 0.9310 Epoch 906/1000 69s 138ms/step - loss: 0.1015 - acc: 0.9984 - val_loss: 0.3678 - val_acc: 0.9304 Epoch 907/1000 69s 138ms/step - loss: 0.1016 - acc: 0.9984 - val_loss: 0.3673 - val_acc: 0.9302 Epoch 908/1000 69s 138ms/step - loss: 0.1031 - acc: 0.9977 - val_loss: 0.3667 - val_acc: 0.9307 Epoch 909/1000 69s 139ms/step - loss: 0.1019 - acc: 0.9983 - val_loss: 0.3672 - val_acc: 0.9317 Epoch 910/1000 69s 137ms/step - loss: 0.1018 - acc: 0.9983 - val_loss: 0.3671 - val_acc: 0.9313 Epoch 911/1000 69s 137ms/step - loss: 0.1018 - acc: 0.9982 - val_loss: 0.3669 - val_acc: 0.9309 Epoch 912/1000 69s 137ms/step - loss: 0.1014 - acc: 0.9986 - val_loss: 0.3677 - val_acc: 0.9303 Epoch 913/1000 68s 137ms/step - loss: 0.1015 - acc: 0.9982 - val_loss: 0.3666 - val_acc: 0.9303 Epoch 914/1000 69s 138ms/step - loss: 0.1015 - acc: 0.9984 - val_loss: 0.3659 - val_acc: 0.9309 Epoch 915/1000 69s 138ms/step - loss: 0.1013 - acc: 0.9983 - val_loss: 0.3651 - val_acc: 0.9318 Epoch 916/1000 69s 138ms/step - loss: 0.1014 - acc: 0.9983 - val_loss: 0.3652 - val_acc: 0.9322 Epoch 917/1000 69s 137ms/step - loss: 0.1010 - acc: 0.9984 - val_loss: 0.3648 - val_acc: 0.9322 Epoch 918/1000 68s 137ms/step - loss: 0.1016 - acc: 0.9981 - val_loss: 0.3644 - val_acc: 0.9324 Epoch 919/1000 69s 138ms/step - loss: 0.1013 - acc: 0.9983 - val_loss: 0.3635 - val_acc: 0.9319 ... Epoch 981/1000 69s 137ms/step - loss: 0.0992 - acc: 0.9987 - val_loss: 0.3619 - val_acc: 0.9324 Epoch 982/1000 68s 137ms/step - loss: 0.0994 - acc: 0.9986 - val_loss: 0.3619 - val_acc: 0.9332 Epoch 983/1000 69s 137ms/step - loss: 0.0995 - acc: 0.9986 - val_loss: 0.3617 - val_acc: 0.9329 Epoch 984/1000 68s 137ms/step - loss: 0.0991 - acc: 0.9987 - val_loss: 0.3622 - val_acc: 0.9328 Epoch 985/1000 68s 137ms/step - loss: 0.0991 - acc: 0.9987 - val_loss: 0.3628 - val_acc: 0.9322 Epoch 986/1000 68s 137ms/step - loss: 0.0993 - acc: 0.9987 - val_loss: 0.3625 - val_acc: 0.9319 Epoch 987/1000 68s 137ms/step - loss: 0.0995 - acc: 0.9986 - val_loss: 0.3629 - val_acc: 0.9317 Epoch 988/1000 69s 137ms/step - loss: 0.0993 - acc: 0.9985 - val_loss: 0.3628 - val_acc: 0.9319 Epoch 989/1000 69s 137ms/step - loss: 0.0997 - acc: 0.9984 - val_loss: 0.3624 - val_acc: 0.9322 Epoch 990/1000 69s 138ms/step - loss: 0.0993 - acc: 0.9986 - val_loss: 0.3622 - val_acc: 0.9323 Epoch 991/1000 68s 137ms/step - loss: 0.0993 - acc: 0.9986 - val_loss: 0.3625 - val_acc: 0.9327 Epoch 992/1000 69s 137ms/step - loss: 0.0993 - acc: 0.9988 - val_loss: 0.3630 - val_acc: 0.9325 Epoch 993/1000 68s 137ms/step - loss: 0.0992 - acc: 0.9984 - val_loss: 0.3634 - val_acc: 0.9320 Epoch 994/1000 69s 138ms/step - loss: 0.0991 - acc: 0.9988 - val_loss: 0.3627 - val_acc: 0.9328 Epoch 995/1000 69s 138ms/step - loss: 0.0989 - acc: 0.9989 - val_loss: 0.3637 - val_acc: 0.9321 Epoch 996/1000 69s 138ms/step - loss: 0.0994 - acc: 0.9986 - val_loss: 0.3623 - val_acc: 0.9319 Epoch 997/1000 69s 138ms/step - loss: 0.0987 - acc: 0.9987 - val_loss: 0.3622 - val_acc: 0.9322 Epoch 998/1000 69s 138ms/step - loss: 0.0989 - acc: 0.9988 - val_loss: 0.3621 - val_acc: 0.9325 Epoch 999/1000 69s 138ms/step - loss: 0.0993 - acc: 0.9984 - val_loss: 0.3615 - val_acc: 0.9326 Epoch 1000/1000 69s 138ms/step - loss: 0.0986 - acc: 0.9988 - val_loss: 0.3614 - val_acc: 0.9323 Train loss: 0.09943642792105675 Train accuracy: 0.9982600016593933 Test loss: 0.3614072059094906 Test accuracy: 0.9322999995946885在使用了shear_range = 30的数据增强以后,准确率降了呢。
Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020
https://ieeexplore.ieee.org/document/8998530
---来自腾讯云社区的---用户7368967
微信扫一扫打赏
支付宝扫一扫打赏