Updated 29/Nov/2021 by Yoshihisa Nitta  

Further Training of Cycle Generative Adversarial Network for VidTIMIT dataset with Tensorflow 2 on Google Colab (WGAN-GP)

Assuming that you have already executed CycleGAN_VidTIMIT_Train.ipynb, further train the Model.

VidTIMIT データセットに対して Cycle Generative Adversarial Network をGoogle Colab 上の Tensorflow 2 でさらに学習させる

既に CycleGAN_VidTIMIT_Train.ipynb を実行していることを前提とし、さらに学習を進める。

In [ ]:
MAX_EPOCHS = 50     # Change this value and run this ipynb many times

save_path = '/content/drive/MyDrive/ColabRun/CycleGAN_VidTIMIT01'
VERBOSE = False
In [ ]:
#! pip install tensorflow==2.7.0
In [ ]:
! pip install tensorflow_addons
Collecting tensorflow_addons
  Downloading tensorflow_addons-0.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)
     |████████████████████████████████| 1.1 MB 15.8 MB/s 
Requirement already satisfied: typeguard>=2.7 in /usr/local/lib/python3.7/dist-packages (from tensorflow_addons) (2.7.1)
Installing collected packages: tensorflow-addons
Successfully installed tensorflow-addons-0.15.0
In [ ]:
%tensorflow_version 2.x

import tensorflow as tf
print(tf.__version__)
2.7.0
In [ ]:
import numpy as np

np.random.seed(2022)

Check the Google Colab runtime environment

Google Colab 実行環境を調べる

In [ ]:
! nvidia-smi
if VERBOSE:
    ! cat /proc/cpuinfo
    ! cat /etc/issue
    ! free -h
Tue Dec  7 21:31:29 2021       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 495.44       Driver Version: 460.32.03    CUDA Version: 11.2     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  Tesla V100-SXM2...  Off  | 00000000:00:04.0 Off |                    0 |
| N/A   42C    P0    26W / 300W |      0MiB / 16160MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|  No running processes found                                                 |
+-----------------------------------------------------------------------------+

Mount Google Drive from Google Colab

Google Colab から GoogleDrive をマウントする

In [ ]:
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
In [ ]:
if VERBOSE:
    ! ls /content/drive

Download source file from Google Drive or nw.tsuda.ac.jp

Basically, gdown from Google Drive. Download from nw.tsuda.ac.jp above only if the specifications of Google Drive change and you cannot download from Google Drive.

Google Drive または nw.tsuda.ac.jp からファイルをダウンロードする

基本的に、Google Drive から gdown してください。 Google Drive の仕様が変わってダウンロードができない場合にのみ、nw.tsuda.ac.jp からダウンロードしてください。

In [ ]:
# Download source file
nw_path = './nw'
! rm -rf {nw_path}
! mkdir -p {nw_path}

if True:   # from Google Drive
    url_model =  'https://drive.google.com/uc?id=1aNvpPDNeDWYQFu_PA1kOtFlzcO5seHky'
    ! (cd {nw_path}; gdown {url_model})
else:      # from nw.tsuda.ac.jp
    URL_NW = 'https://nw.tsuda.ac.jp/lec/GoogleColab/pub'
    url_model = f'{URL_NW}/models/CycleGAN.py'
    ! wget -nd {url_model} -P {nw_path}
Downloading...
From: https://drive.google.com/uc?id=1aNvpPDNeDWYQFu_PA1kOtFlzcO5seHky
To: /content/nw/CycleGAN.py
100% 24.6k/24.6k [00:00<00:00, 19.7MB/s]
In [ ]:
if VERBOSE:
    ! cat {nw_path}/CycleGAN.py
In [ ]:
# Download zip files
VidTIMIT_site = 'https://zenodo.org/record/158963/files/'
VidTIMIT_fnames = [ 'fadg0', 'faks0']

Mirrored_files = [
    'https://drive.google.com/uc?id=1_Fv4p9MDNphMZMnLpEvtCtnwXgN8N5Cj', 
    'https://drive.google.com/uc?id=1Y8j7ThPVqB0gbx4hb9aMEp9Ptr9wFuoz'
]

data_dir = './datasets'
! rm -rf $data_dir
! mkdir -p $data_dir

for i, fname in enumerate(VidTIMIT_fnames):
    fzip = fname + '.zip'
    if False:
        url = VidTIMIT_site + fzip
        !wget {url}
    else:
        url = Mirrored_files[i]
        !gdown {url}

    !unzip -q {fzip} -d {data_dir}
Downloading...
From: https://drive.google.com/uc?id=1_Fv4p9MDNphMZMnLpEvtCtnwXgN8N5Cj
To: /content/fadg0.zip
100% 81.6M/81.6M [00:01<00:00, 48.9MB/s]
Downloading...
From: https://drive.google.com/uc?id=1Y8j7ThPVqB0gbx4hb9aMEp9Ptr9wFuoz
To: /content/faks0.zip
100% 64.2M/64.2M [00:00<00:00, 66.4MB/s]

Make DataGenerator from the images of VidTIMIT

VidTIMIT の画像ファイルから DataGenerator を作る

In [ ]:
IMAGE_SIZE = 128
In [ ]:
import os
import glob

imgA_paths = glob.glob(os.path.join(data_dir, VidTIMIT_fnames[0], 'video/*/[0-9]*'))
imgB_paths = glob.glob(os.path.join(data_dir, VidTIMIT_fnames[1], 'video/*/[0-9]*'))
In [ ]:
import numpy as np

validation_split = 0.05

nA, nB = len(imgA_paths), len(imgB_paths)
splitA = int(nA * (1 - validation_split))
splitB = int(nB * (1 - validation_split))

np.random.shuffle(imgA_paths)
np.random.shuffle(imgB_paths)

train_imgA_paths = imgA_paths[:splitA]
test_imgA_paths = imgA_paths[splitA:]
train_imgB_paths = imgB_paths[:splitB]
test_imgB_paths = imgB_paths[splitB:]
In [ ]:
# Image: [-1, 1] --> [0, 1]
def M1P1_ZeroP1(imgs):
    imgs = (imgs + 1) * 0.5
    return np.clip(imgs, 0, 1)

# Image: [0, 1] --> [-1, 1]
def ZeroP1_M1P1(imgs):
    return imgs * 2 - 1
In [ ]:
from nw.CycleGAN import PairDataset

pair_flow = PairDataset(train_imgA_paths, train_imgB_paths, target_size=(IMAGE_SIZE, IMAGE_SIZE))
test_pair_flow = PairDataset(test_imgA_paths, test_imgB_paths, target_size=(IMAGE_SIZE, IMAGE_SIZE))

Define the Neural Network Model

ニューラルネットワーク・モデルを定義する

In [ ]:
from nw.CycleGAN import CycleGAN

gan = CycleGAN.load(save_path)

print(gan.epoch)
750

Train

訓練する

Further Training

さらに学習を進める

In [ ]:
gan.train(
    pair_flow,
    epochs = MAX_EPOCHS,
    batch_size=1,
    run_folder = save_path,
    print_step_interval = 1000,
    save_epoch_interval = 50
)
WARNING:tensorflow:5 out of the last 5 calls to <function Model.make_train_function.<locals>.train_function at 0x7f51e2711ef0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.
Epoch 751/800 1000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.562 adv: 1.797 recon: 0.064 id: 0.062 time: 0:03:45.376853
Epoch 751/800 2000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.668 adv: 2.045 recon: 0.052 id: 0.053 time: 0:07:11.279277
Epoch 751/800 [D loss: 0.016 acc: 0.991][G loss: 2.667 adv: 1.883 recon: 0.065 id: 0.066 time: 0:09:14.769916
Epoch 752/800 1000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.549 adv: 1.919 recon: 0.052 id: 0.053 time: 0:12:39.378950
Epoch 752/800 2000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.490 adv: 1.876 recon: 0.051 id: 0.052 time: 0:16:06.596191
Epoch 752/800 [D loss: 0.016 acc: 0.991][G loss: 2.606 adv: 1.879 recon: 0.060 id: 0.062 time: 0:18:09.053658
Epoch 753/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.489 adv: 1.830 recon: 0.055 id: 0.055 time: 0:21:36.872877
Epoch 753/800 2000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.585 adv: 1.954 recon: 0.052 id: 0.053 time: 0:25:04.923162
Epoch 753/800 [D loss: 0.018 acc: 0.987][G loss: 2.689 adv: 1.871 recon: 0.068 id: 0.069 time: 0:27:07.927755
Epoch 754/800 1000/2595 [D loss: 0.024 acc: 1.000][G loss: 2.389 adv: 1.729 recon: 0.054 id: 0.057 time: 0:30:35.696196
Epoch 754/800 2000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.627 adv: 1.999 recon: 0.052 id: 0.053 time: 0:34:03.673666
Epoch 754/800 [D loss: 0.018 acc: 0.988][G loss: 2.736 adv: 1.872 recon: 0.072 id: 0.073 time: 0:36:06.782489
Epoch 755/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.457 adv: 1.824 recon: 0.052 id: 0.054 time: 0:39:33.589211
Epoch 755/800 2000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.661 adv: 1.825 recon: 0.070 id: 0.070 time: 0:42:59.800975
Epoch 755/800 [D loss: 0.016 acc: 0.989][G loss: 2.666 adv: 1.883 recon: 0.065 id: 0.066 time: 0:45:02.296048
Epoch 756/800 1000/2595 [D loss: 0.003 acc: 1.000][G loss: 3.045 adv: 2.021 recon: 0.086 id: 0.084 time: 0:48:29.336065
Epoch 756/800 2000/2595 [D loss: 0.019 acc: 0.993][G loss: 2.933 adv: 2.303 recon: 0.052 id: 0.054 time: 0:51:54.000817
Epoch 756/800 [D loss: 0.015 acc: 0.992][G loss: 2.632 adv: 1.888 recon: 0.062 id: 0.063 time: 0:53:55.981703
Epoch 757/800 1000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.493 adv: 1.856 recon: 0.053 id: 0.053 time: 0:57:19.457168
Epoch 757/800 2000/2595 [D loss: 0.026 acc: 0.996][G loss: 2.933 adv: 2.286 recon: 0.054 id: 0.054 time: 1:00:43.845591
Epoch 757/800 [D loss: 0.015 acc: 0.992][G loss: 2.652 adv: 1.886 recon: 0.064 id: 0.065 time: 1:02:44.764668
Epoch 758/800 1000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.443 adv: 1.798 recon: 0.054 id: 0.054 time: 1:06:10.172260
Epoch 758/800 2000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.503 adv: 1.823 recon: 0.057 id: 0.056 time: 1:09:33.064128
Epoch 758/800 [D loss: 0.015 acc: 0.992][G loss: 2.707 adv: 1.876 recon: 0.069 id: 0.070 time: 1:11:33.398421
Epoch 759/800 1000/2595 [D loss: 0.009 acc: 1.000][G loss: 2.514 adv: 1.877 recon: 0.053 id: 0.054 time: 1:14:55.505437
Epoch 759/800 2000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.434 adv: 1.805 recon: 0.052 id: 0.053 time: 1:18:20.099977
Epoch 759/800 [D loss: 0.013 acc: 0.995][G loss: 2.634 adv: 1.898 recon: 0.061 id: 0.063 time: 1:20:20.053619
Epoch 760/800 1000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.523 adv: 1.894 recon: 0.052 id: 0.053 time: 1:23:42.768353
Epoch 760/800 2000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.751 adv: 2.120 recon: 0.053 id: 0.053 time: 1:27:02.956354
Epoch 760/800 [D loss: 0.012 acc: 0.996][G loss: 2.618 adv: 1.899 recon: 0.060 id: 0.061 time: 1:29:07.404342
Epoch 761/800 1000/2595 [D loss: 0.026 acc: 1.000][G loss: 2.032 adv: 1.337 recon: 0.058 id: 0.058 time: 1:32:31.141564
Epoch 761/800 2000/2595 [D loss: 0.007 acc: 1.000][G loss: 3.079 adv: 2.438 recon: 0.053 id: 0.054 time: 1:35:52.124935
Epoch 761/800 [D loss: 0.014 acc: 0.994][G loss: 2.653 adv: 1.895 recon: 0.063 id: 0.064 time: 1:37:49.614719
Epoch 762/800 1000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.458 adv: 1.795 recon: 0.055 id: 0.055 time: 1:41:10.795260
Epoch 762/800 2000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.979 adv: 2.334 recon: 0.054 id: 0.054 time: 1:44:31.412914
Epoch 762/800 [D loss: 0.016 acc: 0.993][G loss: 2.640 adv: 1.866 recon: 0.064 id: 0.065 time: 1:46:30.827094
Epoch 763/800 1000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.523 adv: 1.869 recon: 0.054 id: 0.055 time: 1:49:51.251626
Epoch 763/800 2000/2595 [D loss: 0.011 acc: 1.000][G loss: 2.563 adv: 1.923 recon: 0.053 id: 0.054 time: 1:53:14.507776
Epoch 763/800 [D loss: 0.016 acc: 0.994][G loss: 2.597 adv: 1.880 recon: 0.060 id: 0.060 time: 1:55:15.796646
Epoch 764/800 1000/2595 [D loss: 0.014 acc: 1.000][G loss: 2.567 adv: 1.789 recon: 0.065 id: 0.063 time: 1:58:39.932721
Epoch 764/800 2000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.583 adv: 1.951 recon: 0.052 id: 0.054 time: 2:02:03.744003
Epoch 764/800 [D loss: 0.016 acc: 0.993][G loss: 2.598 adv: 1.865 recon: 0.061 id: 0.062 time: 2:04:04.843428
Epoch 765/800 1000/2595 [D loss: 0.010 acc: 1.000][G loss: 2.475 adv: 1.806 recon: 0.055 id: 0.057 time: 2:07:27.813554
Epoch 765/800 2000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.529 adv: 1.895 recon: 0.053 id: 0.054 time: 2:10:51.774763
Epoch 765/800 [D loss: 0.016 acc: 0.993][G loss: 2.593 adv: 1.874 recon: 0.060 id: 0.061 time: 2:12:51.173453
Epoch 766/800 1000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.964 adv: 1.964 recon: 0.085 id: 0.074 time: 2:16:16.080675
Epoch 766/800 2000/2595 [D loss: 0.013 acc: 1.000][G loss: 2.208 adv: 1.575 recon: 0.053 id: 0.054 time: 2:19:39.483593
Epoch 766/800 [D loss: 0.016 acc: 0.992][G loss: 2.622 adv: 1.881 recon: 0.062 id: 0.062 time: 2:21:40.924391
Epoch 767/800 1000/2595 [D loss: 0.012 acc: 1.000][G loss: 2.303 adv: 1.678 recon: 0.052 id: 0.053 time: 2:25:04.501841
Epoch 767/800 2000/2595 [D loss: 0.009 acc: 1.000][G loss: 2.700 adv: 2.048 recon: 0.054 id: 0.055 time: 2:28:29.564175
Epoch 767/800 [D loss: 0.016 acc: 0.991][G loss: 2.598 adv: 1.891 recon: 0.059 id: 0.060 time: 2:30:30.458695
Epoch 768/800 1000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.554 adv: 1.790 recon: 0.064 id: 0.062 time: 2:33:53.346604
Epoch 768/800 2000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.481 adv: 1.845 recon: 0.053 id: 0.053 time: 2:37:17.630176
Epoch 768/800 [D loss: 0.014 acc: 0.993][G loss: 2.621 adv: 1.889 recon: 0.061 id: 0.062 time: 2:39:19.752090
Epoch 769/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.579 adv: 1.948 recon: 0.052 id: 0.054 time: 2:42:45.501633
Epoch 769/800 2000/2595 [D loss: 0.012 acc: 1.000][G loss: 2.507 adv: 1.876 recon: 0.052 id: 0.053 time: 2:46:10.892446
Epoch 769/800 [D loss: 0.014 acc: 0.993][G loss: 2.613 adv: 1.899 recon: 0.059 id: 0.060 time: 2:48:11.408266
Epoch 770/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.627 adv: 1.965 recon: 0.055 id: 0.056 time: 2:51:35.452986
Epoch 770/800 2000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.534 adv: 1.910 recon: 0.052 id: 0.053 time: 2:54:58.366734
Epoch 770/800 [D loss: 0.013 acc: 0.994][G loss: 2.596 adv: 1.895 recon: 0.058 id: 0.059 time: 2:56:58.794319
Epoch 771/800 1000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.486 adv: 1.870 recon: 0.051 id: 0.053 time: 3:00:24.728076
Epoch 771/800 2000/2595 [D loss: 0.013 acc: 1.000][G loss: 2.452 adv: 1.808 recon: 0.054 id: 0.054 time: 3:03:50.028073
Epoch 771/800 [D loss: 0.015 acc: 0.992][G loss: 2.580 adv: 1.883 recon: 0.058 id: 0.059 time: 3:05:50.436768
Epoch 772/800 1000/2595 [D loss: 0.015 acc: 1.000][G loss: 2.553 adv: 1.924 recon: 0.052 id: 0.053 time: 3:09:16.921557
Epoch 772/800 2000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.833 adv: 2.168 recon: 0.055 id: 0.055 time: 3:12:43.375117
Epoch 772/800 [D loss: 0.016 acc: 0.990][G loss: 2.552 adv: 1.876 recon: 0.056 id: 0.057 time: 3:14:47.243194
Epoch 773/800 1000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.249 adv: 1.621 recon: 0.052 id: 0.053 time: 3:18:15.907985
Epoch 773/800 2000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.750 adv: 2.118 recon: 0.052 id: 0.053 time: 3:21:45.056431
Epoch 773/800 [D loss: 0.016 acc: 0.991][G loss: 2.570 adv: 1.875 recon: 0.058 id: 0.059 time: 3:23:49.423237
Epoch 774/800 1000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.407 adv: 1.788 recon: 0.051 id: 0.053 time: 3:27:20.808183
Epoch 774/800 2000/2595 [D loss: 0.013 acc: 1.000][G loss: 2.881 adv: 2.238 recon: 0.054 id: 0.054 time: 3:30:49.095005
Epoch 774/800 [D loss: 0.014 acc: 0.994][G loss: 2.551 adv: 1.893 recon: 0.055 id: 0.056 time: 3:32:50.559595
Epoch 775/800 1000/2595 [D loss: 0.011 acc: 1.000][G loss: 2.356 adv: 1.741 recon: 0.051 id: 0.052 time: 3:36:10.834600
Epoch 775/800 2000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.617 adv: 1.994 recon: 0.052 id: 0.053 time: 3:39:32.289099
Epoch 775/800 [D loss: 0.015 acc: 0.993][G loss: 2.620 adv: 1.893 recon: 0.061 id: 0.061 time: 3:41:30.837885
Epoch 776/800 1000/2595 [D loss: 0.012 acc: 1.000][G loss: 2.405 adv: 1.777 recon: 0.052 id: 0.054 time: 3:44:51.148941
Epoch 776/800 2000/2595 [D loss: 0.009 acc: 1.000][G loss: 2.478 adv: 1.821 recon: 0.055 id: 0.055 time: 3:48:10.887091
Epoch 776/800 [D loss: 0.015 acc: 0.992][G loss: 2.750 adv: 1.891 recon: 0.071 id: 0.072 time: 3:50:09.407338
Epoch 777/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.583 adv: 1.938 recon: 0.054 id: 0.054 time: 3:53:26.726058
Epoch 777/800 2000/2595 [D loss: 0.011 acc: 1.000][G loss: 2.466 adv: 1.843 recon: 0.052 id: 0.053 time: 3:56:43.882600
Epoch 777/800 [D loss: 0.014 acc: 0.993][G loss: 2.659 adv: 1.888 recon: 0.064 id: 0.065 time: 3:58:43.600231
Epoch 778/800 1000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.315 adv: 1.699 recon: 0.051 id: 0.053 time: 4:02:03.378352
Epoch 778/800 2000/2595 [D loss: 0.002 acc: 1.000][G loss: 2.680 adv: 1.931 recon: 0.062 id: 0.062 time: 4:05:20.302035
Epoch 778/800 [D loss: 0.016 acc: 0.990][G loss: 2.709 adv: 1.887 recon: 0.069 id: 0.069 time: 4:07:17.966259
Epoch 779/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.487 adv: 1.859 recon: 0.052 id: 0.054 time: 4:10:34.595400
Epoch 779/800 2000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.553 adv: 1.914 recon: 0.053 id: 0.054 time: 4:13:50.787373
Epoch 779/800 [D loss: 0.017 acc: 0.990][G loss: 2.663 adv: 1.885 recon: 0.065 id: 0.066 time: 4:15:50.306303
Epoch 780/800 1000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.388 adv: 1.770 recon: 0.051 id: 0.052 time: 4:19:11.058702
Epoch 780/800 2000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.857 adv: 2.228 recon: 0.052 id: 0.053 time: 4:22:34.082817
Epoch 780/800 [D loss: 0.014 acc: 0.993][G loss: 2.657 adv: 1.892 recon: 0.064 id: 0.064 time: 4:24:32.868915
Epoch 781/800 1000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.446 adv: 1.799 recon: 0.054 id: 0.054 time: 4:27:48.664137
Epoch 781/800 2000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.835 adv: 2.200 recon: 0.053 id: 0.054 time: 4:31:03.036758
Epoch 781/800 [D loss: 0.015 acc: 0.991][G loss: 2.690 adv: 1.873 recon: 0.068 id: 0.069 time: 4:33:00.006575
Epoch 782/800 1000/2595 [D loss: 0.009 acc: 1.000][G loss: 2.358 adv: 1.736 recon: 0.052 id: 0.052 time: 4:36:18.617045
Epoch 782/800 2000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.442 adv: 1.830 recon: 0.051 id: 0.052 time: 4:39:39.810535
Epoch 782/800 [D loss: 0.017 acc: 0.989][G loss: 2.635 adv: 1.868 recon: 0.064 id: 0.065 time: 4:41:40.378686
Epoch 783/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.653 adv: 1.905 recon: 0.063 id: 0.060 time: 4:44:59.528840
Epoch 783/800 2000/2595 [D loss: 0.002 acc: 1.000][G loss: 2.601 adv: 1.973 recon: 0.052 id: 0.053 time: 4:48:17.206198
Epoch 783/800 [D loss: 0.019 acc: 0.987][G loss: 2.642 adv: 1.862 recon: 0.065 id: 0.066 time: 4:50:13.660310
Epoch 784/800 1000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.420 adv: 1.672 recon: 0.063 id: 0.060 time: 4:53:30.505632
Epoch 784/800 2000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.454 adv: 1.837 recon: 0.051 id: 0.052 time: 4:56:47.992019
Epoch 784/800 [D loss: 0.017 acc: 0.988][G loss: 2.604 adv: 1.868 recon: 0.061 id: 0.063 time: 4:58:44.744891
Epoch 785/800 1000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.613 adv: 1.957 recon: 0.055 id: 0.055 time: 5:01:58.106185
Epoch 785/800 2000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.466 adv: 1.843 recon: 0.052 id: 0.053 time: 5:05:12.299642
Epoch 785/800 [D loss: 0.016 acc: 0.991][G loss: 2.624 adv: 1.868 recon: 0.063 id: 0.064 time: 5:07:07.084418
Epoch 786/800 1000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.396 adv: 1.732 recon: 0.055 id: 0.055 time: 5:10:23.412331
Epoch 786/800 2000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.707 adv: 2.057 recon: 0.054 id: 0.054 time: 5:13:38.393115
Epoch 786/800 [D loss: 0.017 acc: 0.989][G loss: 2.640 adv: 1.870 recon: 0.064 id: 0.066 time: 5:15:34.666717
Epoch 787/800 1000/2595 [D loss: 0.010 acc: 1.000][G loss: 2.723 adv: 2.041 recon: 0.057 id: 0.058 time: 5:18:47.798945
Epoch 787/800 2000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.545 adv: 1.914 recon: 0.052 id: 0.053 time: 5:22:04.257135
Epoch 787/800 [D loss: 0.016 acc: 0.991][G loss: 2.653 adv: 1.879 recon: 0.064 id: 0.067 time: 5:24:00.167488
Epoch 788/800 1000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.915 adv: 2.247 recon: 0.056 id: 0.056 time: 5:27:17.247531
Epoch 788/800 2000/2595 [D loss: 0.002 acc: 1.000][G loss: 2.456 adv: 1.822 recon: 0.053 id: 0.053 time: 5:30:34.359279
Epoch 788/800 [D loss: 0.014 acc: 0.995][G loss: 2.604 adv: 1.893 recon: 0.059 id: 0.061 time: 5:32:31.500816
Epoch 789/800 1000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.493 adv: 1.815 recon: 0.056 id: 0.057 time: 5:35:49.467603
Epoch 789/800 2000/2595 [D loss: 0.006 acc: 1.000][G loss: 2.682 adv: 2.053 recon: 0.052 id: 0.054 time: 5:39:04.605524
Epoch 789/800 [D loss: 0.014 acc: 0.993][G loss: 2.550 adv: 1.883 recon: 0.055 id: 0.056 time: 5:41:00.832057
Epoch 790/800 1000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.531 adv: 1.818 recon: 0.059 id: 0.059 time: 5:44:15.123158
Epoch 790/800 2000/2595 [D loss: 0.003 acc: 1.000][G loss: 2.748 adv: 2.110 recon: 0.053 id: 0.053 time: 5:47:30.702452
Epoch 790/800 [D loss: 0.016 acc: 0.993][G loss: 2.565 adv: 1.860 recon: 0.059 id: 0.059 time: 5:49:24.085746
Epoch 791/800 1000/2595 [D loss: 0.010 acc: 1.000][G loss: 2.476 adv: 1.830 recon: 0.054 id: 0.053 time: 5:52:34.914881
Epoch 791/800 2000/2595 [D loss: 0.010 acc: 1.000][G loss: 2.933 adv: 2.268 recon: 0.055 id: 0.055 time: 5:55:45.091080
Epoch 791/800 [D loss: 0.016 acc: 0.993][G loss: 2.650 adv: 1.877 recon: 0.064 id: 0.065 time: 5:57:38.134581
Epoch 792/800 1000/2595 [D loss: 0.023 acc: 1.000][G loss: 4.124 adv: 2.300 recon: 0.153 id: 0.147 time: 6:00:47.236963
Epoch 792/800 2000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.240 adv: 1.571 recon: 0.055 id: 0.058 time: 6:03:58.523786
Epoch 792/800 [D loss: 0.017 acc: 0.993][G loss: 2.689 adv: 1.864 recon: 0.068 id: 0.070 time: 6:05:50.622732
Epoch 793/800 1000/2595 [D loss: 0.033 acc: 0.997][G loss: 2.859 adv: 2.213 recon: 0.054 id: 0.055 time: 6:08:59.229409
Epoch 793/800 2000/2595 [D loss: 0.011 acc: 0.999][G loss: 2.351 adv: 1.738 recon: 0.051 id: 0.052 time: 6:12:07.415402
Epoch 793/800 [D loss: 0.018 acc: 0.990][G loss: 2.593 adv: 1.859 recon: 0.061 id: 0.062 time: 6:14:02.000365
Epoch 794/800 1000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.884 adv: 1.842 recon: 0.087 id: 0.088 time: 6:17:10.329166
Epoch 794/800 2000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.384 adv: 1.754 recon: 0.052 id: 0.053 time: 6:20:18.541098
Epoch 794/800 [D loss: 0.015 acc: 0.994][G loss: 2.611 adv: 1.886 recon: 0.060 id: 0.062 time: 6:22:10.910995
Epoch 795/800 1000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.491 adv: 1.854 recon: 0.053 id: 0.055 time: 6:25:19.401209
Epoch 795/800 2000/2595 [D loss: 0.005 acc: 1.000][G loss: 2.446 adv: 1.776 recon: 0.056 id: 0.056 time: 6:28:27.450823
Epoch 795/800 [D loss: 0.017 acc: 0.991][G loss: 2.570 adv: 1.857 recon: 0.059 id: 0.060 time: 6:30:19.590348
Epoch 796/800 1000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.528 adv: 1.873 recon: 0.055 id: 0.055 time: 6:33:26.288933
Epoch 796/800 2000/2595 [D loss: 0.008 acc: 1.000][G loss: 2.533 adv: 1.882 recon: 0.054 id: 0.055 time: 6:36:34.020393
Epoch 796/800 [D loss: 0.016 acc: 0.992][G loss: 2.641 adv: 1.878 recon: 0.063 id: 0.065 time: 6:38:28.290193
Epoch 797/800 1000/2595 [D loss: 0.024 acc: 1.000][G loss: 2.959 adv: 2.267 recon: 0.058 id: 0.058 time: 6:41:36.446982
Epoch 797/800 2000/2595 [D loss: 0.013 acc: 1.000][G loss: 1.871 adv: 1.254 recon: 0.051 id: 0.052 time: 6:44:44.134221
Epoch 797/800 [D loss: 0.018 acc: 0.989][G loss: 2.655 adv: 1.857 recon: 0.066 id: 0.067 time: 6:46:35.679371
Epoch 798/800 1000/2595 [D loss: 0.009 acc: 1.000][G loss: 2.363 adv: 1.740 recon: 0.052 id: 0.053 time: 6:49:43.845114
Epoch 798/800 2000/2595 [D loss: 0.009 acc: 1.000][G loss: 2.326 adv: 1.650 recon: 0.057 id: 0.056 time: 6:52:52.553835
Epoch 798/800 [D loss: 0.016 acc: 0.992][G loss: 2.652 adv: 1.870 recon: 0.065 id: 0.066 time: 6:54:44.702106
Epoch 799/800 1000/2595 [D loss: 0.006 acc: 1.000][G loss: 4.692 adv: 2.033 recon: 0.223 id: 0.216 time: 6:57:55.460577
Epoch 799/800 2000/2595 [D loss: 0.004 acc: 1.000][G loss: 2.459 adv: 1.826 recon: 0.052 id: 0.055 time: 7:01:04.633113
Epoch 799/800 [D loss: 0.015 acc: 0.993][G loss: 2.697 adv: 1.878 recon: 0.068 id: 0.070 time: 7:02:57.104427
Epoch 800/800 1000/2595 [D loss: 0.007 acc: 1.000][G loss: 2.444 adv: 1.828 recon: 0.051 id: 0.053 time: 7:06:07.336751
Epoch 800/800 2000/2595 [D loss: 0.010 acc: 1.000][G loss: 2.517 adv: 1.891 recon: 0.052 id: 0.053 time: 7:09:16.122167
Epoch 800/800 [D loss: 0.016 acc: 0.992][G loss: 2.663 adv: 1.869 recon: 0.066 id: 0.067 time: 7:11:08.156283
In [ ]:
! ls {save_path}/weights
combined-weights_100.h5  d_A-weights_600.h5   g_AB-weights_400.h5
combined-weights_150.h5  d_A-weights_650.h5   g_AB-weights_450.h5
combined-weights_1.h5	 d_A-weights_700.h5   g_AB-weights_500.h5
combined-weights_200.h5  d_A-weights_750.h5   g_AB-weights_50.h5
combined-weights_250.h5  d_A-weights_800.h5   g_AB-weights_550.h5
combined-weights_300.h5  d_A-weights.h5       g_AB-weights_5.h5
combined-weights_350.h5  d_B-weights_100.h5   g_AB-weights_600.h5
combined-weights_3.h5	 d_B-weights_150.h5   g_AB-weights_650.h5
combined-weights_400.h5  d_B-weights_1.h5     g_AB-weights_700.h5
combined-weights_450.h5  d_B-weights_200.h5   g_AB-weights_750.h5
combined-weights_500.h5  d_B-weights_250.h5   g_AB-weights_800.h5
combined-weights_50.h5	 d_B-weights_300.h5   g_AB-weights.h5
combined-weights_550.h5  d_B-weights_350.h5   g_BA-weights_100.h5
combined-weights_5.h5	 d_B-weights_3.h5     g_BA-weights_150.h5
combined-weights_600.h5  d_B-weights_400.h5   g_BA-weights_1.h5
combined-weights_650.h5  d_B-weights_450.h5   g_BA-weights_200.h5
combined-weights_700.h5  d_B-weights_500.h5   g_BA-weights_250.h5
combined-weights_750.h5  d_B-weights_50.h5    g_BA-weights_300.h5
combined-weights_800.h5  d_B-weights_550.h5   g_BA-weights_350.h5
combined-weights.h5	 d_B-weights_5.h5     g_BA-weights_3.h5
d_A-weights_100.h5	 d_B-weights_600.h5   g_BA-weights_400.h5
d_A-weights_150.h5	 d_B-weights_650.h5   g_BA-weights_450.h5
d_A-weights_1.h5	 d_B-weights_700.h5   g_BA-weights_500.h5
d_A-weights_200.h5	 d_B-weights_750.h5   g_BA-weights_50.h5
d_A-weights_250.h5	 d_B-weights_800.h5   g_BA-weights_550.h5
d_A-weights_300.h5	 d_B-weights.h5       g_BA-weights_5.h5
d_A-weights_350.h5	 g_AB-weights_100.h5  g_BA-weights_600.h5
d_A-weights_3.h5	 g_AB-weights_150.h5  g_BA-weights_650.h5
d_A-weights_400.h5	 g_AB-weights_1.h5    g_BA-weights_700.h5
d_A-weights_450.h5	 g_AB-weights_200.h5  g_BA-weights_750.h5
d_A-weights_500.h5	 g_AB-weights_250.h5  g_BA-weights_800.h5
d_A-weights_50.h5	 g_AB-weights_300.h5  g_BA-weights.h5
d_A-weights_550.h5	 g_AB-weights_350.h5
d_A-weights_5.h5	 g_AB-weights_3.h5

Generate Images

画像を生成する

In [ ]:
# Display images
# 画像を表示する。
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np

def showImages(imgs, rows=-1, cols=-1, w=2, h=2):
    N = len(imgs)
    if rows < 0: rows = 1
    if cols < 0: cols = (N + rows -1) // rows
    fig, ax = plt.subplots(rows, cols, figsize=(w*cols, h*rows))
    idx = 0
    for row in range(rows):
        for col in range(cols) :
            if rows == 1 and cols == 1:
                axis = ax
            elif rows == 1:
                axis = ax[col]
            elif cols == 1:
                axis = ax[row]
            else:
                axis = ax[row][col]

            if idx < N:
                axis.imshow(imgs[idx])
            axis.axis('off')
            idx += 1
    plt.show()
In [ ]:
# Display generated and cycle images.
# 生成画像とサイクル画像を表示する。

test_pairs = test_pair_flow[:5]

test_imgsA = test_pairs[:,0]
test_imgsB = test_pairs[:,1]

imgsAB = gan.generate_image_from_A(test_imgsA)
imgsBA = gan.generate_image_from_B(test_imgsB)

print('A-->B-->A, ID')
showImages(M1P1_ZeroP1(imgsAB), 4)

print('B-->A-->B, ID')
showImages(M1P1_ZeroP1(imgsBA), 4)
A-->B-->A, ID
B-->A-->B, ID

Check the loss and accuracy of the training process.

学習過程のlossと精度を確認する

In [ ]:
# Display the graph of losses in training
%matplotlib inline

gan.showLoss()
loss AB
loss BA

Check the saved files

保存されているファイルを確認する

In [ ]:
! ls -lR {save_path}
/content/drive/MyDrive/ColabRun/CycleGAN_VidTIMIT01:
total 2090
drwx------ 2 root root   4096 Dec  7 08:30 BAK
-rw------- 1 root root  27155 Nov 30 02:00 params_100.pkl
-rw------- 1 root root  40955 Nov 30 08:35 params_150.pkl
-rw------- 1 root root    413 Nov 29 12:52 params_1.pkl
-rw------- 1 root root  55105 Nov 30 18:35 params_200.pkl
-rw------- 1 root root  69205 Dec  1 03:22 params_250.pkl
-rw------- 1 root root  83356 Dec  2 07:39 params_300.pkl
-rw------- 1 root root  97456 Dec  2 14:25 params_350.pkl
-rw------- 1 root root    895 Nov 29 13:08 params_3.pkl
-rw------- 1 root root 111606 Dec  2 21:14 params_400.pkl
-rw------- 1 root root 125756 Dec  3 09:36 params_450.pkl
-rw------- 1 root root 139906 Dec  3 18:25 params_500.pkl
-rw------- 1 root root  13355 Nov 29 19:32 params_50.pkl
-rw------- 1 root root 154056 Dec  5 01:54 params_550.pkl
-rw------- 1 root root   1416 Nov 29 13:25 params_5.pkl
-rw------- 1 root root 168156 Dec  5 10:27 params_600.pkl
-rw------- 1 root root 182306 Dec  6 06:11 params_650.pkl
-rw------- 1 root root 196406 Dec  6 13:00 params_700.pkl
-rw------- 1 root root 210556 Dec  6 19:44 params_750.pkl
-rw------- 1 root root 224706 Dec  8 04:43 params_800.pkl
-rw------- 1 root root 224706 Dec  8 04:43 params.pkl
drwx------ 2 root root   4096 Dec  8 04:43 weights

/content/drive/MyDrive/ColabRun/CycleGAN_VidTIMIT01/BAK:
total 70632
-rw------- 1 root root 17974272 Dec  7 07:01 combined-weights_800.h5
-rw------- 1 root root 17974272 Dec  7 07:01 combined-weights.h5
-rw------- 1 root root  2805136 Dec  7 07:01 d_A-weights_800.h5
-rw------- 1 root root  2805136 Dec  7 07:01 d_A-weights.h5
-rw------- 1 root root  2805136 Dec  7 07:01 d_B-weights_800.h5
-rw------- 1 root root  2805136 Dec  7 07:01 d_B-weights.h5
-rw------- 1 root root  6232880 Dec  7 07:01 g_AB-weights_800.h5
-rw------- 1 root root  6232880 Dec  7 07:01 g_AB-weights.h5
-rw------- 1 root root  6232880 Dec  7 07:01 g_BA-weights_800.h5
-rw------- 1 root root  6232880 Dec  7 07:01 g_BA-weights.h5
-rw------- 1 root root   224706 Dec  7 08:30 params.pkl

/content/drive/MyDrive/ColabRun/CycleGAN_VidTIMIT01/weights:
total 704120
-rw------- 1 root root 17974272 Nov 30 02:00 combined-weights_100.h5
-rw------- 1 root root 17974272 Nov 30 08:35 combined-weights_150.h5
-rw------- 1 root root 17974272 Nov 29 12:52 combined-weights_1.h5
-rw------- 1 root root 17974272 Nov 30 18:35 combined-weights_200.h5
-rw------- 1 root root 17974272 Dec  1 03:22 combined-weights_250.h5
-rw------- 1 root root 17974272 Dec  2 07:39 combined-weights_300.h5
-rw------- 1 root root 17974272 Dec  2 14:25 combined-weights_350.h5
-rw------- 1 root root 17974272 Nov 29 13:08 combined-weights_3.h5
-rw------- 1 root root 17974272 Dec  2 21:14 combined-weights_400.h5
-rw------- 1 root root 17974272 Dec  3 09:36 combined-weights_450.h5
-rw------- 1 root root 17974272 Dec  3 18:25 combined-weights_500.h5
-rw------- 1 root root 17974272 Nov 29 19:32 combined-weights_50.h5
-rw------- 1 root root 17974272 Dec  5 01:54 combined-weights_550.h5
-rw------- 1 root root 17974272 Nov 29 13:25 combined-weights_5.h5
-rw------- 1 root root 17974272 Dec  5 10:27 combined-weights_600.h5
-rw------- 1 root root 17974272 Dec  6 06:11 combined-weights_650.h5
-rw------- 1 root root 17974272 Dec  6 13:00 combined-weights_700.h5
-rw------- 1 root root 17974272 Dec  6 19:44 combined-weights_750.h5
-rw------- 1 root root 17974272 Dec  8 04:43 combined-weights_800.h5
-rw------- 1 root root 17974272 Dec  8 04:43 combined-weights.h5
-rw------- 1 root root  2805136 Nov 30 02:00 d_A-weights_100.h5
-rw------- 1 root root  2805136 Nov 30 08:35 d_A-weights_150.h5
-rw------- 1 root root  2805136 Nov 29 12:52 d_A-weights_1.h5
-rw------- 1 root root  2805136 Nov 30 18:35 d_A-weights_200.h5
-rw------- 1 root root  2805136 Dec  1 03:22 d_A-weights_250.h5
-rw------- 1 root root  2805136 Dec  2 07:39 d_A-weights_300.h5
-rw------- 1 root root  2805136 Dec  2 14:25 d_A-weights_350.h5
-rw------- 1 root root  2805136 Nov 29 13:08 d_A-weights_3.h5
-rw------- 1 root root  2805136 Dec  2 21:14 d_A-weights_400.h5
-rw------- 1 root root  2805136 Dec  3 09:36 d_A-weights_450.h5
-rw------- 1 root root  2805136 Dec  3 18:25 d_A-weights_500.h5
-rw------- 1 root root  2805136 Nov 29 19:32 d_A-weights_50.h5
-rw------- 1 root root  2805136 Dec  5 01:54 d_A-weights_550.h5
-rw------- 1 root root  2805136 Nov 29 13:25 d_A-weights_5.h5
-rw------- 1 root root  2805136 Dec  5 10:27 d_A-weights_600.h5
-rw------- 1 root root  2805136 Dec  6 06:11 d_A-weights_650.h5
-rw------- 1 root root  2805136 Dec  6 13:00 d_A-weights_700.h5
-rw------- 1 root root  2805136 Dec  6 19:44 d_A-weights_750.h5
-rw------- 1 root root  2805136 Dec  8 04:43 d_A-weights_800.h5
-rw------- 1 root root  2805136 Dec  8 04:43 d_A-weights.h5
-rw------- 1 root root  2805136 Nov 30 02:00 d_B-weights_100.h5
-rw------- 1 root root  2805136 Nov 30 08:35 d_B-weights_150.h5
-rw------- 1 root root  2805136 Nov 29 12:52 d_B-weights_1.h5
-rw------- 1 root root  2805136 Nov 30 18:35 d_B-weights_200.h5
-rw------- 1 root root  2805136 Dec  1 03:22 d_B-weights_250.h5
-rw------- 1 root root  2805136 Dec  2 07:39 d_B-weights_300.h5
-rw------- 1 root root  2805136 Dec  2 14:25 d_B-weights_350.h5
-rw------- 1 root root  2805136 Nov 29 13:08 d_B-weights_3.h5
-rw------- 1 root root  2805136 Dec  2 21:14 d_B-weights_400.h5
-rw------- 1 root root  2805136 Dec  3 09:36 d_B-weights_450.h5
-rw------- 1 root root  2805136 Dec  3 18:25 d_B-weights_500.h5
-rw------- 1 root root  2805136 Nov 29 19:32 d_B-weights_50.h5
-rw------- 1 root root  2805136 Dec  5 01:54 d_B-weights_550.h5
-rw------- 1 root root  2805136 Nov 29 13:25 d_B-weights_5.h5
-rw------- 1 root root  2805136 Dec  5 10:27 d_B-weights_600.h5
-rw------- 1 root root  2805136 Dec  6 06:11 d_B-weights_650.h5
-rw------- 1 root root  2805136 Dec  6 13:00 d_B-weights_700.h5
-rw------- 1 root root  2805136 Dec  6 19:44 d_B-weights_750.h5
-rw------- 1 root root  2805136 Dec  8 04:43 d_B-weights_800.h5
-rw------- 1 root root  2805136 Dec  8 04:43 d_B-weights.h5
-rw------- 1 root root  6232880 Nov 30 02:00 g_AB-weights_100.h5
-rw------- 1 root root  6232880 Nov 30 08:35 g_AB-weights_150.h5
-rw------- 1 root root  6232880 Nov 29 12:52 g_AB-weights_1.h5
-rw------- 1 root root  6232880 Nov 30 18:35 g_AB-weights_200.h5
-rw------- 1 root root  6232880 Dec  1 03:22 g_AB-weights_250.h5
-rw------- 1 root root  6232880 Dec  2 07:39 g_AB-weights_300.h5
-rw------- 1 root root  6232880 Dec  2 14:25 g_AB-weights_350.h5
-rw------- 1 root root  6232880 Nov 29 13:08 g_AB-weights_3.h5
-rw------- 1 root root  6232880 Dec  2 21:14 g_AB-weights_400.h5
-rw------- 1 root root  6232880 Dec  3 09:36 g_AB-weights_450.h5
-rw------- 1 root root  6232880 Dec  3 18:25 g_AB-weights_500.h5
-rw------- 1 root root  6232880 Nov 29 19:32 g_AB-weights_50.h5
-rw------- 1 root root  6232880 Dec  5 01:54 g_AB-weights_550.h5
-rw------- 1 root root  6232880 Nov 29 13:25 g_AB-weights_5.h5
-rw------- 1 root root  6232880 Dec  5 10:27 g_AB-weights_600.h5
-rw------- 1 root root  6232880 Dec  6 06:11 g_AB-weights_650.h5
-rw------- 1 root root  6232880 Dec  6 13:00 g_AB-weights_700.h5
-rw------- 1 root root  6232880 Dec  6 19:44 g_AB-weights_750.h5
-rw------- 1 root root  6232880 Dec  8 04:43 g_AB-weights_800.h5
-rw------- 1 root root  6232880 Dec  8 04:43 g_AB-weights.h5
-rw------- 1 root root  6232880 Nov 30 02:00 g_BA-weights_100.h5
-rw------- 1 root root  6232880 Nov 30 08:35 g_BA-weights_150.h5
-rw------- 1 root root  6232880 Nov 29 12:52 g_BA-weights_1.h5
-rw------- 1 root root  6232880 Nov 30 18:35 g_BA-weights_200.h5
-rw------- 1 root root  6232880 Dec  1 03:22 g_BA-weights_250.h5
-rw------- 1 root root  6232880 Dec  2 07:39 g_BA-weights_300.h5
-rw------- 1 root root  6232880 Dec  2 14:25 g_BA-weights_350.h5
-rw------- 1 root root  6232880 Nov 29 13:08 g_BA-weights_3.h5
-rw------- 1 root root  6232880 Dec  2 21:14 g_BA-weights_400.h5
-rw------- 1 root root  6232880 Dec  3 09:36 g_BA-weights_450.h5
-rw------- 1 root root  6232880 Dec  3 18:25 g_BA-weights_500.h5
-rw------- 1 root root  6232880 Nov 29 19:32 g_BA-weights_50.h5
-rw------- 1 root root  6232880 Dec  5 01:54 g_BA-weights_550.h5
-rw------- 1 root root  6232880 Nov 29 13:25 g_BA-weights_5.h5
-rw------- 1 root root  6232880 Dec  5 10:27 g_BA-weights_600.h5
-rw------- 1 root root  6232880 Dec  6 06:11 g_BA-weights_650.h5
-rw------- 1 root root  6232880 Dec  6 13:00 g_BA-weights_700.h5
-rw------- 1 root root  6232880 Dec  6 19:44 g_BA-weights_750.h5
-rw------- 1 root root  6232880 Dec  8 04:43 g_BA-weights_800.h5
-rw------- 1 root root  6232880 Dec  8 04:43 g_BA-weights.h5
In [ ]: