Segmentation of Eucalyptus Areas in Eastern Mato Grosso do Sul¶

In this example, we will create a model to segment eucalyptus areas in Sentinel 2 images. To do this, we selected some areas where the eucalyptus area polygons were collected. This polygon was converted into a mask image that will be used together with the Sentinel 2 image to train the model.

image.png

Ten Sentinel 2 bands were selected: B02, B03, B04, B05, B06, B07, B08, B08A, B11 and B12. Images from 3 different periods of the year were also selected.

image.png

The dataset is already prepared, so we start by installing rasterio and importing some libraries:

In [ ]:
!pip install rasterio
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting rasterio
  Downloading rasterio-1.2.10-cp37-cp37m-manylinux1_x86_64.whl (19.3 MB)
     |████████████████████████████████| 19.3 MB 2.0 MB/s 
Collecting affine
  Downloading affine-2.3.1-py2.py3-none-any.whl (16 kB)
Requirement already satisfied: certifi in /usr/local/lib/python3.7/dist-packages (from rasterio) (2022.6.15)
Requirement already satisfied: click>=4.0 in /usr/local/lib/python3.7/dist-packages (from rasterio) (7.1.2)
Requirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from rasterio) (1.21.6)
Collecting click-plugins
  Downloading click_plugins-1.1.1-py2.py3-none-any.whl (7.5 kB)
Requirement already satisfied: setuptools in /usr/local/lib/python3.7/dist-packages (from rasterio) (57.4.0)
Collecting cligj>=0.5
  Downloading cligj-0.7.2-py3-none-any.whl (7.1 kB)
Collecting snuggs>=1.4.1
  Downloading snuggs-1.4.7-py3-none-any.whl (5.4 kB)
Requirement already satisfied: attrs in /usr/local/lib/python3.7/dist-packages (from rasterio) (22.1.0)
Requirement already satisfied: pyparsing>=2.1.6 in /usr/local/lib/python3.7/dist-packages (from snuggs>=1.4.1->rasterio) (3.0.9)
Installing collected packages: snuggs, cligj, click-plugins, affine, rasterio
Successfully installed affine-2.3.1 click-plugins-1.1.1 cligj-0.7.2 rasterio-1.2.10 snuggs-1.4.7
In [ ]:
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
In [ ]:
import rasterio
import os
import cv2
import matplotlib.pyplot as plt
import numpy as np
from sklearn.model_selection import train_test_split
from rasterio.merge import merge
from rasterio.windows import Window

We defined the paths for the images and masks of the three different times of the year:

In [ ]:
path_img_dry = 'drive/My Drive/Datasets/ForestryView/data/2019_dry_img'
path_img_wet = 'drive/My Drive/Datasets/ForestryView/data/2019_wet_img'
path_mask_dry = 'drive/My Drive/Datasets/ForestryView/data/2019_dry_tgt'
path_mask_wet = 'drive/My Drive/Datasets/ForestryView/data/2019_wet_tgt'
path_img_int = 'drive/My Drive/Datasets/ForestryView/data/2019_int_img'
path_mask_int = 'drive/My Drive/Datasets/ForestryView/data/2019_int_tgt'
In [ ]:
X_wet = []
images_files_wet = [f for f in os.listdir(path_img_wet)]
for i in range(len(images_files_wet)):
  import_raster = os.path.join(path_img_wet,'img_' + str(i+1) + '.tif')
  print(i)
  with rasterio.open(import_raster) as src:
    im = src.read()
  im = im.transpose([1,2,0])
  im = np.nan_to_num(im)
  im = im[:,:,0:10]
  full_img = im.copy()
  print(full_img.shape)
  if (full_img.shape[0] == 335):
    full_img = full_img[1:,:,:]
  if (full_img.shape[1] == 335):
    full_img = full_img[:,1:,:]
  full_img = full_img[7:-7,7:-7,:]
  X_wet.append(full_img)
X_wet = np.array(X_wet)
print(X_wet.shape)



Y_wet = []
mask_files_wet = [f for f in os.listdir(path_mask_wet)]
for i in range(len(mask_files_wet)):
  import_raster = os.path.join(path_mask_wet,'tgt_' + str(i+1) + '.tif')
  with rasterio.open(import_raster) as src:
    im = src.read()
  im = im.transpose([1,2,0])
  print(im.shape)
  if (im.shape[0] == 335):
    im = im[1:,:,:]
  if (im.shape[1] == 335):
    im = im[:,1:,:]
  im = im[7:-7,7:-7,:]
  Y_wet.append(im)
Y_wet = np.array(Y_wet)
print(Y_wet.shape)
0
(335, 335, 10)
1
(335, 335, 10)
2
(335, 335, 10)
3
(335, 334, 10)
4
(335, 335, 10)
5
(335, 335, 10)
6
(335, 335, 10)
7
(335, 335, 10)
8
(335, 335, 10)
9
(335, 335, 10)
10
(335, 335, 10)
11
(335, 335, 10)
12
(335, 335, 10)
13
(335, 335, 10)
14
(335, 335, 10)
15
(335, 335, 10)
16
(335, 335, 10)
17
(335, 335, 10)
18
(335, 335, 10)
19
(335, 335, 10)
20
(335, 335, 10)
21
(335, 335, 10)
22
(335, 335, 10)
23
(335, 335, 10)
24
(335, 335, 10)
25
(335, 335, 10)
26
(335, 335, 10)
27
(335, 335, 10)
28
(335, 335, 10)
29
(335, 335, 10)
30
(335, 335, 10)
31
(335, 335, 10)
32
(335, 335, 10)
33
(335, 335, 10)
34
(335, 335, 10)
35
(335, 335, 10)
36
(335, 335, 10)
37
(335, 335, 10)
38
(335, 335, 10)
39
(335, 335, 10)
40
(335, 335, 10)
41
(335, 335, 10)
42
(335, 335, 10)
43
(335, 335, 10)
44
(335, 335, 10)
45
(335, 335, 10)
46
(335, 335, 10)
47
(335, 335, 10)
48
(335, 335, 10)
49
(335, 335, 10)
50
(335, 335, 10)
51
(335, 335, 10)
52
(335, 335, 10)
53
(335, 335, 10)
54
(335, 335, 10)
55
(335, 335, 10)
56
(335, 335, 10)
57
(335, 335, 10)
58
(335, 335, 10)
59
(335, 335, 10)
60
(335, 335, 10)
61
(335, 335, 10)
62
(335, 335, 10)
63
(335, 335, 10)
64
(334, 335, 10)
65
(335, 335, 10)
66
(335, 335, 10)
67
(335, 335, 10)
68
(335, 335, 10)
69
(335, 335, 10)
70
(335, 335, 10)
71
(335, 335, 10)
72
(335, 335, 10)
73
(335, 335, 10)
74
(335, 335, 10)
75
(335, 335, 10)
76
(335, 335, 10)
77
(335, 335, 10)
78
(335, 335, 10)
79
(335, 335, 10)
80
(335, 335, 10)
81
(335, 335, 10)
82
(335, 335, 10)
83
(335, 335, 10)
84
(335, 335, 10)
85
(335, 335, 10)
86
(335, 335, 10)
87
(335, 334, 10)
88
(335, 335, 10)
89
(335, 335, 10)
90
(335, 335, 10)
91
(335, 335, 10)
92
(335, 335, 10)
93
(335, 335, 10)
94
(335, 335, 10)
95
(335, 334, 10)
96
(335, 335, 10)
97
(335, 335, 10)
98
(335, 335, 10)
99
(335, 335, 10)
100
(335, 335, 10)
101
(335, 335, 10)
102
(335, 335, 10)
103
(335, 335, 10)
104
(335, 335, 10)
105
(335, 335, 10)
106
(335, 335, 10)
107
(335, 335, 10)
108
(335, 334, 10)
109
(335, 335, 10)
(110, 320, 320, 10)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(334, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(110, 320, 320, 1)
In [ ]:
X_dry = []
images_files_dry = [f for f in os.listdir(path_img_dry)]
for i in range(len(images_files_dry)):
  import_raster = os.path.join(path_img_dry,'img_' + str(i+1) + '.tif')
  print(i)
  with rasterio.open(import_raster) as src:
    im = src.read()
  im = im.transpose([1,2,0])
  im = np.nan_to_num(im)
  im = im[:,:,0:10]
  full_img = im.copy()
  print(full_img.shape)
  if (full_img.shape[0] == 335):
    full_img = full_img[1:,:,:]
  if (full_img.shape[1] == 335):
    full_img = full_img[:,1:,:]
  full_img = full_img[7:-7,7:-7,:]
  X_dry.append(full_img)
X_dry = np.array(X_dry)
print(X_dry.shape)



Y_dry = []
mask_files_dry = [f for f in os.listdir(path_mask_dry)]
for i in range(len(mask_files_dry)):
  import_raster = os.path.join(path_mask_dry,'tgt_' + str(i+1) + '.tif')
  with rasterio.open(import_raster) as src:
    im = src.read()
  im = im.transpose([1,2,0])
  print(im.shape)
  if (im.shape[0] == 335):
    im = im[1:,:,:]
  if (im.shape[1] == 335):
    im = im[:,1:,:]
  im = im[7:-7,7:-7,:]
  Y_dry.append(im)
Y_dry = np.array(Y_dry)
print(Y_dry.shape)
0
(335, 335, 10)
1
(335, 335, 10)
2
(335, 335, 10)
3
(335, 334, 10)
4
(335, 335, 10)
5
(335, 335, 10)
6
(335, 335, 10)
7
(335, 335, 10)
8
(335, 335, 10)
9
(335, 335, 10)
10
(335, 335, 10)
11
(335, 335, 10)
12
(335, 335, 10)
13
(335, 335, 10)
14
(335, 335, 10)
15
(335, 335, 10)
16
(335, 335, 10)
17
(335, 335, 10)
18
(335, 335, 10)
19
(335, 335, 10)
20
(335, 335, 10)
21
(335, 335, 10)
22
(335, 335, 10)
23
(335, 335, 10)
24
(335, 335, 10)
25
(335, 335, 10)
26
(335, 335, 10)
27
(335, 335, 10)
28
(335, 335, 10)
29
(335, 335, 10)
30
(335, 335, 10)
31
(335, 335, 10)
32
(335, 335, 10)
33
(335, 335, 10)
34
(335, 335, 10)
35
(335, 335, 10)
36
(335, 335, 10)
37
(335, 335, 10)
38
(335, 335, 10)
39
(335, 335, 10)
40
(335, 335, 10)
41
(335, 335, 10)
42
(335, 335, 10)
43
(335, 335, 10)
44
(335, 335, 10)
45
(335, 335, 10)
46
(335, 335, 10)
47
(335, 335, 10)
48
(335, 335, 10)
49
(335, 335, 10)
50
(335, 335, 10)
51
(335, 335, 10)
52
(335, 335, 10)
53
(335, 335, 10)
54
(335, 335, 10)
55
(335, 335, 10)
56
(335, 335, 10)
57
(335, 335, 10)
58
(335, 335, 10)
59
(335, 335, 10)
60
(335, 335, 10)
61
(335, 335, 10)
62
(335, 335, 10)
63
(335, 335, 10)
64
(334, 335, 10)
65
(335, 335, 10)
66
(335, 335, 10)
67
(335, 335, 10)
68
(335, 335, 10)
69
(335, 335, 10)
70
(335, 335, 10)
71
(335, 335, 10)
72
(335, 335, 10)
73
(335, 335, 10)
74
(335, 335, 10)
75
(335, 335, 10)
76
(335, 335, 10)
77
(335, 335, 10)
78
(335, 335, 10)
79
(335, 335, 10)
80
(335, 335, 10)
81
(335, 335, 10)
82
(335, 335, 10)
83
(335, 335, 10)
84
(335, 335, 10)
85
(335, 335, 10)
86
(335, 335, 10)
87
(335, 334, 10)
88
(335, 335, 10)
89
(335, 335, 10)
90
(335, 335, 10)
91
(335, 335, 10)
92
(335, 335, 10)
93
(335, 335, 10)
94
(335, 335, 10)
95
(335, 334, 10)
96
(335, 335, 10)
97
(335, 335, 10)
98
(335, 335, 10)
99
(335, 335, 10)
100
(335, 335, 10)
101
(335, 335, 10)
102
(335, 335, 10)
103
(335, 335, 10)
104
(335, 335, 10)
105
(335, 335, 10)
106
(335, 335, 10)
107
(335, 335, 10)
108
(335, 334, 10)
109
(335, 335, 10)
(110, 320, 320, 10)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(334, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(110, 320, 320, 1)
In [ ]:
X_int = []
images_files_int = [f for f in os.listdir(path_img_int)]
for i in range(len(images_files_int)):
  import_raster = os.path.join(path_img_int,'img_' + str(i+1) + '.tif')
  print(i)
  with rasterio.open(import_raster) as src:
    im = src.read()
  im = im.transpose([1,2,0])
  im = np.nan_to_num(im)
  im = im[:,:,0:10]
  full_img = im.copy()
  print(full_img.shape)
  if (full_img.shape[0] == 335):
    full_img = full_img[1:,:,:]
  if (full_img.shape[1] == 335):
    full_img = full_img[:,1:,:]
  full_img = full_img[7:-7,7:-7,:]
  X_int.append(full_img)
X_int = np.array(X_int)
print(X_int.shape)



Y_int = []
mask_files_int = [f for f in os.listdir(path_mask_int)]
for i in range(len(mask_files_int)):
  import_raster = os.path.join(path_mask_int,'tgt_' + str(i+1) + '.tif')
  with rasterio.open(import_raster) as src:
    im = src.read()
  im = im.transpose([1,2,0])
  print(im.shape)
  if (im.shape[0] == 335):
    im = im[1:,:,:]
  if (im.shape[1] == 335):
    im = im[:,1:,:]
  im = im[7:-7,7:-7,:]
  Y_int.append(im)
Y_int = np.array(Y_int)
print(Y_int.shape)
0
(335, 335, 10)
1
(335, 335, 10)
2
(335, 335, 10)
3
(335, 334, 10)
4
(335, 335, 10)
5
(335, 335, 10)
6
(335, 335, 10)
7
(335, 335, 10)
8
(335, 335, 10)
9
(335, 335, 10)
10
(335, 335, 10)
11
(335, 335, 10)
12
(335, 335, 10)
13
(335, 335, 10)
14
(335, 335, 10)
15
(335, 335, 10)
16
(335, 335, 10)
17
(335, 335, 10)
18
(335, 335, 10)
19
(335, 335, 10)
20
(335, 335, 10)
21
(335, 335, 10)
22
(335, 335, 10)
23
(335, 335, 10)
24
(335, 335, 10)
25
(335, 335, 10)
26
(335, 335, 10)
27
(335, 335, 10)
28
(335, 335, 10)
29
(335, 335, 10)
30
(335, 335, 10)
31
(335, 335, 10)
32
(335, 335, 10)
33
(335, 335, 10)
34
(335, 335, 10)
35
(335, 335, 10)
36
(335, 335, 10)
37
(335, 335, 10)
38
(335, 335, 10)
39
(335, 335, 10)
40
(335, 335, 10)
41
(335, 335, 10)
42
(335, 335, 10)
43
(335, 335, 10)
44
(335, 335, 10)
45
(335, 335, 10)
46
(335, 335, 10)
47
(335, 335, 10)
48
(335, 335, 10)
49
(335, 335, 10)
50
(335, 335, 10)
51
(335, 335, 10)
52
(335, 335, 10)
53
(335, 335, 10)
54
(335, 335, 10)
55
(335, 335, 10)
56
(335, 335, 10)
57
(335, 335, 10)
58
(335, 335, 10)
59
(335, 335, 10)
60
(335, 335, 10)
61
(335, 335, 10)
62
(335, 335, 10)
63
(335, 335, 10)
64
(334, 335, 10)
65
(335, 335, 10)
66
(335, 335, 10)
67
(335, 335, 10)
68
(335, 335, 10)
69
(335, 335, 10)
70
(335, 335, 10)
71
(335, 335, 10)
72
(335, 335, 10)
73
(335, 335, 10)
74
(335, 335, 10)
75
(335, 335, 10)
76
(335, 335, 10)
77
(335, 335, 10)
78
(335, 335, 10)
79
(335, 335, 10)
80
(335, 335, 10)
81
(335, 335, 10)
82
(335, 335, 10)
83
(335, 335, 10)
84
(335, 335, 10)
85
(335, 335, 10)
86
(335, 335, 10)
87
(335, 334, 10)
88
(335, 335, 10)
89
(335, 335, 10)
90
(335, 335, 10)
91
(335, 335, 10)
92
(335, 335, 10)
93
(335, 335, 10)
94
(335, 335, 10)
95
(335, 334, 10)
96
(335, 335, 10)
97
(335, 335, 10)
98
(335, 335, 10)
99
(335, 335, 10)
100
(335, 335, 10)
101
(335, 335, 10)
102
(335, 335, 10)
103
(335, 335, 10)
104
(335, 335, 10)
105
(335, 335, 10)
106
(335, 335, 10)
107
(335, 335, 10)
108
(335, 334, 10)
109
(335, 335, 10)
(110, 320, 320, 10)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(334, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 335, 1)
(335, 334, 1)
(335, 335, 1)
(110, 320, 320, 1)

After importing the data, we separate a part for training and a part for testing. And so we concatenate them into one variable for the training images, another for testing. We do this for the masks as well.

In [ ]:
X_wet_train = X_wet[0:90,:,:,:].copy()
Y_wet_train = Y_wet[0:90,:,:,:].copy()
X_dry_train = X_dry[0:90,:,:,:].copy()
Y_dry_train = Y_dry[0:90,:,:,:].copy()
X_int_train = X_int[0:90,:,:,:].copy()
Y_int_train = Y_int[0:90,:,:,:].copy()


X_wet_test = X_wet[90:109,:,:,:].copy()
Y_wet_test = Y_wet[90:109,:,:,:].copy()
X_dry_test = X_dry[90:109,:,:,:].copy()
Y_dry_test = Y_dry[90:109,:,:,:].copy()
X_int_test = X_int[90:109,:,:,:].copy()
Y_int_test = Y_int[90:109,:,:,:].copy()



x_train = np.concatenate((X_wet_train,X_dry_train,X_int_train))
y_train = np.concatenate((Y_wet_train,Y_dry_train,Y_int_train))
x_test = np.concatenate((X_wet_test,X_dry_test,X_int_test))
y_test = np.concatenate((Y_wet_test,Y_dry_test,Y_int_test))
In [ ]:
del X_wet, X_dry, X_int, Y_wet, Y_dry, Y_int
In [ ]:
print(np.unique(y_test, return_counts=True))
print(np.unique(y_train, return_counts=True))
(array([0., 1.], dtype=float32), array([4356074, 1480726]))
(array([0., 1.], dtype=float32), array([21748725,  5899275]))
In [ ]:
x_test.shape
Out[ ]:
(57, 320, 320, 10)
In [ ]:
y_test.shape
Out[ ]:
(57, 320, 320, 1)

Now let's plot an example in RGB:

Link para Dataset: https://drive.google.com/drive/folders/1S1Ng8mzspJWMqQY9F2jOpVFmrrKY-i-1?usp=drive_link

In [ ]:
R = x_test[8,:,:,2]*4
G = x_test[8,:,:,1]*4
B = x_test[8,:,:,0]*4

rgb = np.dstack((R,G,B))
plt.figure(figsize=[12,12])
plt.imshow(rgb)
plt.axis('off')
WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
Out[ ]:
(-0.5, 319.5, 319.5, -0.5)
No description has been provided for this image
In [ ]:
i = 1
R = x_test[i,:,:,2]*4
G = x_test[i,:,:,1]*4
B = x_test[i,:,:,0]*4
rgb = np.dstack((R,G,B))
plt.figure(figsize=[20,20])
plt.subplot(121)
plt.imshow(rgb)
plt.title('RGB Image')
plt.axis('off')
plt.subplot(122)
plt.imshow(y_test[i,:,:,0])
plt.title('label maks')
plt.axis('off')
Out[ ]:
(-0.5, 319.5, 319.5, -0.5)
No description has been provided for this image

The next step is to import the Keras functions, implement data augmentation on the training dataset, build the architecture and start training:

In [ ]:
from keras.models import Model
from keras.layers import Input, Conv2D, MaxPooling2D, UpSampling2D, concatenate, Concatenate, add, Conv2DTranspose, BatchNormalization, Dropout, Activation, Add, AveragePooling2D, Lambda, SeparableConv2D, GlobalAveragePooling2D, DepthwiseConv2D, ZeroPadding2D, LeakyReLU
#from tensorflow.keras.optimizers import Adam
#from tensorflow.keras.optimizers.legacy import Adam
from keras.optimizers import Adam
from keras.activations import relu
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from keras.losses import binary_crossentropy
from keras import backend as K
import tensorflow as tf
from keras.models import load_model
from tensorflow.keras.losses import Dice
In [ ]:
img_datagen = ImageDataGenerator(
    rotation_range=90,
    vertical_flip = True,
    horizontal_flip=True,
    zoom_range = 0.2)

mask_datagen = ImageDataGenerator(
    rotation_range=90,
    vertical_flip = True,
    horizontal_flip=True,
    zoom_range = 0.2)
In [ ]:
img_datagen.fit(x_train, augment=True,seed=1200)
mask_datagen.fit(y_train, augment=True,seed=1200)
/usr/local/lib/python3.10/dist-packages/keras/src/legacy/preprocessing/image.py:1495: UserWarning: Expected input to be images (as Numpy array) following the data format convention "channels_last" (channels on axis 3), i.e. expected either 1, 3 or 4 channels on axis 3. However, it was passed an array with shape (270, 320, 320, 10) (10 channels).
  warnings.warn(
In [ ]:
train_generator=img_datagen.flow(x_train,y_train,batch_size=6,seed=1200)
/usr/local/lib/python3.10/dist-packages/keras/src/legacy/preprocessing/image.py:619: UserWarning: NumpyArrayIterator is set to use the data format convention "channels_last" (channels on axis 3), i.e. expected either 1, 3, or 4 channels on axis 3. However, it was passed an array with shape (270, 320, 320, 10) (10 channels).
  warnings.warn(
In [ ]:
len(x_test)/4
Out[ ]:
14.25
In [ ]:
steps_per_epoch = len(x_train)//6
validation_steps = len(x_test)//4
In [ ]:
def conv_block(input_tensor, filters, strides, d_rates):
    x = Conv2D(filters[0], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[0])(input_tensor)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[1], kernel_size=3, strides=strides, kernel_initializer='he_uniform', padding='same', dilation_rate=d_rates[1])(x)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[2], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[2])(x)
    x = BatchNormalization()(x)

    shortcut = Conv2D(filters[2], kernel_size=1, kernel_initializer='he_uniform', strides=strides)(input_tensor)
    shortcut = BatchNormalization()(shortcut)

    x = add([x, shortcut])
    x = Activation('relu')(x)

    return x


def identity_block(input_tensor, filters, d_rates):
    x = Conv2D(filters[0], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[0])(input_tensor)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[1], kernel_size=3, kernel_initializer='he_uniform', padding='same', dilation_rate=d_rates[1])(x)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[2], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[2])(x)
    x = BatchNormalization()(x)

    x = add([x, input_tensor])
    x = Activation('relu')(x)

    return x

def one_side_pad(x):
    x = ZeroPadding2D((1, 1))(x)
    x = Lambda(lambda x: x[:, :-1, :-1, :])(x)
    return x
In [ ]:
droprate = 0.3
inputs = Input(shape=x_train.shape[1:])
conv_1 = Conv2D(32, (3, 3), strides=(1, 1), kernel_initializer='he_uniform', padding='same')(inputs)
conv_1 = BatchNormalization()(conv_1)
conv_1 = Activation("relu")(conv_1)
f1 = conv_1

conv_2 = Conv2D(64, (3, 3), strides=(2, 2), kernel_initializer='he_uniform', padding='same')(conv_1)
conv_2 = BatchNormalization()(conv_2)
conv_2 = Activation("relu")(conv_2)

conv_3 = Conv2D(64, (3, 3), strides=(1, 1), kernel_initializer='he_uniform', padding='same')(conv_2)
conv_3 = BatchNormalization()(conv_3)
conv_3 = Activation("relu")(conv_3)

f2 = conv_3


pool_1 = MaxPooling2D((2, 2), strides=(2, 2))(conv_3)

conv_block1 = conv_block(pool_1, filters=[64, 64, 128], strides=(1, 1), d_rates=[1, 1, 1])
identity_block1 = identity_block(conv_block1, filters=[64, 64, 128], d_rates=[1, 2, 1])
identity_block2 = identity_block(identity_block1, filters=[64, 64, 128], d_rates=[1, 3, 1])
f3 = identity_block2

conv_block2 = conv_block(identity_block2, filters=[128, 128, 256], strides=(2, 2), d_rates=[1, 1, 1])
identity_block3 = identity_block(conv_block2, filters=[128, 128, 256], d_rates=[1, 2, 1])
identity_block4 = identity_block(identity_block3, filters=[128, 128, 256], d_rates=[1, 3, 1])
identity_block5 = identity_block(identity_block4, filters=[128, 128, 256], d_rates=[1, 4, 1])
f4 = identity_block5


identity_block10 = conv_block(identity_block5, filters=[256, 256, 512], strides=(2, 2), d_rates=[1, 1, 1])
for i in range(25):
  identity_block10 = identity_block(identity_block10, filters=[256, 256, 512], d_rates=[1, 2, 1])

f5 = identity_block10

conv_block4 = conv_block(identity_block10, filters=[512, 512, 1024], strides=(2, 2), d_rates=[1, 1, 1])
identity_block11 = identity_block(conv_block4, filters=[512, 512, 1024], d_rates=[1, 4, 1])
identity_block12 = identity_block(identity_block11, filters=[512, 512, 1024], d_rates=[1, 4, 1])
f6 = identity_block12

o = f6

o = (BatchNormalization())(o)
o = Conv2D(1024, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(512, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)


o = Conv2DTranspose(512, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f5]))
o = (BatchNormalization())(o)
o = Conv2D(512, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(256, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)



o = Conv2DTranspose(256, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f4]))
o = (BatchNormalization())(o)
o = Conv2D(256, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(128, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)



o = Conv2DTranspose(128, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f3]))
o = (BatchNormalization())(o)
o = Conv2D(128, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(64, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)



o = Conv2DTranspose(64, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f2]))
o = (BatchNormalization())(o)
o = Conv2D(64, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(32, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)


o = Conv2DTranspose(32, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f1]))
o = (BatchNormalization())(o)
o = Conv2D(32, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(32, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)


o = Conv2D(1, (3, 3), padding='same', activation='sigmoid')(o)

model = Model(inputs=inputs, outputs=o)
model.compile(optimizer=Adam(learning_rate = 1e-5), loss = Dice, metrics = ['accuracy'])
model.summary()
Model: "functional"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Layer (type)              ┃ Output Shape           ┃        Param # ┃ Connected to           ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━┩
│ input_layer (InputLayer)  │ (None, 320, 320, 10)   │              0 │ -                      │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d (Conv2D)           │ (None, 320, 320, 32)   │          2,912 │ input_layer[0][0]      │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization       │ (None, 320, 320, 32)   │            128 │ conv2d[0][0]           │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation (Activation)   │ (None, 320, 320, 32)   │              0 │ batch_normalization[0… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_1 (Conv2D)         │ (None, 160, 160, 64)   │         18,496 │ activation[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_1     │ (None, 160, 160, 64)   │            256 │ conv2d_1[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_1 (Activation) │ (None, 160, 160, 64)   │              0 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_2 (Conv2D)         │ (None, 160, 160, 64)   │         36,928 │ activation_1[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_2     │ (None, 160, 160, 64)   │            256 │ conv2d_2[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_2 (Activation) │ (None, 160, 160, 64)   │              0 │ batch_normalization_2… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ max_pooling2d             │ (None, 80, 80, 64)     │              0 │ activation_2[0][0]     │
│ (MaxPooling2D)            │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_3 (Conv2D)         │ (None, 80, 80, 64)     │          4,160 │ max_pooling2d[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_3     │ (None, 80, 80, 64)     │            256 │ conv2d_3[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_3 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_3… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_4 (Conv2D)         │ (None, 80, 80, 64)     │         36,928 │ activation_3[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_4     │ (None, 80, 80, 64)     │            256 │ conv2d_4[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_4 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_4… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_5 (Conv2D)         │ (None, 80, 80, 128)    │          8,320 │ activation_4[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_6 (Conv2D)         │ (None, 80, 80, 128)    │          8,320 │ max_pooling2d[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_5     │ (None, 80, 80, 128)    │            512 │ conv2d_5[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_6     │ (None, 80, 80, 128)    │            512 │ conv2d_6[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add (Add)                 │ (None, 80, 80, 128)    │              0 │ batch_normalization_5… │
│                           │                        │                │ batch_normalization_6… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_5 (Activation) │ (None, 80, 80, 128)    │              0 │ add[0][0]              │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_7 (Conv2D)         │ (None, 80, 80, 64)     │          8,256 │ activation_5[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_7     │ (None, 80, 80, 64)     │            256 │ conv2d_7[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_6 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_7… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_8 (Conv2D)         │ (None, 80, 80, 64)     │         36,928 │ activation_6[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_8     │ (None, 80, 80, 64)     │            256 │ conv2d_8[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_7 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_8… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_9 (Conv2D)         │ (None, 80, 80, 128)    │          8,320 │ activation_7[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_9     │ (None, 80, 80, 128)    │            512 │ conv2d_9[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_1 (Add)               │ (None, 80, 80, 128)    │              0 │ batch_normalization_9… │
│                           │                        │                │ activation_5[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_8 (Activation) │ (None, 80, 80, 128)    │              0 │ add_1[0][0]            │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_10 (Conv2D)        │ (None, 80, 80, 64)     │          8,256 │ activation_8[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_10    │ (None, 80, 80, 64)     │            256 │ conv2d_10[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_9 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_11 (Conv2D)        │ (None, 80, 80, 64)     │         36,928 │ activation_9[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_11    │ (None, 80, 80, 64)     │            256 │ conv2d_11[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_10             │ (None, 80, 80, 64)     │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_12 (Conv2D)        │ (None, 80, 80, 128)    │          8,320 │ activation_10[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_12    │ (None, 80, 80, 128)    │            512 │ conv2d_12[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_2 (Add)               │ (None, 80, 80, 128)    │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_8[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_11             │ (None, 80, 80, 128)    │              0 │ add_2[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_13 (Conv2D)        │ (None, 80, 80, 128)    │         16,512 │ activation_11[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_13    │ (None, 80, 80, 128)    │            512 │ conv2d_13[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_12             │ (None, 80, 80, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_14 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_12[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_14    │ (None, 40, 40, 128)    │            512 │ conv2d_14[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_13             │ (None, 40, 40, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_15 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_13[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_16 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_11[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_15    │ (None, 40, 40, 256)    │          1,024 │ conv2d_15[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_16    │ (None, 40, 40, 256)    │          1,024 │ conv2d_16[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_3 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_1… │
│                           │                        │                │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_14             │ (None, 40, 40, 256)    │              0 │ add_3[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_17 (Conv2D)        │ (None, 40, 40, 128)    │         32,896 │ activation_14[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_17    │ (None, 40, 40, 128)    │            512 │ conv2d_17[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_15             │ (None, 40, 40, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_18 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_15[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_18    │ (None, 40, 40, 128)    │            512 │ conv2d_18[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_16             │ (None, 40, 40, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_19 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_16[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_19    │ (None, 40, 40, 256)    │          1,024 │ conv2d_19[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_4 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_14[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_17             │ (None, 40, 40, 256)    │              0 │ add_4[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_20 (Conv2D)        │ (None, 40, 40, 128)    │         32,896 │ activation_17[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_20    │ (None, 40, 40, 128)    │            512 │ conv2d_20[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_18             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_21 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_18[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_21    │ (None, 40, 40, 128)    │            512 │ conv2d_21[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_19             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_22 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_19[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_22    │ (None, 40, 40, 256)    │          1,024 │ conv2d_22[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_5 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_2… │
│                           │                        │                │ activation_17[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_20             │ (None, 40, 40, 256)    │              0 │ add_5[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_23 (Conv2D)        │ (None, 40, 40, 128)    │         32,896 │ activation_20[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_23    │ (None, 40, 40, 128)    │            512 │ conv2d_23[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_21             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_24 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_21[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_24    │ (None, 40, 40, 128)    │            512 │ conv2d_24[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_22             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_25 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_22[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_25    │ (None, 40, 40, 256)    │          1,024 │ conv2d_25[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_6 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_2… │
│                           │                        │                │ activation_20[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_23             │ (None, 40, 40, 256)    │              0 │ add_6[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_26 (Conv2D)        │ (None, 40, 40, 256)    │         65,792 │ activation_23[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_26    │ (None, 40, 40, 256)    │          1,024 │ conv2d_26[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_24             │ (None, 40, 40, 256)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_27 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_24[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_27    │ (None, 20, 20, 256)    │          1,024 │ conv2d_27[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_25             │ (None, 20, 20, 256)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_28 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_25[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_29 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_23[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_28    │ (None, 20, 20, 512)    │          2,048 │ conv2d_28[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_29    │ (None, 20, 20, 512)    │          2,048 │ conv2d_29[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_7 (Add)               │ (None, 20, 20, 512)    │              0 │ batch_normalization_2… │
│                           │                        │                │ batch_normalization_2… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_26             │ (None, 20, 20, 512)    │              0 │ add_7[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_30 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_26[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_30    │ (None, 20, 20, 256)    │          1,024 │ conv2d_30[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_27             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_31 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_27[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_31    │ (None, 20, 20, 256)    │          1,024 │ conv2d_31[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_28             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_32 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_28[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_32    │ (None, 20, 20, 512)    │          2,048 │ conv2d_32[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_8 (Add)               │ (None, 20, 20, 512)    │              0 │ batch_normalization_3… │
│                           │                        │                │ activation_26[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_29             │ (None, 20, 20, 512)    │              0 │ add_8[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_33 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_29[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_33    │ (None, 20, 20, 256)    │          1,024 │ conv2d_33[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_30             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_34 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_30[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_34    │ (None, 20, 20, 256)    │          1,024 │ conv2d_34[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_31             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_35 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_31[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_35    │ (None, 20, 20, 512)    │          2,048 │ conv2d_35[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_9 (Add)               │ (None, 20, 20, 512)    │              0 │ batch_normalization_3… │
│                           │                        │                │ activation_29[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_32             │ (None, 20, 20, 512)    │              0 │ add_9[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_36 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_32[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_36    │ (None, 20, 20, 256)    │          1,024 │ conv2d_36[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_33             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_37 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_33[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_37    │ (None, 20, 20, 256)    │          1,024 │ conv2d_37[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_34             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_38 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_34[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_38    │ (None, 20, 20, 512)    │          2,048 │ conv2d_38[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_10 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_3… │
│                           │                        │                │ activation_32[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_35             │ (None, 20, 20, 512)    │              0 │ add_10[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_39 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_35[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_39    │ (None, 20, 20, 256)    │          1,024 │ conv2d_39[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_36             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_40 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_36[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_40    │ (None, 20, 20, 256)    │          1,024 │ conv2d_40[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_37             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_41 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_37[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_41    │ (None, 20, 20, 512)    │          2,048 │ conv2d_41[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_11 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_4… │
│                           │                        │                │ activation_35[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_38             │ (None, 20, 20, 512)    │              0 │ add_11[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_42 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_38[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_42    │ (None, 20, 20, 256)    │          1,024 │ conv2d_42[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_39             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_43 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_39[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_43    │ (None, 20, 20, 256)    │          1,024 │ conv2d_43[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_40             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_44 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_40[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_44    │ (None, 20, 20, 512)    │          2,048 │ conv2d_44[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_12 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_4… │
│                           │                        │                │ activation_38[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_41             │ (None, 20, 20, 512)    │              0 │ add_12[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_45 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_41[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_45    │ (None, 20, 20, 256)    │          1,024 │ conv2d_45[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_42             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_46 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_42[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_46    │ (None, 20, 20, 256)    │          1,024 │ conv2d_46[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_43             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_47 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_43[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_47    │ (None, 20, 20, 512)    │          2,048 │ conv2d_47[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_13 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_4… │
│                           │                        │                │ activation_41[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_44             │ (None, 20, 20, 512)    │              0 │ add_13[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_48 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_44[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_48    │ (None, 20, 20, 256)    │          1,024 │ conv2d_48[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_45             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_49 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_45[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_49    │ (None, 20, 20, 256)    │          1,024 │ conv2d_49[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_46             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_50 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_46[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_50    │ (None, 20, 20, 512)    │          2,048 │ conv2d_50[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_14 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_5… │
│                           │                        │                │ activation_44[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_47             │ (None, 20, 20, 512)    │              0 │ add_14[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_51 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_47[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_51    │ (None, 20, 20, 256)    │          1,024 │ conv2d_51[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_48             │ (None, 20, 20, 256)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_52 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_48[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_52    │ (None, 20, 20, 256)    │          1,024 │ conv2d_52[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_49             │ (None, 20, 20, 256)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_53 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_49[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_53    │ (None, 20, 20, 512)    │          2,048 │ conv2d_53[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_15 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_5… │
│                           │                        │                │ activation_47[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_50             │ (None, 20, 20, 512)    │              0 │ add_15[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_54 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_50[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_54    │ (None, 20, 20, 256)    │          1,024 │ conv2d_54[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_51             │ (None, 20, 20, 256)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_55 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_51[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_55    │ (None, 20, 20, 256)    │          1,024 │ conv2d_55[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_52             │ (None, 20, 20, 256)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_56 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_52[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_56    │ (None, 20, 20, 512)    │          2,048 │ conv2d_56[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_16 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_5… │
│                           │                        │                │ activation_50[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_53             │ (None, 20, 20, 512)    │              0 │ add_16[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_57 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_53[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_57    │ (None, 20, 20, 256)    │          1,024 │ conv2d_57[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_54             │ (None, 20, 20, 256)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_58 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_54[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_58    │ (None, 20, 20, 256)    │          1,024 │ conv2d_58[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_55             │ (None, 20, 20, 256)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_59 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_55[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_59    │ (None, 20, 20, 512)    │          2,048 │ conv2d_59[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_17 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_5… │
│                           │                        │                │ activation_53[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_56             │ (None, 20, 20, 512)    │              0 │ add_17[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_60 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_56[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_60    │ (None, 20, 20, 256)    │          1,024 │ conv2d_60[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_57             │ (None, 20, 20, 256)    │              0 │ batch_normalization_6… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_61 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_57[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_61    │ (None, 20, 20, 256)    │          1,024 │ conv2d_61[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_58             │ (None, 20, 20, 256)    │              0 │ batch_normalization_6… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_62 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_58[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_62    │ (None, 20, 20, 512)    │          2,048 │ conv2d_62[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_18 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_6… │
│                           │                        │                │ activation_56[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_59             │ (None, 20, 20, 512)    │              0 │ add_18[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_63 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_59[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_63    │ (None, 20, 20, 256)    │          1,024 │ conv2d_63[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_60             │ (None, 20, 20, 256)    │              0 │ batch_normalization_6… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_64 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_60[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_64    │ (None, 20, 20, 256)    │          1,024 │ conv2d_64[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_61             │ (None, 20, 20, 256)    │              0 │ batch_normalization_6… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_65 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_61[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_65    │ (None, 20, 20, 512)    │          2,048 │ conv2d_65[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_19 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_6… │
│                           │                        │                │ activation_59[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_62             │ (None, 20, 20, 512)    │              0 │ add_19[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_66 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_62[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_66    │ (None, 20, 20, 256)    │          1,024 │ conv2d_66[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_63             │ (None, 20, 20, 256)    │              0 │ batch_normalization_6… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_67 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_63[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_67    │ (None, 20, 20, 256)    │          1,024 │ conv2d_67[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_64             │ (None, 20, 20, 256)    │              0 │ batch_normalization_6… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_68 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_64[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_68    │ (None, 20, 20, 512)    │          2,048 │ conv2d_68[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_20 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_6… │
│                           │                        │                │ activation_62[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_65             │ (None, 20, 20, 512)    │              0 │ add_20[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_69 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_65[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_69    │ (None, 20, 20, 256)    │          1,024 │ conv2d_69[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_66             │ (None, 20, 20, 256)    │              0 │ batch_normalization_6… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_70 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_66[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_70    │ (None, 20, 20, 256)    │          1,024 │ conv2d_70[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_67             │ (None, 20, 20, 256)    │              0 │ batch_normalization_7… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_71 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_67[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_71    │ (None, 20, 20, 512)    │          2,048 │ conv2d_71[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_21 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_7… │
│                           │                        │                │ activation_65[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_68             │ (None, 20, 20, 512)    │              0 │ add_21[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_72 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_68[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_72    │ (None, 20, 20, 256)    │          1,024 │ conv2d_72[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_69             │ (None, 20, 20, 256)    │              0 │ batch_normalization_7… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_73 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_69[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_73    │ (None, 20, 20, 256)    │          1,024 │ conv2d_73[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_70             │ (None, 20, 20, 256)    │              0 │ batch_normalization_7… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_74 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_70[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_74    │ (None, 20, 20, 512)    │          2,048 │ conv2d_74[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_22 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_7… │
│                           │                        │                │ activation_68[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_71             │ (None, 20, 20, 512)    │              0 │ add_22[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_75 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_71[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_75    │ (None, 20, 20, 256)    │          1,024 │ conv2d_75[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_72             │ (None, 20, 20, 256)    │              0 │ batch_normalization_7… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_76 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_72[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_76    │ (None, 20, 20, 256)    │          1,024 │ conv2d_76[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_73             │ (None, 20, 20, 256)    │              0 │ batch_normalization_7… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_77 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_73[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_77    │ (None, 20, 20, 512)    │          2,048 │ conv2d_77[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_23 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_7… │
│                           │                        │                │ activation_71[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_74             │ (None, 20, 20, 512)    │              0 │ add_23[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_78 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_74[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_78    │ (None, 20, 20, 256)    │          1,024 │ conv2d_78[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_75             │ (None, 20, 20, 256)    │              0 │ batch_normalization_7… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_79 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_75[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_79    │ (None, 20, 20, 256)    │          1,024 │ conv2d_79[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_76             │ (None, 20, 20, 256)    │              0 │ batch_normalization_7… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_80 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_76[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_80    │ (None, 20, 20, 512)    │          2,048 │ conv2d_80[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_24 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_8… │
│                           │                        │                │ activation_74[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_77             │ (None, 20, 20, 512)    │              0 │ add_24[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_81 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_77[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_81    │ (None, 20, 20, 256)    │          1,024 │ conv2d_81[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_78             │ (None, 20, 20, 256)    │              0 │ batch_normalization_8… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_82 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_78[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_82    │ (None, 20, 20, 256)    │          1,024 │ conv2d_82[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_79             │ (None, 20, 20, 256)    │              0 │ batch_normalization_8… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_83 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_79[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_83    │ (None, 20, 20, 512)    │          2,048 │ conv2d_83[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_25 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_8… │
│                           │                        │                │ activation_77[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_80             │ (None, 20, 20, 512)    │              0 │ add_25[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_84 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_80[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_84    │ (None, 20, 20, 256)    │          1,024 │ conv2d_84[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_81             │ (None, 20, 20, 256)    │              0 │ batch_normalization_8… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_85 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_81[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_85    │ (None, 20, 20, 256)    │          1,024 │ conv2d_85[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_82             │ (None, 20, 20, 256)    │              0 │ batch_normalization_8… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_86 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_82[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_86    │ (None, 20, 20, 512)    │          2,048 │ conv2d_86[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_26 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_8… │
│                           │                        │                │ activation_80[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_83             │ (None, 20, 20, 512)    │              0 │ add_26[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_87 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_83[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_87    │ (None, 20, 20, 256)    │          1,024 │ conv2d_87[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_84             │ (None, 20, 20, 256)    │              0 │ batch_normalization_8… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_88 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_84[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_88    │ (None, 20, 20, 256)    │          1,024 │ conv2d_88[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_85             │ (None, 20, 20, 256)    │              0 │ batch_normalization_8… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_89 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_85[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_89    │ (None, 20, 20, 512)    │          2,048 │ conv2d_89[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_27 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_8… │
│                           │                        │                │ activation_83[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_86             │ (None, 20, 20, 512)    │              0 │ add_27[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_90 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_86[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_90    │ (None, 20, 20, 256)    │          1,024 │ conv2d_90[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_87             │ (None, 20, 20, 256)    │              0 │ batch_normalization_9… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_91 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_87[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_91    │ (None, 20, 20, 256)    │          1,024 │ conv2d_91[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_88             │ (None, 20, 20, 256)    │              0 │ batch_normalization_9… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_92 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_88[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_92    │ (None, 20, 20, 512)    │          2,048 │ conv2d_92[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_28 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_9… │
│                           │                        │                │ activation_86[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_89             │ (None, 20, 20, 512)    │              0 │ add_28[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_93 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_89[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_93    │ (None, 20, 20, 256)    │          1,024 │ conv2d_93[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_90             │ (None, 20, 20, 256)    │              0 │ batch_normalization_9… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_94 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_90[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_94    │ (None, 20, 20, 256)    │          1,024 │ conv2d_94[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_91             │ (None, 20, 20, 256)    │              0 │ batch_normalization_9… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_95 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_91[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_95    │ (None, 20, 20, 512)    │          2,048 │ conv2d_95[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_29 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_9… │
│                           │                        │                │ activation_89[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_92             │ (None, 20, 20, 512)    │              0 │ add_29[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_96 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_92[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_96    │ (None, 20, 20, 256)    │          1,024 │ conv2d_96[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_93             │ (None, 20, 20, 256)    │              0 │ batch_normalization_9… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_97 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_93[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_97    │ (None, 20, 20, 256)    │          1,024 │ conv2d_97[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_94             │ (None, 20, 20, 256)    │              0 │ batch_normalization_9… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_98 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_94[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_98    │ (None, 20, 20, 512)    │          2,048 │ conv2d_98[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_30 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_9… │
│                           │                        │                │ activation_92[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_95             │ (None, 20, 20, 512)    │              0 │ add_30[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_99 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_95[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_99    │ (None, 20, 20, 256)    │          1,024 │ conv2d_99[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_96             │ (None, 20, 20, 256)    │              0 │ batch_normalization_9… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_100 (Conv2D)       │ (None, 20, 20, 256)    │        590,080 │ activation_96[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_100   │ (None, 20, 20, 256)    │          1,024 │ conv2d_100[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_97             │ (None, 20, 20, 256)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_101 (Conv2D)       │ (None, 20, 20, 512)    │        131,584 │ activation_97[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_101   │ (None, 20, 20, 512)    │          2,048 │ conv2d_101[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_31 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_95[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_98             │ (None, 20, 20, 512)    │              0 │ add_31[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_102 (Conv2D)       │ (None, 20, 20, 256)    │        131,328 │ activation_98[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_102   │ (None, 20, 20, 256)    │          1,024 │ conv2d_102[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_99             │ (None, 20, 20, 256)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_103 (Conv2D)       │ (None, 20, 20, 256)    │        590,080 │ activation_99[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_103   │ (None, 20, 20, 256)    │          1,024 │ conv2d_103[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_100            │ (None, 20, 20, 256)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_104 (Conv2D)       │ (None, 20, 20, 512)    │        131,584 │ activation_100[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_104   │ (None, 20, 20, 512)    │          2,048 │ conv2d_104[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_32 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_98[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_101            │ (None, 20, 20, 512)    │              0 │ add_32[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_105 (Conv2D)       │ (None, 20, 20, 512)    │        262,656 │ activation_101[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_105   │ (None, 20, 20, 512)    │          2,048 │ conv2d_105[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_102            │ (None, 20, 20, 512)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_106 (Conv2D)       │ (None, 10, 10, 512)    │      2,359,808 │ activation_102[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_106   │ (None, 10, 10, 512)    │          2,048 │ conv2d_106[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_103            │ (None, 10, 10, 512)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_107 (Conv2D)       │ (None, 10, 10, 1024)   │        525,312 │ activation_103[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_108 (Conv2D)       │ (None, 10, 10, 1024)   │        525,312 │ activation_101[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_107   │ (None, 10, 10, 1024)   │          4,096 │ conv2d_107[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_108   │ (None, 10, 10, 1024)   │          4,096 │ conv2d_108[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_33 (Add)              │ (None, 10, 10, 1024)   │              0 │ batch_normalization_1… │
│                           │                        │                │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_104            │ (None, 10, 10, 1024)   │              0 │ add_33[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_109 (Conv2D)       │ (None, 10, 10, 512)    │        524,800 │ activation_104[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_109   │ (None, 10, 10, 512)    │          2,048 │ conv2d_109[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_105            │ (None, 10, 10, 512)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_110 (Conv2D)       │ (None, 10, 10, 512)    │      2,359,808 │ activation_105[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_110   │ (None, 10, 10, 512)    │          2,048 │ conv2d_110[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_106            │ (None, 10, 10, 512)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_111 (Conv2D)       │ (None, 10, 10, 1024)   │        525,312 │ activation_106[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_111   │ (None, 10, 10, 1024)   │          4,096 │ conv2d_111[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_34 (Add)              │ (None, 10, 10, 1024)   │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_104[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_107            │ (None, 10, 10, 1024)   │              0 │ add_34[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_112 (Conv2D)       │ (None, 10, 10, 512)    │        524,800 │ activation_107[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_112   │ (None, 10, 10, 512)    │          2,048 │ conv2d_112[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_108            │ (None, 10, 10, 512)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_113 (Conv2D)       │ (None, 10, 10, 512)    │      2,359,808 │ activation_108[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_113   │ (None, 10, 10, 512)    │          2,048 │ conv2d_113[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_109            │ (None, 10, 10, 512)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_114 (Conv2D)       │ (None, 10, 10, 1024)   │        525,312 │ activation_109[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_114   │ (None, 10, 10, 1024)   │          4,096 │ conv2d_114[0][0]       │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_35 (Add)              │ (None, 10, 10, 1024)   │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_107[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_110            │ (None, 10, 10, 1024)   │              0 │ add_35[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_115   │ (None, 10, 10, 1024)   │          4,096 │ activation_110[0][0]   │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_115 (Conv2D)       │ (None, 10, 10, 1024)   │      9,438,208 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_116 (Conv2D)       │ (None, 10, 10, 512)    │      4,719,104 │ conv2d_115[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout (Dropout)         │ (None, 10, 10, 512)    │              0 │ conv2d_116[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose          │ (None, 20, 20, 512)    │      1,049,088 │ dropout[0][0]          │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate (Concatenate) │ (None, 20, 20, 1024)   │              0 │ conv2d_transpose[0][0… │
│                           │                        │                │ activation_101[0][0]   │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_116   │ (None, 20, 20, 1024)   │          4,096 │ concatenate[0][0]      │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_117 (Conv2D)       │ (None, 20, 20, 512)    │      4,719,104 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_118 (Conv2D)       │ (None, 20, 20, 256)    │      1,179,904 │ conv2d_117[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_1 (Dropout)       │ (None, 20, 20, 256)    │              0 │ conv2d_118[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_1        │ (None, 40, 40, 256)    │        262,400 │ dropout_1[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_1             │ (None, 40, 40, 512)    │              0 │ conv2d_transpose_1[0]… │
│ (Concatenate)             │                        │                │ activation_23[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_117   │ (None, 40, 40, 512)    │          2,048 │ concatenate_1[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_119 (Conv2D)       │ (None, 40, 40, 256)    │      1,179,904 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_120 (Conv2D)       │ (None, 40, 40, 128)    │        295,040 │ conv2d_119[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_2 (Dropout)       │ (None, 40, 40, 128)    │              0 │ conv2d_120[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_2        │ (None, 80, 80, 128)    │         65,664 │ dropout_2[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_2             │ (None, 80, 80, 256)    │              0 │ conv2d_transpose_2[0]… │
│ (Concatenate)             │                        │                │ activation_11[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_118   │ (None, 80, 80, 256)    │          1,024 │ concatenate_2[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_121 (Conv2D)       │ (None, 80, 80, 128)    │        295,040 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_122 (Conv2D)       │ (None, 80, 80, 64)     │         73,792 │ conv2d_121[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_3 (Dropout)       │ (None, 80, 80, 64)     │              0 │ conv2d_122[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_3        │ (None, 160, 160, 64)   │         16,448 │ dropout_3[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_3             │ (None, 160, 160, 128)  │              0 │ conv2d_transpose_3[0]… │
│ (Concatenate)             │                        │                │ activation_2[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_119   │ (None, 160, 160, 128)  │            512 │ concatenate_3[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_123 (Conv2D)       │ (None, 160, 160, 64)   │         73,792 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_124 (Conv2D)       │ (None, 160, 160, 32)   │         18,464 │ conv2d_123[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_4 (Dropout)       │ (None, 160, 160, 32)   │              0 │ conv2d_124[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_4        │ (None, 320, 320, 32)   │          4,128 │ dropout_4[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_4             │ (None, 320, 320, 64)   │              0 │ conv2d_transpose_4[0]… │
│ (Concatenate)             │                        │                │ activation[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_120   │ (None, 320, 320, 64)   │            256 │ concatenate_4[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_125 (Conv2D)       │ (None, 320, 320, 32)   │         18,464 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_126 (Conv2D)       │ (None, 320, 320, 32)   │          9,248 │ conv2d_125[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_127 (Conv2D)       │ (None, 320, 320, 1)    │            289 │ conv2d_126[0][0]       │
└───────────────────────────┴────────────────────────┴────────────────┴────────────────────────┘
 Total params: 57,411,265 (219.01 MB)
 Trainable params: 57,329,921 (218.70 MB)
 Non-trainable params: 81,344 (317.75 KB)
In [ ]:
history = model.fit(train_generator,steps_per_epoch=steps_per_epoch, validation_steps=validation_steps,
                              epochs=300, validation_data=(x_test,y_test))
/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:2: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators.
  
Epoch 1/300
33/33 [==============================] - 52s 756ms/step - loss: 0.6956 - accuracy: 0.4752 - val_loss: 0.6652 - val_accuracy: 0.2571
Epoch 2/300
33/33 [==============================] - 21s 583ms/step - loss: 0.6384 - accuracy: 0.4852 - val_loss: 0.6594 - val_accuracy: 0.2569
Epoch 3/300
33/33 [==============================] - 19s 587ms/step - loss: 0.6618 - accuracy: 0.4870 - val_loss: 0.6534 - val_accuracy: 0.2717
Epoch 4/300
33/33 [==============================] - 19s 590ms/step - loss: 0.6248 - accuracy: 0.5398 - val_loss: 0.6452 - val_accuracy: 0.3462
Epoch 5/300
33/33 [==============================] - 19s 596ms/step - loss: 0.6031 - accuracy: 0.5778 - val_loss: 0.6340 - val_accuracy: 0.4536
Epoch 6/300
33/33 [==============================] - 19s 592ms/step - loss: 0.5970 - accuracy: 0.6230 - val_loss: 0.6195 - val_accuracy: 0.6185
Epoch 7/300
33/33 [==============================] - 19s 590ms/step - loss: 0.5801 - accuracy: 0.6725 - val_loss: 0.5952 - val_accuracy: 0.7083
Epoch 8/300
33/33 [==============================] - 19s 586ms/step - loss: 0.5568 - accuracy: 0.7175 - val_loss: 0.5645 - val_accuracy: 0.7631
Epoch 9/300
33/33 [==============================] - 19s 589ms/step - loss: 0.5404 - accuracy: 0.7254 - val_loss: 0.5360 - val_accuracy: 0.7111
Epoch 10/300
33/33 [==============================] - 19s 589ms/step - loss: 0.5264 - accuracy: 0.7596 - val_loss: 0.5027 - val_accuracy: 0.7433
Epoch 11/300
33/33 [==============================] - 19s 584ms/step - loss: 0.5229 - accuracy: 0.7748 - val_loss: 0.4855 - val_accuracy: 0.7373
Epoch 12/300
33/33 [==============================] - 19s 591ms/step - loss: 0.5024 - accuracy: 0.7898 - val_loss: 0.4784 - val_accuracy: 0.7037
Epoch 13/300
33/33 [==============================] - 19s 597ms/step - loss: 0.4829 - accuracy: 0.7977 - val_loss: 0.4765 - val_accuracy: 0.6505
Epoch 14/300
33/33 [==============================] - 19s 582ms/step - loss: 0.4493 - accuracy: 0.8221 - val_loss: 0.4506 - val_accuracy: 0.6907
Epoch 15/300
33/33 [==============================] - 19s 590ms/step - loss: 0.4286 - accuracy: 0.8279 - val_loss: 0.3932 - val_accuracy: 0.7902
Epoch 16/300
33/33 [==============================] - 19s 582ms/step - loss: 0.4248 - accuracy: 0.8345 - val_loss: 0.3937 - val_accuracy: 0.7658
Epoch 17/300
33/33 [==============================] - 19s 592ms/step - loss: 0.4094 - accuracy: 0.8417 - val_loss: 0.3021 - val_accuracy: 0.8714
Epoch 18/300
33/33 [==============================] - 19s 585ms/step - loss: 0.4239 - accuracy: 0.8324 - val_loss: 0.3165 - val_accuracy: 0.8504
Epoch 19/300
33/33 [==============================] - 19s 591ms/step - loss: 0.4092 - accuracy: 0.8555 - val_loss: 0.2559 - val_accuracy: 0.9062
Epoch 20/300
33/33 [==============================] - 19s 588ms/step - loss: 0.4073 - accuracy: 0.8366 - val_loss: 0.2510 - val_accuracy: 0.9063
Epoch 21/300
33/33 [==============================] - 19s 580ms/step - loss: 0.4039 - accuracy: 0.8497 - val_loss: 0.2423 - val_accuracy: 0.9109
Epoch 22/300
33/33 [==============================] - 19s 591ms/step - loss: 0.3754 - accuracy: 0.8541 - val_loss: 0.2362 - val_accuracy: 0.9158
Epoch 23/300
33/33 [==============================] - 20s 622ms/step - loss: 0.3951 - accuracy: 0.8526 - val_loss: 0.2324 - val_accuracy: 0.9155
Epoch 24/300
33/33 [==============================] - 19s 597ms/step - loss: 0.3549 - accuracy: 0.8527 - val_loss: 0.2356 - val_accuracy: 0.9088
Epoch 25/300
33/33 [==============================] - 19s 585ms/step - loss: 0.3691 - accuracy: 0.8579 - val_loss: 0.2361 - val_accuracy: 0.9111
Epoch 26/300
33/33 [==============================] - 19s 593ms/step - loss: 0.3749 - accuracy: 0.8462 - val_loss: 0.2307 - val_accuracy: 0.9101
Epoch 27/300
33/33 [==============================] - 19s 590ms/step - loss: 0.3796 - accuracy: 0.8586 - val_loss: 0.2292 - val_accuracy: 0.9155
Epoch 28/300
33/33 [==============================] - 19s 586ms/step - loss: 0.3995 - accuracy: 0.8423 - val_loss: 0.2246 - val_accuracy: 0.9137
Epoch 29/300
33/33 [==============================] - 19s 595ms/step - loss: 0.3764 - accuracy: 0.8530 - val_loss: 0.2226 - val_accuracy: 0.9174
Epoch 30/300
33/33 [==============================] - 19s 590ms/step - loss: 0.3482 - accuracy: 0.8543 - val_loss: 0.2223 - val_accuracy: 0.9137
Epoch 31/300
33/33 [==============================] - 19s 583ms/step - loss: 0.3705 - accuracy: 0.8549 - val_loss: 0.2210 - val_accuracy: 0.9149
Epoch 32/300
33/33 [==============================] - 19s 588ms/step - loss: 0.3489 - accuracy: 0.8633 - val_loss: 0.2158 - val_accuracy: 0.9185
Epoch 33/300
33/33 [==============================] - 19s 581ms/step - loss: 0.3520 - accuracy: 0.8566 - val_loss: 0.2202 - val_accuracy: 0.9170
Epoch 34/300
33/33 [==============================] - 20s 607ms/step - loss: 0.3418 - accuracy: 0.8567 - val_loss: 0.2147 - val_accuracy: 0.9181
Epoch 35/300
33/33 [==============================] - 19s 594ms/step - loss: 0.3543 - accuracy: 0.8553 - val_loss: 0.2146 - val_accuracy: 0.9169
Epoch 36/300
33/33 [==============================] - 19s 592ms/step - loss: 0.3371 - accuracy: 0.8582 - val_loss: 0.2148 - val_accuracy: 0.9175
Epoch 37/300
33/33 [==============================] - 19s 590ms/step - loss: 0.3312 - accuracy: 0.8654 - val_loss: 0.2111 - val_accuracy: 0.9205
Epoch 38/300
33/33 [==============================] - 19s 585ms/step - loss: 0.3446 - accuracy: 0.8598 - val_loss: 0.2086 - val_accuracy: 0.9210
Epoch 39/300
33/33 [==============================] - 19s 596ms/step - loss: 0.3203 - accuracy: 0.8688 - val_loss: 0.2075 - val_accuracy: 0.9207
Epoch 40/300
33/33 [==============================] - 19s 579ms/step - loss: 0.3100 - accuracy: 0.8745 - val_loss: 0.2136 - val_accuracy: 0.9178
Epoch 41/300
33/33 [==============================] - 19s 599ms/step - loss: 0.3143 - accuracy: 0.8704 - val_loss: 0.2212 - val_accuracy: 0.9167
Epoch 42/300
33/33 [==============================] - 19s 589ms/step - loss: 0.3175 - accuracy: 0.8661 - val_loss: 0.2170 - val_accuracy: 0.9170
Epoch 43/300
33/33 [==============================] - 19s 586ms/step - loss: 0.2690 - accuracy: 0.8828 - val_loss: 0.2238 - val_accuracy: 0.9138
Epoch 44/300
33/33 [==============================] - 19s 595ms/step - loss: 0.3564 - accuracy: 0.8614 - val_loss: 0.2165 - val_accuracy: 0.9176
Epoch 45/300
33/33 [==============================] - 19s 584ms/step - loss: 0.3555 - accuracy: 0.8630 - val_loss: 0.2121 - val_accuracy: 0.9202
Epoch 46/300
33/33 [==============================] - 19s 596ms/step - loss: 0.3078 - accuracy: 0.8770 - val_loss: 0.2101 - val_accuracy: 0.9214
Epoch 47/300
33/33 [==============================] - 19s 589ms/step - loss: 0.3276 - accuracy: 0.8739 - val_loss: 0.2030 - val_accuracy: 0.9261
Epoch 48/300
33/33 [==============================] - 19s 585ms/step - loss: 0.3061 - accuracy: 0.8651 - val_loss: 0.2029 - val_accuracy: 0.9233
Epoch 49/300
33/33 [==============================] - 19s 586ms/step - loss: 0.3303 - accuracy: 0.8745 - val_loss: 0.2063 - val_accuracy: 0.9204
Epoch 50/300
33/33 [==============================] - 19s 579ms/step - loss: 0.3290 - accuracy: 0.8812 - val_loss: 0.2012 - val_accuracy: 0.9239
Epoch 51/300
33/33 [==============================] - 19s 588ms/step - loss: 0.3177 - accuracy: 0.8622 - val_loss: 0.1982 - val_accuracy: 0.9244
Epoch 52/300
33/33 [==============================] - 19s 589ms/step - loss: 0.2918 - accuracy: 0.8769 - val_loss: 0.2110 - val_accuracy: 0.9181
Epoch 53/300
33/33 [==============================] - 19s 586ms/step - loss: 0.3210 - accuracy: 0.8735 - val_loss: 0.2087 - val_accuracy: 0.9178
Epoch 54/300
33/33 [==============================] - 19s 587ms/step - loss: 0.3085 - accuracy: 0.8725 - val_loss: 0.2114 - val_accuracy: 0.9141
Epoch 55/300
33/33 [==============================] - 19s 585ms/step - loss: 0.2998 - accuracy: 0.8837 - val_loss: 0.2053 - val_accuracy: 0.9193
Epoch 56/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2865 - accuracy: 0.8859 - val_loss: 0.1951 - val_accuracy: 0.9248
Epoch 57/300
33/33 [==============================] - 19s 593ms/step - loss: 0.3005 - accuracy: 0.8724 - val_loss: 0.2031 - val_accuracy: 0.9219
Epoch 58/300
33/33 [==============================] - 19s 581ms/step - loss: 0.2797 - accuracy: 0.8852 - val_loss: 0.2066 - val_accuracy: 0.9234
Epoch 59/300
33/33 [==============================] - 19s 588ms/step - loss: 0.3248 - accuracy: 0.8710 - val_loss: 0.1973 - val_accuracy: 0.9265
Epoch 60/300
33/33 [==============================] - 19s 591ms/step - loss: 0.2844 - accuracy: 0.8789 - val_loss: 0.1956 - val_accuracy: 0.9273
Epoch 61/300
33/33 [==============================] - 19s 593ms/step - loss: 0.2839 - accuracy: 0.8899 - val_loss: 0.1976 - val_accuracy: 0.9253
Epoch 62/300
33/33 [==============================] - 19s 585ms/step - loss: 0.2805 - accuracy: 0.8877 - val_loss: 0.1931 - val_accuracy: 0.9295
Epoch 63/300
33/33 [==============================] - 19s 597ms/step - loss: 0.2686 - accuracy: 0.8865 - val_loss: 0.1947 - val_accuracy: 0.9306
Epoch 64/300
33/33 [==============================] - 19s 592ms/step - loss: 0.2671 - accuracy: 0.8919 - val_loss: 0.2021 - val_accuracy: 0.9243
Epoch 65/300
33/33 [==============================] - 19s 579ms/step - loss: 0.2661 - accuracy: 0.8875 - val_loss: 0.1954 - val_accuracy: 0.9291
Epoch 66/300
33/33 [==============================] - 19s 589ms/step - loss: 0.2712 - accuracy: 0.8911 - val_loss: 0.1915 - val_accuracy: 0.9340
Epoch 67/300
33/33 [==============================] - 19s 590ms/step - loss: 0.2948 - accuracy: 0.8844 - val_loss: 0.1865 - val_accuracy: 0.9325
Epoch 68/300
33/33 [==============================] - 19s 596ms/step - loss: 0.3171 - accuracy: 0.8727 - val_loss: 0.1919 - val_accuracy: 0.9281
Epoch 69/300
33/33 [==============================] - 19s 589ms/step - loss: 0.2740 - accuracy: 0.8843 - val_loss: 0.1979 - val_accuracy: 0.9236
Epoch 70/300
33/33 [==============================] - 19s 590ms/step - loss: 0.2711 - accuracy: 0.8888 - val_loss: 0.1953 - val_accuracy: 0.9241
Epoch 71/300
33/33 [==============================] - 20s 610ms/step - loss: 0.2481 - accuracy: 0.8952 - val_loss: 0.1990 - val_accuracy: 0.9234
Epoch 72/300
33/33 [==============================] - 20s 609ms/step - loss: 0.3054 - accuracy: 0.8805 - val_loss: 0.1963 - val_accuracy: 0.9255
Epoch 73/300
33/33 [==============================] - 20s 606ms/step - loss: 0.2679 - accuracy: 0.8868 - val_loss: 0.1947 - val_accuracy: 0.9279
Epoch 74/300
33/33 [==============================] - 20s 610ms/step - loss: 0.2707 - accuracy: 0.8909 - val_loss: 0.2160 - val_accuracy: 0.9076
Epoch 75/300
33/33 [==============================] - 19s 596ms/step - loss: 0.2924 - accuracy: 0.8886 - val_loss: 0.1875 - val_accuracy: 0.9313
Epoch 76/300
33/33 [==============================] - 20s 612ms/step - loss: 0.2713 - accuracy: 0.8898 - val_loss: 0.1988 - val_accuracy: 0.9219
Epoch 77/300
33/33 [==============================] - 19s 586ms/step - loss: 0.3028 - accuracy: 0.8887 - val_loss: 0.1889 - val_accuracy: 0.9326
Epoch 78/300
33/33 [==============================] - 19s 588ms/step - loss: 0.2611 - accuracy: 0.8821 - val_loss: 0.2016 - val_accuracy: 0.9194
Epoch 79/300
33/33 [==============================] - 19s 586ms/step - loss: 0.2901 - accuracy: 0.8933 - val_loss: 0.1824 - val_accuracy: 0.9373
Epoch 80/300
33/33 [==============================] - 20s 602ms/step - loss: 0.2303 - accuracy: 0.9030 - val_loss: 0.2076 - val_accuracy: 0.9144
Epoch 81/300
33/33 [==============================] - 19s 590ms/step - loss: 0.3098 - accuracy: 0.8723 - val_loss: 0.1914 - val_accuracy: 0.9276
Epoch 82/300
33/33 [==============================] - 19s 584ms/step - loss: 0.2476 - accuracy: 0.8933 - val_loss: 0.1882 - val_accuracy: 0.9327
Epoch 83/300
33/33 [==============================] - 19s 588ms/step - loss: 0.2902 - accuracy: 0.8937 - val_loss: 0.1903 - val_accuracy: 0.9328
Epoch 84/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2552 - accuracy: 0.8834 - val_loss: 0.1925 - val_accuracy: 0.9275
Epoch 85/300
33/33 [==============================] - 19s 595ms/step - loss: 0.2411 - accuracy: 0.9020 - val_loss: 0.2169 - val_accuracy: 0.9098
Epoch 86/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2781 - accuracy: 0.8942 - val_loss: 0.2059 - val_accuracy: 0.9167
Epoch 87/300
33/33 [==============================] - 19s 581ms/step - loss: 0.2465 - accuracy: 0.8934 - val_loss: 0.2217 - val_accuracy: 0.9062
Epoch 88/300
33/33 [==============================] - 19s 592ms/step - loss: 0.2653 - accuracy: 0.9014 - val_loss: 0.1903 - val_accuracy: 0.9334
Epoch 89/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2750 - accuracy: 0.8920 - val_loss: 0.2172 - val_accuracy: 0.9093
Epoch 90/300
33/33 [==============================] - 19s 582ms/step - loss: 0.2676 - accuracy: 0.8983 - val_loss: 0.2047 - val_accuracy: 0.9222
Epoch 91/300
33/33 [==============================] - 19s 600ms/step - loss: 0.2574 - accuracy: 0.8845 - val_loss: 0.1984 - val_accuracy: 0.9274
Epoch 92/300
33/33 [==============================] - 20s 602ms/step - loss: 0.2315 - accuracy: 0.9112 - val_loss: 0.2004 - val_accuracy: 0.9264
Epoch 93/300
33/33 [==============================] - 19s 595ms/step - loss: 0.2508 - accuracy: 0.8940 - val_loss: 0.2032 - val_accuracy: 0.9211
Epoch 94/300
33/33 [==============================] - 19s 584ms/step - loss: 0.2448 - accuracy: 0.8974 - val_loss: 0.1977 - val_accuracy: 0.9231
Epoch 95/300
33/33 [==============================] - 19s 588ms/step - loss: 0.2396 - accuracy: 0.8995 - val_loss: 0.1979 - val_accuracy: 0.9227
Epoch 96/300
33/33 [==============================] - 19s 590ms/step - loss: 0.2364 - accuracy: 0.9065 - val_loss: 0.2119 - val_accuracy: 0.9162
Epoch 97/300
33/33 [==============================] - 19s 596ms/step - loss: 0.2528 - accuracy: 0.9005 - val_loss: 0.1939 - val_accuracy: 0.9269
Epoch 98/300
33/33 [==============================] - 20s 601ms/step - loss: 0.2669 - accuracy: 0.8985 - val_loss: 0.1902 - val_accuracy: 0.9275
Epoch 99/300
33/33 [==============================] - 19s 589ms/step - loss: 0.3051 - accuracy: 0.8893 - val_loss: 0.1852 - val_accuracy: 0.9309
Epoch 100/300
33/33 [==============================] - 19s 586ms/step - loss: 0.2223 - accuracy: 0.9055 - val_loss: 0.1828 - val_accuracy: 0.9315
Epoch 101/300
33/33 [==============================] - 19s 592ms/step - loss: 0.2202 - accuracy: 0.9040 - val_loss: 0.1780 - val_accuracy: 0.9361
Epoch 102/300
33/33 [==============================] - 19s 596ms/step - loss: 0.2463 - accuracy: 0.9013 - val_loss: 0.1941 - val_accuracy: 0.9243
Epoch 103/300
33/33 [==============================] - 19s 595ms/step - loss: 0.2785 - accuracy: 0.8905 - val_loss: 0.1911 - val_accuracy: 0.9281
Epoch 104/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2295 - accuracy: 0.8990 - val_loss: 0.1945 - val_accuracy: 0.9265
Epoch 105/300
33/33 [==============================] - 19s 590ms/step - loss: 0.2411 - accuracy: 0.9056 - val_loss: 0.1997 - val_accuracy: 0.9240
Epoch 106/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2180 - accuracy: 0.9033 - val_loss: 0.1979 - val_accuracy: 0.9231
Epoch 107/300
33/33 [==============================] - 19s 585ms/step - loss: 0.2544 - accuracy: 0.8928 - val_loss: 0.1906 - val_accuracy: 0.9291
Epoch 108/300
33/33 [==============================] - 20s 602ms/step - loss: 0.2438 - accuracy: 0.9079 - val_loss: 0.1837 - val_accuracy: 0.9316
Epoch 109/300
33/33 [==============================] - 19s 582ms/step - loss: 0.2010 - accuracy: 0.9135 - val_loss: 0.1934 - val_accuracy: 0.9256
Epoch 110/300
33/33 [==============================] - 19s 596ms/step - loss: 0.2633 - accuracy: 0.8929 - val_loss: 0.1928 - val_accuracy: 0.9248
Epoch 111/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2309 - accuracy: 0.9023 - val_loss: 0.1983 - val_accuracy: 0.9218
Epoch 112/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2486 - accuracy: 0.9028 - val_loss: 0.1863 - val_accuracy: 0.9329
Epoch 113/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2244 - accuracy: 0.9059 - val_loss: 0.1960 - val_accuracy: 0.9294
Epoch 114/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2494 - accuracy: 0.9055 - val_loss: 0.1840 - val_accuracy: 0.9312
Epoch 115/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2698 - accuracy: 0.8991 - val_loss: 0.1853 - val_accuracy: 0.9344
Epoch 116/300
33/33 [==============================] - 19s 585ms/step - loss: 0.2292 - accuracy: 0.8970 - val_loss: 0.1838 - val_accuracy: 0.9335
Epoch 117/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2306 - accuracy: 0.8957 - val_loss: 0.1948 - val_accuracy: 0.9226
Epoch 118/300
33/33 [==============================] - 19s 596ms/step - loss: 0.3187 - accuracy: 0.8989 - val_loss: 0.1794 - val_accuracy: 0.9355
Epoch 119/300
33/33 [==============================] - 19s 583ms/step - loss: 0.2802 - accuracy: 0.8854 - val_loss: 0.1981 - val_accuracy: 0.9213
Epoch 120/300
33/33 [==============================] - 19s 598ms/step - loss: 0.2357 - accuracy: 0.9075 - val_loss: 0.1832 - val_accuracy: 0.9327
Epoch 121/300
33/33 [==============================] - 19s 586ms/step - loss: 0.2103 - accuracy: 0.9060 - val_loss: 0.1943 - val_accuracy: 0.9251
Epoch 122/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2596 - accuracy: 0.9027 - val_loss: 0.1771 - val_accuracy: 0.9369
Epoch 123/300
33/33 [==============================] - 19s 591ms/step - loss: 0.2334 - accuracy: 0.9028 - val_loss: 0.1809 - val_accuracy: 0.9340
Epoch 124/300
33/33 [==============================] - 19s 583ms/step - loss: 0.2030 - accuracy: 0.9144 - val_loss: 0.2023 - val_accuracy: 0.9189
Epoch 125/300
33/33 [==============================] - 19s 591ms/step - loss: 0.2357 - accuracy: 0.9062 - val_loss: 0.1889 - val_accuracy: 0.9282
Epoch 126/300
33/33 [==============================] - 19s 588ms/step - loss: 0.2598 - accuracy: 0.9014 - val_loss: 0.1846 - val_accuracy: 0.9301
Epoch 127/300
33/33 [==============================] - 19s 578ms/step - loss: 0.2284 - accuracy: 0.9049 - val_loss: 0.1825 - val_accuracy: 0.9325
Epoch 128/300
33/33 [==============================] - 19s 580ms/step - loss: 0.2336 - accuracy: 0.9042 - val_loss: 0.1817 - val_accuracy: 0.9361
Epoch 129/300
33/33 [==============================] - 19s 578ms/step - loss: 0.2360 - accuracy: 0.9005 - val_loss: 0.1843 - val_accuracy: 0.9343
Epoch 130/300
33/33 [==============================] - 19s 581ms/step - loss: 0.2477 - accuracy: 0.9003 - val_loss: 0.1930 - val_accuracy: 0.9266
Epoch 131/300
33/33 [==============================] - 19s 591ms/step - loss: 0.2321 - accuracy: 0.9050 - val_loss: 0.1847 - val_accuracy: 0.9324
Epoch 132/300
33/33 [==============================] - 20s 605ms/step - loss: 0.2244 - accuracy: 0.9099 - val_loss: 0.1845 - val_accuracy: 0.9311
Epoch 133/300
33/33 [==============================] - 20s 613ms/step - loss: 0.1863 - accuracy: 0.9182 - val_loss: 0.1932 - val_accuracy: 0.9239
Epoch 134/300
33/33 [==============================] - 19s 592ms/step - loss: 0.2368 - accuracy: 0.9032 - val_loss: 0.1817 - val_accuracy: 0.9329
Epoch 135/300
33/33 [==============================] - 21s 658ms/step - loss: 0.2529 - accuracy: 0.9036 - val_loss: 0.2059 - val_accuracy: 0.9142
Epoch 136/300
33/33 [==============================] - 21s 654ms/step - loss: 0.2510 - accuracy: 0.9049 - val_loss: 0.1845 - val_accuracy: 0.9302
Epoch 137/300
33/33 [==============================] - 20s 607ms/step - loss: 0.2211 - accuracy: 0.9034 - val_loss: 0.1917 - val_accuracy: 0.9240
Epoch 138/300
33/33 [==============================] - 20s 603ms/step - loss: 0.2019 - accuracy: 0.9178 - val_loss: 0.1782 - val_accuracy: 0.9340
Epoch 139/300
33/33 [==============================] - 19s 592ms/step - loss: 0.2222 - accuracy: 0.9075 - val_loss: 0.1886 - val_accuracy: 0.9248
Epoch 140/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2329 - accuracy: 0.9084 - val_loss: 0.2077 - val_accuracy: 0.9107
Epoch 141/300
33/33 [==============================] - 20s 608ms/step - loss: 0.2327 - accuracy: 0.9034 - val_loss: 0.1785 - val_accuracy: 0.9329
Epoch 142/300
33/33 [==============================] - 20s 604ms/step - loss: 0.2203 - accuracy: 0.9113 - val_loss: 0.1769 - val_accuracy: 0.9345
Epoch 143/300
33/33 [==============================] - 19s 598ms/step - loss: 0.2333 - accuracy: 0.9071 - val_loss: 0.1808 - val_accuracy: 0.9326
Epoch 144/300
33/33 [==============================] - 19s 585ms/step - loss: 0.2346 - accuracy: 0.9026 - val_loss: 0.1864 - val_accuracy: 0.9316
Epoch 145/300
33/33 [==============================] - 20s 601ms/step - loss: 0.2197 - accuracy: 0.9094 - val_loss: 0.1923 - val_accuracy: 0.9256
Epoch 146/300
33/33 [==============================] - 19s 584ms/step - loss: 0.2214 - accuracy: 0.9109 - val_loss: 0.1937 - val_accuracy: 0.9260
Epoch 147/300
33/33 [==============================] - 19s 595ms/step - loss: 0.2048 - accuracy: 0.9130 - val_loss: 0.1825 - val_accuracy: 0.9345
Epoch 148/300
33/33 [==============================] - 20s 605ms/step - loss: 0.2147 - accuracy: 0.9202 - val_loss: 0.1789 - val_accuracy: 0.9371
Epoch 149/300
33/33 [==============================] - 19s 589ms/step - loss: 0.1985 - accuracy: 0.9134 - val_loss: 0.1905 - val_accuracy: 0.9273
Epoch 150/300
33/33 [==============================] - 19s 597ms/step - loss: 0.2311 - accuracy: 0.9081 - val_loss: 0.1763 - val_accuracy: 0.9384
Epoch 151/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2270 - accuracy: 0.9106 - val_loss: 0.1900 - val_accuracy: 0.9285
Epoch 152/300
33/33 [==============================] - 19s 595ms/step - loss: 0.1937 - accuracy: 0.9078 - val_loss: 0.2189 - val_accuracy: 0.9058
Epoch 153/300
33/33 [==============================] - 20s 602ms/step - loss: 0.2413 - accuracy: 0.9107 - val_loss: 0.1786 - val_accuracy: 0.9387
Epoch 154/300
33/33 [==============================] - 19s 593ms/step - loss: 0.2438 - accuracy: 0.9111 - val_loss: 0.1705 - val_accuracy: 0.9406
Epoch 155/300
33/33 [==============================] - 19s 598ms/step - loss: 0.1950 - accuracy: 0.9089 - val_loss: 0.1730 - val_accuracy: 0.9364
Epoch 156/300
33/33 [==============================] - 19s 597ms/step - loss: 0.2260 - accuracy: 0.9127 - val_loss: 0.1817 - val_accuracy: 0.9301
Epoch 157/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2072 - accuracy: 0.9080 - val_loss: 0.1810 - val_accuracy: 0.9310
Epoch 158/300
33/33 [==============================] - 19s 597ms/step - loss: 0.2392 - accuracy: 0.9065 - val_loss: 0.1919 - val_accuracy: 0.9232
Epoch 159/300
33/33 [==============================] - 19s 593ms/step - loss: 0.2149 - accuracy: 0.9082 - val_loss: 0.1848 - val_accuracy: 0.9277
Epoch 160/300
33/33 [==============================] - 20s 606ms/step - loss: 0.2382 - accuracy: 0.9064 - val_loss: 0.1774 - val_accuracy: 0.9347
Epoch 161/300
33/33 [==============================] - 19s 589ms/step - loss: 0.2224 - accuracy: 0.9045 - val_loss: 0.1754 - val_accuracy: 0.9372
Epoch 162/300
33/33 [==============================] - 19s 598ms/step - loss: 0.2201 - accuracy: 0.9130 - val_loss: 0.1855 - val_accuracy: 0.9300
Epoch 163/300
33/33 [==============================] - 19s 587ms/step - loss: 0.2309 - accuracy: 0.9121 - val_loss: 0.1761 - val_accuracy: 0.9342
Epoch 164/300
33/33 [==============================] - 20s 602ms/step - loss: 0.2144 - accuracy: 0.9144 - val_loss: 0.1939 - val_accuracy: 0.9240
Epoch 165/300
33/33 [==============================] - 20s 601ms/step - loss: 0.2141 - accuracy: 0.9131 - val_loss: 0.1739 - val_accuracy: 0.9391
Epoch 166/300
33/33 [==============================] - 19s 599ms/step - loss: 0.1838 - accuracy: 0.9182 - val_loss: 0.1998 - val_accuracy: 0.9177
Epoch 167/300
33/33 [==============================] - 20s 611ms/step - loss: 0.2359 - accuracy: 0.9080 - val_loss: 0.1745 - val_accuracy: 0.9366
Epoch 168/300
33/33 [==============================] - 20s 612ms/step - loss: 0.2184 - accuracy: 0.9138 - val_loss: 0.1728 - val_accuracy: 0.9394
Epoch 169/300
33/33 [==============================] - 19s 598ms/step - loss: 0.2315 - accuracy: 0.9062 - val_loss: 0.1803 - val_accuracy: 0.9330
Epoch 170/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2364 - accuracy: 0.9126 - val_loss: 0.1741 - val_accuracy: 0.9379
Epoch 171/300
33/33 [==============================] - 19s 593ms/step - loss: 0.2378 - accuracy: 0.9104 - val_loss: 0.2084 - val_accuracy: 0.9111
Epoch 172/300
33/33 [==============================] - 20s 614ms/step - loss: 0.2243 - accuracy: 0.9102 - val_loss: 0.1678 - val_accuracy: 0.9403
Epoch 173/300
33/33 [==============================] - 20s 606ms/step - loss: 0.2038 - accuracy: 0.9116 - val_loss: 0.1772 - val_accuracy: 0.9340
Epoch 174/300
33/33 [==============================] - 20s 601ms/step - loss: 0.1995 - accuracy: 0.9182 - val_loss: 0.1751 - val_accuracy: 0.9374
Epoch 175/300
33/33 [==============================] - 19s 600ms/step - loss: 0.1876 - accuracy: 0.9204 - val_loss: 0.1672 - val_accuracy: 0.9415
Epoch 176/300
33/33 [==============================] - 19s 592ms/step - loss: 0.2121 - accuracy: 0.9098 - val_loss: 0.1742 - val_accuracy: 0.9362
Epoch 177/300
33/33 [==============================] - 19s 597ms/step - loss: 0.2216 - accuracy: 0.9111 - val_loss: 0.1764 - val_accuracy: 0.9351
Epoch 178/300
33/33 [==============================] - 19s 591ms/step - loss: 0.2258 - accuracy: 0.9122 - val_loss: 0.1830 - val_accuracy: 0.9315
Epoch 179/300
33/33 [==============================] - 19s 593ms/step - loss: 0.2181 - accuracy: 0.9099 - val_loss: 0.1908 - val_accuracy: 0.9254
Epoch 180/300
33/33 [==============================] - 19s 598ms/step - loss: 0.2014 - accuracy: 0.9152 - val_loss: 0.1781 - val_accuracy: 0.9333
Epoch 181/300
33/33 [==============================] - 19s 589ms/step - loss: 0.2065 - accuracy: 0.9128 - val_loss: 0.1805 - val_accuracy: 0.9318
Epoch 182/300
33/33 [==============================] - 19s 600ms/step - loss: 0.2194 - accuracy: 0.9178 - val_loss: 0.1929 - val_accuracy: 0.9224
Epoch 183/300
33/33 [==============================] - 19s 594ms/step - loss: 0.1941 - accuracy: 0.9209 - val_loss: 0.1813 - val_accuracy: 0.9295
Epoch 184/300
33/33 [==============================] - 19s 596ms/step - loss: 0.1908 - accuracy: 0.9165 - val_loss: 0.1836 - val_accuracy: 0.9298
Epoch 185/300
33/33 [==============================] - 20s 605ms/step - loss: 0.2638 - accuracy: 0.9092 - val_loss: 0.1712 - val_accuracy: 0.9375
Epoch 186/300
33/33 [==============================] - 19s 595ms/step - loss: 0.2226 - accuracy: 0.9131 - val_loss: 0.2355 - val_accuracy: 0.8904
Epoch 187/300
33/33 [==============================] - 20s 612ms/step - loss: 0.2075 - accuracy: 0.9137 - val_loss: 0.1826 - val_accuracy: 0.9293
Epoch 188/300
33/33 [==============================] - 19s 594ms/step - loss: 0.1813 - accuracy: 0.9209 - val_loss: 0.1950 - val_accuracy: 0.9196
Epoch 189/300
33/33 [==============================] - 19s 600ms/step - loss: 0.2023 - accuracy: 0.9152 - val_loss: 0.1856 - val_accuracy: 0.9273
Epoch 190/300
33/33 [==============================] - 19s 600ms/step - loss: 0.1714 - accuracy: 0.9294 - val_loss: 0.2025 - val_accuracy: 0.9149
Epoch 191/300
33/33 [==============================] - 19s 592ms/step - loss: 0.1992 - accuracy: 0.9091 - val_loss: 0.1799 - val_accuracy: 0.9326
Epoch 192/300
33/33 [==============================] - 20s 607ms/step - loss: 0.2104 - accuracy: 0.9256 - val_loss: 0.1722 - val_accuracy: 0.9377
Epoch 193/300
33/33 [==============================] - 19s 596ms/step - loss: 0.2168 - accuracy: 0.9135 - val_loss: 0.1791 - val_accuracy: 0.9316
Epoch 194/300
33/33 [==============================] - 20s 601ms/step - loss: 0.2107 - accuracy: 0.9221 - val_loss: 0.1794 - val_accuracy: 0.9314
Epoch 195/300
33/33 [==============================] - 19s 597ms/step - loss: 0.2287 - accuracy: 0.9133 - val_loss: 0.1703 - val_accuracy: 0.9384
Epoch 196/300
33/33 [==============================] - 19s 597ms/step - loss: 0.1997 - accuracy: 0.9164 - val_loss: 0.1949 - val_accuracy: 0.9219
Epoch 197/300
33/33 [==============================] - 21s 632ms/step - loss: 0.2172 - accuracy: 0.9203 - val_loss: 0.1747 - val_accuracy: 0.9352
Epoch 198/300
33/33 [==============================] - 20s 615ms/step - loss: 0.2013 - accuracy: 0.9163 - val_loss: 0.1881 - val_accuracy: 0.9262
Epoch 199/300
33/33 [==============================] - 20s 621ms/step - loss: 0.1675 - accuracy: 0.9210 - val_loss: 0.1750 - val_accuracy: 0.9368
Epoch 200/300
33/33 [==============================] - 20s 614ms/step - loss: 0.1924 - accuracy: 0.9215 - val_loss: 0.1855 - val_accuracy: 0.9297
Epoch 201/300
33/33 [==============================] - 20s 602ms/step - loss: 0.1799 - accuracy: 0.9210 - val_loss: 0.1730 - val_accuracy: 0.9378
Epoch 202/300
33/33 [==============================] - 20s 612ms/step - loss: 0.2082 - accuracy: 0.9190 - val_loss: 0.1878 - val_accuracy: 0.9260
Epoch 203/300
33/33 [==============================] - 19s 597ms/step - loss: 0.2069 - accuracy: 0.9147 - val_loss: 0.1811 - val_accuracy: 0.9311
Epoch 204/300
33/33 [==============================] - 20s 609ms/step - loss: 0.1874 - accuracy: 0.9214 - val_loss: 0.1884 - val_accuracy: 0.9268
Epoch 205/300
33/33 [==============================] - 20s 608ms/step - loss: 0.1733 - accuracy: 0.9243 - val_loss: 0.1711 - val_accuracy: 0.9384
Epoch 206/300
33/33 [==============================] - 20s 603ms/step - loss: 0.2447 - accuracy: 0.9163 - val_loss: 0.1757 - val_accuracy: 0.9356
Epoch 207/300
33/33 [==============================] - 19s 599ms/step - loss: 0.1893 - accuracy: 0.9173 - val_loss: 0.1824 - val_accuracy: 0.9292
Epoch 208/300
33/33 [==============================] - 20s 601ms/step - loss: 0.1814 - accuracy: 0.9268 - val_loss: 0.1712 - val_accuracy: 0.9366
Epoch 209/300
33/33 [==============================] - 19s 598ms/step - loss: 0.2198 - accuracy: 0.9103 - val_loss: 0.1771 - val_accuracy: 0.9325
Epoch 210/300
33/33 [==============================] - 20s 604ms/step - loss: 0.2279 - accuracy: 0.9145 - val_loss: 0.1783 - val_accuracy: 0.9330
Epoch 211/300
33/33 [==============================] - 19s 593ms/step - loss: 0.2222 - accuracy: 0.9124 - val_loss: 0.1846 - val_accuracy: 0.9288
Epoch 212/300
33/33 [==============================] - 20s 602ms/step - loss: 0.1972 - accuracy: 0.9188 - val_loss: 0.1894 - val_accuracy: 0.9247
Epoch 213/300
33/33 [==============================] - 19s 594ms/step - loss: 0.1881 - accuracy: 0.9204 - val_loss: 0.1621 - val_accuracy: 0.9440
Epoch 214/300
33/33 [==============================] - 20s 607ms/step - loss: 0.2057 - accuracy: 0.9167 - val_loss: 0.1745 - val_accuracy: 0.9352
Epoch 215/300
33/33 [==============================] - 19s 599ms/step - loss: 0.2026 - accuracy: 0.9136 - val_loss: 0.1702 - val_accuracy: 0.9381
Epoch 216/300
33/33 [==============================] - 19s 595ms/step - loss: 0.2006 - accuracy: 0.9229 - val_loss: 0.1774 - val_accuracy: 0.9331
Epoch 217/300
33/33 [==============================] - 20s 602ms/step - loss: 0.2250 - accuracy: 0.9140 - val_loss: 0.1821 - val_accuracy: 0.9290
Epoch 218/300
33/33 [==============================] - 19s 596ms/step - loss: 0.2001 - accuracy: 0.9270 - val_loss: 0.1690 - val_accuracy: 0.9380
Epoch 219/300
33/33 [==============================] - 20s 608ms/step - loss: 0.2181 - accuracy: 0.9117 - val_loss: 0.1769 - val_accuracy: 0.9340
Epoch 220/300
33/33 [==============================] - 19s 598ms/step - loss: 0.2228 - accuracy: 0.9158 - val_loss: 0.1788 - val_accuracy: 0.9324
Epoch 221/300
33/33 [==============================] - 19s 588ms/step - loss: 0.2081 - accuracy: 0.9122 - val_loss: 0.1797 - val_accuracy: 0.9302
Epoch 222/300
33/33 [==============================] - 19s 600ms/step - loss: 0.2007 - accuracy: 0.9313 - val_loss: 0.2007 - val_accuracy: 0.9167
Epoch 223/300
33/33 [==============================] - 19s 585ms/step - loss: 0.2110 - accuracy: 0.9060 - val_loss: 0.1866 - val_accuracy: 0.9255
Epoch 224/300
33/33 [==============================] - 19s 592ms/step - loss: 0.2106 - accuracy: 0.9196 - val_loss: 0.1994 - val_accuracy: 0.9160
Epoch 225/300
33/33 [==============================] - 20s 601ms/step - loss: 0.1838 - accuracy: 0.9205 - val_loss: 0.1975 - val_accuracy: 0.9176
Epoch 226/300
33/33 [==============================] - 19s 593ms/step - loss: 0.1966 - accuracy: 0.9259 - val_loss: 0.1892 - val_accuracy: 0.9238
Epoch 227/300
33/33 [==============================] - 20s 602ms/step - loss: 0.2112 - accuracy: 0.9116 - val_loss: 0.1819 - val_accuracy: 0.9283
Epoch 228/300
33/33 [==============================] - 19s 585ms/step - loss: 0.1792 - accuracy: 0.9215 - val_loss: 0.1803 - val_accuracy: 0.9288
Epoch 229/300
33/33 [==============================] - 19s 598ms/step - loss: 0.1860 - accuracy: 0.9163 - val_loss: 0.1752 - val_accuracy: 0.9338
Epoch 230/300
33/33 [==============================] - 19s 586ms/step - loss: 0.1663 - accuracy: 0.9273 - val_loss: 0.1689 - val_accuracy: 0.9403
Epoch 231/300
33/33 [==============================] - 20s 601ms/step - loss: 0.2231 - accuracy: 0.9168 - val_loss: 0.1640 - val_accuracy: 0.9420
Epoch 232/300
33/33 [==============================] - 19s 599ms/step - loss: 0.2056 - accuracy: 0.9214 - val_loss: 0.1813 - val_accuracy: 0.9297
Epoch 233/300
33/33 [==============================] - 19s 588ms/step - loss: 0.1918 - accuracy: 0.9219 - val_loss: 0.1700 - val_accuracy: 0.9382
Epoch 234/300
33/33 [==============================] - 19s 597ms/step - loss: 0.1546 - accuracy: 0.9306 - val_loss: 0.1886 - val_accuracy: 0.9249
Epoch 235/300
33/33 [==============================] - 20s 603ms/step - loss: 0.1842 - accuracy: 0.9205 - val_loss: 0.1698 - val_accuracy: 0.9375
Epoch 236/300
33/33 [==============================] - 20s 603ms/step - loss: 0.1833 - accuracy: 0.9249 - val_loss: 0.1704 - val_accuracy: 0.9363
Epoch 237/300
33/33 [==============================] - 19s 599ms/step - loss: 0.1810 - accuracy: 0.9264 - val_loss: 0.1789 - val_accuracy: 0.9303
Epoch 238/300
33/33 [==============================] - 19s 590ms/step - loss: 0.2049 - accuracy: 0.9167 - val_loss: 0.1687 - val_accuracy: 0.9389
Epoch 239/300
33/33 [==============================] - 20s 607ms/step - loss: 0.1824 - accuracy: 0.9223 - val_loss: 0.2050 - val_accuracy: 0.9114
Epoch 240/300
33/33 [==============================] - 19s 599ms/step - loss: 0.1901 - accuracy: 0.9189 - val_loss: 0.1832 - val_accuracy: 0.9275
Epoch 241/300
33/33 [==============================] - 19s 599ms/step - loss: 0.2098 - accuracy: 0.9144 - val_loss: 0.1708 - val_accuracy: 0.9357
Epoch 242/300
33/33 [==============================] - 20s 610ms/step - loss: 0.1785 - accuracy: 0.9247 - val_loss: 0.1802 - val_accuracy: 0.9297
Epoch 243/300
33/33 [==============================] - 19s 596ms/step - loss: 0.2101 - accuracy: 0.9202 - val_loss: 0.1565 - val_accuracy: 0.9459
Epoch 244/300
33/33 [==============================] - 19s 599ms/step - loss: 0.2109 - accuracy: 0.9085 - val_loss: 0.1764 - val_accuracy: 0.9321
Epoch 245/300
33/33 [==============================] - 19s 596ms/step - loss: 0.1705 - accuracy: 0.9257 - val_loss: 0.1702 - val_accuracy: 0.9384
Epoch 246/300
33/33 [==============================] - 19s 596ms/step - loss: 0.1699 - accuracy: 0.9306 - val_loss: 0.1811 - val_accuracy: 0.9307
Epoch 247/300
33/33 [==============================] - 19s 599ms/step - loss: 0.1795 - accuracy: 0.9262 - val_loss: 0.1668 - val_accuracy: 0.9414
Epoch 248/300
33/33 [==============================] - 20s 606ms/step - loss: 0.1746 - accuracy: 0.9230 - val_loss: 0.1732 - val_accuracy: 0.9392
Epoch 249/300
33/33 [==============================] - 19s 594ms/step - loss: 0.1630 - accuracy: 0.9290 - val_loss: 0.1780 - val_accuracy: 0.9351
Epoch 250/300
33/33 [==============================] - 19s 595ms/step - loss: 0.1827 - accuracy: 0.9229 - val_loss: 0.1824 - val_accuracy: 0.9360
Epoch 251/300
33/33 [==============================] - 19s 599ms/step - loss: 0.2100 - accuracy: 0.9200 - val_loss: 0.1819 - val_accuracy: 0.9318
Epoch 252/300
33/33 [==============================] - 19s 595ms/step - loss: 0.1932 - accuracy: 0.9175 - val_loss: 0.1978 - val_accuracy: 0.9187
Epoch 253/300
33/33 [==============================] - 19s 599ms/step - loss: 0.1770 - accuracy: 0.9228 - val_loss: 0.1738 - val_accuracy: 0.9290
Epoch 254/300
33/33 [==============================] - 19s 593ms/step - loss: 0.1990 - accuracy: 0.9229 - val_loss: 0.2624 - val_accuracy: 0.8608
Epoch 255/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2030 - accuracy: 0.9119 - val_loss: 0.1739 - val_accuracy: 0.9324
Epoch 256/300
33/33 [==============================] - 20s 611ms/step - loss: 0.2367 - accuracy: 0.9184 - val_loss: 0.2298 - val_accuracy: 0.8902
Epoch 257/300
33/33 [==============================] - 19s 597ms/step - loss: 0.1696 - accuracy: 0.9228 - val_loss: 0.1839 - val_accuracy: 0.9265
Epoch 258/300
33/33 [==============================] - 20s 602ms/step - loss: 0.1571 - accuracy: 0.9308 - val_loss: 0.1636 - val_accuracy: 0.9407
Epoch 259/300
33/33 [==============================] - 20s 615ms/step - loss: 0.1756 - accuracy: 0.9237 - val_loss: 0.1910 - val_accuracy: 0.9211
Epoch 260/300
33/33 [==============================] - 19s 596ms/step - loss: 0.1547 - accuracy: 0.9223 - val_loss: 0.2102 - val_accuracy: 0.9059
Epoch 261/300
33/33 [==============================] - 20s 605ms/step - loss: 0.1734 - accuracy: 0.9340 - val_loss: 0.1924 - val_accuracy: 0.9206
Epoch 262/300
33/33 [==============================] - 20s 604ms/step - loss: 0.1613 - accuracy: 0.9338 - val_loss: 0.1716 - val_accuracy: 0.9390
Epoch 263/300
33/33 [==============================] - 20s 617ms/step - loss: 0.1598 - accuracy: 0.9292 - val_loss: 0.1808 - val_accuracy: 0.9306
Epoch 264/300
33/33 [==============================] - 20s 618ms/step - loss: 0.1968 - accuracy: 0.9245 - val_loss: 0.1693 - val_accuracy: 0.9374
Epoch 265/300
33/33 [==============================] - 20s 606ms/step - loss: 0.1925 - accuracy: 0.9186 - val_loss: 0.1704 - val_accuracy: 0.9366
Epoch 266/300
33/33 [==============================] - 19s 594ms/step - loss: 0.1694 - accuracy: 0.9304 - val_loss: 0.1814 - val_accuracy: 0.9311
Epoch 267/300
33/33 [==============================] - 19s 589ms/step - loss: 0.1784 - accuracy: 0.9259 - val_loss: 0.1690 - val_accuracy: 0.9379
Epoch 268/300
33/33 [==============================] - 19s 592ms/step - loss: 0.1882 - accuracy: 0.9213 - val_loss: 0.1711 - val_accuracy: 0.9381
Epoch 269/300
33/33 [==============================] - 19s 584ms/step - loss: 0.2014 - accuracy: 0.9121 - val_loss: 0.1635 - val_accuracy: 0.9424
Epoch 270/300
33/33 [==============================] - 19s 597ms/step - loss: 0.1936 - accuracy: 0.9268 - val_loss: 0.1716 - val_accuracy: 0.9390
Epoch 271/300
33/33 [==============================] - 20s 619ms/step - loss: 0.1824 - accuracy: 0.9214 - val_loss: 0.1689 - val_accuracy: 0.9397
Epoch 272/300
33/33 [==============================] - 20s 617ms/step - loss: 0.1600 - accuracy: 0.9321 - val_loss: 0.1661 - val_accuracy: 0.9438
Epoch 273/300
33/33 [==============================] - 20s 607ms/step - loss: 0.1690 - accuracy: 0.9303 - val_loss: 0.1648 - val_accuracy: 0.9407
Epoch 274/300
33/33 [==============================] - 20s 607ms/step - loss: 0.2084 - accuracy: 0.9190 - val_loss: 0.1659 - val_accuracy: 0.9409
Epoch 275/300
33/33 [==============================] - 19s 594ms/step - loss: 0.1722 - accuracy: 0.9299 - val_loss: 0.1721 - val_accuracy: 0.9359
Epoch 276/300
33/33 [==============================] - 20s 607ms/step - loss: 0.1945 - accuracy: 0.9278 - val_loss: 0.1695 - val_accuracy: 0.9377
Epoch 277/300
33/33 [==============================] - 19s 583ms/step - loss: 0.1686 - accuracy: 0.9284 - val_loss: 0.1707 - val_accuracy: 0.9364
Epoch 278/300
33/33 [==============================] - 19s 582ms/step - loss: 0.1743 - accuracy: 0.9265 - val_loss: 0.1655 - val_accuracy: 0.9414
Epoch 279/300
33/33 [==============================] - 19s 587ms/step - loss: 0.1615 - accuracy: 0.9310 - val_loss: 0.1639 - val_accuracy: 0.9448
Epoch 280/300
33/33 [==============================] - 19s 585ms/step - loss: 0.1919 - accuracy: 0.9252 - val_loss: 0.1623 - val_accuracy: 0.9421
Epoch 281/300
33/33 [==============================] - 19s 588ms/step - loss: 0.1706 - accuracy: 0.9235 - val_loss: 0.1576 - val_accuracy: 0.9446
Epoch 282/300
33/33 [==============================] - 19s 594ms/step - loss: 0.2319 - accuracy: 0.9218 - val_loss: 0.1683 - val_accuracy: 0.9395
Epoch 283/300
33/33 [==============================] - 19s 587ms/step - loss: 0.1554 - accuracy: 0.9283 - val_loss: 0.1685 - val_accuracy: 0.9387
Epoch 284/300
33/33 [==============================] - 20s 605ms/step - loss: 0.1562 - accuracy: 0.9289 - val_loss: 0.1701 - val_accuracy: 0.9371
Epoch 285/300
33/33 [==============================] - 19s 586ms/step - loss: 0.1490 - accuracy: 0.9356 - val_loss: 0.1805 - val_accuracy: 0.9326
Epoch 286/300
33/33 [==============================] - 19s 586ms/step - loss: 0.1779 - accuracy: 0.9268 - val_loss: 0.1688 - val_accuracy: 0.9398
Epoch 287/300
33/33 [==============================] - 19s 580ms/step - loss: 0.1639 - accuracy: 0.9316 - val_loss: 0.1642 - val_accuracy: 0.9439
Epoch 288/300
33/33 [==============================] - 19s 591ms/step - loss: 0.1833 - accuracy: 0.9286 - val_loss: 0.1902 - val_accuracy: 0.9298
Epoch 289/300
33/33 [==============================] - 19s 585ms/step - loss: 0.1785 - accuracy: 0.9252 - val_loss: 0.1942 - val_accuracy: 0.9196
Epoch 290/300
33/33 [==============================] - 19s 580ms/step - loss: 0.1724 - accuracy: 0.9331 - val_loss: 0.1636 - val_accuracy: 0.9411
Epoch 291/300
33/33 [==============================] - 19s 589ms/step - loss: 0.1604 - accuracy: 0.9298 - val_loss: 0.1676 - val_accuracy: 0.9378
Epoch 292/300
33/33 [==============================] - 19s 581ms/step - loss: 0.1905 - accuracy: 0.9286 - val_loss: 0.1783 - val_accuracy: 0.9330
Epoch 293/300
33/33 [==============================] - 19s 585ms/step - loss: 0.1929 - accuracy: 0.9243 - val_loss: 0.1987 - val_accuracy: 0.9150
Epoch 294/300
33/33 [==============================] - 19s 587ms/step - loss: 0.1635 - accuracy: 0.9289 - val_loss: 0.1635 - val_accuracy: 0.9380
Epoch 295/300
33/33 [==============================] - 19s 580ms/step - loss: 0.1765 - accuracy: 0.9320 - val_loss: 0.1846 - val_accuracy: 0.9281
Epoch 296/300
33/33 [==============================] - 19s 584ms/step - loss: 0.1590 - accuracy: 0.9308 - val_loss: 0.1734 - val_accuracy: 0.9337
Epoch 297/300
33/33 [==============================] - 19s 581ms/step - loss: 0.1880 - accuracy: 0.9218 - val_loss: 0.1634 - val_accuracy: 0.9381
Epoch 298/300
33/33 [==============================] - 19s 584ms/step - loss: 0.1907 - accuracy: 0.9246 - val_loss: 0.1551 - val_accuracy: 0.9455
Epoch 299/300
33/33 [==============================] - 19s 585ms/step - loss: 0.1844 - accuracy: 0.9269 - val_loss: 0.1668 - val_accuracy: 0.9379
Epoch 300/300
33/33 [==============================] - 19s 589ms/step - loss: 0.1752 - accuracy: 0.9252 - val_loss: 0.1706 - val_accuracy: 0.9328

Once training is complete, we can save the architecture and trained weights.

In [ ]:
model_json = model.to_json()
with open("/content/drive/My Drive/Datasets/ForestryView/Model300.json", "w") as json_file:
    json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("/content/drive/My Drive/Datasets/ForestryView/Model300.weights.h5")

Suppose we need to reload the architecture and weights:

In [ ]:
from keras.models import model_from_json
In [ ]:
json_file = open('/content/drive/My Drive/Datasets/ForestryView/Model300.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("/content/drive/My Drive/Datasets/ForestryView/Model300.weights.h5")

This way we can apply predict to the test data and compare the results:

In [ ]:
predict = loaded_model.predict(x_test)
In [ ]:
i = 2
R = x_test[i,:,:,2]*4
G = x_test[i,:,:,1]*4
B = x_test[i,:,:,0]*4
rgb = np.dstack((R,G,B))
plt.figure(figsize=[30,30])
plt.subplot(131)
plt.imshow(rgb)
plt.title('RGB Image')
plt.axis('off')
plt.subplot(132)
plt.imshow(np.round(y_test[i,:,:,0]))
plt.title('True Image')
plt.axis('off')
plt.subplot(133)
plt.imshow(np.round(predict[i,:,:,0]))
plt.title('Predict Image')
plt.axis('off')
Out[ ]:
(-0.5, 319.5, 319.5, -0.5)
No description has been provided for this image

Using the trained model to predict a complete orthomosaic.¶

Let's now apply our Eucalyptus area segmentation model to this orthomosaic and generate a resulting mapping.

image.png

The first step is to configure the orthomosaic path and create a folder in the content to receive the patches and one for the predictions.

In [ ]:
path_img_to_pred = "/content/drive/MyDrive/Datasets/ForestryView/TL.tif"
path_split = "/content/split_img"
if not os.path.isdir(path_split):
    os.mkdir(path_split)

path_exp = "/content/mask_predict"
if not os.path.isdir(path_exp):
    os.mkdir(path_exp)

We divide our orthomosaic into patches of the same size (The size of the image we trained the model on):

In [ ]:
src = rasterio.open(path_img_to_pred)
out_meta = src.meta.copy()
qtd = 0
for n in range((src.meta['width']//320)):
  for m in range((src.meta['height']//320)):
    x = ((n*320))
    y = ((m*320))
    window = Window(x,y,320,320)
    win_transform = src.window_transform(window)
    arr_win = src.read(window=window)
    if (arr_win.max() != 0) and (arr_win.shape[1] == 320) and (arr_win.shape[2] == 320):
      qtd = qtd + 1
      path_exp_img = os.path.join(path_split, 'img_' + str(qtd) + '.tif')
      out_meta.update({"driver": "GTiff","height": 320,"width": 320, "compress":'lzw', "transform":win_transform})
      with rasterio.open(path_exp_img, 'w', **out_meta) as dst:
          for i, layer in enumerate(arr_win, start=1):
              dst.write_band(i, layer.reshape(-1, layer.shape[-1]))
      print('Create img: ' + str(qtd))
    del arr_win
Create img: 1
Create img: 2
Create img: 3
Create img: 4
Create img: 5
Create img: 6
Create img: 7
Create img: 8
Create img: 9
Create img: 10
Create img: 11
Create img: 12
Create img: 13
Create img: 14
Create img: 15
Create img: 16
Create img: 17
Create img: 18
Create img: 19
Create img: 20
Create img: 21
Create img: 22
Create img: 23
Create img: 24
Create img: 25
Create img: 26
Create img: 27
Create img: 28
Create img: 29
Create img: 30
Create img: 31
Create img: 32
Create img: 33
Create img: 34
Create img: 35
Create img: 36
Create img: 37
Create img: 38
Create img: 39
Create img: 40
Create img: 41
Create img: 42
Create img: 43
Create img: 44
Create img: 45
Create img: 46
Create img: 47
Create img: 48
Create img: 49
Create img: 50
Create img: 51
Create img: 52
Create img: 53
Create img: 54
Create img: 55
Create img: 56
Create img: 57
Create img: 58
Create img: 59
Create img: 60
Create img: 61
Create img: 62
Create img: 63
Create img: 64
Create img: 65
Create img: 66
Create img: 67
Create img: 68
Create img: 69
Create img: 70
Create img: 71
Create img: 72
Create img: 73
Create img: 74
Create img: 75
Create img: 76
Create img: 77
Create img: 78
Create img: 79
Create img: 80
Create img: 81
Create img: 82
Create img: 83
Create img: 84
Create img: 85
Create img: 86
Create img: 87
Create img: 88
Create img: 89
Create img: 90
Create img: 91
Create img: 92
Create img: 93
Create img: 94
Create img: 95
Create img: 96
Create img: 97
Create img: 98
Create img: 99
Create img: 100
Create img: 101
Create img: 102
Create img: 103
Create img: 104
Create img: 105
Create img: 106
Create img: 107
Create img: 108
Create img: 109
Create img: 110
Create img: 111
Create img: 112
Create img: 113
Create img: 114
Create img: 115
Create img: 116
Create img: 117
Create img: 118
Create img: 119
Create img: 120
Create img: 121
Create img: 122
Create img: 123
Create img: 124
Create img: 125
Create img: 126
Create img: 127
Create img: 128
Create img: 129
Create img: 130
Create img: 131
Create img: 132
Create img: 133
Create img: 134
Create img: 135
Create img: 136
Create img: 137
Create img: 138
Create img: 139
Create img: 140
Create img: 141
Create img: 142
Create img: 143
Create img: 144
Create img: 145
Create img: 146
Create img: 147
Create img: 148
Create img: 149
Create img: 150
Create img: 151
Create img: 152
Create img: 153
Create img: 154
Create img: 155
Create img: 156
Create img: 157
Create img: 158
Create img: 159
Create img: 160
Create img: 161
Create img: 162
Create img: 163
Create img: 164
Create img: 165
Create img: 166
Create img: 167
Create img: 168
Create img: 169
Create img: 170
Create img: 171
Create img: 172
Create img: 173
Create img: 174
Create img: 175
Create img: 176
Create img: 177
Create img: 178
Create img: 179
Create img: 180
Create img: 181
Create img: 182
Create img: 183
Create img: 184
Create img: 185
Create img: 186
Create img: 187
Create img: 188
Create img: 189
Create img: 190
Create img: 191
Create img: 192
Create img: 193
Create img: 194
Create img: 195
Create img: 196
Create img: 197
Create img: 198
Create img: 199
Create img: 200
Create img: 201
Create img: 202
Create img: 203
Create img: 204
Create img: 205
Create img: 206
Create img: 207
Create img: 208
Create img: 209
Create img: 210
Create img: 211
Create img: 212
Create img: 213
Create img: 214
Create img: 215
Create img: 216
Create img: 217
Create img: 218
Create img: 219
Create img: 220
Create img: 221
Create img: 222
Create img: 223
Create img: 224
Create img: 225
Create img: 226
Create img: 227
Create img: 228
Create img: 229
Create img: 230
Create img: 231
Create img: 232
Create img: 233
Create img: 234
Create img: 235
Create img: 236
Create img: 237
Create img: 238
Create img: 239
Create img: 240
Create img: 241
Create img: 242
Create img: 243
Create img: 244
Create img: 245
Create img: 246
Create img: 247
Create img: 248
Create img: 249
Create img: 250
Create img: 251
Create img: 252
Create img: 253
Create img: 254
Create img: 255
Create img: 256
Create img: 257
Create img: 258
Create img: 259
Create img: 260
Create img: 261
Create img: 262
Create img: 263
Create img: 264
Create img: 265
Create img: 266
Create img: 267
Create img: 268
Create img: 269
Create img: 270
Create img: 271
Create img: 272
Create img: 273
Create img: 274
Create img: 275
Create img: 276
Create img: 277
Create img: 278
Create img: 279
Create img: 280
Create img: 281
Create img: 282
Create img: 283
Create img: 284
Create img: 285
Create img: 286
Create img: 287
Create img: 288
Create img: 289
Create img: 290
Create img: 291
Create img: 292
Create img: 293
Create img: 294
Create img: 295
Create img: 296
Create img: 297
Create img: 298
Create img: 299
Create img: 300
Create img: 301
Create img: 302
Create img: 303
Create img: 304
Create img: 305
Create img: 306
Create img: 307
Create img: 308
Create img: 309
Create img: 310
Create img: 311
Create img: 312
Create img: 313
Create img: 314
Create img: 315
Create img: 316
Create img: 317
Create img: 318
Create img: 319
Create img: 320
Create img: 321
Create img: 322
Create img: 323
Create img: 324
Create img: 325
Create img: 326
Create img: 327
Create img: 328
Create img: 329
Create img: 330
Create img: 331
Create img: 332
Create img: 333
Create img: 334
Create img: 335
Create img: 336
Create img: 337
Create img: 338
Create img: 339
Create img: 340
Create img: 341
Create img: 342
Create img: 343
Create img: 344
Create img: 345
Create img: 346
Create img: 347
Create img: 348
Create img: 349
Create img: 350
Create img: 351
Create img: 352
Create img: 353
Create img: 354
Create img: 355
Create img: 356
Create img: 357
Create img: 358
Create img: 359
Create img: 360
Create img: 361
Create img: 362
Create img: 363
Create img: 364
Create img: 365
Create img: 366
Create img: 367
Create img: 368
Create img: 369
Create img: 370
Create img: 371
Create img: 372
Create img: 373
Create img: 374
Create img: 375
Create img: 376
Create img: 377
Create img: 378
Create img: 379
Create img: 380
Create img: 381
Create img: 382
Create img: 383
Create img: 384
Create img: 385
Create img: 386
Create img: 387
Create img: 388
Create img: 389
Create img: 390
Create img: 391
Create img: 392
Create img: 393
Create img: 394
Create img: 395
Create img: 396
Create img: 397
Create img: 398
Create img: 399
Create img: 400
Create img: 401
Create img: 402
Create img: 403
Create img: 404
Create img: 405
Create img: 406
Create img: 407
Create img: 408
Create img: 409
Create img: 410
Create img: 411
Create img: 412
Create img: 413
Create img: 414
Create img: 415
Create img: 416
Create img: 417
Create img: 418
Create img: 419
Create img: 420
Create img: 421
Create img: 422
Create img: 423
Create img: 424
Create img: 425
Create img: 426
Create img: 427
Create img: 428
Create img: 429
Create img: 430
Create img: 431
Create img: 432
Create img: 433
Create img: 434
Create img: 435
Create img: 436
Create img: 437
Create img: 438
Create img: 439
Create img: 440
Create img: 441
Create img: 442
Create img: 443
Create img: 444
Create img: 445
Create img: 446
Create img: 447
Create img: 448
Create img: 449
Create img: 450
Create img: 451
Create img: 452
Create img: 453
Create img: 454
Create img: 455
Create img: 456
Create img: 457
Create img: 458
Create img: 459
Create img: 460
Create img: 461
Create img: 462
Create img: 463
Create img: 464
Create img: 465
Create img: 466
Create img: 467
Create img: 468
Create img: 469
Create img: 470
Create img: 471
Create img: 472
Create img: 473
Create img: 474
Create img: 475
Create img: 476
Create img: 477
Create img: 478
Create img: 479
Create img: 480
Create img: 481
Create img: 482
Create img: 483
Create img: 484
Create img: 485
Create img: 486
Create img: 487
Create img: 488
Create img: 489
Create img: 490
Create img: 491
Create img: 492
Create img: 493
Create img: 494
Create img: 495
Create img: 496
Create img: 497
Create img: 498
Create img: 499
Create img: 500
Create img: 501
Create img: 502
Create img: 503
Create img: 504
Create img: 505
Create img: 506
Create img: 507
Create img: 508
Create img: 509
Create img: 510
Create img: 511
Create img: 512
Create img: 513
Create img: 514
Create img: 515
Create img: 516
Create img: 517
Create img: 518
Create img: 519
Create img: 520
Create img: 521
Create img: 522
Create img: 523
Create img: 524
Create img: 525
Create img: 526
Create img: 527
Create img: 528
Create img: 529
Create img: 530
Create img: 531
Create img: 532
Create img: 533
Create img: 534
Create img: 535
Create img: 536
Create img: 537
Create img: 538
Create img: 539
Create img: 540
Create img: 541
Create img: 542
Create img: 543
Create img: 544
Create img: 545
Create img: 546
Create img: 547
Create img: 548
Create img: 549
Create img: 550
Create img: 551
Create img: 552
Create img: 553
Create img: 554
Create img: 555
Create img: 556
Create img: 557
Create img: 558
Create img: 559
Create img: 560
Create img: 561
Create img: 562
Create img: 563
Create img: 564
Create img: 565
Create img: 566
Create img: 567

We create a loop to open the image, predict the results and save the georeferenced mask.

In [ ]:
n = [f for f in os.listdir(path_split)]
In [ ]:
for path_img in n:
  img = []
  path_full = os.path.join(path_split,path_img)
  ds = rasterio.open(path_full, 'r')
  im = ds.read()
  im = im.transpose([1,2,0])
  im = im[np.newaxis,:,:,:]
  predict = loaded_model.predict(im)
  predict = np.round(predict).astype(np.uint8)
  print(path_img.split('_')[1])
  out_meta = ds.meta.copy()
  w = ds.meta['width']
  h = ds.meta['height']
  path_exp_1 = os.path.join(path_exp,'Pred_' + path_img.split('_')[1])
  out_meta.update({"driver": "GTiff","dtype":rasterio.uint8,"compress":'lzw',"count":1,"nodata":0})
  with rasterio.open(path_exp_1, 'w', **out_meta) as dst:
      dst.write(predict[0,:,:,0], indexes=1)
138.tif
362.tif
13.tif
261.tif
254.tif
116.tif
73.tif
206.tif
10.tif
282.tif
367.tif
208.tif
465.tif
295.tif
454.tif
354.tif
507.tif
46.tif
69.tif
341.tif
537.tif
493.tif
525.tif
491.tif
24.tif
438.tif
405.tif
414.tif
369.tif
33.tif
357.tif
79.tif
27.tif
286.tif
311.tif
461.tif
490.tif
277.tif
520.tif
210.tif
269.tif
124.tif
39.tif
450.tif
97.tif
542.tif
76.tif
200.tif
72.tif
293.tif
58.tif
451.tif
173.tif
186.tif
253.tif
230.tif
560.tif
370.tif
38.tif
203.tif
486.tif
250.tif
388.tif
540.tif
440.tif
285.tif
70.tif
478.tif
496.tif
403.tif
487.tif
565.tif
243.tif
397.tif
472.tif
474.tif
475.tif
95.tif
59.tif
164.tif
137.tif
401.tif
359.tif
15.tif
215.tif
265.tif
170.tif
212.tif
327.tif
133.tif
157.tif
158.tif
364.tif
136.tif
390.tif
350.tif
246.tif
23.tif
98.tif
94.tif
371.tif
387.tif
306.tif
445.tif
242.tif
488.tif
4.tif
416.tif
192.tif
153.tif
93.tif
187.tif
402.tif
122.tif
223.tif
502.tif
161.tif
177.tif
355.tif
113.tif
7.tif
100.tif
57.tif
121.tif
20.tif
3.tif
195.tif
339.tif
396.tif
12.tif
428.tif
294.tif
347.tif
550.tif
331.tif
532.tif
558.tif
290.tif
394.tif
313.tif
155.tif
442.tif
49.tif
501.tif
197.tif
521.tif
5.tif
228.tif
304.tif
259.tif
144.tif
321.tif
222.tif
356.tif
377.tif
218.tif
125.tif
21.tif
399.tif
512.tif
191.tif
156.tif
522.tif
239.tif
270.tif
91.tif
172.tif
75.tif
167.tif
19.tif
275.tif
330.tif
563.tif
535.tif
467.tif
404.tif
89.tif
151.tif
554.tif
337.tif
557.tif
395.tif
66.tif
227.tif
276.tif
193.tif
146.tif
128.tif
126.tif
538.tif
37.tif
74.tif
226.tif
559.tif
9.tif
135.tif
2.tif
211.tif
53.tif
343.tif
384.tif
381.tif
456.tif
301.tif
373.tif
338.tif
64.tif
240.tif
274.tif
179.tif
52.tif
256.tif
165.tif
368.tif
340.tif
273.tif
216.tif
430.tif
500.tif
268.tif
171.tif
204.tif
511.tif
541.tif
249.tif
549.tif
43.tif
484.tif
435.tif
232.tif
166.tif
510.tif
65.tif
54.tif
225.tif
326.tif
299.tif
534.tif
425.tif
77.tif
345.tif
379.tif
130.tif
300.tif
348.tif
528.tif
479.tif
6.tif
106.tif
181.tif
217.tif
28.tif
272.tif
420.tif
544.tif
120.tif
334.tif
201.tif
482.tif
117.tif
296.tif
411.tif
278.tif
566.tif
489.tif
241.tif
406.tif
515.tif
314.tif
247.tif
68.tif
88.tif
545.tif
497.tif
320.tif
220.tif
556.tif
231.tif
61.tif
417.tif
30.tif
134.tif
291.tif
308.tif
255.tif
1.tif
35.tif
60.tif
319.tif
431.tif
11.tif
477.tif
342.tif
22.tif
139.tif
150.tif
160.tif
82.tif
271.tif
419.tif
315.tif
336.tif
473.tif
328.tif
214.tif
398.tif
548.tif
423.tif
503.tif
561.tif
412.tif
96.tif
352.tif
555.tif
83.tif
184.tif
111.tif
463.tif
162.tif
514.tif
188.tif
264.tif
418.tif
190.tif
78.tif
92.tif
56.tif
408.tif
31.tif
360.tif
325.tif
154.tif
302.tif
48.tif
433.tif
464.tif
539.tif
365.tif
194.tif
424.tif
366.tif
118.tif
149.tif
452.tif
233.tif
506.tif
281.tif
529.tif
481.tif
469.tif
288.tif
457.tif
147.tif
104.tif
51.tif
382.tif
163.tif
494.tif
552.tif
183.tif
346.tif
441.tif
427.tif
71.tif
14.tif
129.tif
518.tif
297.tif
312.tif
252.tif
513.tif
468.tif
380.tif
244.tif
429.tif
280.tif
376.tif
8.tif
309.tif
409.tif
317.tif
307.tif
509.tif
353.tif
333.tif
207.tif
289.tif
237.tif
41.tif
25.tif
372.tif
374.tif
383.tif
504.tif
18.tif
443.tif
152.tif
141.tif
413.tif
449.tif
263.tif
329.tif
168.tif
40.tif
45.tif
148.tif
36.tif
108.tif
103.tif
123.tif
498.tif
551.tif
567.tif
495.tif
351.tif
109.tif
279.tif
499.tif
444.tif
322.tif
202.tif
251.tif
236.tif
85.tif
205.tif
455.tif
460.tif
523.tif
182.tif
407.tif
115.tif
145.tif
389.tif
143.tif
176.tif
421.tif
553.tif
483.tif
508.tif
32.tif
55.tif
17.tif
142.tif
16.tif
470.tif
198.tif
245.tif
44.tif
434.tif
110.tif
114.tif
458.tif
185.tif
257.tif
526.tif
101.tif
316.tif
564.tif
283.tif
378.tif
260.tif
131.tif
238.tif
235.tif
67.tif
229.tif
358.tif
363.tif
80.tif
26.tif
42.tif
159.tif
178.tif
415.tif
524.tif
453.tif
127.tif
305.tif
213.tif
516.tif
84.tif
439.tif
34.tif
62.tif
258.tif
459.tif
81.tif
219.tif
436.tif
267.tif
375.tif
393.tif
519.tif
323.tif
462.tif
107.tif
531.tif
196.tif
140.tif
400.tif
102.tif
180.tif
332.tif
386.tif
471.tif
391.tif
517.tif
287.tif
476.tif
234.tif
562.tif
344.tif
169.tif
47.tif
221.tif
174.tif
533.tif
446.tif
536.tif
292.tif
50.tif
432.tif
392.tif
527.tif
175.tif
447.tif
547.tif
410.tif
324.tif
189.tif
209.tif
310.tif
335.tif
426.tif
492.tif
87.tif
63.tif
29.tif
530.tif
485.tif
480.tif
298.tif
132.tif
546.tif
224.tif
543.tif
361.tif
303.tif
466.tif
448.tif
99.tif
505.tif
119.tif
105.tif
422.tif
248.tif
266.tif
199.tif
318.tif
112.tif
86.tif
385.tif
262.tif
349.tif
284.tif
437.tif
90.tif

After predicting the images, we will mosaic the resulting masks into a single .tif file:

In [ ]:
out_fp = r"/content/Pred_mosaic.tif"
In [ ]:
images_files = [f for f in os.listdir(path_exp)]
print(images_files)
['Pred_352.tif', 'Pred_183.tif', 'Pred_96.tif', 'Pred_537.tif', 'Pred_66.tif', 'Pred_348.tif', 'Pred_109.tif', 'Pred_155.tif', 'Pred_459.tif', 'Pred_361.tif', 'Pred_509.tif', 'Pred_157.tif', 'Pred_238.tif', 'Pred_21.tif', 'Pred_2.tif', 'Pred_4.tif', 'Pred_25.tif', 'Pred_186.tif', 'Pred_260.tif', 'Pred_419.tif', 'Pred_445.tif', 'Pred_221.tif', 'Pred_126.tif', 'Pred_9.tif', 'Pred_107.tif', 'Pred_499.tif', 'Pred_268.tif', 'Pred_542.tif', 'Pred_545.tif', 'Pred_397.tif', 'Pred_209.tif', 'Pred_506.tif', 'Pred_105.tif', 'Pred_454.tif', 'Pred_404.tif', 'Pred_326.tif', 'Pred_554.tif', 'Pred_51.tif', 'Pred_169.tif', 'Pred_220.tif', 'Pred_199.tif', 'Pred_154.tif', 'Pred_497.tif', 'Pred_180.tif', 'Pred_495.tif', 'Pred_203.tif', 'Pred_346.tif', 'Pred_138.tif', 'Pred_15.tif', 'Pred_234.tif', 'Pred_541.tif', 'Pred_20.tif', 'Pred_263.tif', 'Pred_287.tif', 'Pred_153.tif', 'Pred_370.tif', 'Pred_292.tif', 'Pred_432.tif', 'Pred_391.tif', 'Pred_255.tif', 'Pred_259.tif', 'Pred_286.tif', 'Pred_437.tif', 'Pred_381.tif', 'Pred_110.tif', 'Pred_73.tif', 'Pred_152.tif', 'Pred_342.tif', 'Pred_243.tif', 'Pred_215.tif', 'Pred_562.tif', 'Pred_253.tif', 'Pred_22.tif', 'Pred_556.tif', 'Pred_399.tif', 'Pred_237.tif', 'Pred_137.tif', 'Pred_524.tif', 'Pred_81.tif', 'Pred_58.tif', 'Pred_10.tif', 'Pred_281.tif', 'Pred_104.tif', 'Pred_425.tif', 'Pred_106.tif', 'Pred_517.tif', 'Pred_145.tif', 'Pred_23.tif', 'Pred_194.tif', 'Pred_65.tif', 'Pred_69.tif', 'Pred_491.tif', 'Pred_11.tif', 'Pred_522.tif', 'Pred_321.tif', 'Pred_94.tif', 'Pred_461.tif', 'Pred_14.tif', 'Pred_478.tif', 'Pred_318.tif', 'Pred_190.tif', 'Pred_353.tif', 'Pred_447.tif', 'Pred_503.tif', 'Pred_552.tif', 'Pred_492.tif', 'Pred_124.tif', 'Pred_232.tif', 'Pred_433.tif', 'Pred_513.tif', 'Pred_299.tif', 'Pred_136.tif', 'Pred_212.tif', 'Pred_297.tif', 'Pred_483.tif', 'Pred_248.tif', 'Pred_32.tif', 'Pred_434.tif', 'Pred_324.tif', 'Pred_431.tif', 'Pred_365.tif', 'Pred_374.tif', 'Pred_122.tif', 'Pred_129.tif', 'Pred_470.tif', 'Pred_402.tif', 'Pred_31.tif', 'Pred_41.tif', 'Pred_472.tif', 'Pred_103.tif', 'Pred_413.tif', 'Pred_565.tif', 'Pred_227.tif', 'Pred_67.tif', 'Pred_201.tif', 'Pred_265.tif', 'Pred_142.tif', 'Pred_206.tif', 'Pred_181.tif', 'Pred_426.tif', 'Pred_422.tif', 'Pred_140.tif', 'Pred_195.tif', 'Pred_278.tif', 'Pred_507.tif', 'Pred_266.tif', 'Pred_52.tif', 'Pred_449.tif', 'Pred_53.tif', 'Pred_300.tif', 'Pred_344.tif', 'Pred_376.tif', 'Pred_192.tif', 'Pred_469.tif', 'Pred_162.tif', 'Pred_176.tif', 'Pred_448.tif', 'Pred_99.tif', 'Pred_118.tif', 'Pred_379.tif', 'Pred_127.tif', 'Pred_282.tif', 'Pred_219.tif', 'Pred_337.tif', 'Pred_485.tif', 'Pred_134.tif', 'Pred_114.tif', 'Pred_165.tif', 'Pred_241.tif', 'Pred_557.tif', 'Pred_487.tif', 'Pred_280.tif', 'Pred_439.tif', 'Pred_371.tif', 'Pred_57.tif', 'Pred_98.tif', 'Pred_373.tif', 'Pred_24.tif', 'Pred_334.tif', 'Pred_471.tif', 'Pred_174.tif', 'Pred_428.tif', 'Pred_525.tif', 'Pred_251.tif', 'Pred_82.tif', 'Pred_91.tif', 'Pred_354.tif', 'Pred_225.tif', 'Pred_7.tif', 'Pred_258.tif', 'Pred_204.tif', 'Pred_189.tif', 'Pred_331.tif', 'Pred_392.tif', 'Pred_97.tif', 'Pred_312.tif', 'Pred_170.tif', 'Pred_527.tif', 'Pred_369.tif', 'Pred_563.tif', 'Pred_216.tif', 'Pred_233.tif', 'Pred_467.tif', 'Pred_175.tif', 'Pred_451.tif', 'Pred_3.tif', 'Pred_395.tif', 'Pred_508.tif', 'Pred_173.tif', 'Pred_38.tif', 'Pred_307.tif', 'Pred_133.tif', 'Pred_526.tif', 'Pred_496.tif', 'Pred_474.tif', 'Pred_393.tif', 'Pred_228.tif', 'Pred_213.tif', 'Pred_476.tif', 'Pred_18.tif', 'Pred_273.tif', 'Pred_377.tif', 'Pred_37.tif', 'Pred_460.tif', 'Pred_267.tif', 'Pred_551.tif', 'Pred_302.tif', 'Pred_340.tif', 'Pred_305.tif', 'Pred_70.tif', 'Pred_515.tif', 'Pred_161.tif', 'Pred_197.tif', 'Pred_473.tif', 'Pred_63.tif', 'Pred_502.tif', 'Pred_327.tif', 'Pred_378.tif', 'Pred_359.tif', 'Pred_205.tif', 'Pred_27.tif', 'Pred_330.tif', 'Pred_510.tif', 'Pred_427.tif', 'Pred_147.tif', 'Pred_314.tif', 'Pred_493.tif', 'Pred_529.tif', 'Pred_245.tif', 'Pred_444.tif', 'Pred_79.tif', 'Pred_523.tif', 'Pred_511.tif', 'Pred_418.tif', 'Pred_208.tif', 'Pred_464.tif', 'Pred_362.tif', 'Pred_301.tif', 'Pred_385.tif', 'Pred_135.tif', 'Pred_450.tif', 'Pred_319.tif', 'Pred_130.tif', 'Pred_384.tif', 'Pred_355.tif', 'Pred_188.tif', 'Pred_116.tif', 'Pred_16.tif', 'Pred_398.tif', 'Pred_360.tif', 'Pred_231.tif', 'Pred_207.tif', 'Pred_119.tif', 'Pred_160.tif', 'Pred_117.tif', 'Pred_441.tif', 'Pred_458.tif', 'Pred_60.tif', 'Pred_77.tif', 'Pred_366.tif', 'Pred_386.tif', 'Pred_59.tif', 'Pred_121.tif', 'Pred_171.tif', 'Pred_88.tif', 'Pred_210.tif', 'Pred_40.tif', 'Pred_247.tif', 'Pred_498.tif', 'Pred_33.tif', 'Pred_466.tif', 'Pred_141.tif', 'Pred_93.tif', 'Pred_412.tif', 'Pred_85.tif', 'Pred_112.tif', 'Pred_26.tif', 'Pred_351.tif', 'Pred_306.tif', 'Pred_486.tif', 'Pred_328.tif', 'Pred_304.tif', 'Pred_42.tif', 'Pred_322.tif', 'Pred_338.tif', 'Pred_17.tif', 'Pred_89.tif', 'Pred_95.tif', 'Pred_146.tif', 'Pred_416.tif', 'Pred_13.tif', 'Pred_440.tif', 'Pred_436.tif', 'Pred_235.tif', 'Pred_156.tif', 'Pred_482.tif', 'Pred_283.tif', 'Pred_34.tif', 'Pred_291.tif', 'Pred_456.tif', 'Pred_519.tif', 'Pred_462.tif', 'Pred_226.tif', 'Pred_46.tif', 'Pred_558.tif', 'Pred_567.tif', 'Pred_388.tif', 'Pred_521.tif', 'Pred_514.tif', 'Pred_191.tif', 'Pred_149.tif', 'Pred_274.tif', 'Pred_401.tif', 'Pred_101.tif', 'Pred_163.tif', 'Pred_367.tif', 'Pred_264.tif', 'Pred_540.tif', 'Pred_102.tif', 'Pred_310.tif', 'Pred_249.tif', 'Pred_168.tif', 'Pred_148.tif', 'Pred_28.tif', 'Pred_74.tif', 'Pred_76.tif', 'Pred_415.tif', 'Pred_54.tif', 'Pred_198.tif', 'Pred_335.tif', 'Pred_457.tif', 'Pred_479.tif', 'Pred_279.tif', 'Pred_92.tif', 'Pred_252.tif', 'Pred_356.tif', 'Pred_43.tif', 'Pred_559.tif', 'Pred_547.tif', 'Pred_275.tif', 'Pred_166.tif', 'Pred_566.tif', 'Pred_36.tif', 'Pred_480.tif', 'Pred_500.tif', 'Pred_78.tif', 'Pred_240.tif', 'Pred_246.tif', 'Pred_230.tif', 'Pred_184.tif', 'Pred_368.tif', 'Pred_115.tif', 'Pred_332.tif', 'Pred_564.tif', 'Pred_364.tif', 'Pred_257.tif', 'Pred_463.tif', 'Pred_296.tif', 'Pred_90.tif', 'Pred_295.tif', 'Pred_298.tif', 'Pred_193.tif', 'Pred_294.tif', 'Pred_47.tif', 'Pred_516.tif', 'Pred_438.tif', 'Pred_548.tif', 'Pred_84.tif', 'Pred_44.tif', 'Pred_303.tif', 'Pred_410.tif', 'Pred_341.tif', 'Pred_196.tif', 'Pred_443.tif', 'Pred_229.tif', 'Pred_403.tif', 'Pred_420.tif', 'Pred_224.tif', 'Pred_182.tif', 'Pred_120.tif', 'Pred_343.tif', 'Pred_468.tif', 'Pred_217.tif', 'Pred_494.tif', 'Pred_30.tif', 'Pred_12.tif', 'Pred_164.tif', 'Pred_490.tif', 'Pred_87.tif', 'Pred_56.tif', 'Pred_408.tif', 'Pred_128.tif', 'Pred_68.tif', 'Pred_489.tif', 'Pred_276.tif', 'Pred_375.tif', 'Pred_242.tif', 'Pred_390.tif', 'Pred_309.tif', 'Pred_131.tif', 'Pred_560.tif', 'Pred_236.tif', 'Pred_550.tif', 'Pred_533.tif', 'Pred_239.tif', 'Pred_159.tif', 'Pred_132.tif', 'Pred_262.tif', 'Pred_553.tif', 'Pred_256.tif', 'Pred_400.tif', 'Pred_531.tif', 'Pred_158.tif', 'Pred_167.tif', 'Pred_423.tif', 'Pred_350.tif', 'Pred_8.tif', 'Pred_317.tif', 'Pred_336.tif', 'Pred_429.tif', 'Pred_380.tif', 'Pred_405.tif', 'Pred_530.tif', 'Pred_349.tif', 'Pred_49.tif', 'Pred_211.tif', 'Pred_411.tif', 'Pred_383.tif', 'Pred_139.tif', 'Pred_200.tif', 'Pred_320.tif', 'Pred_534.tif', 'Pred_481.tif', 'Pred_83.tif', 'Pred_313.tif', 'Pred_123.tif', 'Pred_35.tif', 'Pred_185.tif', 'Pred_179.tif', 'Pred_345.tif', 'Pred_484.tif', 'Pred_72.tif', 'Pred_435.tif', 'Pred_372.tif', 'Pred_409.tif', 'Pred_143.tif', 'Pred_414.tif', 'Pred_442.tif', 'Pred_113.tif', 'Pred_421.tif', 'Pred_151.tif', 'Pred_465.tif', 'Pred_178.tif', 'Pred_475.tif', 'Pred_333.tif', 'Pred_394.tif', 'Pred_284.tif', 'Pred_272.tif', 'Pred_80.tif', 'Pred_150.tif', 'Pred_19.tif', 'Pred_357.tif', 'Pred_50.tif', 'Pred_520.tif', 'Pred_288.tif', 'Pred_187.tif', 'Pred_396.tif', 'Pred_86.tif', 'Pred_505.tif', 'Pred_125.tif', 'Pred_424.tif', 'Pred_214.tif', 'Pred_222.tif', 'Pred_363.tif', 'Pred_518.tif', 'Pred_477.tif', 'Pred_218.tif', 'Pred_75.tif', 'Pred_290.tif', 'Pred_271.tif', 'Pred_5.tif', 'Pred_177.tif', 'Pred_289.tif', 'Pred_311.tif', 'Pred_202.tif', 'Pred_539.tif', 'Pred_325.tif', 'Pred_308.tif', 'Pred_285.tif', 'Pred_55.tif', 'Pred_323.tif', 'Pred_339.tif', 'Pred_316.tif', 'Pred_512.tif', 'Pred_329.tif', 'Pred_71.tif', 'Pred_144.tif', 'Pred_501.tif', 'Pred_488.tif', 'Pred_406.tif', 'Pred_455.tif', 'Pred_389.tif', 'Pred_277.tif', 'Pred_270.tif', 'Pred_407.tif', 'Pred_1.tif', 'Pred_453.tif', 'Pred_45.tif', 'Pred_172.tif', 'Pred_269.tif', 'Pred_546.tif', 'Pred_61.tif', 'Pred_244.tif', 'Pred_549.tif', 'Pred_446.tif', 'Pred_561.tif', 'Pred_555.tif', 'Pred_358.tif', 'Pred_452.tif', 'Pred_430.tif', 'Pred_39.tif', 'Pred_250.tif', 'Pred_48.tif', 'Pred_528.tif', 'Pred_100.tif', 'Pred_538.tif', 'Pred_382.tif', 'Pred_544.tif', 'Pred_62.tif', 'Pred_223.tif', 'Pred_536.tif', 'Pred_293.tif', 'Pred_29.tif', 'Pred_111.tif', 'Pred_254.tif', 'Pred_532.tif', 'Pred_387.tif', 'Pred_64.tif', 'Pred_504.tif', 'Pred_543.tif', 'Pred_108.tif', 'Pred_261.tif', 'Pred_417.tif', 'Pred_347.tif', 'Pred_315.tif', 'Pred_535.tif', 'Pred_6.tif']
In [ ]:
src_files_to_mosaic = []
for fp in images_files:
  src = rasterio.open(os.path.join(path_exp,fp))
  src_files_to_mosaic.append(src)

We use the rasterio merge function to join all the masks and save the result:

In [ ]:
mosaic, out_trans = merge(src_files_to_mosaic)
In [ ]:
out_meta.update({"driver": "GTiff",
                 "height": mosaic.shape[1],
                 "width": mosaic.shape[2],
                 "transform": out_trans,
                 "compress":'lzw'})
In [ ]:
with rasterio.open(out_fp, "w", **out_meta) as dest:
    dest.write(mosaic)

By placing the resulting mask over the original image we can generate a map of eucalyptus areas in our AOI:

image.png