Advanced Semantic Segmentation Architectures - Part 1¶

How to create mask patches and orthomosaic images from shapefiles¶

image.png

I will demonstrate how we convert target shapefiles into masks to train semantic segmentation models. The dataset used in this example is part of the Open Cities AI challenge that we will use later:

In [ ]:
!pip install rasterio
Collecting rasterio
  Downloading rasterio-1.3.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.3/21.3 MB 44.9 MB/s eta 0:00:00
Collecting affine (from rasterio)
  Downloading affine-2.4.0-py3-none-any.whl (15 kB)
Requirement already satisfied: attrs in /usr/local/lib/python3.10/dist-packages (from rasterio) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from rasterio) (2023.7.22)
Requirement already satisfied: click>=4.0 in /usr/local/lib/python3.10/dist-packages (from rasterio) (8.1.7)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.10/dist-packages (from rasterio) (0.7.2)
Requirement already satisfied: numpy>=1.18 in /usr/local/lib/python3.10/dist-packages (from rasterio) (1.23.5)
Collecting snuggs>=1.4.1 (from rasterio)
  Downloading snuggs-1.4.7-py3-none-any.whl (5.4 kB)
Requirement already satisfied: click-plugins in /usr/local/lib/python3.10/dist-packages (from rasterio) (1.1.1)
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from rasterio) (67.7.2)
Requirement already satisfied: pyparsing>=2.1.6 in /usr/local/lib/python3.10/dist-packages (from snuggs>=1.4.1->rasterio) (3.1.1)
Installing collected packages: snuggs, affine, rasterio
Successfully installed affine-2.4.0 rasterio-1.3.8 snuggs-1.4.7
Requirement already satisfied: geopandas in /usr/local/lib/python3.10/dist-packages (0.13.2)
Requirement already satisfied: fiona>=1.8.19 in /usr/local/lib/python3.10/dist-packages (from geopandas) (1.9.4.post1)
Requirement already satisfied: packaging in /usr/local/lib/python3.10/dist-packages (from geopandas) (23.1)
Requirement already satisfied: pandas>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from geopandas) (1.5.3)
Requirement already satisfied: pyproj>=3.0.1 in /usr/local/lib/python3.10/dist-packages (from geopandas) (3.6.0)
Requirement already satisfied: shapely>=1.7.1 in /usr/local/lib/python3.10/dist-packages (from geopandas) (2.0.1)
Requirement already satisfied: attrs>=19.2.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (2023.7.22)
Requirement already satisfied: click~=8.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (8.1.7)
Requirement already satisfied: click-plugins>=1.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (1.1.1)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (0.7.2)
Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (1.16.0)
Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (2023.3)
Requirement already satisfied: numpy>=1.21.0 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (1.23.5)
In [ ]:
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
In [ ]:
import glob
import os
import cv2
import rasterio
import geopandas as gpd
import numpy as np
import matplotlib.pyplot as plt
from tensorflow.keras.utils import to_categorical
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
from rasterio.merge import merge
from rasterio.plot import show
In [ ]:
path = '/content/drive/MyDrive/Datasets/OpenCitiesAI/mon/493701/493701.tif'

Let's open the file, read the raw data as numpy arrays and perform the band axis transposition:

In [ ]:
src = rasterio.open(path)
In [ ]:
img = src.read()
In [ ]:
img.shape
Out[ ]:
(4, 21783, 22333)
In [ ]:
img = img.transpose([1,2,0])

After opening the image stored in Drive, let's check the CRS:

In [ ]:
src.crs
Out[ ]:
CRS.from_epsg(32629)

Using matplotlib, let's see the image:

In [ ]:
plt.figure(figsize=[16,16])
plt.imshow(img)
plt.axis('off')
Out[ ]:
(-0.5, 22332.5, 21782.5, -0.5)
No description has been provided for this image

Now let's work with the labels. First we import them using Geopandas and then we set the same CRS as the Image:

In [ ]:
path_labels = '/content/drive/MyDrive/Datasets/OpenCitiesAI/mon/493701-labels/493701.geojson'
In [ ]:
label = gpd.read_file(path_labels)
In [ ]:
label = label.to_crs(32629)

We will also plot:

In [ ]:
label.plot(figsize=(16,16))
Out[ ]:
<Axes: >
No description has been provided for this image

Let's use the rasterize function to convert the target vectors into an image with 0's and 1's. The number 1 represents the polygons and the 0 represents the background.

In [ ]:
from rasterio.features import rasterize
In [ ]:
shape_target = src.shape
out_arr = np.zeros(shape_target)
out_target = src.meta.copy()
mask_rasterized = rasterize( [(x.geometry, 1) for i, x in label.iterrows()],
                                transform=src.transform,
                                fill=0,
                                out = out_arr,
                                dtype=rasterio.uint8)
del out_arr
out_target.update({"driver": "GTiff",
                              "nodata":0,
                               "dtype":rasterio.uint8,
                               "compress":'lzw',
                               "count":1})
path_exp_target = '/content/mask_target.tif'
with rasterio.open(path_exp_target, 'w', **out_target) as msk:
    msk.write(mask_rasterized, indexes=1)

The resulting file was saved in /content with the name mask_target.tiff

So, let's open it up and plot it:

In [ ]:
tgt = rasterio.open(path_exp_target)
In [ ]:
tgt_arr = tgt.read(1)
In [ ]:
plt.figure(figsize=[16,16])
plt.imshow(tgt_arr)
plt.axis('off')
Out[ ]:
(-0.5, 22332.5, 21782.5, -0.5)
No description has been provided for this image

We will create a folder in the content to store the image patches and masks:

In [ ]:
from os import mkdir
In [ ]:
mkdir('data')

And finally we will go through the image and the mask, dividing them into patches of 512x512 pixels.

In [ ]:
from rasterio.windows import Window
In [ ]:
qtd = 0
out_meta = src.meta.copy()
out_meta_tgt = tgt.meta.copy()
for n in range((src.meta['width']//512)):
  for m in range((src.meta['height']//512)):
    x = (n*512)
    y = (m*512)
    window = Window(x,y,512,512)
    win_transform = src.window_transform(window)
    arr_win = src.read(window=window)
    arr_win = arr_win[0:3]
    tgt_transform = tgt.window_transform(window)
    tgt_win = tgt.read(window=window)
    if (arr_win.max() != 0):
      qtd = qtd + 1
      path_exp = '/content/data/img_' + str(qtd) + '.tif'
      out_meta.update({"driver": "GTiff","height": 512,"width": 512, "count":len(arr_win), "compress":'lzw', "transform":win_transform})
      with rasterio.open(path_exp, 'w', **out_meta) as dst:
          for i, layer in enumerate(arr_win, start=1):
              dst.write_band(i, layer.reshape(-1, layer.shape[-1]))
      path_exp_mask = '/content/data/msk_' + str(qtd) + '.tif'
      out_meta_tgt.update({"driver": "GTiff","height": 512,"width": 512, "compress":'lzw', "transform":tgt_transform})
      with rasterio.open(path_exp_mask, 'w', **out_meta_tgt) as msk:
          msk.write(tgt_win.reshape(-1, tgt_win.shape[-1]), indexes=1)
      print('Create img and mask: ' + str(qtd))
    del tgt_win
    del arr_win
Create img and mask: 1
Create img and mask: 2
Create img and mask: 3
Create img and mask: 4
Create img and mask: 5
Create img and mask: 6
Create img and mask: 7
Create img and mask: 8
Create img and mask: 9
Create img and mask: 10
Create img and mask: 11
Create img and mask: 12
Create img and mask: 13
Create img and mask: 14
Create img and mask: 15
Create img and mask: 16
Create img and mask: 17
Create img and mask: 18
Create img and mask: 19
Create img and mask: 20
Create img and mask: 21
Create img and mask: 22
Create img and mask: 23
Create img and mask: 24
Create img and mask: 25
Create img and mask: 26
Create img and mask: 27
Create img and mask: 28
Create img and mask: 29
Create img and mask: 30
Create img and mask: 31
Create img and mask: 32
Create img and mask: 33
Create img and mask: 34
Create img and mask: 35
Create img and mask: 36
Create img and mask: 37
Create img and mask: 38
Create img and mask: 39
Create img and mask: 40
Create img and mask: 41
Create img and mask: 42
Create img and mask: 43
Create img and mask: 44
Create img and mask: 45
Create img and mask: 46
Create img and mask: 47
Create img and mask: 48
Create img and mask: 49
Create img and mask: 50
Create img and mask: 51
Create img and mask: 52
Create img and mask: 53
Create img and mask: 54
Create img and mask: 55
Create img and mask: 56
Create img and mask: 57
Create img and mask: 58
Create img and mask: 59
Create img and mask: 60
Create img and mask: 61
Create img and mask: 62
Create img and mask: 63
Create img and mask: 64
Create img and mask: 65
Create img and mask: 66
Create img and mask: 67
Create img and mask: 68
Create img and mask: 69
Create img and mask: 70
Create img and mask: 71
Create img and mask: 72
Create img and mask: 73
Create img and mask: 74
Create img and mask: 75
Create img and mask: 76
Create img and mask: 77
Create img and mask: 78
Create img and mask: 79
Create img and mask: 80
Create img and mask: 81
Create img and mask: 82
Create img and mask: 83
Create img and mask: 84
Create img and mask: 85
Create img and mask: 86
Create img and mask: 87
Create img and mask: 88
Create img and mask: 89
Create img and mask: 90
Create img and mask: 91
Create img and mask: 92
Create img and mask: 93
Create img and mask: 94
Create img and mask: 95
Create img and mask: 96
Create img and mask: 97
Create img and mask: 98
Create img and mask: 99
Create img and mask: 100
Create img and mask: 101
Create img and mask: 102
Create img and mask: 103
Create img and mask: 104
Create img and mask: 105
Create img and mask: 106
Create img and mask: 107
Create img and mask: 108
Create img and mask: 109
Create img and mask: 110
Create img and mask: 111
Create img and mask: 112
Create img and mask: 113
Create img and mask: 114
Create img and mask: 115
Create img and mask: 116
Create img and mask: 117
Create img and mask: 118
Create img and mask: 119
Create img and mask: 120
Create img and mask: 121
Create img and mask: 122
Create img and mask: 123
Create img and mask: 124
Create img and mask: 125
Create img and mask: 126
Create img and mask: 127
Create img and mask: 128
Create img and mask: 129
Create img and mask: 130
Create img and mask: 131
Create img and mask: 132
Create img and mask: 133
Create img and mask: 134
Create img and mask: 135
Create img and mask: 136
Create img and mask: 137
Create img and mask: 138
Create img and mask: 139
Create img and mask: 140
Create img and mask: 141
Create img and mask: 142
Create img and mask: 143
Create img and mask: 144
Create img and mask: 145
Create img and mask: 146
Create img and mask: 147
Create img and mask: 148
Create img and mask: 149
Create img and mask: 150
Create img and mask: 151
Create img and mask: 152
Create img and mask: 153
Create img and mask: 154
Create img and mask: 155
Create img and mask: 156
Create img and mask: 157
Create img and mask: 158
Create img and mask: 159
Create img and mask: 160
Create img and mask: 161
Create img and mask: 162
Create img and mask: 163
Create img and mask: 164
Create img and mask: 165
Create img and mask: 166
Create img and mask: 167
Create img and mask: 168
Create img and mask: 169
Create img and mask: 170
Create img and mask: 171
Create img and mask: 172
Create img and mask: 173
Create img and mask: 174
Create img and mask: 175
Create img and mask: 176
Create img and mask: 177
Create img and mask: 178
Create img and mask: 179
Create img and mask: 180
Create img and mask: 181
Create img and mask: 182
Create img and mask: 183
Create img and mask: 184
Create img and mask: 185
Create img and mask: 186
Create img and mask: 187
Create img and mask: 188
Create img and mask: 189
Create img and mask: 190
Create img and mask: 191
Create img and mask: 192
Create img and mask: 193
Create img and mask: 194
Create img and mask: 195
Create img and mask: 196
Create img and mask: 197
Create img and mask: 198
Create img and mask: 199
Create img and mask: 200
Create img and mask: 201
Create img and mask: 202
Create img and mask: 203
Create img and mask: 204
Create img and mask: 205
Create img and mask: 206
Create img and mask: 207
Create img and mask: 208
Create img and mask: 209
Create img and mask: 210
Create img and mask: 211
Create img and mask: 212
Create img and mask: 213
Create img and mask: 214
Create img and mask: 215
Create img and mask: 216
Create img and mask: 217
Create img and mask: 218
Create img and mask: 219
Create img and mask: 220
Create img and mask: 221
Create img and mask: 222
Create img and mask: 223
Create img and mask: 224
Create img and mask: 225
Create img and mask: 226
Create img and mask: 227
Create img and mask: 228
Create img and mask: 229
Create img and mask: 230
Create img and mask: 231
Create img and mask: 232
Create img and mask: 233
Create img and mask: 234
Create img and mask: 235
Create img and mask: 236
Create img and mask: 237
Create img and mask: 238
Create img and mask: 239
Create img and mask: 240
Create img and mask: 241
Create img and mask: 242
Create img and mask: 243
Create img and mask: 244
Create img and mask: 245
Create img and mask: 246
Create img and mask: 247
Create img and mask: 248
Create img and mask: 249
Create img and mask: 250
Create img and mask: 251
Create img and mask: 252
Create img and mask: 253
Create img and mask: 254
Create img and mask: 255
Create img and mask: 256
Create img and mask: 257
Create img and mask: 258
Create img and mask: 259
Create img and mask: 260
Create img and mask: 261
Create img and mask: 262
Create img and mask: 263
Create img and mask: 264
Create img and mask: 265
Create img and mask: 266
Create img and mask: 267
Create img and mask: 268
Create img and mask: 269
Create img and mask: 270
Create img and mask: 271
Create img and mask: 272
Create img and mask: 273
Create img and mask: 274
Create img and mask: 275
Create img and mask: 276
Create img and mask: 277
Create img and mask: 278
Create img and mask: 279
Create img and mask: 280
Create img and mask: 281
Create img and mask: 282
Create img and mask: 283
Create img and mask: 284
Create img and mask: 285
Create img and mask: 286
Create img and mask: 287
Create img and mask: 288
Create img and mask: 289
Create img and mask: 290
Create img and mask: 291
Create img and mask: 292
Create img and mask: 293
Create img and mask: 294
Create img and mask: 295
Create img and mask: 296
Create img and mask: 297
Create img and mask: 298
Create img and mask: 299
Create img and mask: 300
Create img and mask: 301
Create img and mask: 302
Create img and mask: 303
Create img and mask: 304
Create img and mask: 305
Create img and mask: 306
Create img and mask: 307
Create img and mask: 308
Create img and mask: 309
Create img and mask: 310
Create img and mask: 311
Create img and mask: 312
Create img and mask: 313
Create img and mask: 314
Create img and mask: 315
Create img and mask: 316
Create img and mask: 317
Create img and mask: 318
Create img and mask: 319
Create img and mask: 320
Create img and mask: 321
Create img and mask: 322
Create img and mask: 323
Create img and mask: 324
Create img and mask: 325
Create img and mask: 326
Create img and mask: 327
Create img and mask: 328
Create img and mask: 329
Create img and mask: 330
Create img and mask: 331
Create img and mask: 332
Create img and mask: 333
Create img and mask: 334
Create img and mask: 335
Create img and mask: 336
Create img and mask: 337
Create img and mask: 338
Create img and mask: 339
Create img and mask: 340
Create img and mask: 341
Create img and mask: 342
Create img and mask: 343
Create img and mask: 344
Create img and mask: 345
Create img and mask: 346
Create img and mask: 347
Create img and mask: 348
Create img and mask: 349
Create img and mask: 350
Create img and mask: 351
Create img and mask: 352
Create img and mask: 353
Create img and mask: 354
Create img and mask: 355
Create img and mask: 356
Create img and mask: 357
Create img and mask: 358
Create img and mask: 359
Create img and mask: 360
Create img and mask: 361
Create img and mask: 362
Create img and mask: 363
Create img and mask: 364
Create img and mask: 365
Create img and mask: 366
Create img and mask: 367
Create img and mask: 368
Create img and mask: 369
Create img and mask: 370
Create img and mask: 371
Create img and mask: 372
Create img and mask: 373
Create img and mask: 374
Create img and mask: 375
Create img and mask: 376
Create img and mask: 377
Create img and mask: 378
Create img and mask: 379
Create img and mask: 380
Create img and mask: 381
Create img and mask: 382
Create img and mask: 383
Create img and mask: 384
Create img and mask: 385
Create img and mask: 386
Create img and mask: 387
Create img and mask: 388
Create img and mask: 389
Create img and mask: 390
Create img and mask: 391
Create img and mask: 392
Create img and mask: 393
Create img and mask: 394
Create img and mask: 395
Create img and mask: 396
Create img and mask: 397
Create img and mask: 398
Create img and mask: 399
Create img and mask: 400
Create img and mask: 401
Create img and mask: 402
Create img and mask: 403
Create img and mask: 404
Create img and mask: 405
Create img and mask: 406
Create img and mask: 407
Create img and mask: 408
Create img and mask: 409
Create img and mask: 410
Create img and mask: 411
Create img and mask: 412
Create img and mask: 413
Create img and mask: 414
Create img and mask: 415
Create img and mask: 416
Create img and mask: 417
Create img and mask: 418
Create img and mask: 419
Create img and mask: 420
Create img and mask: 421
Create img and mask: 422
Create img and mask: 423
Create img and mask: 424
Create img and mask: 425
Create img and mask: 426
Create img and mask: 427
Create img and mask: 428
Create img and mask: 429
Create img and mask: 430
Create img and mask: 431
Create img and mask: 432
Create img and mask: 433
Create img and mask: 434
Create img and mask: 435
Create img and mask: 436
Create img and mask: 437
Create img and mask: 438
Create img and mask: 439
Create img and mask: 440
Create img and mask: 441
Create img and mask: 442
Create img and mask: 443
Create img and mask: 444
Create img and mask: 445
Create img and mask: 446
Create img and mask: 447
Create img and mask: 448
Create img and mask: 449
Create img and mask: 450
Create img and mask: 451
Create img and mask: 452
Create img and mask: 453
Create img and mask: 454
Create img and mask: 455
Create img and mask: 456
Create img and mask: 457
Create img and mask: 458
Create img and mask: 459
Create img and mask: 460
Create img and mask: 461
Create img and mask: 462
Create img and mask: 463
Create img and mask: 464
Create img and mask: 465
Create img and mask: 466
Create img and mask: 467
Create img and mask: 468
Create img and mask: 469
Create img and mask: 470
Create img and mask: 471
Create img and mask: 472
Create img and mask: 473
Create img and mask: 474
Create img and mask: 475
Create img and mask: 476
Create img and mask: 477
Create img and mask: 478
Create img and mask: 479
Create img and mask: 480
Create img and mask: 481
Create img and mask: 482
Create img and mask: 483
Create img and mask: 484
Create img and mask: 485
Create img and mask: 486
Create img and mask: 487
Create img and mask: 488
Create img and mask: 489
Create img and mask: 490
Create img and mask: 491
Create img and mask: 492
Create img and mask: 493
Create img and mask: 494
Create img and mask: 495
Create img and mask: 496
Create img and mask: 497
Create img and mask: 498
Create img and mask: 499
Create img and mask: 500
Create img and mask: 501
Create img and mask: 502
Create img and mask: 503
Create img and mask: 504
Create img and mask: 505
Create img and mask: 506
Create img and mask: 507
Create img and mask: 508
Create img and mask: 509
Create img and mask: 510
Create img and mask: 511
Create img and mask: 512
Create img and mask: 513
Create img and mask: 514
Create img and mask: 515
Create img and mask: 516
Create img and mask: 517
Create img and mask: 518
Create img and mask: 519
Create img and mask: 520
Create img and mask: 521
Create img and mask: 522
Create img and mask: 523
Create img and mask: 524
Create img and mask: 525
Create img and mask: 526
Create img and mask: 527
Create img and mask: 528
Create img and mask: 529
Create img and mask: 530
Create img and mask: 531
Create img and mask: 532
Create img and mask: 533
Create img and mask: 534
Create img and mask: 535
Create img and mask: 536
Create img and mask: 537
Create img and mask: 538
Create img and mask: 539
Create img and mask: 540
Create img and mask: 541
Create img and mask: 542
Create img and mask: 543
Create img and mask: 544
Create img and mask: 545
Create img and mask: 546
Create img and mask: 547
Create img and mask: 548
Create img and mask: 549
Create img and mask: 550
Create img and mask: 551
Create img and mask: 552
Create img and mask: 553
Create img and mask: 554
Create img and mask: 555
Create img and mask: 556
Create img and mask: 557
Create img and mask: 558
Create img and mask: 559
Create img and mask: 560
Create img and mask: 561
Create img and mask: 562
Create img and mask: 563
Create img and mask: 564
Create img and mask: 565
Create img and mask: 566
Create img and mask: 567
Create img and mask: 568
Create img and mask: 569
Create img and mask: 570
Create img and mask: 571
Create img and mask: 572
Create img and mask: 573
Create img and mask: 574
Create img and mask: 575
Create img and mask: 576
Create img and mask: 577
Create img and mask: 578
Create img and mask: 579
Create img and mask: 580
Create img and mask: 581
Create img and mask: 582
Create img and mask: 583
Create img and mask: 584
Create img and mask: 585
Create img and mask: 586
Create img and mask: 587
Create img and mask: 588
Create img and mask: 589
Create img and mask: 590
Create img and mask: 591
Create img and mask: 592
Create img and mask: 593
Create img and mask: 594
Create img and mask: 595
Create img and mask: 596
Create img and mask: 597
Create img and mask: 598
Create img and mask: 599
Create img and mask: 600
Create img and mask: 601
Create img and mask: 602
Create img and mask: 603
Create img and mask: 604
Create img and mask: 605
Create img and mask: 606
Create img and mask: 607
Create img and mask: 608
Create img and mask: 609
Create img and mask: 610
Create img and mask: 611
Create img and mask: 612
Create img and mask: 613
Create img and mask: 614
Create img and mask: 615
Create img and mask: 616
Create img and mask: 617
Create img and mask: 618
Create img and mask: 619
Create img and mask: 620
Create img and mask: 621
Create img and mask: 622
Create img and mask: 623
Create img and mask: 624
Create img and mask: 625
Create img and mask: 626
Create img and mask: 627
Create img and mask: 628
Create img and mask: 629
Create img and mask: 630
Create img and mask: 631
Create img and mask: 632
Create img and mask: 633
Create img and mask: 634
Create img and mask: 635
Create img and mask: 636
Create img and mask: 637
Create img and mask: 638
Create img and mask: 639
Create img and mask: 640
Create img and mask: 641
Create img and mask: 642
Create img and mask: 643
Create img and mask: 644
Create img and mask: 645
Create img and mask: 646
Create img and mask: 647
Create img and mask: 648
Create img and mask: 649
Create img and mask: 650
Create img and mask: 651
Create img and mask: 652
Create img and mask: 653
Create img and mask: 654
Create img and mask: 655
Create img and mask: 656
Create img and mask: 657
Create img and mask: 658
Create img and mask: 659
Create img and mask: 660
Create img and mask: 661
Create img and mask: 662
Create img and mask: 663
Create img and mask: 664
Create img and mask: 665
Create img and mask: 666
Create img and mask: 667
Create img and mask: 668
Create img and mask: 669
Create img and mask: 670
Create img and mask: 671
Create img and mask: 672
Create img and mask: 673
Create img and mask: 674
Create img and mask: 675
Create img and mask: 676
Create img and mask: 677
Create img and mask: 678
Create img and mask: 679
Create img and mask: 680
Create img and mask: 681
Create img and mask: 682
Create img and mask: 683
Create img and mask: 684
Create img and mask: 685
Create img and mask: 686
Create img and mask: 687
Create img and mask: 688
Create img and mask: 689
Create img and mask: 690
Create img and mask: 691
Create img and mask: 692
Create img and mask: 693
Create img and mask: 694
Create img and mask: 695
Create img and mask: 696
Create img and mask: 697
Create img and mask: 698
Create img and mask: 699
Create img and mask: 700
Create img and mask: 701
Create img and mask: 702
Create img and mask: 703
Create img and mask: 704
Create img and mask: 705
Create img and mask: 706
Create img and mask: 707
Create img and mask: 708
Create img and mask: 709
Create img and mask: 710
Create img and mask: 711
Create img and mask: 712
Create img and mask: 713
Create img and mask: 714
Create img and mask: 715
Create img and mask: 716
Create img and mask: 717
Create img and mask: 718
Create img and mask: 719
Create img and mask: 720
Create img and mask: 721
Create img and mask: 722
Create img and mask: 723
Create img and mask: 724
Create img and mask: 725
Create img and mask: 726
Create img and mask: 727
Create img and mask: 728
Create img and mask: 729
Create img and mask: 730
Create img and mask: 731
Create img and mask: 732
Create img and mask: 733
Create img and mask: 734
Create img and mask: 735
Create img and mask: 736
Create img and mask: 737
Create img and mask: 738
Create img and mask: 739
Create img and mask: 740
Create img and mask: 741
Create img and mask: 742
Create img and mask: 743
Create img and mask: 744
Create img and mask: 745
Create img and mask: 746
Create img and mask: 747
Create img and mask: 748
Create img and mask: 749
Create img and mask: 750
Create img and mask: 751
Create img and mask: 752
Create img and mask: 753
Create img and mask: 754
Create img and mask: 755
Create img and mask: 756
Create img and mask: 757
Create img and mask: 758
Create img and mask: 759
Create img and mask: 760
Create img and mask: 761
Create img and mask: 762
Create img and mask: 763
Create img and mask: 764
Create img and mask: 765
Create img and mask: 766
Create img and mask: 767
Create img and mask: 768
Create img and mask: 769
Create img and mask: 770
Create img and mask: 771
Create img and mask: 772
Create img and mask: 773
Create img and mask: 774
Create img and mask: 775
Create img and mask: 776
Create img and mask: 777
Create img and mask: 778
Create img and mask: 779
Create img and mask: 780
Create img and mask: 781
Create img and mask: 782
Create img and mask: 783
Create img and mask: 784
Create img and mask: 785
Create img and mask: 786
Create img and mask: 787
Create img and mask: 788
Create img and mask: 789
Create img and mask: 790
Create img and mask: 791
Create img and mask: 792
Create img and mask: 793
Create img and mask: 794
Create img and mask: 795
Create img and mask: 796
Create img and mask: 797
Create img and mask: 798
Create img and mask: 799
Create img and mask: 800
Create img and mask: 801
Create img and mask: 802
Create img and mask: 803
Create img and mask: 804
Create img and mask: 805
Create img and mask: 806
Create img and mask: 807
Create img and mask: 808
Create img and mask: 809
Create img and mask: 810
Create img and mask: 811
Create img and mask: 812
Create img and mask: 813
Create img and mask: 814
Create img and mask: 815
Create img and mask: 816
Create img and mask: 817
Create img and mask: 818
Create img and mask: 819
Create img and mask: 820
Create img and mask: 821
Create img and mask: 822
Create img and mask: 823
Create img and mask: 824
Create img and mask: 825
Create img and mask: 826
Create img and mask: 827
Create img and mask: 828
Create img and mask: 829
Create img and mask: 830
Create img and mask: 831
Create img and mask: 832
Create img and mask: 833
Create img and mask: 834
Create img and mask: 835
Create img and mask: 836
Create img and mask: 837
Create img and mask: 838
Create img and mask: 839
Create img and mask: 840
Create img and mask: 841
Create img and mask: 842
Create img and mask: 843
Create img and mask: 844
Create img and mask: 845
Create img and mask: 846
Create img and mask: 847
Create img and mask: 848
Create img and mask: 849
Create img and mask: 850
Create img and mask: 851
Create img and mask: 852
Create img and mask: 853
Create img and mask: 854
Create img and mask: 855
Create img and mask: 856
Create img and mask: 857
Create img and mask: 858
Create img and mask: 859
Create img and mask: 860
Create img and mask: 861
Create img and mask: 862
Create img and mask: 863
Create img and mask: 864
Create img and mask: 865
Create img and mask: 866
Create img and mask: 867
Create img and mask: 868
Create img and mask: 869
Create img and mask: 870
Create img and mask: 871
Create img and mask: 872
Create img and mask: 873
Create img and mask: 874
Create img and mask: 875
Create img and mask: 876
Create img and mask: 877
Create img and mask: 878
Create img and mask: 879
Create img and mask: 880
Create img and mask: 881
Create img and mask: 882
Create img and mask: 883
Create img and mask: 884
Create img and mask: 885
Create img and mask: 886
Create img and mask: 887
Create img and mask: 888
Create img and mask: 889
Create img and mask: 890
Create img and mask: 891
Create img and mask: 892
Create img and mask: 893
Create img and mask: 894
Create img and mask: 895
Create img and mask: 896
Create img and mask: 897
Create img and mask: 898
Create img and mask: 899
Create img and mask: 900
Create img and mask: 901
Create img and mask: 902
Create img and mask: 903
Create img and mask: 904
Create img and mask: 905
Create img and mask: 906
Create img and mask: 907
Create img and mask: 908
Create img and mask: 909
Create img and mask: 910
Create img and mask: 911
Create img and mask: 912
Create img and mask: 913
Create img and mask: 914
Create img and mask: 915
Create img and mask: 916
Create img and mask: 917
Create img and mask: 918
Create img and mask: 919
Create img and mask: 920
Create img and mask: 921
Create img and mask: 922
Create img and mask: 923
Create img and mask: 924
Create img and mask: 925
Create img and mask: 926
Create img and mask: 927
Create img and mask: 928
Create img and mask: 929
Create img and mask: 930
Create img and mask: 931
Create img and mask: 932
Create img and mask: 933
Create img and mask: 934
Create img and mask: 935
Create img and mask: 936
Create img and mask: 937
Create img and mask: 938
Create img and mask: 939
Create img and mask: 940
Create img and mask: 941
Create img and mask: 942
Create img and mask: 943
Create img and mask: 944
Create img and mask: 945
Create img and mask: 946
Create img and mask: 947
Create img and mask: 948
Create img and mask: 949
Create img and mask: 950
Create img and mask: 951
Create img and mask: 952
Create img and mask: 953
Create img and mask: 954
Create img and mask: 955
Create img and mask: 956
Create img and mask: 957
Create img and mask: 958
Create img and mask: 959
Create img and mask: 960
Create img and mask: 961
Create img and mask: 962
Create img and mask: 963
Create img and mask: 964
Create img and mask: 965
Create img and mask: 966
Create img and mask: 967
Create img and mask: 968
Create img and mask: 969
Create img and mask: 970
Create img and mask: 971
Create img and mask: 972
Create img and mask: 973
Create img and mask: 974
Create img and mask: 975
Create img and mask: 976
Create img and mask: 977
Create img and mask: 978
Create img and mask: 979
Create img and mask: 980
Create img and mask: 981
Create img and mask: 982
Create img and mask: 983
Create img and mask: 984
Create img and mask: 985
Create img and mask: 986
Create img and mask: 987
Create img and mask: 988
Create img and mask: 989
Create img and mask: 990
Create img and mask: 991
Create img and mask: 992
Create img and mask: 993
Create img and mask: 994
Create img and mask: 995
Create img and mask: 996
Create img and mask: 997
Create img and mask: 998
Create img and mask: 999
Create img and mask: 1000
Create img and mask: 1001
Create img and mask: 1002
Create img and mask: 1003
Create img and mask: 1004
Create img and mask: 1005
Create img and mask: 1006
Create img and mask: 1007
Create img and mask: 1008
Create img and mask: 1009
Create img and mask: 1010
Create img and mask: 1011
Create img and mask: 1012
Create img and mask: 1013
Create img and mask: 1014
Create img and mask: 1015
Create img and mask: 1016
Create img and mask: 1017
Create img and mask: 1018
Create img and mask: 1019
Create img and mask: 1020
Create img and mask: 1021
Create img and mask: 1022
Create img and mask: 1023
Create img and mask: 1024
Create img and mask: 1025
Create img and mask: 1026
Create img and mask: 1027
Create img and mask: 1028
Create img and mask: 1029
Create img and mask: 1030
Create img and mask: 1031
Create img and mask: 1032
Create img and mask: 1033
Create img and mask: 1034
Create img and mask: 1035
Create img and mask: 1036
Create img and mask: 1037
Create img and mask: 1038
Create img and mask: 1039
Create img and mask: 1040
Create img and mask: 1041
Create img and mask: 1042
Create img and mask: 1043
Create img and mask: 1044
Create img and mask: 1045
Create img and mask: 1046
Create img and mask: 1047
Create img and mask: 1048
Create img and mask: 1049
Create img and mask: 1050
Create img and mask: 1051
Create img and mask: 1052
Create img and mask: 1053
Create img and mask: 1054
Create img and mask: 1055
Create img and mask: 1056
Create img and mask: 1057
Create img and mask: 1058
Create img and mask: 1059
Create img and mask: 1060
Create img and mask: 1061
Create img and mask: 1062
Create img and mask: 1063
Create img and mask: 1064
Create img and mask: 1065
Create img and mask: 1066
Create img and mask: 1067
Create img and mask: 1068
Create img and mask: 1069
Create img and mask: 1070
Create img and mask: 1071
Create img and mask: 1072
Create img and mask: 1073
Create img and mask: 1074
Create img and mask: 1075
Create img and mask: 1076
Create img and mask: 1077
Create img and mask: 1078
Create img and mask: 1079
Create img and mask: 1080
Create img and mask: 1081
Create img and mask: 1082
Create img and mask: 1083
Create img and mask: 1084
Create img and mask: 1085
Create img and mask: 1086
Create img and mask: 1087
Create img and mask: 1088
Create img and mask: 1089
Create img and mask: 1090
Create img and mask: 1091
Create img and mask: 1092
Create img and mask: 1093
Create img and mask: 1094
Create img and mask: 1095
Create img and mask: 1096
Create img and mask: 1097
Create img and mask: 1098
Create img and mask: 1099
Create img and mask: 1100
Create img and mask: 1101
Create img and mask: 1102
Create img and mask: 1103
Create img and mask: 1104
Create img and mask: 1105
Create img and mask: 1106
Create img and mask: 1107
Create img and mask: 1108
Create img and mask: 1109
Create img and mask: 1110
Create img and mask: 1111
Create img and mask: 1112
Create img and mask: 1113
Create img and mask: 1114
Create img and mask: 1115
Create img and mask: 1116
Create img and mask: 1117
Create img and mask: 1118
Create img and mask: 1119
Create img and mask: 1120
Create img and mask: 1121
Create img and mask: 1122
Create img and mask: 1123
Create img and mask: 1124
Create img and mask: 1125
Create img and mask: 1126
Create img and mask: 1127
Create img and mask: 1128
Create img and mask: 1129
Create img and mask: 1130
Create img and mask: 1131
Create img and mask: 1132
Create img and mask: 1133
Create img and mask: 1134
Create img and mask: 1135
Create img and mask: 1136
Create img and mask: 1137
Create img and mask: 1138
Create img and mask: 1139
Create img and mask: 1140
Create img and mask: 1141
Create img and mask: 1142
Create img and mask: 1143
Create img and mask: 1144
Create img and mask: 1145
Create img and mask: 1146
Create img and mask: 1147
Create img and mask: 1148
Create img and mask: 1149
Create img and mask: 1150
Create img and mask: 1151
Create img and mask: 1152
Create img and mask: 1153
Create img and mask: 1154
Create img and mask: 1155
Create img and mask: 1156
Create img and mask: 1157
Create img and mask: 1158
Create img and mask: 1159
Create img and mask: 1160
Create img and mask: 1161
Create img and mask: 1162
Create img and mask: 1163
Create img and mask: 1164
Create img and mask: 1165
Create img and mask: 1166
Create img and mask: 1167
Create img and mask: 1168
Create img and mask: 1169
Create img and mask: 1170
Create img and mask: 1171
Create img and mask: 1172
Create img and mask: 1173
Create img and mask: 1174
Create img and mask: 1175
Create img and mask: 1176
Create img and mask: 1177
Create img and mask: 1178
Create img and mask: 1179
Create img and mask: 1180
Create img and mask: 1181
Create img and mask: 1182
Create img and mask: 1183
Create img and mask: 1184
Create img and mask: 1185
Create img and mask: 1186
Create img and mask: 1187
Create img and mask: 1188
Create img and mask: 1189
Create img and mask: 1190
Create img and mask: 1191
Create img and mask: 1192
Create img and mask: 1193
Create img and mask: 1194
Create img and mask: 1195
Create img and mask: 1196
Create img and mask: 1197
Create img and mask: 1198

Finally we can import the created patches and plot them with matplotlib:

In [ ]:
X = []
images_files = [f for f in os.listdir('/content/data/') if f.startswith('img')]
images_files.sort()
for files in images_files:
  import_raster = os.path.join('/content/data/',files)
  with rasterio.open(import_raster) as src:
    im = src.read()
    im = im.transpose([1,2,0])
    im = cv2.resize(im, (256,256))
  X.append(im)
X = np.array(X)
print(X.shape)
(1198, 256, 256, 3)
In [ ]:
Y = []
images_files = [f for f in os.listdir('/content/data/') if f.startswith('msk')]
images_files.sort()
for files in images_files:
  import_raster = os.path.join('/content/data/',files)
  with rasterio.open(import_raster) as src:
    im = src.read()
    im = im.transpose([1,2,0])
    im = cv2.resize(im, (256,256))
  Y.append(im)
Y = np.array(Y)
print(Y.shape)
(1198, 256, 256)
In [ ]:
plt.figure(figsize=[6,6])
plt.imshow(X[80,:,:,0:3])
plt.axis('off')
Out[ ]:
(-0.5, 255.5, 255.5, -0.5)
No description has been provided for this image
In [ ]:
plt.figure(figsize=[6,6])
plt.imshow(np.round(Y[80,:,:]))
plt.axis('off')
Out[ ]:
(-0.5, 255.5, 255.5, -0.5)
No description has been provided for this image

Building Segmentation with the Open Cities AI Challenge Dataset¶

"In this challenge, you will be segmenting houses and buildings from aerial imagery. The data consists of drone images from 10 different cities and regions across Africa. Your goal is to classify the presence or absence of a building on a pixel-by-pixel basis."

image.png

In this example, we will use the data prepared earlier to create a segmentation model for houses and buildings.

Let's plot again an example of an image and its respective mask:

In [ ]:
i = 1000
plt.figure(figsize=[20,20])
plt.subplot(121)
plt.imshow(X[i,:,:,:])
plt.title('RGB Image')
plt.axis('off')
plt.subplot(122)
plt.imshow(Y[i,:,:])
plt.title('True Image')
plt.axis('off')
Out[ ]:
(-0.5, 255.5, 255.5, -0.5)
No description has been provided for this image

Now let's prepare the data to feed the neural network, dividing it into training and testing data, rescaling the values ​​and also importing some functions from Keras:

In [ ]:
x_train, x_test, y_train, y_test = train_test_split(X, Y, test_size=0.3, random_state=10)
In [ ]:
x_train = x_train/255
x_test = x_test/255
In [ ]:
y_train = y_train.astype('float')
y_test = y_test.astype('float')
In [ ]:
from keras.models import Model
from keras.regularizers import l2
from keras.layers import *
from keras.models import *
import keras.backend as K
import tensorflow as tf
#from tensorflow.keras.optimizers import Adam
#from tensorflow.keras.optimizers.legacy import Adam
from keras.optimizers import Adam
from keras.losses import binary_crossentropy
from tensorflow.keras.losses import Dice

Let's apply data augmentation to generate more samples.

In [ ]:
img_datagen = ImageDataGenerator(
    rotation_range=90,
    vertical_flip = True,
    horizontal_flip=True)

mask_datagen = ImageDataGenerator(
    rotation_range=90,
    vertical_flip = True,
    horizontal_flip=True)
In [ ]:
img_datagen.fit(x_train, augment=True,seed=1200)
mask_datagen.fit(y_train[:,:,:,np.newaxis], augment=True,seed=1200)
In [ ]:
train_generator=img_datagen.flow(x_train,y_train[:,:,:,np.newaxis],batch_size=8,seed=1200)
In [ ]:
steps_per_epoch = len(x_train)//8
validation_steps = len(x_test)//8

ResUnet¶

RESUNET is a fully convolutional neural network designed to achieve high performance with fewer parameters. It is an improvement over the existing UNET architecture. RESUNET leverages the UNET architecture and Deep Residual Learning.

image.png

RESUNET Advantages:¶

The use of residual blocks helps in building a deeper network without worrying about the problem of gradient vanishing or gradient explosion. It also helps in easy training of the network. The rich skip connections in RESUNET help in better flow of information between different layers, which helps in better flow of gradients during training (backpropagation).

General architecture¶

RESUNET consists of an encoding network, a decoding network, and a bridge connecting both networks, like a U-Net. U-Net uses two 3 x 3 convolutions, each followed by a ReLU activation function. In the case of RESUNET, these layers are replaced by a pre-activated residual block.

  • Encoder:

The encoder takes the input image and passes it through different encoder blocks, which helps the network learn an abstract representation. The encoder consists of three encoder blocks, which are constructed using the pre-activated residual block. The output of each encoder block acts as a skip connection to the corresponding decoder block.

To reduce the spatial dimensions (height and width) of the feature maps, the first 3×3 convolution layer uses a stride of 2 in the second and third encoder blocks. A stride value of 2 reduces the spatial dimensions by half, i.e., from 256 to 128.

  • Bridge:

The bridge also consists of a pre-activated residual block with a stride value of 2.

  • Decoder:

The decoder takes the feature map from the bridge and the skip connections from different encoder blocks and learns a better semantic representation, which is used to generate a segmentation mask.

The decoder consists of three decoder blocks, and after each block, the spatial dimensions of the feature map are doubled and the number of feature channels is reduced.

Let's implement ResUnet using keras:

In [ ]:
def conv_block(input_tensor, filters, strides, d_rates):
    x = Conv2D(filters[0], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[0])(input_tensor)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[1], kernel_size=3, strides=strides, kernel_initializer='he_uniform', padding='same', dilation_rate=d_rates[1])(x)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[2], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[2])(x)
    x = BatchNormalization()(x)

    shortcut = Conv2D(filters[2], kernel_size=1, kernel_initializer='he_uniform', strides=strides)(input_tensor)
    shortcut = BatchNormalization()(shortcut)

    x = add([x, shortcut])
    x = Activation('relu')(x)

    return x


def identity_block(input_tensor, filters, d_rates):
    x = Conv2D(filters[0], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[0])(input_tensor)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[1], kernel_size=3, kernel_initializer='he_uniform', padding='same', dilation_rate=d_rates[1])(x)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)

    x = Conv2D(filters[2], kernel_size=1, kernel_initializer='he_uniform', dilation_rate=d_rates[2])(x)
    x = BatchNormalization()(x)

    x = add([x, input_tensor])
    x = Activation('relu')(x)

    return x

def one_side_pad(x):
    x = ZeroPadding2D((1, 1))(x)
    x = Lambda(lambda x: x[:, :-1, :-1, :])(x)
    return x
In [ ]:
droprate = 0.2
inputs = Input(shape=x_train.shape[1:])
conv_1 = Conv2D(32, (3, 3), strides=(1, 1), kernel_initializer='he_uniform', padding='same')(inputs)
conv_1 = BatchNormalization()(conv_1)
conv_1 = Activation("relu")(conv_1)
f1 = conv_1

conv_2 = Conv2D(64, (3, 3), strides=(2, 2), kernel_initializer='he_uniform', padding='same')(conv_1)
conv_2 = BatchNormalization()(conv_2)
conv_2 = Activation("relu")(conv_2)

conv_3 = Conv2D(64, (3, 3), strides=(1, 1), kernel_initializer='he_uniform', padding='same')(conv_2)
conv_3 = BatchNormalization()(conv_3)
conv_3 = Activation("relu")(conv_3)

f2 = conv_3


pool_1 = MaxPooling2D((2, 2), strides=(2, 2))(conv_3)

conv_block1 = conv_block(pool_1, filters=[64, 64, 128], strides=(1, 1), d_rates=[1, 1, 1])
identity_block1 = identity_block(conv_block1, filters=[64, 64, 128], d_rates=[1, 2, 1])
identity_block2 = identity_block(identity_block1, filters=[64, 64, 128], d_rates=[1, 3, 1])
f3 = identity_block2

conv_block2 = conv_block(identity_block2, filters=[128, 128, 256], strides=(2, 2), d_rates=[1, 1, 1])
identity_block3 = identity_block(conv_block2, filters=[128, 128, 256], d_rates=[1, 2, 1])
identity_block4 = identity_block(identity_block3, filters=[128, 128, 256], d_rates=[1, 3, 1])
identity_block5 = identity_block(identity_block4, filters=[128, 128, 256], d_rates=[1, 4, 1])
f4 = identity_block5


identity_block10 = conv_block(identity_block5, filters=[256, 256, 512], strides=(2, 2), d_rates=[1, 1, 1])
for i in range(5):
  identity_block10 = identity_block(identity_block10, filters=[256, 256, 512], d_rates=[1, 2, 1])

f5 = identity_block10

conv_block4 = conv_block(identity_block10, filters=[512, 512, 1024], strides=(2, 2), d_rates=[1, 1, 1])
identity_block11 = identity_block(conv_block4, filters=[512, 512, 1024], d_rates=[1, 4, 1])
identity_block12 = identity_block(identity_block11, filters=[512, 512, 1024], d_rates=[1, 4, 1])
f6 = identity_block12

o = f6

o = (BatchNormalization())(o)
o = Conv2D(1024, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(512, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)


o = Conv2DTranspose(512, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f5]))
o = (BatchNormalization())(o)
o = Conv2D(512, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(256, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)



o = Conv2DTranspose(256, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f4]))
o = (BatchNormalization())(o)
o = Conv2D(256, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(128, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)



o = Conv2DTranspose(128, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f3]))
o = (BatchNormalization())(o)
o = Conv2D(128, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(64, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)



o = Conv2DTranspose(64, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f2]))
o = (BatchNormalization())(o)
o = Conv2D(64, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(32, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Dropout(droprate)(o)


o = Conv2DTranspose(32, (2, 2), strides=(2, 2), padding='same')(o)
o = (concatenate([o, f1]))
o = (BatchNormalization())(o)
o = Conv2D(32, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)
o = Conv2D(32, (3, 3), activation='relu', kernel_initializer='he_uniform', padding='same')(o)


o = Conv2D(1, (3, 3), padding='same', activation='sigmoid')(o)

model = Model(inputs=inputs, outputs=o)
model.compile(optimizer=Adam(learning_rate = 1e-5), loss = Dice, metrics = ['accuracy'])
model.summary()
Model: "functional"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Layer (type)              ┃ Output Shape           ┃        Param # ┃ Connected to           ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━┩
│ input_layer (InputLayer)  │ (None, 320, 320, 3)    │              0 │ -                      │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d (Conv2D)           │ (None, 320, 320, 32)   │            896 │ input_layer[0][0]      │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization       │ (None, 320, 320, 32)   │            128 │ conv2d[0][0]           │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation (Activation)   │ (None, 320, 320, 32)   │              0 │ batch_normalization[0… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_1 (Conv2D)         │ (None, 160, 160, 64)   │         18,496 │ activation[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_1     │ (None, 160, 160, 64)   │            256 │ conv2d_1[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_1 (Activation) │ (None, 160, 160, 64)   │              0 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_2 (Conv2D)         │ (None, 160, 160, 64)   │         36,928 │ activation_1[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_2     │ (None, 160, 160, 64)   │            256 │ conv2d_2[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_2 (Activation) │ (None, 160, 160, 64)   │              0 │ batch_normalization_2… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ max_pooling2d             │ (None, 80, 80, 64)     │              0 │ activation_2[0][0]     │
│ (MaxPooling2D)            │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_3 (Conv2D)         │ (None, 80, 80, 64)     │          4,160 │ max_pooling2d[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_3     │ (None, 80, 80, 64)     │            256 │ conv2d_3[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_3 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_3… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_4 (Conv2D)         │ (None, 80, 80, 64)     │         36,928 │ activation_3[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_4     │ (None, 80, 80, 64)     │            256 │ conv2d_4[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_4 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_4… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_5 (Conv2D)         │ (None, 80, 80, 128)    │          8,320 │ activation_4[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_6 (Conv2D)         │ (None, 80, 80, 128)    │          8,320 │ max_pooling2d[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_5     │ (None, 80, 80, 128)    │            512 │ conv2d_5[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_6     │ (None, 80, 80, 128)    │            512 │ conv2d_6[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add (Add)                 │ (None, 80, 80, 128)    │              0 │ batch_normalization_5… │
│                           │                        │                │ batch_normalization_6… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_5 (Activation) │ (None, 80, 80, 128)    │              0 │ add[0][0]              │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_7 (Conv2D)         │ (None, 80, 80, 64)     │          8,256 │ activation_5[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_7     │ (None, 80, 80, 64)     │            256 │ conv2d_7[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_6 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_7… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_8 (Conv2D)         │ (None, 80, 80, 64)     │         36,928 │ activation_6[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_8     │ (None, 80, 80, 64)     │            256 │ conv2d_8[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_7 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_8… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_9 (Conv2D)         │ (None, 80, 80, 128)    │          8,320 │ activation_7[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_9     │ (None, 80, 80, 128)    │            512 │ conv2d_9[0][0]         │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_1 (Add)               │ (None, 80, 80, 128)    │              0 │ batch_normalization_9… │
│                           │                        │                │ activation_5[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_8 (Activation) │ (None, 80, 80, 128)    │              0 │ add_1[0][0]            │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_10 (Conv2D)        │ (None, 80, 80, 64)     │          8,256 │ activation_8[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_10    │ (None, 80, 80, 64)     │            256 │ conv2d_10[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_9 (Activation) │ (None, 80, 80, 64)     │              0 │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_11 (Conv2D)        │ (None, 80, 80, 64)     │         36,928 │ activation_9[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_11    │ (None, 80, 80, 64)     │            256 │ conv2d_11[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_10             │ (None, 80, 80, 64)     │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_12 (Conv2D)        │ (None, 80, 80, 128)    │          8,320 │ activation_10[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_12    │ (None, 80, 80, 128)    │            512 │ conv2d_12[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_2 (Add)               │ (None, 80, 80, 128)    │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_8[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_11             │ (None, 80, 80, 128)    │              0 │ add_2[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_13 (Conv2D)        │ (None, 80, 80, 128)    │         16,512 │ activation_11[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_13    │ (None, 80, 80, 128)    │            512 │ conv2d_13[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_12             │ (None, 80, 80, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_14 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_12[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_14    │ (None, 40, 40, 128)    │            512 │ conv2d_14[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_13             │ (None, 40, 40, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_15 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_13[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_16 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_11[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_15    │ (None, 40, 40, 256)    │          1,024 │ conv2d_15[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_16    │ (None, 40, 40, 256)    │          1,024 │ conv2d_16[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_3 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_1… │
│                           │                        │                │ batch_normalization_1… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_14             │ (None, 40, 40, 256)    │              0 │ add_3[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_17 (Conv2D)        │ (None, 40, 40, 128)    │         32,896 │ activation_14[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_17    │ (None, 40, 40, 128)    │            512 │ conv2d_17[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_15             │ (None, 40, 40, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_18 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_15[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_18    │ (None, 40, 40, 128)    │            512 │ conv2d_18[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_16             │ (None, 40, 40, 128)    │              0 │ batch_normalization_1… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_19 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_16[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_19    │ (None, 40, 40, 256)    │          1,024 │ conv2d_19[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_4 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_1… │
│                           │                        │                │ activation_14[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_17             │ (None, 40, 40, 256)    │              0 │ add_4[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_20 (Conv2D)        │ (None, 40, 40, 128)    │         32,896 │ activation_17[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_20    │ (None, 40, 40, 128)    │            512 │ conv2d_20[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_18             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_21 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_18[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_21    │ (None, 40, 40, 128)    │            512 │ conv2d_21[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_19             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_22 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_19[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_22    │ (None, 40, 40, 256)    │          1,024 │ conv2d_22[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_5 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_2… │
│                           │                        │                │ activation_17[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_20             │ (None, 40, 40, 256)    │              0 │ add_5[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_23 (Conv2D)        │ (None, 40, 40, 128)    │         32,896 │ activation_20[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_23    │ (None, 40, 40, 128)    │            512 │ conv2d_23[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_21             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_24 (Conv2D)        │ (None, 40, 40, 128)    │        147,584 │ activation_21[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_24    │ (None, 40, 40, 128)    │            512 │ conv2d_24[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_22             │ (None, 40, 40, 128)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_25 (Conv2D)        │ (None, 40, 40, 256)    │         33,024 │ activation_22[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_25    │ (None, 40, 40, 256)    │          1,024 │ conv2d_25[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_6 (Add)               │ (None, 40, 40, 256)    │              0 │ batch_normalization_2… │
│                           │                        │                │ activation_20[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_23             │ (None, 40, 40, 256)    │              0 │ add_6[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_26 (Conv2D)        │ (None, 40, 40, 256)    │         65,792 │ activation_23[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_26    │ (None, 40, 40, 256)    │          1,024 │ conv2d_26[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_24             │ (None, 40, 40, 256)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_27 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_24[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_27    │ (None, 20, 20, 256)    │          1,024 │ conv2d_27[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_25             │ (None, 20, 20, 256)    │              0 │ batch_normalization_2… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_28 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_25[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_29 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_23[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_28    │ (None, 20, 20, 512)    │          2,048 │ conv2d_28[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_29    │ (None, 20, 20, 512)    │          2,048 │ conv2d_29[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_7 (Add)               │ (None, 20, 20, 512)    │              0 │ batch_normalization_2… │
│                           │                        │                │ batch_normalization_2… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_26             │ (None, 20, 20, 512)    │              0 │ add_7[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_30 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_26[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_30    │ (None, 20, 20, 256)    │          1,024 │ conv2d_30[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_27             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_31 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_27[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_31    │ (None, 20, 20, 256)    │          1,024 │ conv2d_31[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_28             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_32 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_28[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_32    │ (None, 20, 20, 512)    │          2,048 │ conv2d_32[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_8 (Add)               │ (None, 20, 20, 512)    │              0 │ batch_normalization_3… │
│                           │                        │                │ activation_26[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_29             │ (None, 20, 20, 512)    │              0 │ add_8[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_33 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_29[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_33    │ (None, 20, 20, 256)    │          1,024 │ conv2d_33[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_30             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_34 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_30[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_34    │ (None, 20, 20, 256)    │          1,024 │ conv2d_34[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_31             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_35 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_31[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_35    │ (None, 20, 20, 512)    │          2,048 │ conv2d_35[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_9 (Add)               │ (None, 20, 20, 512)    │              0 │ batch_normalization_3… │
│                           │                        │                │ activation_29[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_32             │ (None, 20, 20, 512)    │              0 │ add_9[0][0]            │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_36 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_32[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_36    │ (None, 20, 20, 256)    │          1,024 │ conv2d_36[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_33             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_37 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_33[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_37    │ (None, 20, 20, 256)    │          1,024 │ conv2d_37[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_34             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_38 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_34[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_38    │ (None, 20, 20, 512)    │          2,048 │ conv2d_38[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_10 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_3… │
│                           │                        │                │ activation_32[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_35             │ (None, 20, 20, 512)    │              0 │ add_10[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_39 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_35[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_39    │ (None, 20, 20, 256)    │          1,024 │ conv2d_39[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_36             │ (None, 20, 20, 256)    │              0 │ batch_normalization_3… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_40 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_36[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_40    │ (None, 20, 20, 256)    │          1,024 │ conv2d_40[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_37             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_41 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_37[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_41    │ (None, 20, 20, 512)    │          2,048 │ conv2d_41[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_11 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_4… │
│                           │                        │                │ activation_35[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_38             │ (None, 20, 20, 512)    │              0 │ add_11[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_42 (Conv2D)        │ (None, 20, 20, 256)    │        131,328 │ activation_38[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_42    │ (None, 20, 20, 256)    │          1,024 │ conv2d_42[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_39             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_43 (Conv2D)        │ (None, 20, 20, 256)    │        590,080 │ activation_39[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_43    │ (None, 20, 20, 256)    │          1,024 │ conv2d_43[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_40             │ (None, 20, 20, 256)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_44 (Conv2D)        │ (None, 20, 20, 512)    │        131,584 │ activation_40[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_44    │ (None, 20, 20, 512)    │          2,048 │ conv2d_44[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_12 (Add)              │ (None, 20, 20, 512)    │              0 │ batch_normalization_4… │
│                           │                        │                │ activation_38[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_41             │ (None, 20, 20, 512)    │              0 │ add_12[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_45 (Conv2D)        │ (None, 20, 20, 512)    │        262,656 │ activation_41[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_45    │ (None, 20, 20, 512)    │          2,048 │ conv2d_45[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_42             │ (None, 20, 20, 512)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_46 (Conv2D)        │ (None, 10, 10, 512)    │      2,359,808 │ activation_42[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_46    │ (None, 10, 10, 512)    │          2,048 │ conv2d_46[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_43             │ (None, 10, 10, 512)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_47 (Conv2D)        │ (None, 10, 10, 1024)   │        525,312 │ activation_43[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_48 (Conv2D)        │ (None, 10, 10, 1024)   │        525,312 │ activation_41[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_47    │ (None, 10, 10, 1024)   │          4,096 │ conv2d_47[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_48    │ (None, 10, 10, 1024)   │          4,096 │ conv2d_48[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_13 (Add)              │ (None, 10, 10, 1024)   │              0 │ batch_normalization_4… │
│                           │                        │                │ batch_normalization_4… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_44             │ (None, 10, 10, 1024)   │              0 │ add_13[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_49 (Conv2D)        │ (None, 10, 10, 512)    │        524,800 │ activation_44[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_49    │ (None, 10, 10, 512)    │          2,048 │ conv2d_49[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_45             │ (None, 10, 10, 512)    │              0 │ batch_normalization_4… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_50 (Conv2D)        │ (None, 10, 10, 512)    │      2,359,808 │ activation_45[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_50    │ (None, 10, 10, 512)    │          2,048 │ conv2d_50[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_46             │ (None, 10, 10, 512)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_51 (Conv2D)        │ (None, 10, 10, 1024)   │        525,312 │ activation_46[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_51    │ (None, 10, 10, 1024)   │          4,096 │ conv2d_51[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_14 (Add)              │ (None, 10, 10, 1024)   │              0 │ batch_normalization_5… │
│                           │                        │                │ activation_44[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_47             │ (None, 10, 10, 1024)   │              0 │ add_14[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_52 (Conv2D)        │ (None, 10, 10, 512)    │        524,800 │ activation_47[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_52    │ (None, 10, 10, 512)    │          2,048 │ conv2d_52[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_48             │ (None, 10, 10, 512)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_53 (Conv2D)        │ (None, 10, 10, 512)    │      2,359,808 │ activation_48[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_53    │ (None, 10, 10, 512)    │          2,048 │ conv2d_53[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_49             │ (None, 10, 10, 512)    │              0 │ batch_normalization_5… │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_54 (Conv2D)        │ (None, 10, 10, 1024)   │        525,312 │ activation_49[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_54    │ (None, 10, 10, 1024)   │          4,096 │ conv2d_54[0][0]        │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ add_15 (Add)              │ (None, 10, 10, 1024)   │              0 │ batch_normalization_5… │
│                           │                        │                │ activation_47[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ activation_50             │ (None, 10, 10, 1024)   │              0 │ add_15[0][0]           │
│ (Activation)              │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_55    │ (None, 10, 10, 1024)   │          4,096 │ activation_50[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_55 (Conv2D)        │ (None, 10, 10, 1024)   │      9,438,208 │ batch_normalization_5… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_56 (Conv2D)        │ (None, 10, 10, 512)    │      4,719,104 │ conv2d_55[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout (Dropout)         │ (None, 10, 10, 512)    │              0 │ conv2d_56[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose          │ (None, 20, 20, 512)    │      1,049,088 │ dropout[0][0]          │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate (Concatenate) │ (None, 20, 20, 1024)   │              0 │ conv2d_transpose[0][0… │
│                           │                        │                │ activation_41[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_56    │ (None, 20, 20, 1024)   │          4,096 │ concatenate[0][0]      │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_57 (Conv2D)        │ (None, 20, 20, 512)    │      4,719,104 │ batch_normalization_5… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_58 (Conv2D)        │ (None, 20, 20, 256)    │      1,179,904 │ conv2d_57[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_1 (Dropout)       │ (None, 20, 20, 256)    │              0 │ conv2d_58[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_1        │ (None, 40, 40, 256)    │        262,400 │ dropout_1[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_1             │ (None, 40, 40, 512)    │              0 │ conv2d_transpose_1[0]… │
│ (Concatenate)             │                        │                │ activation_23[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_57    │ (None, 40, 40, 512)    │          2,048 │ concatenate_1[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_59 (Conv2D)        │ (None, 40, 40, 256)    │      1,179,904 │ batch_normalization_5… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_60 (Conv2D)        │ (None, 40, 40, 128)    │        295,040 │ conv2d_59[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_2 (Dropout)       │ (None, 40, 40, 128)    │              0 │ conv2d_60[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_2        │ (None, 80, 80, 128)    │         65,664 │ dropout_2[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_2             │ (None, 80, 80, 256)    │              0 │ conv2d_transpose_2[0]… │
│ (Concatenate)             │                        │                │ activation_11[0][0]    │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_58    │ (None, 80, 80, 256)    │          1,024 │ concatenate_2[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_61 (Conv2D)        │ (None, 80, 80, 128)    │        295,040 │ batch_normalization_5… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_62 (Conv2D)        │ (None, 80, 80, 64)     │         73,792 │ conv2d_61[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_3 (Dropout)       │ (None, 80, 80, 64)     │              0 │ conv2d_62[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_3        │ (None, 160, 160, 64)   │         16,448 │ dropout_3[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_3             │ (None, 160, 160, 128)  │              0 │ conv2d_transpose_3[0]… │
│ (Concatenate)             │                        │                │ activation_2[0][0]     │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_59    │ (None, 160, 160, 128)  │            512 │ concatenate_3[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_63 (Conv2D)        │ (None, 160, 160, 64)   │         73,792 │ batch_normalization_5… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_64 (Conv2D)        │ (None, 160, 160, 32)   │         18,464 │ conv2d_63[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ dropout_4 (Dropout)       │ (None, 160, 160, 32)   │              0 │ conv2d_64[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_transpose_4        │ (None, 320, 320, 32)   │          4,128 │ dropout_4[0][0]        │
│ (Conv2DTranspose)         │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ concatenate_4             │ (None, 320, 320, 64)   │              0 │ conv2d_transpose_4[0]… │
│ (Concatenate)             │                        │                │ activation[0][0]       │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ batch_normalization_60    │ (None, 320, 320, 64)   │            256 │ concatenate_4[0][0]    │
│ (BatchNormalization)      │                        │                │                        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_65 (Conv2D)        │ (None, 320, 320, 32)   │         18,464 │ batch_normalization_6… │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_66 (Conv2D)        │ (None, 320, 320, 32)   │          9,248 │ conv2d_65[0][0]        │
├───────────────────────────┼────────────────────────┼────────────────┼────────────────────────┤
│ conv2d_67 (Conv2D)        │ (None, 320, 320, 1)    │            289 │ conv2d_66[0][0]        │
└───────────────────────────┴────────────────────────┴────────────────┴────────────────────────┘
 Total params: 40,267,489 (153.61 MB)
 Trainable params: 40,227,105 (153.45 MB)
 Non-trainable params: 40,384 (157.75 KB)

And then we can train the model for 300 epochs:

In [ ]:
history = model.fit(train_generator,steps_per_epoch=steps_per_epoch, validation_steps=validation_steps,
                              epochs=300, validation_data=(x_test,y_test))
<ipython-input-53-3caab27a3ebc>:1: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators.
  history = model.fit_generator(train_generator,steps_per_epoch=steps_per_epoch, validation_steps=validation_steps,
Epoch 1/300
104/104 [==============================] - 40s 179ms/step - loss: 0.5674 - accuracy: 0.3436 - val_loss: 0.6001 - val_accuracy: 0.3957
Epoch 2/300
104/104 [==============================] - 18s 153ms/step - loss: 0.5023 - accuracy: 0.6548 - val_loss: 0.5434 - val_accuracy: 0.8271
Epoch 3/300
104/104 [==============================] - 16s 156ms/step - loss: 0.4243 - accuracy: 0.8225 - val_loss: 0.4241 - val_accuracy: 0.8999
Epoch 4/300
104/104 [==============================] - 16s 154ms/step - loss: 0.3489 - accuracy: 0.8631 - val_loss: 0.3038 - val_accuracy: 0.9031
Epoch 5/300
104/104 [==============================] - 16s 155ms/step - loss: 0.2850 - accuracy: 0.8841 - val_loss: 0.2561 - val_accuracy: 0.9027
Epoch 6/300
104/104 [==============================] - 16s 151ms/step - loss: 0.2511 - accuracy: 0.8897 - val_loss: 0.2368 - val_accuracy: 0.9020
Epoch 7/300
104/104 [==============================] - 16s 153ms/step - loss: 0.2437 - accuracy: 0.8916 - val_loss: 0.2220 - val_accuracy: 0.9071
Epoch 8/300
104/104 [==============================] - 16s 152ms/step - loss: 0.2283 - accuracy: 0.8998 - val_loss: 0.2135 - val_accuracy: 0.9094
Epoch 9/300
104/104 [==============================] - 16s 149ms/step - loss: 0.2104 - accuracy: 0.9018 - val_loss: 0.2049 - val_accuracy: 0.9132
Epoch 10/300
104/104 [==============================] - 16s 156ms/step - loss: 0.2152 - accuracy: 0.9007 - val_loss: 0.2016 - val_accuracy: 0.9163
Epoch 11/300
104/104 [==============================] - 16s 159ms/step - loss: 0.2074 - accuracy: 0.9033 - val_loss: 0.1972 - val_accuracy: 0.9138
Epoch 12/300
104/104 [==============================] - 16s 155ms/step - loss: 0.2152 - accuracy: 0.8997 - val_loss: 0.1930 - val_accuracy: 0.9202
Epoch 13/300
104/104 [==============================] - 16s 154ms/step - loss: 0.2067 - accuracy: 0.9044 - val_loss: 0.1876 - val_accuracy: 0.9177
Epoch 14/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1977 - accuracy: 0.9061 - val_loss: 0.1896 - val_accuracy: 0.9141
Epoch 15/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1949 - accuracy: 0.9099 - val_loss: 0.1813 - val_accuracy: 0.9201
Epoch 16/300
104/104 [==============================] - 16s 154ms/step - loss: 0.2014 - accuracy: 0.9045 - val_loss: 0.1813 - val_accuracy: 0.9180
Epoch 17/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1969 - accuracy: 0.9082 - val_loss: 0.1757 - val_accuracy: 0.9227
Epoch 18/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1937 - accuracy: 0.9101 - val_loss: 0.1792 - val_accuracy: 0.9189
Epoch 19/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1901 - accuracy: 0.9098 - val_loss: 0.1864 - val_accuracy: 0.9125
Epoch 20/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1849 - accuracy: 0.9104 - val_loss: 0.1927 - val_accuracy: 0.9064
Epoch 21/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1860 - accuracy: 0.9125 - val_loss: 0.1738 - val_accuracy: 0.9217
Epoch 22/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1881 - accuracy: 0.9104 - val_loss: 0.1749 - val_accuracy: 0.9206
Epoch 23/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1792 - accuracy: 0.9141 - val_loss: 0.1706 - val_accuracy: 0.9228
Epoch 24/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1857 - accuracy: 0.9112 - val_loss: 0.1766 - val_accuracy: 0.9198
Epoch 25/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1812 - accuracy: 0.9130 - val_loss: 0.1690 - val_accuracy: 0.9249
Epoch 26/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1839 - accuracy: 0.9145 - val_loss: 0.1678 - val_accuracy: 0.9261
Epoch 27/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1745 - accuracy: 0.9146 - val_loss: 0.1756 - val_accuracy: 0.9189
Epoch 28/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1835 - accuracy: 0.9166 - val_loss: 0.1701 - val_accuracy: 0.9215
Epoch 29/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1770 - accuracy: 0.9103 - val_loss: 0.1647 - val_accuracy: 0.9273
Epoch 30/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1690 - accuracy: 0.9169 - val_loss: 0.1645 - val_accuracy: 0.9283
Epoch 31/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1822 - accuracy: 0.9148 - val_loss: 0.1619 - val_accuracy: 0.9295
Epoch 32/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1872 - accuracy: 0.9110 - val_loss: 0.1626 - val_accuracy: 0.9264
Epoch 33/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1676 - accuracy: 0.9185 - val_loss: 0.1728 - val_accuracy: 0.9205
Epoch 34/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1741 - accuracy: 0.9190 - val_loss: 0.1586 - val_accuracy: 0.9306
Epoch 35/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1612 - accuracy: 0.9189 - val_loss: 0.1574 - val_accuracy: 0.9316
Epoch 36/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1748 - accuracy: 0.9155 - val_loss: 0.1610 - val_accuracy: 0.9277
Epoch 37/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1684 - accuracy: 0.9166 - val_loss: 0.1689 - val_accuracy: 0.9199
Epoch 38/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1724 - accuracy: 0.9177 - val_loss: 0.1547 - val_accuracy: 0.9309
Epoch 39/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1626 - accuracy: 0.9208 - val_loss: 0.1577 - val_accuracy: 0.9304
Epoch 40/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1716 - accuracy: 0.9174 - val_loss: 0.1740 - val_accuracy: 0.9162
Epoch 41/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1650 - accuracy: 0.9204 - val_loss: 0.1550 - val_accuracy: 0.9317
Epoch 42/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1667 - accuracy: 0.9191 - val_loss: 0.1572 - val_accuracy: 0.9305
Epoch 43/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1632 - accuracy: 0.9196 - val_loss: 0.1562 - val_accuracy: 0.9323
Epoch 44/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1682 - accuracy: 0.9177 - val_loss: 0.1562 - val_accuracy: 0.9326
Epoch 45/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1617 - accuracy: 0.9208 - val_loss: 0.1617 - val_accuracy: 0.9253
Epoch 46/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1651 - accuracy: 0.9192 - val_loss: 0.1531 - val_accuracy: 0.9322
Epoch 47/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1732 - accuracy: 0.9190 - val_loss: 0.1822 - val_accuracy: 0.9098
Epoch 48/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1622 - accuracy: 0.9199 - val_loss: 0.1593 - val_accuracy: 0.9258
Epoch 49/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1638 - accuracy: 0.9181 - val_loss: 0.1531 - val_accuracy: 0.9327
Epoch 50/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1747 - accuracy: 0.9180 - val_loss: 0.1594 - val_accuracy: 0.9320
Epoch 51/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1591 - accuracy: 0.9172 - val_loss: 0.1554 - val_accuracy: 0.9330
Epoch 52/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1642 - accuracy: 0.9232 - val_loss: 0.1528 - val_accuracy: 0.9335
Epoch 53/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1576 - accuracy: 0.9238 - val_loss: 0.1521 - val_accuracy: 0.9330
Epoch 54/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1693 - accuracy: 0.9180 - val_loss: 0.1544 - val_accuracy: 0.9332
Epoch 55/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1542 - accuracy: 0.9248 - val_loss: 0.1540 - val_accuracy: 0.9339
Epoch 56/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1589 - accuracy: 0.9218 - val_loss: 0.1514 - val_accuracy: 0.9315
Epoch 57/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1609 - accuracy: 0.9228 - val_loss: 0.1508 - val_accuracy: 0.9345
Epoch 58/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1544 - accuracy: 0.9232 - val_loss: 0.1549 - val_accuracy: 0.9338
Epoch 59/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1571 - accuracy: 0.9217 - val_loss: 0.1530 - val_accuracy: 0.9319
Epoch 60/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1623 - accuracy: 0.9236 - val_loss: 0.1552 - val_accuracy: 0.9314
Epoch 61/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1587 - accuracy: 0.9206 - val_loss: 0.1512 - val_accuracy: 0.9335
Epoch 62/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1599 - accuracy: 0.9197 - val_loss: 0.1483 - val_accuracy: 0.9342
Epoch 63/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1531 - accuracy: 0.9283 - val_loss: 0.1479 - val_accuracy: 0.9330
Epoch 64/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1585 - accuracy: 0.9195 - val_loss: 0.1475 - val_accuracy: 0.9351
Epoch 65/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1557 - accuracy: 0.9214 - val_loss: 0.1464 - val_accuracy: 0.9354
Epoch 66/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1509 - accuracy: 0.9259 - val_loss: 0.1465 - val_accuracy: 0.9355
Epoch 67/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1558 - accuracy: 0.9242 - val_loss: 0.1455 - val_accuracy: 0.9352
Epoch 68/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1515 - accuracy: 0.9260 - val_loss: 0.1453 - val_accuracy: 0.9361
Epoch 69/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1522 - accuracy: 0.9224 - val_loss: 0.1460 - val_accuracy: 0.9361
Epoch 70/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1535 - accuracy: 0.9247 - val_loss: 0.1467 - val_accuracy: 0.9341
Epoch 71/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1489 - accuracy: 0.9252 - val_loss: 0.1446 - val_accuracy: 0.9361
Epoch 72/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1493 - accuracy: 0.9245 - val_loss: 0.1460 - val_accuracy: 0.9366
Epoch 73/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1539 - accuracy: 0.9262 - val_loss: 0.1476 - val_accuracy: 0.9360
Epoch 74/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1444 - accuracy: 0.9257 - val_loss: 0.1454 - val_accuracy: 0.9355
Epoch 75/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1509 - accuracy: 0.9261 - val_loss: 0.1444 - val_accuracy: 0.9365
Epoch 76/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1505 - accuracy: 0.9241 - val_loss: 0.1460 - val_accuracy: 0.9355
Epoch 77/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1511 - accuracy: 0.9289 - val_loss: 0.1463 - val_accuracy: 0.9349
Epoch 78/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1490 - accuracy: 0.9228 - val_loss: 0.1490 - val_accuracy: 0.9310
Epoch 79/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1443 - accuracy: 0.9270 - val_loss: 0.1428 - val_accuracy: 0.9361
Epoch 80/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1450 - accuracy: 0.9263 - val_loss: 0.1431 - val_accuracy: 0.9372
Epoch 81/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1495 - accuracy: 0.9277 - val_loss: 0.1432 - val_accuracy: 0.9377
Epoch 82/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1522 - accuracy: 0.9245 - val_loss: 0.1455 - val_accuracy: 0.9329
Epoch 83/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1447 - accuracy: 0.9264 - val_loss: 0.1419 - val_accuracy: 0.9370
Epoch 84/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1420 - accuracy: 0.9282 - val_loss: 0.1424 - val_accuracy: 0.9355
Epoch 85/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1482 - accuracy: 0.9263 - val_loss: 0.1439 - val_accuracy: 0.9361
Epoch 86/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1468 - accuracy: 0.9259 - val_loss: 0.1422 - val_accuracy: 0.9381
Epoch 87/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1461 - accuracy: 0.9275 - val_loss: 0.1411 - val_accuracy: 0.9383
Epoch 88/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1434 - accuracy: 0.9285 - val_loss: 0.1416 - val_accuracy: 0.9364
Epoch 89/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1501 - accuracy: 0.9271 - val_loss: 0.1424 - val_accuracy: 0.9379
Epoch 90/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1445 - accuracy: 0.9271 - val_loss: 0.1451 - val_accuracy: 0.9369
Epoch 91/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1542 - accuracy: 0.9258 - val_loss: 0.1439 - val_accuracy: 0.9379
Epoch 92/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1451 - accuracy: 0.9262 - val_loss: 0.1446 - val_accuracy: 0.9363
Epoch 93/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1503 - accuracy: 0.9258 - val_loss: 0.1392 - val_accuracy: 0.9386
Epoch 94/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1426 - accuracy: 0.9290 - val_loss: 0.1412 - val_accuracy: 0.9378
Epoch 95/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1460 - accuracy: 0.9263 - val_loss: 0.1409 - val_accuracy: 0.9387
Epoch 96/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1523 - accuracy: 0.9263 - val_loss: 0.1412 - val_accuracy: 0.9348
Epoch 97/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1402 - accuracy: 0.9303 - val_loss: 0.1403 - val_accuracy: 0.9364
Epoch 98/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1500 - accuracy: 0.9248 - val_loss: 0.1422 - val_accuracy: 0.9354
Epoch 99/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1455 - accuracy: 0.9282 - val_loss: 0.1403 - val_accuracy: 0.9366
Epoch 100/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1427 - accuracy: 0.9279 - val_loss: 0.1407 - val_accuracy: 0.9363
Epoch 101/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1423 - accuracy: 0.9286 - val_loss: 0.1381 - val_accuracy: 0.9382
Epoch 102/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1424 - accuracy: 0.9293 - val_loss: 0.1393 - val_accuracy: 0.9384
Epoch 103/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1418 - accuracy: 0.9282 - val_loss: 0.1389 - val_accuracy: 0.9389
Epoch 104/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1447 - accuracy: 0.9283 - val_loss: 0.1380 - val_accuracy: 0.9398
Epoch 105/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1429 - accuracy: 0.9285 - val_loss: 0.1384 - val_accuracy: 0.9375
Epoch 106/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1454 - accuracy: 0.9280 - val_loss: 0.1357 - val_accuracy: 0.9392
Epoch 107/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1451 - accuracy: 0.9279 - val_loss: 0.1384 - val_accuracy: 0.9394
Epoch 108/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1423 - accuracy: 0.9277 - val_loss: 0.1454 - val_accuracy: 0.9325
Epoch 109/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1428 - accuracy: 0.9305 - val_loss: 0.1377 - val_accuracy: 0.9394
Epoch 110/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1480 - accuracy: 0.9273 - val_loss: 0.1468 - val_accuracy: 0.9369
Epoch 111/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1379 - accuracy: 0.9294 - val_loss: 0.1388 - val_accuracy: 0.9376
Epoch 112/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1407 - accuracy: 0.9292 - val_loss: 0.1492 - val_accuracy: 0.9367
Epoch 113/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1391 - accuracy: 0.9293 - val_loss: 0.1362 - val_accuracy: 0.9399
Epoch 114/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1412 - accuracy: 0.9310 - val_loss: 0.1374 - val_accuracy: 0.9400
Epoch 115/300
104/104 [==============================] - 15s 144ms/step - loss: 0.1412 - accuracy: 0.9293 - val_loss: 0.1365 - val_accuracy: 0.9403
Epoch 116/300
104/104 [==============================] - 15s 144ms/step - loss: 0.1416 - accuracy: 0.9276 - val_loss: 0.1574 - val_accuracy: 0.9250
Epoch 117/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1411 - accuracy: 0.9300 - val_loss: 0.1419 - val_accuracy: 0.9388
Epoch 118/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1361 - accuracy: 0.9303 - val_loss: 0.1374 - val_accuracy: 0.9382
Epoch 119/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1453 - accuracy: 0.9272 - val_loss: 0.1407 - val_accuracy: 0.9373
Epoch 120/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1338 - accuracy: 0.9306 - val_loss: 0.1426 - val_accuracy: 0.9386
Epoch 121/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1460 - accuracy: 0.9282 - val_loss: 0.1354 - val_accuracy: 0.9397
Epoch 122/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1356 - accuracy: 0.9314 - val_loss: 0.1399 - val_accuracy: 0.9392
Epoch 123/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1429 - accuracy: 0.9291 - val_loss: 0.1451 - val_accuracy: 0.9372
Epoch 124/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1411 - accuracy: 0.9283 - val_loss: 0.1364 - val_accuracy: 0.9394
Epoch 125/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1373 - accuracy: 0.9303 - val_loss: 0.1360 - val_accuracy: 0.9393
Epoch 126/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1466 - accuracy: 0.9305 - val_loss: 0.1364 - val_accuracy: 0.9378
Epoch 127/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1434 - accuracy: 0.9277 - val_loss: 0.1356 - val_accuracy: 0.9392
Epoch 128/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1448 - accuracy: 0.9283 - val_loss: 0.1369 - val_accuracy: 0.9398
Epoch 129/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1396 - accuracy: 0.9304 - val_loss: 0.1376 - val_accuracy: 0.9370
Epoch 130/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1350 - accuracy: 0.9308 - val_loss: 0.1364 - val_accuracy: 0.9402
Epoch 131/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1393 - accuracy: 0.9296 - val_loss: 0.1390 - val_accuracy: 0.9387
Epoch 132/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1308 - accuracy: 0.9361 - val_loss: 0.1367 - val_accuracy: 0.9386
Epoch 133/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1428 - accuracy: 0.9261 - val_loss: 0.1399 - val_accuracy: 0.9347
Epoch 134/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1403 - accuracy: 0.9304 - val_loss: 0.1332 - val_accuracy: 0.9405
Epoch 135/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1346 - accuracy: 0.9337 - val_loss: 0.1374 - val_accuracy: 0.9405
Epoch 136/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1370 - accuracy: 0.9316 - val_loss: 0.1386 - val_accuracy: 0.9370
Epoch 137/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1399 - accuracy: 0.9302 - val_loss: 0.1375 - val_accuracy: 0.9398
Epoch 138/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1320 - accuracy: 0.9305 - val_loss: 0.1417 - val_accuracy: 0.9394
Epoch 139/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1360 - accuracy: 0.9304 - val_loss: 0.1418 - val_accuracy: 0.9353
Epoch 140/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1347 - accuracy: 0.9333 - val_loss: 0.1456 - val_accuracy: 0.9374
Epoch 141/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1339 - accuracy: 0.9313 - val_loss: 0.1337 - val_accuracy: 0.9407
Epoch 142/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1405 - accuracy: 0.9297 - val_loss: 0.1324 - val_accuracy: 0.9390
Epoch 143/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1398 - accuracy: 0.9305 - val_loss: 0.1396 - val_accuracy: 0.9391
Epoch 144/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1330 - accuracy: 0.9344 - val_loss: 0.1354 - val_accuracy: 0.9384
Epoch 145/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1301 - accuracy: 0.9307 - val_loss: 0.1373 - val_accuracy: 0.9381
Epoch 146/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1436 - accuracy: 0.9313 - val_loss: 0.1355 - val_accuracy: 0.9381
Epoch 147/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1363 - accuracy: 0.9307 - val_loss: 0.1311 - val_accuracy: 0.9422
Epoch 148/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1328 - accuracy: 0.9346 - val_loss: 0.1332 - val_accuracy: 0.9418
Epoch 149/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1328 - accuracy: 0.9295 - val_loss: 0.1315 - val_accuracy: 0.9396
Epoch 150/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1331 - accuracy: 0.9367 - val_loss: 0.1351 - val_accuracy: 0.9394
Epoch 151/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1320 - accuracy: 0.9320 - val_loss: 0.1314 - val_accuracy: 0.9413
Epoch 152/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1332 - accuracy: 0.9326 - val_loss: 0.1363 - val_accuracy: 0.9386
Epoch 153/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1295 - accuracy: 0.9337 - val_loss: 0.1332 - val_accuracy: 0.9399
Epoch 154/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1324 - accuracy: 0.9344 - val_loss: 0.1383 - val_accuracy: 0.9403
Epoch 155/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1367 - accuracy: 0.9306 - val_loss: 0.1327 - val_accuracy: 0.9417
Epoch 156/300
104/104 [==============================] - 15s 143ms/step - loss: 0.1366 - accuracy: 0.9323 - val_loss: 0.1322 - val_accuracy: 0.9418
Epoch 157/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1343 - accuracy: 0.9339 - val_loss: 0.1306 - val_accuracy: 0.9424
Epoch 158/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1314 - accuracy: 0.9317 - val_loss: 0.1330 - val_accuracy: 0.9417
Epoch 159/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1345 - accuracy: 0.9339 - val_loss: 0.1344 - val_accuracy: 0.9399
Epoch 160/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1386 - accuracy: 0.9298 - val_loss: 0.1344 - val_accuracy: 0.9417
Epoch 161/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1349 - accuracy: 0.9324 - val_loss: 0.1292 - val_accuracy: 0.9419
Epoch 162/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1307 - accuracy: 0.9318 - val_loss: 0.1335 - val_accuracy: 0.9407
Epoch 163/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1288 - accuracy: 0.9341 - val_loss: 0.1290 - val_accuracy: 0.9428
Epoch 164/300
104/104 [==============================] - 15s 144ms/step - loss: 0.1322 - accuracy: 0.9346 - val_loss: 0.1291 - val_accuracy: 0.9432
Epoch 165/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1302 - accuracy: 0.9347 - val_loss: 0.1291 - val_accuracy: 0.9422
Epoch 166/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1316 - accuracy: 0.9326 - val_loss: 0.1310 - val_accuracy: 0.9423
Epoch 167/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1299 - accuracy: 0.9342 - val_loss: 0.1298 - val_accuracy: 0.9415
Epoch 168/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1367 - accuracy: 0.9302 - val_loss: 0.1325 - val_accuracy: 0.9422
Epoch 169/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1326 - accuracy: 0.9332 - val_loss: 0.1323 - val_accuracy: 0.9405
Epoch 170/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1258 - accuracy: 0.9344 - val_loss: 0.1431 - val_accuracy: 0.9395
Epoch 171/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1386 - accuracy: 0.9331 - val_loss: 0.1298 - val_accuracy: 0.9420
Epoch 172/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1304 - accuracy: 0.9321 - val_loss: 0.1296 - val_accuracy: 0.9408
Epoch 173/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1308 - accuracy: 0.9344 - val_loss: 0.1356 - val_accuracy: 0.9383
Epoch 174/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1247 - accuracy: 0.9356 - val_loss: 0.1293 - val_accuracy: 0.9426
Epoch 175/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1290 - accuracy: 0.9358 - val_loss: 0.1332 - val_accuracy: 0.9388
Epoch 176/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1233 - accuracy: 0.9356 - val_loss: 0.1295 - val_accuracy: 0.9426
Epoch 177/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1265 - accuracy: 0.9356 - val_loss: 0.1311 - val_accuracy: 0.9427
Epoch 178/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1255 - accuracy: 0.9343 - val_loss: 0.1290 - val_accuracy: 0.9417
Epoch 179/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1274 - accuracy: 0.9344 - val_loss: 0.1287 - val_accuracy: 0.9427
Epoch 180/300
104/104 [==============================] - 15s 144ms/step - loss: 0.1328 - accuracy: 0.9338 - val_loss: 0.1293 - val_accuracy: 0.9425
Epoch 181/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1271 - accuracy: 0.9347 - val_loss: 0.1291 - val_accuracy: 0.9429
Epoch 182/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1309 - accuracy: 0.9364 - val_loss: 0.1334 - val_accuracy: 0.9422
Epoch 183/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1269 - accuracy: 0.9331 - val_loss: 0.1259 - val_accuracy: 0.9438
Epoch 184/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1399 - accuracy: 0.9312 - val_loss: 0.1303 - val_accuracy: 0.9426
Epoch 185/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1264 - accuracy: 0.9344 - val_loss: 0.1287 - val_accuracy: 0.9437
Epoch 186/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1265 - accuracy: 0.9369 - val_loss: 0.1289 - val_accuracy: 0.9414
Epoch 187/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1267 - accuracy: 0.9346 - val_loss: 0.1315 - val_accuracy: 0.9416
Epoch 188/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1259 - accuracy: 0.9345 - val_loss: 0.1278 - val_accuracy: 0.9420
Epoch 189/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1324 - accuracy: 0.9358 - val_loss: 0.1285 - val_accuracy: 0.9428
Epoch 190/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1292 - accuracy: 0.9326 - val_loss: 0.1302 - val_accuracy: 0.9436
Epoch 191/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1195 - accuracy: 0.9398 - val_loss: 0.1262 - val_accuracy: 0.9440
Epoch 192/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1272 - accuracy: 0.9341 - val_loss: 0.1271 - val_accuracy: 0.9431
Epoch 193/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1265 - accuracy: 0.9366 - val_loss: 0.1263 - val_accuracy: 0.9428
Epoch 194/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1351 - accuracy: 0.9316 - val_loss: 0.1399 - val_accuracy: 0.9396
Epoch 195/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1234 - accuracy: 0.9364 - val_loss: 0.1316 - val_accuracy: 0.9428
Epoch 196/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1264 - accuracy: 0.9357 - val_loss: 0.1280 - val_accuracy: 0.9430
Epoch 197/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1290 - accuracy: 0.9333 - val_loss: 0.1324 - val_accuracy: 0.9423
Epoch 198/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1228 - accuracy: 0.9367 - val_loss: 0.1266 - val_accuracy: 0.9436
Epoch 199/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1297 - accuracy: 0.9346 - val_loss: 0.1272 - val_accuracy: 0.9440
Epoch 200/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1272 - accuracy: 0.9355 - val_loss: 0.1270 - val_accuracy: 0.9427
Epoch 201/300
104/104 [==============================] - 15s 144ms/step - loss: 0.1260 - accuracy: 0.9362 - val_loss: 0.1467 - val_accuracy: 0.9388
Epoch 202/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1250 - accuracy: 0.9349 - val_loss: 0.1306 - val_accuracy: 0.9415
Epoch 203/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1242 - accuracy: 0.9369 - val_loss: 0.1277 - val_accuracy: 0.9422
Epoch 204/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1295 - accuracy: 0.9353 - val_loss: 0.1282 - val_accuracy: 0.9440
Epoch 205/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1267 - accuracy: 0.9358 - val_loss: 0.1369 - val_accuracy: 0.9399
Epoch 206/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1232 - accuracy: 0.9359 - val_loss: 0.1266 - val_accuracy: 0.9446
Epoch 207/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1244 - accuracy: 0.9364 - val_loss: 0.1287 - val_accuracy: 0.9413
Epoch 208/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1246 - accuracy: 0.9362 - val_loss: 0.1296 - val_accuracy: 0.9435
Epoch 209/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1214 - accuracy: 0.9372 - val_loss: 0.1283 - val_accuracy: 0.9421
Epoch 210/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1311 - accuracy: 0.9348 - val_loss: 0.1324 - val_accuracy: 0.9429
Epoch 211/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1231 - accuracy: 0.9368 - val_loss: 0.1284 - val_accuracy: 0.9436
Epoch 212/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1249 - accuracy: 0.9358 - val_loss: 0.1362 - val_accuracy: 0.9419
Epoch 213/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1236 - accuracy: 0.9375 - val_loss: 0.1245 - val_accuracy: 0.9452
Epoch 214/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1226 - accuracy: 0.9366 - val_loss: 0.1291 - val_accuracy: 0.9423
Epoch 215/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1228 - accuracy: 0.9363 - val_loss: 0.1279 - val_accuracy: 0.9433
Epoch 216/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1283 - accuracy: 0.9358 - val_loss: 0.1289 - val_accuracy: 0.9432
Epoch 217/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1236 - accuracy: 0.9370 - val_loss: 0.1280 - val_accuracy: 0.9427
Epoch 218/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1235 - accuracy: 0.9363 - val_loss: 0.1256 - val_accuracy: 0.9443
Epoch 219/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1227 - accuracy: 0.9367 - val_loss: 0.1268 - val_accuracy: 0.9437
Epoch 220/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1253 - accuracy: 0.9336 - val_loss: 0.1266 - val_accuracy: 0.9440
Epoch 221/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1188 - accuracy: 0.9399 - val_loss: 0.1290 - val_accuracy: 0.9433
Epoch 222/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1233 - accuracy: 0.9366 - val_loss: 0.1311 - val_accuracy: 0.9432
Epoch 223/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1249 - accuracy: 0.9369 - val_loss: 0.1289 - val_accuracy: 0.9436
Epoch 224/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1254 - accuracy: 0.9367 - val_loss: 0.1265 - val_accuracy: 0.9444
Epoch 225/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1228 - accuracy: 0.9362 - val_loss: 0.1318 - val_accuracy: 0.9424
Epoch 226/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1205 - accuracy: 0.9391 - val_loss: 0.1270 - val_accuracy: 0.9444
Epoch 227/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1249 - accuracy: 0.9347 - val_loss: 0.1234 - val_accuracy: 0.9446
Epoch 228/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1229 - accuracy: 0.9388 - val_loss: 0.1253 - val_accuracy: 0.9431
Epoch 229/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1185 - accuracy: 0.9357 - val_loss: 0.1242 - val_accuracy: 0.9436
Epoch 230/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1227 - accuracy: 0.9394 - val_loss: 0.1264 - val_accuracy: 0.9442
Epoch 231/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1226 - accuracy: 0.9371 - val_loss: 0.1261 - val_accuracy: 0.9447
Epoch 232/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1208 - accuracy: 0.9393 - val_loss: 0.1270 - val_accuracy: 0.9445
Epoch 233/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1272 - accuracy: 0.9346 - val_loss: 0.1277 - val_accuracy: 0.9445
Epoch 234/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1182 - accuracy: 0.9415 - val_loss: 0.1330 - val_accuracy: 0.9425
Epoch 235/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1263 - accuracy: 0.9323 - val_loss: 0.1274 - val_accuracy: 0.9439
Epoch 236/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1186 - accuracy: 0.9404 - val_loss: 0.1258 - val_accuracy: 0.9437
Epoch 237/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1130 - accuracy: 0.9386 - val_loss: 0.1242 - val_accuracy: 0.9447
Epoch 238/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1269 - accuracy: 0.9367 - val_loss: 0.1243 - val_accuracy: 0.9444
Epoch 239/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1195 - accuracy: 0.9375 - val_loss: 0.1230 - val_accuracy: 0.9445
Epoch 240/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1171 - accuracy: 0.9396 - val_loss: 0.1223 - val_accuracy: 0.9453
Epoch 241/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1241 - accuracy: 0.9368 - val_loss: 0.1239 - val_accuracy: 0.9451
Epoch 242/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1219 - accuracy: 0.9392 - val_loss: 0.1269 - val_accuracy: 0.9438
Epoch 243/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1260 - accuracy: 0.9354 - val_loss: 0.1333 - val_accuracy: 0.9424
Epoch 244/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1223 - accuracy: 0.9377 - val_loss: 0.1295 - val_accuracy: 0.9433
Epoch 245/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1252 - accuracy: 0.9373 - val_loss: 0.1228 - val_accuracy: 0.9451
Epoch 246/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1180 - accuracy: 0.9389 - val_loss: 0.1248 - val_accuracy: 0.9448
Epoch 247/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1175 - accuracy: 0.9392 - val_loss: 0.1252 - val_accuracy: 0.9440
Epoch 248/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1175 - accuracy: 0.9404 - val_loss: 0.1258 - val_accuracy: 0.9446
Epoch 249/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1156 - accuracy: 0.9388 - val_loss: 0.1229 - val_accuracy: 0.9455
Epoch 250/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1224 - accuracy: 0.9379 - val_loss: 0.1228 - val_accuracy: 0.9461
Epoch 251/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1228 - accuracy: 0.9358 - val_loss: 0.1241 - val_accuracy: 0.9449
Epoch 252/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1150 - accuracy: 0.9411 - val_loss: 0.1244 - val_accuracy: 0.9452
Epoch 253/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1196 - accuracy: 0.9386 - val_loss: 0.1209 - val_accuracy: 0.9460
Epoch 254/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1203 - accuracy: 0.9366 - val_loss: 0.1210 - val_accuracy: 0.9456
Epoch 255/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1149 - accuracy: 0.9408 - val_loss: 0.1214 - val_accuracy: 0.9463
Epoch 256/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1221 - accuracy: 0.9395 - val_loss: 0.1227 - val_accuracy: 0.9458
Epoch 257/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1184 - accuracy: 0.9381 - val_loss: 0.1275 - val_accuracy: 0.9442
Epoch 258/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1110 - accuracy: 0.9429 - val_loss: 0.1247 - val_accuracy: 0.9439
Epoch 259/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1247 - accuracy: 0.9344 - val_loss: 0.1204 - val_accuracy: 0.9461
Epoch 260/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1190 - accuracy: 0.9410 - val_loss: 0.1232 - val_accuracy: 0.9452
Epoch 261/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1193 - accuracy: 0.9400 - val_loss: 0.1266 - val_accuracy: 0.9453
Epoch 262/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1168 - accuracy: 0.9394 - val_loss: 0.1245 - val_accuracy: 0.9429
Epoch 263/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1197 - accuracy: 0.9396 - val_loss: 0.1262 - val_accuracy: 0.9451
Epoch 264/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1205 - accuracy: 0.9368 - val_loss: 0.1280 - val_accuracy: 0.9429
Epoch 265/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1181 - accuracy: 0.9401 - val_loss: 0.1233 - val_accuracy: 0.9458
Epoch 266/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1135 - accuracy: 0.9406 - val_loss: 0.1224 - val_accuracy: 0.9450
Epoch 267/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1224 - accuracy: 0.9362 - val_loss: 0.1251 - val_accuracy: 0.9440
Epoch 268/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1156 - accuracy: 0.9409 - val_loss: 0.1265 - val_accuracy: 0.9451
Epoch 269/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1231 - accuracy: 0.9389 - val_loss: 0.1211 - val_accuracy: 0.9467
Epoch 270/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1191 - accuracy: 0.9382 - val_loss: 0.1295 - val_accuracy: 0.9435
Epoch 271/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1133 - accuracy: 0.9405 - val_loss: 0.1269 - val_accuracy: 0.9434
Epoch 272/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1168 - accuracy: 0.9395 - val_loss: 0.1265 - val_accuracy: 0.9441
Epoch 273/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1161 - accuracy: 0.9388 - val_loss: 0.1221 - val_accuracy: 0.9458
Epoch 274/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1220 - accuracy: 0.9388 - val_loss: 0.1255 - val_accuracy: 0.9450
Epoch 275/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1233 - accuracy: 0.9384 - val_loss: 0.1257 - val_accuracy: 0.9455
Epoch 276/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1195 - accuracy: 0.9385 - val_loss: 0.1214 - val_accuracy: 0.9451
Epoch 277/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1207 - accuracy: 0.9397 - val_loss: 0.1207 - val_accuracy: 0.9458
Epoch 278/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1165 - accuracy: 0.9396 - val_loss: 0.1230 - val_accuracy: 0.9458
Epoch 279/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1146 - accuracy: 0.9382 - val_loss: 0.1239 - val_accuracy: 0.9448
Epoch 280/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1165 - accuracy: 0.9419 - val_loss: 0.1471 - val_accuracy: 0.9382
Epoch 281/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1237 - accuracy: 0.9353 - val_loss: 0.1197 - val_accuracy: 0.9468
Epoch 282/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1149 - accuracy: 0.9418 - val_loss: 0.1228 - val_accuracy: 0.9464
Epoch 283/300
104/104 [==============================] - 15s 145ms/step - loss: 0.1157 - accuracy: 0.9393 - val_loss: 0.1214 - val_accuracy: 0.9458
Epoch 284/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1197 - accuracy: 0.9380 - val_loss: 0.1348 - val_accuracy: 0.9418
Epoch 285/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1168 - accuracy: 0.9409 - val_loss: 0.1215 - val_accuracy: 0.9463
Epoch 286/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1168 - accuracy: 0.9393 - val_loss: 0.1208 - val_accuracy: 0.9466
Epoch 287/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1210 - accuracy: 0.9394 - val_loss: 0.1222 - val_accuracy: 0.9468
Epoch 288/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1073 - accuracy: 0.9436 - val_loss: 0.1240 - val_accuracy: 0.9458
Epoch 289/300
104/104 [==============================] - 15s 147ms/step - loss: 0.1207 - accuracy: 0.9377 - val_loss: 0.1193 - val_accuracy: 0.9467
Epoch 290/300
104/104 [==============================] - 16s 150ms/step - loss: 0.1148 - accuracy: 0.9416 - val_loss: 0.1208 - val_accuracy: 0.9469
Epoch 291/300
104/104 [==============================] - 15s 146ms/step - loss: 0.1161 - accuracy: 0.9408 - val_loss: 0.1316 - val_accuracy: 0.9436
Epoch 292/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1113 - accuracy: 0.9416 - val_loss: 0.1234 - val_accuracy: 0.9438
Epoch 293/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1184 - accuracy: 0.9379 - val_loss: 0.1383 - val_accuracy: 0.9416
Epoch 294/300
104/104 [==============================] - 16s 151ms/step - loss: 0.1127 - accuracy: 0.9422 - val_loss: 0.1235 - val_accuracy: 0.9463
Epoch 295/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1175 - accuracy: 0.9381 - val_loss: 0.1202 - val_accuracy: 0.9461
Epoch 296/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1148 - accuracy: 0.9408 - val_loss: 0.1199 - val_accuracy: 0.9460
Epoch 297/300
104/104 [==============================] - 15s 149ms/step - loss: 0.1164 - accuracy: 0.9387 - val_loss: 0.1181 - val_accuracy: 0.9459
Epoch 298/300
104/104 [==============================] - 16s 152ms/step - loss: 0.1157 - accuracy: 0.9413 - val_loss: 0.1180 - val_accuracy: 0.9462
Epoch 299/300
104/104 [==============================] - 16s 153ms/step - loss: 0.1128 - accuracy: 0.9406 - val_loss: 0.1176 - val_accuracy: 0.9473
Epoch 300/300
104/104 [==============================] - 15s 148ms/step - loss: 0.1133 - accuracy: 0.9424 - val_loss: 0.1222 - val_accuracy: 0.9467

After training is complete, we will plot the accuracy and loss curves:

In [ ]:
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='lower right')
plt.show()

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper right')
plt.show()
No description has been provided for this image
No description has been provided for this image

Thus, we can calculate the accuracy for the test set:

In [ ]:
predict = model.predict(x_test)
12/12 [==============================] - 5s 73ms/step
In [ ]:
pred = np.round(predict)
In [ ]:
accuracy = accuracy_score(y_test.flatten(),pred.flatten())
print(accuracy)
0.9467164357503255
In [ ]:
y_test.shape
Out[ ]:
(360, 256, 256)

Finally, let's plot an example of the predicted result compared to the original mask:

In [ ]:
i = 0
plt.figure(figsize=(20,8))
plt.subplot(1,3,1),
plt.imshow(x_test[i])
plt.subplot(1,3,2),
plt.imshow(y_test[i,:,:])
plt.subplot(1,3,3),
plt.imshow(pred[i,:,:,0])
Out[ ]:
<matplotlib.image.AxesImage at 0x7e59a8a57fa0>
No description has been provided for this image

Orthomosaic Prediction¶

After training the model, we will apply it to the entire orthomosaic. First we divide it into 512x512 pixel windows and save it in tiff format.

In [ ]:
path_img_to_pred = '/content/drive/MyDrive/Datasets/OpenCitiesAI/mon/493701/493701.tif'
path_split = "/content/split_img"
if not os.path.isdir(path_split):
    os.mkdir(path_split)

path_exp = "/content/mask_predict"
if not os.path.isdir(path_exp):
    os.mkdir(path_exp)
In [ ]:
src = rasterio.open(path_img_to_pred)
out_meta = src.meta.copy()
qtd = 0
for n in range((src.meta['width']//512)):
  for m in range((src.meta['height']//512)):
    x = ((n*512))
    y = ((m*512))
    window = Window(x,y,512,512)
    win_transform = src.window_transform(window)
    arr_win = src.read(window=window)
    arr_win = arr_win[0:3]
    if (arr_win.max() != 0) and (arr_win.shape[1] == 512) and (arr_win.shape[2] == 512):
      qtd = qtd + 1
      path_exp_img = os.path.join(path_split, 'img_' + str(qtd) + '.tif')
      out_meta.update({"driver": "GTiff","height": 512,"width": 512, "count":len(arr_win), "compress":'lzw', "transform":win_transform})
      with rasterio.open(path_exp_img, 'w', **out_meta) as dst:
          for i, layer in enumerate(arr_win, start=1):
              dst.write_band(i, layer.reshape(-1, layer.shape[-1]))
      print('Create img: ' + str(qtd))
    del arr_win
Create img: 1
Create img: 2
Create img: 3
Create img: 4
Create img: 5
Create img: 6
Create img: 7
Create img: 8
Create img: 9
Create img: 10
Create img: 11
Create img: 12
Create img: 13
Create img: 14
Create img: 15
Create img: 16
Create img: 17
Create img: 18
Create img: 19
Create img: 20
Create img: 21
Create img: 22
Create img: 23
Create img: 24
Create img: 25
Create img: 26
Create img: 27
Create img: 28
Create img: 29
Create img: 30
Create img: 31
Create img: 32
Create img: 33
Create img: 34
Create img: 35
Create img: 36
Create img: 37
Create img: 38
Create img: 39
Create img: 40
Create img: 41
Create img: 42
Create img: 43
Create img: 44
Create img: 45
Create img: 46
Create img: 47
Create img: 48
Create img: 49
Create img: 50
Create img: 51
Create img: 52
Create img: 53
Create img: 54
Create img: 55
Create img: 56
Create img: 57
Create img: 58
Create img: 59
Create img: 60
Create img: 61
Create img: 62
Create img: 63
Create img: 64
Create img: 65
Create img: 66
Create img: 67
Create img: 68
Create img: 69
Create img: 70
Create img: 71
Create img: 72
Create img: 73
Create img: 74
Create img: 75
Create img: 76
Create img: 77
Create img: 78
Create img: 79
Create img: 80
Create img: 81
Create img: 82
Create img: 83
Create img: 84
Create img: 85
Create img: 86
Create img: 87
Create img: 88
Create img: 89
Create img: 90
Create img: 91
Create img: 92
Create img: 93
Create img: 94
Create img: 95
Create img: 96
Create img: 97
Create img: 98
Create img: 99
Create img: 100
Create img: 101
Create img: 102
Create img: 103
Create img: 104
Create img: 105
Create img: 106
Create img: 107
Create img: 108
Create img: 109
Create img: 110
Create img: 111
Create img: 112
Create img: 113
Create img: 114
Create img: 115
Create img: 116
Create img: 117
Create img: 118
Create img: 119
Create img: 120
Create img: 121
Create img: 122
Create img: 123
Create img: 124
Create img: 125
Create img: 126
Create img: 127
Create img: 128
Create img: 129
Create img: 130
Create img: 131
Create img: 132
Create img: 133
Create img: 134
Create img: 135
Create img: 136
Create img: 137
Create img: 138
Create img: 139
Create img: 140
Create img: 141
Create img: 142
Create img: 143
Create img: 144
Create img: 145
Create img: 146
Create img: 147
Create img: 148
Create img: 149
Create img: 150
Create img: 151
Create img: 152
Create img: 153
Create img: 154
Create img: 155
Create img: 156
Create img: 157
Create img: 158
Create img: 159
Create img: 160
Create img: 161
Create img: 162
Create img: 163
Create img: 164
Create img: 165
Create img: 166
Create img: 167
Create img: 168
Create img: 169
Create img: 170
Create img: 171
Create img: 172
Create img: 173
Create img: 174
Create img: 175
Create img: 176
Create img: 177
Create img: 178
Create img: 179
Create img: 180
Create img: 181
Create img: 182
Create img: 183
Create img: 184
Create img: 185
Create img: 186
Create img: 187
Create img: 188
Create img: 189
Create img: 190
Create img: 191
Create img: 192
Create img: 193
Create img: 194
Create img: 195
Create img: 196
Create img: 197
Create img: 198
Create img: 199
Create img: 200
Create img: 201
Create img: 202
Create img: 203
Create img: 204
Create img: 205
Create img: 206
Create img: 207
Create img: 208
Create img: 209
Create img: 210
Create img: 211
Create img: 212
Create img: 213
Create img: 214
Create img: 215
Create img: 216
Create img: 217
Create img: 218
Create img: 219
Create img: 220
Create img: 221
Create img: 222
Create img: 223
Create img: 224
Create img: 225
Create img: 226
Create img: 227
Create img: 228
Create img: 229
Create img: 230
Create img: 231
Create img: 232
Create img: 233
Create img: 234
Create img: 235
Create img: 236
Create img: 237
Create img: 238
Create img: 239
Create img: 240
Create img: 241
Create img: 242
Create img: 243
Create img: 244
Create img: 245
Create img: 246
Create img: 247
Create img: 248
Create img: 249
Create img: 250
Create img: 251
Create img: 252
Create img: 253
Create img: 254
Create img: 255
Create img: 256
Create img: 257
Create img: 258
Create img: 259
Create img: 260
Create img: 261
Create img: 262
Create img: 263
Create img: 264
Create img: 265
Create img: 266
Create img: 267
Create img: 268
Create img: 269
Create img: 270
Create img: 271
Create img: 272
Create img: 273
Create img: 274
Create img: 275
Create img: 276
Create img: 277
Create img: 278
Create img: 279
Create img: 280
Create img: 281
Create img: 282
Create img: 283
Create img: 284
Create img: 285
Create img: 286
Create img: 287
Create img: 288
Create img: 289
Create img: 290
Create img: 291
Create img: 292
Create img: 293
Create img: 294
Create img: 295
Create img: 296
Create img: 297
Create img: 298
Create img: 299
Create img: 300
Create img: 301
Create img: 302
Create img: 303
Create img: 304
Create img: 305
Create img: 306
Create img: 307
Create img: 308
Create img: 309
Create img: 310
Create img: 311
Create img: 312
Create img: 313
Create img: 314
Create img: 315
Create img: 316
Create img: 317
Create img: 318
Create img: 319
Create img: 320
Create img: 321
Create img: 322
Create img: 323
Create img: 324
Create img: 325
Create img: 326
Create img: 327
Create img: 328
Create img: 329
Create img: 330
Create img: 331
Create img: 332
Create img: 333
Create img: 334
Create img: 335
Create img: 336
Create img: 337
Create img: 338
Create img: 339
Create img: 340
Create img: 341
Create img: 342
Create img: 343
Create img: 344
Create img: 345
Create img: 346
Create img: 347
Create img: 348
Create img: 349
Create img: 350
Create img: 351
Create img: 352
Create img: 353
Create img: 354
Create img: 355
Create img: 356
Create img: 357
Create img: 358
Create img: 359
Create img: 360
Create img: 361
Create img: 362
Create img: 363
Create img: 364
Create img: 365
Create img: 366
Create img: 367
Create img: 368
Create img: 369
Create img: 370
Create img: 371
Create img: 372
Create img: 373
Create img: 374
Create img: 375
Create img: 376
Create img: 377
Create img: 378
Create img: 379
Create img: 380
Create img: 381
Create img: 382
Create img: 383
Create img: 384
Create img: 385
Create img: 386
Create img: 387
Create img: 388
Create img: 389
Create img: 390
Create img: 391
Create img: 392
Create img: 393
Create img: 394
Create img: 395
Create img: 396
Create img: 397
Create img: 398
Create img: 399
Create img: 400
Create img: 401
Create img: 402
Create img: 403
Create img: 404
Create img: 405
Create img: 406
Create img: 407
Create img: 408
Create img: 409
Create img: 410
Create img: 411
Create img: 412
Create img: 413
Create img: 414
Create img: 415
Create img: 416
Create img: 417
Create img: 418
Create img: 419
Create img: 420
Create img: 421
Create img: 422
Create img: 423
Create img: 424
Create img: 425
Create img: 426
Create img: 427
Create img: 428
Create img: 429
Create img: 430
Create img: 431
Create img: 432
Create img: 433
Create img: 434
Create img: 435
Create img: 436
Create img: 437
Create img: 438
Create img: 439
Create img: 440
Create img: 441
Create img: 442
Create img: 443
Create img: 444
Create img: 445
Create img: 446
Create img: 447
Create img: 448
Create img: 449
Create img: 450
Create img: 451
Create img: 452
Create img: 453
Create img: 454
Create img: 455
Create img: 456
Create img: 457
Create img: 458
Create img: 459
Create img: 460
Create img: 461
Create img: 462
Create img: 463
Create img: 464
Create img: 465
Create img: 466
Create img: 467
Create img: 468
Create img: 469
Create img: 470
Create img: 471
Create img: 472
Create img: 473
Create img: 474
Create img: 475
Create img: 476
Create img: 477
Create img: 478
Create img: 479
Create img: 480
Create img: 481
Create img: 482
Create img: 483
Create img: 484
Create img: 485
Create img: 486
Create img: 487
Create img: 488
Create img: 489
Create img: 490
Create img: 491
Create img: 492
Create img: 493
Create img: 494
Create img: 495
Create img: 496
Create img: 497
Create img: 498
Create img: 499
Create img: 500
Create img: 501
Create img: 502
Create img: 503
Create img: 504
Create img: 505
Create img: 506
Create img: 507
Create img: 508
Create img: 509
Create img: 510
Create img: 511
Create img: 512
Create img: 513
Create img: 514
Create img: 515
Create img: 516
Create img: 517
Create img: 518
Create img: 519
Create img: 520
Create img: 521
Create img: 522
Create img: 523
Create img: 524
Create img: 525
Create img: 526
Create img: 527
Create img: 528
Create img: 529
Create img: 530
Create img: 531
Create img: 532
Create img: 533
Create img: 534
Create img: 535
Create img: 536
Create img: 537
Create img: 538
Create img: 539
Create img: 540
Create img: 541
Create img: 542
Create img: 543
Create img: 544
Create img: 545
Create img: 546
Create img: 547
Create img: 548
Create img: 549
Create img: 550
Create img: 551
Create img: 552
Create img: 553
Create img: 554
Create img: 555
Create img: 556
Create img: 557
Create img: 558
Create img: 559
Create img: 560
Create img: 561
Create img: 562
Create img: 563
Create img: 564
Create img: 565
Create img: 566
Create img: 567
Create img: 568
Create img: 569
Create img: 570
Create img: 571
Create img: 572
Create img: 573
Create img: 574
Create img: 575
Create img: 576
Create img: 577
Create img: 578
Create img: 579
Create img: 580
Create img: 581
Create img: 582
Create img: 583
Create img: 584
Create img: 585
Create img: 586
Create img: 587
Create img: 588
Create img: 589
Create img: 590
Create img: 591
Create img: 592
Create img: 593
Create img: 594
Create img: 595
Create img: 596
Create img: 597
Create img: 598
Create img: 599
Create img: 600
Create img: 601
Create img: 602
Create img: 603
Create img: 604
Create img: 605
Create img: 606
Create img: 607
Create img: 608
Create img: 609
Create img: 610
Create img: 611
Create img: 612
Create img: 613
Create img: 614
Create img: 615
Create img: 616
Create img: 617
Create img: 618
Create img: 619
Create img: 620
Create img: 621
Create img: 622
Create img: 623
Create img: 624
Create img: 625
Create img: 626
Create img: 627
Create img: 628
Create img: 629
Create img: 630
Create img: 631
Create img: 632
Create img: 633
Create img: 634
Create img: 635
Create img: 636
Create img: 637
Create img: 638
Create img: 639
Create img: 640
Create img: 641
Create img: 642
Create img: 643
Create img: 644
Create img: 645
Create img: 646
Create img: 647
Create img: 648
Create img: 649
Create img: 650
Create img: 651
Create img: 652
Create img: 653
Create img: 654
Create img: 655
Create img: 656
Create img: 657
Create img: 658
Create img: 659
Create img: 660
Create img: 661
Create img: 662
Create img: 663
Create img: 664
Create img: 665
Create img: 666
Create img: 667
Create img: 668
Create img: 669
Create img: 670
Create img: 671
Create img: 672
Create img: 673
Create img: 674
Create img: 675
Create img: 676
Create img: 677
Create img: 678
Create img: 679
Create img: 680
Create img: 681
Create img: 682
Create img: 683
Create img: 684
Create img: 685
Create img: 686
Create img: 687
Create img: 688
Create img: 689
Create img: 690
Create img: 691
Create img: 692
Create img: 693
Create img: 694
Create img: 695
Create img: 696
Create img: 697
Create img: 698
Create img: 699
Create img: 700
Create img: 701
Create img: 702
Create img: 703
Create img: 704
Create img: 705
Create img: 706
Create img: 707
Create img: 708
Create img: 709
Create img: 710
Create img: 711
Create img: 712
Create img: 713
Create img: 714
Create img: 715
Create img: 716
Create img: 717
Create img: 718
Create img: 719
Create img: 720
Create img: 721
Create img: 722
Create img: 723
Create img: 724
Create img: 725
Create img: 726
Create img: 727
Create img: 728
Create img: 729
Create img: 730
Create img: 731
Create img: 732
Create img: 733
Create img: 734
Create img: 735
Create img: 736
Create img: 737
Create img: 738
Create img: 739
Create img: 740
Create img: 741
Create img: 742
Create img: 743
Create img: 744
Create img: 745
Create img: 746
Create img: 747
Create img: 748
Create img: 749
Create img: 750
Create img: 751
Create img: 752
Create img: 753
Create img: 754
Create img: 755
Create img: 756
Create img: 757
Create img: 758
Create img: 759
Create img: 760
Create img: 761
Create img: 762
Create img: 763
Create img: 764
Create img: 765
Create img: 766
Create img: 767
Create img: 768
Create img: 769
Create img: 770
Create img: 771
Create img: 772
Create img: 773
Create img: 774
Create img: 775
Create img: 776
Create img: 777
Create img: 778
Create img: 779
Create img: 780
Create img: 781
Create img: 782
Create img: 783
Create img: 784
Create img: 785
Create img: 786
Create img: 787
Create img: 788
Create img: 789
Create img: 790
Create img: 791
Create img: 792
Create img: 793
Create img: 794
Create img: 795
Create img: 796
Create img: 797
Create img: 798
Create img: 799
Create img: 800
Create img: 801
Create img: 802
Create img: 803
Create img: 804
Create img: 805
Create img: 806
Create img: 807
Create img: 808
Create img: 809
Create img: 810
Create img: 811
Create img: 812
Create img: 813
Create img: 814
Create img: 815
Create img: 816
Create img: 817
Create img: 818
Create img: 819
Create img: 820
Create img: 821
Create img: 822
Create img: 823
Create img: 824
Create img: 825
Create img: 826
Create img: 827
Create img: 828
Create img: 829
Create img: 830
Create img: 831
Create img: 832
Create img: 833
Create img: 834
Create img: 835
Create img: 836
Create img: 837
Create img: 838
Create img: 839
Create img: 840
Create img: 841
Create img: 842
Create img: 843
Create img: 844
Create img: 845
Create img: 846
Create img: 847
Create img: 848
Create img: 849
Create img: 850
Create img: 851
Create img: 852
Create img: 853
Create img: 854
Create img: 855
Create img: 856
Create img: 857
Create img: 858
Create img: 859
Create img: 860
Create img: 861
Create img: 862
Create img: 863
Create img: 864
Create img: 865
Create img: 866
Create img: 867
Create img: 868
Create img: 869
Create img: 870
Create img: 871
Create img: 872
Create img: 873
Create img: 874
Create img: 875
Create img: 876
Create img: 877
Create img: 878
Create img: 879
Create img: 880
Create img: 881
Create img: 882
Create img: 883
Create img: 884
Create img: 885
Create img: 886
Create img: 887
Create img: 888
Create img: 889
Create img: 890
Create img: 891
Create img: 892
Create img: 893
Create img: 894
Create img: 895
Create img: 896
Create img: 897
Create img: 898
Create img: 899
Create img: 900
Create img: 901
Create img: 902
Create img: 903
Create img: 904
Create img: 905
Create img: 906
Create img: 907
Create img: 908
Create img: 909
Create img: 910
Create img: 911
Create img: 912
Create img: 913
Create img: 914
Create img: 915
Create img: 916
Create img: 917
Create img: 918
Create img: 919
Create img: 920
Create img: 921
Create img: 922
Create img: 923
Create img: 924
Create img: 925
Create img: 926
Create img: 927
Create img: 928
Create img: 929
Create img: 930
Create img: 931
Create img: 932
Create img: 933
Create img: 934
Create img: 935
Create img: 936
Create img: 937
Create img: 938
Create img: 939
Create img: 940
Create img: 941
Create img: 942
Create img: 943
Create img: 944
Create img: 945
Create img: 946
Create img: 947
Create img: 948
Create img: 949
Create img: 950
Create img: 951
Create img: 952
Create img: 953
Create img: 954
Create img: 955
Create img: 956
Create img: 957
Create img: 958
Create img: 959
Create img: 960
Create img: 961
Create img: 962
Create img: 963
Create img: 964
Create img: 965
Create img: 966
Create img: 967
Create img: 968
Create img: 969
Create img: 970
Create img: 971
Create img: 972
Create img: 973
Create img: 974
Create img: 975
Create img: 976
Create img: 977
Create img: 978
Create img: 979
Create img: 980
Create img: 981
Create img: 982
Create img: 983
Create img: 984
Create img: 985
Create img: 986
Create img: 987
Create img: 988
Create img: 989
Create img: 990
Create img: 991
Create img: 992
Create img: 993
Create img: 994
Create img: 995
Create img: 996
Create img: 997
Create img: 998
Create img: 999
Create img: 1000
Create img: 1001
Create img: 1002
Create img: 1003
Create img: 1004
Create img: 1005
Create img: 1006
Create img: 1007
Create img: 1008
Create img: 1009
Create img: 1010
Create img: 1011
Create img: 1012
Create img: 1013
Create img: 1014
Create img: 1015
Create img: 1016
Create img: 1017
Create img: 1018
Create img: 1019
Create img: 1020
Create img: 1021
Create img: 1022
Create img: 1023
Create img: 1024
Create img: 1025
Create img: 1026
Create img: 1027
Create img: 1028
Create img: 1029
Create img: 1030
Create img: 1031
Create img: 1032
Create img: 1033
Create img: 1034
Create img: 1035
Create img: 1036
Create img: 1037
Create img: 1038
Create img: 1039
Create img: 1040
Create img: 1041
Create img: 1042
Create img: 1043
Create img: 1044
Create img: 1045
Create img: 1046
Create img: 1047
Create img: 1048
Create img: 1049
Create img: 1050
Create img: 1051
Create img: 1052
Create img: 1053
Create img: 1054
Create img: 1055
Create img: 1056
Create img: 1057
Create img: 1058
Create img: 1059
Create img: 1060
Create img: 1061
Create img: 1062
Create img: 1063
Create img: 1064
Create img: 1065
Create img: 1066
Create img: 1067
Create img: 1068
Create img: 1069
Create img: 1070
Create img: 1071
Create img: 1072
Create img: 1073
Create img: 1074
Create img: 1075
Create img: 1076
Create img: 1077
Create img: 1078
Create img: 1079
Create img: 1080
Create img: 1081
Create img: 1082
Create img: 1083
Create img: 1084
Create img: 1085
Create img: 1086
Create img: 1087
Create img: 1088
Create img: 1089
Create img: 1090
Create img: 1091
Create img: 1092
Create img: 1093
Create img: 1094
Create img: 1095
Create img: 1096
Create img: 1097
Create img: 1098
Create img: 1099
Create img: 1100
Create img: 1101
Create img: 1102
Create img: 1103
Create img: 1104
Create img: 1105
Create img: 1106
Create img: 1107
Create img: 1108
Create img: 1109
Create img: 1110
Create img: 1111
Create img: 1112
Create img: 1113
Create img: 1114
Create img: 1115
Create img: 1116
Create img: 1117
Create img: 1118
Create img: 1119
Create img: 1120
Create img: 1121
Create img: 1122
Create img: 1123
Create img: 1124
Create img: 1125
Create img: 1126
Create img: 1127
Create img: 1128
Create img: 1129
Create img: 1130
Create img: 1131
Create img: 1132
Create img: 1133
Create img: 1134
Create img: 1135
Create img: 1136
Create img: 1137
Create img: 1138
Create img: 1139
Create img: 1140
Create img: 1141
Create img: 1142
Create img: 1143
Create img: 1144
Create img: 1145
Create img: 1146
Create img: 1147
Create img: 1148
Create img: 1149
Create img: 1150
Create img: 1151
Create img: 1152
Create img: 1153
Create img: 1154
Create img: 1155
Create img: 1156
Create img: 1157
Create img: 1158
Create img: 1159
Create img: 1160
Create img: 1161
Create img: 1162
Create img: 1163
Create img: 1164
Create img: 1165
Create img: 1166
Create img: 1167
Create img: 1168
Create img: 1169
Create img: 1170
Create img: 1171
Create img: 1172
Create img: 1173
Create img: 1174
Create img: 1175
Create img: 1176
Create img: 1177
Create img: 1178
Create img: 1179
Create img: 1180
Create img: 1181
Create img: 1182
Create img: 1183
Create img: 1184
Create img: 1185
Create img: 1186
Create img: 1187
Create img: 1188
Create img: 1189
Create img: 1190
Create img: 1191
Create img: 1192
Create img: 1193
Create img: 1194
Create img: 1195
Create img: 1196
Create img: 1197
Create img: 1198

Now just apply the model to each of the images:

In [ ]:
n = [f for f in os.listdir(path_split)]
In [ ]:
for path_img in n:
  img = []
  path_full = os.path.join(path_split,path_img)
  ds = rasterio.open(path_full, 'r')
  im = ds.read()
  im = im.transpose([1,2,0])
  im = im/255
  im = im[np.newaxis,:,:,:]
  predict = model.predict(im)
  predict = np.round(predict).astype(np.uint8)
  print(path_img.split('_')[1])
  out_meta = ds.meta.copy()
  w = ds.meta['width']
  h = ds.meta['height']
  path_exp_1 = os.path.join(path_exp,'Pred_' + path_img.split('_')[1])
  out_meta.update({"driver": "GTiff","dtype":rasterio.uint8,"compress":'lzw',"count":1,"nodata":0})
  with rasterio.open(path_exp_1, 'w', **out_meta) as dst:
      dst.write(predict[0,:,:,0], indexes=1)
1/1 [==============================] - 3s 3s/step
649.tif
1/1 [==============================] - 0s 31ms/step
957.tif
1/1 [==============================] - 0s 32ms/step
715.tif
1/1 [==============================] - 0s 29ms/step
572.tif
1/1 [==============================] - 0s 30ms/step
655.tif
1/1 [==============================] - 0s 28ms/step
1083.tif
1/1 [==============================] - 0s 29ms/step
493.tif
1/1 [==============================] - 0s 30ms/step
366.tif
1/1 [==============================] - 0s 30ms/step
654.tif
1/1 [==============================] - 0s 31ms/step
168.tif
1/1 [==============================] - 0s 29ms/step
153.tif
1/1 [==============================] - 0s 32ms/step
1136.tif
1/1 [==============================] - 0s 32ms/step
974.tif
1/1 [==============================] - 0s 29ms/step
678.tif
1/1 [==============================] - 0s 30ms/step
443.tif
1/1 [==============================] - 0s 29ms/step
730.tif
1/1 [==============================] - 0s 31ms/step
294.tif
1/1 [==============================] - 0s 31ms/step
451.tif
1/1 [==============================] - 0s 29ms/step
722.tif
1/1 [==============================] - 0s 29ms/step
267.tif
1/1 [==============================] - 0s 30ms/step
320.tif
1/1 [==============================] - 0s 31ms/step
919.tif
1/1 [==============================] - 0s 32ms/step
1011.tif
1/1 [==============================] - 0s 32ms/step
1108.tif
1/1 [==============================] - 0s 33ms/step
855.tif
1/1 [==============================] - 0s 28ms/step
591.tif
1/1 [==============================] - 0s 29ms/step
619.tif
1/1 [==============================] - 0s 31ms/step
468.tif
1/1 [==============================] - 0s 30ms/step
401.tif
1/1 [==============================] - 0s 29ms/step
138.tif
1/1 [==============================] - 0s 30ms/step
679.tif
1/1 [==============================] - 0s 29ms/step
469.tif
1/1 [==============================] - 0s 31ms/step
492.tif
1/1 [==============================] - 0s 32ms/step
97.tif
1/1 [==============================] - 0s 30ms/step
1006.tif
1/1 [==============================] - 0s 29ms/step
474.tif
1/1 [==============================] - 0s 30ms/step
965.tif
1/1 [==============================] - 0s 31ms/step
348.tif
1/1 [==============================] - 0s 27ms/step
1158.tif
1/1 [==============================] - 0s 30ms/step
1015.tif
1/1 [==============================] - 0s 30ms/step
815.tif
1/1 [==============================] - 0s 30ms/step
753.tif
1/1 [==============================] - 0s 30ms/step
186.tif
1/1 [==============================] - 0s 29ms/step
484.tif
1/1 [==============================] - 0s 31ms/step
808.tif
1/1 [==============================] - 0s 29ms/step
222.tif
1/1 [==============================] - 0s 29ms/step
107.tif
1/1 [==============================] - 0s 33ms/step
755.tif
1/1 [==============================] - 0s 30ms/step
189.tif
1/1 [==============================] - 0s 35ms/step
874.tif
1/1 [==============================] - 0s 34ms/step
423.tif
1/1 [==============================] - 0s 34ms/step
416.tif
1/1 [==============================] - 0s 30ms/step
355.tif
1/1 [==============================] - 0s 30ms/step
552.tif
1/1 [==============================] - 0s 32ms/step
1017.tif
1/1 [==============================] - 0s 29ms/step
285.tif
1/1 [==============================] - 0s 30ms/step
91.tif
1/1 [==============================] - 0s 33ms/step
234.tif
1/1 [==============================] - 0s 32ms/step
307.tif
1/1 [==============================] - 0s 32ms/step
266.tif
1/1 [==============================] - 0s 30ms/step
47.tif
1/1 [==============================] - 0s 31ms/step
563.tif
1/1 [==============================] - 0s 31ms/step
702.tif
1/1 [==============================] - 0s 28ms/step
807.tif
1/1 [==============================] - 0s 31ms/step
256.tif
1/1 [==============================] - 0s 30ms/step
221.tif
1/1 [==============================] - 0s 29ms/step
548.tif
1/1 [==============================] - 0s 32ms/step
278.tif
1/1 [==============================] - 0s 33ms/step
645.tif
1/1 [==============================] - 0s 31ms/step
84.tif
1/1 [==============================] - 0s 30ms/step
851.tif
1/1 [==============================] - 0s 30ms/step
477.tif
1/1 [==============================] - 0s 31ms/step
524.tif
1/1 [==============================] - 0s 30ms/step
1157.tif
1/1 [==============================] - 0s 31ms/step
311.tif
1/1 [==============================] - 0s 29ms/step
1094.tif
1/1 [==============================] - 0s 32ms/step
224.tif
1/1 [==============================] - 0s 31ms/step
606.tif
1/1 [==============================] - 0s 29ms/step
119.tif
1/1 [==============================] - 0s 29ms/step
485.tif
1/1 [==============================] - 0s 28ms/step
737.tif
1/1 [==============================] - 0s 29ms/step
866.tif
1/1 [==============================] - 0s 29ms/step
900.tif
1/1 [==============================] - 0s 28ms/step
825.tif
1/1 [==============================] - 0s 30ms/step
24.tif
1/1 [==============================] - 0s 31ms/step
1196.tif
1/1 [==============================] - 0s 31ms/step
929.tif
1/1 [==============================] - 0s 30ms/step
212.tif
1/1 [==============================] - 0s 28ms/step
326.tif
1/1 [==============================] - 0s 29ms/step
1178.tif
1/1 [==============================] - 0s 30ms/step
459.tif
1/1 [==============================] - 0s 29ms/step
662.tif
1/1 [==============================] - 0s 30ms/step
639.tif
1/1 [==============================] - 0s 28ms/step
261.tif
1/1 [==============================] - 0s 31ms/step
646.tif
1/1 [==============================] - 0s 30ms/step
287.tif
1/1 [==============================] - 0s 30ms/step
522.tif
1/1 [==============================] - 0s 31ms/step
357.tif
1/1 [==============================] - 0s 28ms/step
681.tif
1/1 [==============================] - 0s 31ms/step
924.tif
1/1 [==============================] - 0s 29ms/step
685.tif
1/1 [==============================] - 0s 30ms/step
561.tif
1/1 [==============================] - 0s 29ms/step
616.tif
1/1 [==============================] - 0s 31ms/step
850.tif
1/1 [==============================] - 0s 28ms/step
704.tif
1/1 [==============================] - 0s 29ms/step
604.tif
1/1 [==============================] - 0s 30ms/step
279.tif
1/1 [==============================] - 0s 29ms/step
546.tif
1/1 [==============================] - 0s 30ms/step
579.tif
1/1 [==============================] - 0s 30ms/step
166.tif
1/1 [==============================] - 0s 29ms/step
816.tif
1/1 [==============================] - 0s 33ms/step
147.tif
1/1 [==============================] - 0s 31ms/step
669.tif
1/1 [==============================] - 0s 31ms/step
1148.tif
1/1 [==============================] - 0s 29ms/step
40.tif
1/1 [==============================] - 0s 31ms/step
445.tif
1/1 [==============================] - 0s 30ms/step
60.tif
1/1 [==============================] - 0s 29ms/step
844.tif
1/1 [==============================] - 0s 29ms/step
517.tif
1/1 [==============================] - 0s 30ms/step
1162.tif
1/1 [==============================] - 0s 29ms/step
32.tif
1/1 [==============================] - 0s 31ms/step
99.tif
1/1 [==============================] - 0s 30ms/step
529.tif
1/1 [==============================] - 0s 30ms/step
768.tif
1/1 [==============================] - 0s 29ms/step
247.tif
1/1 [==============================] - 0s 30ms/step
972.tif
1/1 [==============================] - 0s 31ms/step
1134.tif
1/1 [==============================] - 0s 31ms/step
511.tif
1/1 [==============================] - 0s 30ms/step
948.tif
1/1 [==============================] - 0s 33ms/step
1177.tif
1/1 [==============================] - 0s 32ms/step
519.tif
1/1 [==============================] - 0s 31ms/step
142.tif
1/1 [==============================] - 0s 31ms/step
444.tif
1/1 [==============================] - 0s 32ms/step
980.tif
1/1 [==============================] - 0s 30ms/step
1120.tif
1/1 [==============================] - 0s 30ms/step
132.tif
1/1 [==============================] - 0s 31ms/step
1008.tif
1/1 [==============================] - 0s 33ms/step
1198.tif
1/1 [==============================] - 0s 32ms/step
402.tif
1/1 [==============================] - 0s 30ms/step
707.tif
1/1 [==============================] - 0s 31ms/step
415.tif
1/1 [==============================] - 0s 30ms/step
1018.tif
1/1 [==============================] - 0s 32ms/step
764.tif
1/1 [==============================] - 0s 32ms/step
229.tif
1/1 [==============================] - 0s 31ms/step
1077.tif
1/1 [==============================] - 0s 30ms/step
719.tif
1/1 [==============================] - 0s 30ms/step
1172.tif
1/1 [==============================] - 0s 32ms/step
537.tif
1/1 [==============================] - 0s 31ms/step
130.tif
1/1 [==============================] - 0s 29ms/step
973.tif
1/1 [==============================] - 0s 32ms/step
143.tif
1/1 [==============================] - 0s 30ms/step
592.tif
1/1 [==============================] - 0s 31ms/step
295.tif
1/1 [==============================] - 0s 31ms/step
647.tif
1/1 [==============================] - 0s 30ms/step
321.tif
1/1 [==============================] - 0s 30ms/step
1010.tif
1/1 [==============================] - 0s 30ms/step
1047.tif
1/1 [==============================] - 0s 33ms/step
720.tif
1/1 [==============================] - 0s 32ms/step
299.tif
1/1 [==============================] - 0s 31ms/step
504.tif
1/1 [==============================] - 0s 30ms/step
314.tif
1/1 [==============================] - 0s 30ms/step
982.tif
1/1 [==============================] - 0s 31ms/step
810.tif
1/1 [==============================] - 0s 31ms/step
981.tif
1/1 [==============================] - 0s 30ms/step
617.tif
1/1 [==============================] - 0s 32ms/step
939.tif
1/1 [==============================] - 0s 32ms/step
240.tif
1/1 [==============================] - 0s 31ms/step
56.tif
1/1 [==============================] - 0s 31ms/step
465.tif
1/1 [==============================] - 0s 31ms/step
788.tif
1/1 [==============================] - 0s 32ms/step
928.tif
1/1 [==============================] - 0s 29ms/step
139.tif
1/1 [==============================] - 0s 31ms/step
88.tif
1/1 [==============================] - 0s 32ms/step
344.tif
1/1 [==============================] - 0s 33ms/step
293.tif
1/1 [==============================] - 0s 32ms/step
912.tif
1/1 [==============================] - 0s 31ms/step
677.tif
1/1 [==============================] - 0s 31ms/step
779.tif
1/1 [==============================] - 0s 30ms/step
498.tif
1/1 [==============================] - 0s 33ms/step
1007.tif
1/1 [==============================] - 0s 29ms/step
896.tif
1/1 [==============================] - 0s 30ms/step
569.tif
1/1 [==============================] - 0s 32ms/step
770.tif
1/1 [==============================] - 0s 30ms/step
657.tif
1/1 [==============================] - 0s 30ms/step
637.tif
1/1 [==============================] - 0s 29ms/step
336.tif
1/1 [==============================] - 0s 30ms/step
359.tif
1/1 [==============================] - 0s 30ms/step
687.tif
1/1 [==============================] - 0s 30ms/step
690.tif
1/1 [==============================] - 0s 32ms/step
354.tif
1/1 [==============================] - 0s 29ms/step
949.tif
1/1 [==============================] - 0s 31ms/step
241.tif
1/1 [==============================] - 0s 32ms/step
953.tif
1/1 [==============================] - 0s 31ms/step
520.tif
1/1 [==============================] - 0s 34ms/step
1040.tif
1/1 [==============================] - 0s 30ms/step
264.tif
1/1 [==============================] - 0s 31ms/step
997.tif
1/1 [==============================] - 0s 31ms/step
607.tif
1/1 [==============================] - 0s 31ms/step
608.tif
1/1 [==============================] - 0s 30ms/step
386.tif
1/1 [==============================] - 0s 30ms/step
780.tif
1/1 [==============================] - 0s 31ms/step
917.tif
1/1 [==============================] - 0s 30ms/step
1060.tif
1/1 [==============================] - 0s 33ms/step
933.tif
1/1 [==============================] - 0s 30ms/step
209.tif
1/1 [==============================] - 0s 30ms/step
978.tif
1/1 [==============================] - 0s 30ms/step
376.tif
1/1 [==============================] - 0s 32ms/step
800.tif
1/1 [==============================] - 0s 29ms/step
831.tif
1/1 [==============================] - 0s 32ms/step
120.tif
1/1 [==============================] - 0s 32ms/step
243.tif
1/1 [==============================] - 0s 30ms/step
656.tif
1/1 [==============================] - 0s 30ms/step
140.tif
1/1 [==============================] - 0s 29ms/step
789.tif
1/1 [==============================] - 0s 32ms/step
745.tif
1/1 [==============================] - 0s 29ms/step
76.tif
1/1 [==============================] - 0s 30ms/step
940.tif
1/1 [==============================] - 0s 30ms/step
996.tif
1/1 [==============================] - 0s 30ms/step
908.tif
1/1 [==============================] - 0s 31ms/step
599.tif
1/1 [==============================] - 0s 29ms/step
327.tif
1/1 [==============================] - 0s 30ms/step
46.tif
1/1 [==============================] - 0s 29ms/step
739.tif
1/1 [==============================] - 0s 31ms/step
824.tif
1/1 [==============================] - 0s 32ms/step
283.tif
1/1 [==============================] - 0s 30ms/step
886.tif
1/1 [==============================] - 0s 30ms/step
123.tif
1/1 [==============================] - 0s 28ms/step
159.tif
1/1 [==============================] - 0s 30ms/step
463.tif
1/1 [==============================] - 0s 30ms/step
52.tif
1/1 [==============================] - 0s 29ms/step
436.tif
1/1 [==============================] - 0s 28ms/step
837.tif
1/1 [==============================] - 0s 28ms/step
1085.tif
1/1 [==============================] - 0s 29ms/step
25.tif
1/1 [==============================] - 0s 29ms/step
42.tif
1/1 [==============================] - 0s 29ms/step
680.tif
1/1 [==============================] - 0s 30ms/step
624.tif
1/1 [==============================] - 0s 30ms/step
98.tif
1/1 [==============================] - 0s 30ms/step
1086.tif
1/1 [==============================] - 0s 29ms/step
923.tif
1/1 [==============================] - 0s 28ms/step
86.tif
1/1 [==============================] - 0s 29ms/step
1064.tif
1/1 [==============================] - 0s 28ms/step
38.tif
1/1 [==============================] - 0s 29ms/step
672.tif
1/1 [==============================] - 0s 28ms/step
172.tif
1/1 [==============================] - 0s 29ms/step
208.tif
1/1 [==============================] - 0s 31ms/step
398.tif
1/1 [==============================] - 0s 31ms/step
125.tif
1/1 [==============================] - 0s 30ms/step
351.tif
1/1 [==============================] - 0s 31ms/step
93.tif
1/1 [==============================] - 0s 29ms/step
290.tif
1/1 [==============================] - 0s 29ms/step
121.tif
1/1 [==============================] - 0s 28ms/step
10.tif
1/1 [==============================] - 0s 31ms/step
1096.tif
1/1 [==============================] - 0s 31ms/step
812.tif
1/1 [==============================] - 0s 30ms/step
803.tif
1/1 [==============================] - 0s 29ms/step
936.tif
1/1 [==============================] - 0s 30ms/step
192.tif
1/1 [==============================] - 0s 31ms/step
419.tif
1/1 [==============================] - 0s 30ms/step
255.tif
1/1 [==============================] - 0s 30ms/step
252.tif
1/1 [==============================] - 0s 29ms/step
570.tif
1/1 [==============================] - 0s 29ms/step
853.tif
1/1 [==============================] - 0s 31ms/step
630.tif
1/1 [==============================] - 0s 31ms/step
711.tif
1/1 [==============================] - 0s 32ms/step
65.tif
1/1 [==============================] - 0s 31ms/step
377.tif
1/1 [==============================] - 0s 30ms/step
1138.tif
1/1 [==============================] - 0s 29ms/step
986.tif
1/1 [==============================] - 0s 29ms/step
688.tif
1/1 [==============================] - 0s 29ms/step
1079.tif
1/1 [==============================] - 0s 30ms/step
589.tif
1/1 [==============================] - 0s 29ms/step
448.tif
1/1 [==============================] - 0s 29ms/step
1049.tif
1/1 [==============================] - 0s 30ms/step
916.tif
1/1 [==============================] - 0s 28ms/step
1116.tif
1/1 [==============================] - 0s 29ms/step
557.tif
1/1 [==============================] - 0s 30ms/step
1166.tif
1/1 [==============================] - 0s 30ms/step
743.tif
1/1 [==============================] - 0s 30ms/step
462.tif
1/1 [==============================] - 0s 29ms/step
967.tif
1/1 [==============================] - 0s 29ms/step
490.tif
1/1 [==============================] - 0s 31ms/step
984.tif
1/1 [==============================] - 0s 31ms/step
920.tif
1/1 [==============================] - 0s 31ms/step
228.tif
1/1 [==============================] - 0s 31ms/step
797.tif
1/1 [==============================] - 0s 31ms/step
602.tif
1/1 [==============================] - 0s 32ms/step
802.tif
1/1 [==============================] - 0s 31ms/step
262.tif
1/1 [==============================] - 0s 31ms/step
518.tif
1/1 [==============================] - 0s 31ms/step
456.tif
1/1 [==============================] - 0s 33ms/step
1030.tif
1/1 [==============================] - 0s 31ms/step
638.tif
1/1 [==============================] - 0s 30ms/step
499.tif
1/1 [==============================] - 0s 29ms/step
856.tif
1/1 [==============================] - 0s 30ms/step
849.tif
1/1 [==============================] - 0s 30ms/step
1046.tif
1/1 [==============================] - 0s 31ms/step
823.tif
1/1 [==============================] - 0s 29ms/step
1174.tif
1/1 [==============================] - 0s 30ms/step
1135.tif
1/1 [==============================] - 0s 31ms/step
210.tif
1/1 [==============================] - 0s 34ms/step
551.tif
1/1 [==============================] - 0s 31ms/step
75.tif
1/1 [==============================] - 0s 29ms/step
843.tif
1/1 [==============================] - 0s 31ms/step
614.tif
1/1 [==============================] - 0s 30ms/step
286.tif
1/1 [==============================] - 0s 33ms/step
494.tif
1/1 [==============================] - 0s 32ms/step
621.tif
1/1 [==============================] - 0s 35ms/step
635.tif
1/1 [==============================] - 0s 34ms/step
127.tif
1/1 [==============================] - 0s 31ms/step
313.tif
1/1 [==============================] - 0s 32ms/step
1173.tif
1/1 [==============================] - 0s 32ms/step
1142.tif
1/1 [==============================] - 0s 31ms/step
104.tif
1/1 [==============================] - 0s 30ms/step
388.tif
1/1 [==============================] - 0s 33ms/step
732.tif
1/1 [==============================] - 0s 31ms/step
408.tif
1/1 [==============================] - 0s 33ms/step
728.tif
1/1 [==============================] - 0s 32ms/step
791.tif
1/1 [==============================] - 0s 30ms/step
434.tif
1/1 [==============================] - 0s 31ms/step
735.tif
1/1 [==============================] - 0s 30ms/step
772.tif
1/1 [==============================] - 0s 33ms/step
409.tif
1/1 [==============================] - 0s 32ms/step
632.tif
1/1 [==============================] - 0s 31ms/step
1119.tif
1/1 [==============================] - 0s 32ms/step
158.tif
1/1 [==============================] - 0s 33ms/step
1035.tif
1/1 [==============================] - 0s 32ms/step
489.tif
1/1 [==============================] - 0s 31ms/step
1084.tif
1/1 [==============================] - 0s 29ms/step
979.tif
1/1 [==============================] - 0s 31ms/step
236.tif
1/1 [==============================] - 0s 29ms/step
799.tif
1/1 [==============================] - 0s 31ms/step
353.tif
1/1 [==============================] - 0s 30ms/step
633.tif
1/1 [==============================] - 0s 31ms/step
971.tif
1/1 [==============================] - 0s 31ms/step
861.tif
1/1 [==============================] - 0s 31ms/step
1123.tif
1/1 [==============================] - 0s 31ms/step
558.tif
1/1 [==============================] - 0s 31ms/step
345.tif
1/1 [==============================] - 0s 30ms/step
1192.tif
1/1 [==============================] - 0s 31ms/step
530.tif
1/1 [==============================] - 0s 31ms/step
513.tif
1/1 [==============================] - 0s 32ms/step
152.tif
1/1 [==============================] - 0s 31ms/step
538.tif
1/1 [==============================] - 0s 32ms/step
962.tif
1/1 [==============================] - 0s 32ms/step
580.tif
1/1 [==============================] - 0s 34ms/step
943.tif
1/1 [==============================] - 0s 31ms/step
721.tif
1/1 [==============================] - 0s 30ms/step
593.tif
1/1 [==============================] - 0s 32ms/step
987.tif
1/1 [==============================] - 0s 33ms/step
439.tif
1/1 [==============================] - 0s 31ms/step
1151.tif
1/1 [==============================] - 0s 32ms/step
9.tif
1/1 [==============================] - 0s 32ms/step
858.tif
1/1 [==============================] - 0s 31ms/step
395.tif
1/1 [==============================] - 0s 30ms/step
246.tif
1/1 [==============================] - 0s 31ms/step
1165.tif
1/1 [==============================] - 0s 32ms/step
582.tif
1/1 [==============================] - 0s 31ms/step
767.tif
1/1 [==============================] - 0s 30ms/step
1112.tif
1/1 [==============================] - 0s 28ms/step
1106.tif
1/1 [==============================] - 0s 32ms/step
1073.tif
1/1 [==============================] - 0s 31ms/step
734.tif
1/1 [==============================] - 0s 30ms/step
298.tif
1/1 [==============================] - 0s 30ms/step
1004.tif
1/1 [==============================] - 0s 30ms/step
841.tif
1/1 [==============================] - 0s 30ms/step
338.tif
1/1 [==============================] - 0s 30ms/step
793.tif
1/1 [==============================] - 0s 31ms/step
1099.tif
1/1 [==============================] - 0s 31ms/step
725.tif
1/1 [==============================] - 0s 30ms/step
332.tif
1/1 [==============================] - 0s 32ms/step
1131.tif
1/1 [==============================] - 0s 30ms/step
544.tif
1/1 [==============================] - 0s 31ms/step
312.tif
1/1 [==============================] - 0s 32ms/step
217.tif
1/1 [==============================] - 0s 32ms/step
693.tif
1/1 [==============================] - 0s 31ms/step
342.tif
1/1 [==============================] - 0s 29ms/step
961.tif
1/1 [==============================] - 0s 31ms/step
108.tif
1/1 [==============================] - 0s 30ms/step
1140.tif
1/1 [==============================] - 0s 31ms/step
1143.tif
1/1 [==============================] - 0s 31ms/step
101.tif
1/1 [==============================] - 0s 30ms/step
170.tif
1/1 [==============================] - 0s 29ms/step
951.tif
1/1 [==============================] - 0s 29ms/step
95.tif
1/1 [==============================] - 0s 29ms/step
969.tif
1/1 [==============================] - 0s 30ms/step
1103.tif
1/1 [==============================] - 0s 30ms/step
418.tif
1/1 [==============================] - 0s 31ms/step
554.tif
1/1 [==============================] - 0s 30ms/step
876.tif
1/1 [==============================] - 0s 30ms/step
749.tif
1/1 [==============================] - 0s 30ms/step
389.tif
1/1 [==============================] - 0s 29ms/step
988.tif
1/1 [==============================] - 0s 30ms/step
165.tif
1/1 [==============================] - 0s 32ms/step
1093.tif
1/1 [==============================] - 0s 29ms/step
54.tif
1/1 [==============================] - 0s 30ms/step
141.tif
1/1 [==============================] - 0s 31ms/step
867.tif
1/1 [==============================] - 0s 31ms/step
507.tif
1/1 [==============================] - 0s 31ms/step
414.tif
1/1 [==============================] - 0s 29ms/step
584.tif
1/1 [==============================] - 0s 31ms/step
754.tif
1/1 [==============================] - 0s 28ms/step
1045.tif
1/1 [==============================] - 0s 30ms/step
868.tif
1/1 [==============================] - 0s 30ms/step
92.tif
1/1 [==============================] - 0s 29ms/step
369.tif
1/1 [==============================] - 0s 32ms/step
288.tif
1/1 [==============================] - 0s 29ms/step
184.tif
1/1 [==============================] - 0s 31ms/step
626.tif
1/1 [==============================] - 0s 30ms/step
4.tif
1/1 [==============================] - 0s 30ms/step
509.tif
1/1 [==============================] - 0s 30ms/step
577.tif
1/1 [==============================] - 0s 30ms/step
15.tif
1/1 [==============================] - 0s 30ms/step
610.tif
1/1 [==============================] - 0s 29ms/step
723.tif
1/1 [==============================] - 0s 29ms/step
795.tif
1/1 [==============================] - 0s 31ms/step
87.tif
1/1 [==============================] - 0s 31ms/step
174.tif
1/1 [==============================] - 0s 32ms/step
1043.tif
1/1 [==============================] - 0s 31ms/step
686.tif
1/1 [==============================] - 0s 31ms/step
854.tif
1/1 [==============================] - 0s 29ms/step
1089.tif
1/1 [==============================] - 0s 30ms/step
407.tif
1/1 [==============================] - 0s 30ms/step
1056.tif
1/1 [==============================] - 0s 33ms/step
116.tif
1/1 [==============================] - 0s 32ms/step
238.tif
1/1 [==============================] - 0s 31ms/step
1150.tif
1/1 [==============================] - 0s 28ms/step
239.tif
1/1 [==============================] - 0s 29ms/step
390.tif
1/1 [==============================] - 0s 30ms/step
676.tif
1/1 [==============================] - 0s 31ms/step
1000.tif
1/1 [==============================] - 0s 32ms/step
1087.tif
1/1 [==============================] - 0s 29ms/step
464.tif
1/1 [==============================] - 0s 31ms/step
1062.tif
1/1 [==============================] - 0s 33ms/step
786.tif
1/1 [==============================] - 0s 29ms/step
1130.tif
1/1 [==============================] - 0s 29ms/step
8.tif
1/1 [==============================] - 0s 30ms/step
1156.tif
1/1 [==============================] - 0s 32ms/step
899.tif
1/1 [==============================] - 0s 32ms/step
310.tif
1/1 [==============================] - 0s 31ms/step
505.tif
1/1 [==============================] - 0s 32ms/step
667.tif
1/1 [==============================] - 0s 33ms/step
956.tif
1/1 [==============================] - 0s 31ms/step
1146.tif
1/1 [==============================] - 0s 29ms/step
741.tif
1/1 [==============================] - 0s 34ms/step
615.tif
1/1 [==============================] - 0s 30ms/step
1027.tif
1/1 [==============================] - 0s 30ms/step
983.tif
1/1 [==============================] - 0s 32ms/step
848.tif
1/1 [==============================] - 0s 31ms/step
301.tif
1/1 [==============================] - 0s 30ms/step
450.tif
1/1 [==============================] - 0s 31ms/step
834.tif
1/1 [==============================] - 0s 32ms/step
838.tif
1/1 [==============================] - 0s 32ms/step
526.tif
1/1 [==============================] - 0s 32ms/step
1187.tif
1/1 [==============================] - 0s 31ms/step
429.tif
1/1 [==============================] - 0s 33ms/step
718.tif
1/1 [==============================] - 0s 32ms/step
975.tif
1/1 [==============================] - 0s 30ms/step
124.tif
1/1 [==============================] - 0s 31ms/step
29.tif
1/1 [==============================] - 0s 32ms/step
549.tif
1/1 [==============================] - 0s 31ms/step
828.tif
1/1 [==============================] - 0s 29ms/step
1180.tif
1/1 [==============================] - 0s 31ms/step
441.tif
1/1 [==============================] - 0s 29ms/step
426.tif
1/1 [==============================] - 0s 31ms/step
248.tif
1/1 [==============================] - 0s 30ms/step
999.tif
1/1 [==============================] - 0s 31ms/step
479.tif
1/1 [==============================] - 0s 29ms/step
653.tif
1/1 [==============================] - 0s 32ms/step
932.tif
1/1 [==============================] - 0s 31ms/step
49.tif
1/1 [==============================] - 0s 31ms/step
594.tif
1/1 [==============================] - 0s 30ms/step
736.tif
1/1 [==============================] - 0s 29ms/step
897.tif
1/1 [==============================] - 0s 30ms/step
503.tif
1/1 [==============================] - 0s 30ms/step
1139.tif
1/1 [==============================] - 0s 29ms/step
1163.tif
1/1 [==============================] - 0s 29ms/step
869.tif
1/1 [==============================] - 0s 32ms/step
571.tif
1/1 [==============================] - 0s 31ms/step
535.tif
1/1 [==============================] - 0s 32ms/step
291.tif
1/1 [==============================] - 0s 30ms/step
100.tif
1/1 [==============================] - 0s 31ms/step
68.tif
1/1 [==============================] - 0s 30ms/step
762.tif
1/1 [==============================] - 0s 32ms/step
364.tif
1/1 [==============================] - 0s 32ms/step
870.tif
1/1 [==============================] - 0s 30ms/step
709.tif
1/1 [==============================] - 0s 31ms/step
733.tif
1/1 [==============================] - 0s 32ms/step
641.tif
1/1 [==============================] - 0s 31ms/step
250.tif
1/1 [==============================] - 0s 30ms/step
335.tif
1/1 [==============================] - 0s 31ms/step
1050.tif
1/1 [==============================] - 0s 31ms/step
115.tif
1/1 [==============================] - 0s 33ms/step
35.tif
1/1 [==============================] - 0s 31ms/step
731.tif
1/1 [==============================] - 0s 33ms/step
790.tif
1/1 [==============================] - 0s 31ms/step
181.tif
1/1 [==============================] - 0s 30ms/step
941.tif
1/1 [==============================] - 0s 30ms/step
460.tif
1/1 [==============================] - 0s 31ms/step
760.tif
1/1 [==============================] - 0s 30ms/step
340.tif
1/1 [==============================] - 0s 29ms/step
1014.tif
1/1 [==============================] - 0s 30ms/step
71.tif
1/1 [==============================] - 0s 30ms/step
371.tif
1/1 [==============================] - 0s 32ms/step
603.tif
1/1 [==============================] - 0s 31ms/step
257.tif
1/1 [==============================] - 0s 30ms/step
276.tif
1/1 [==============================] - 0s 32ms/step
830.tif
1/1 [==============================] - 0s 30ms/step
1054.tif
1/1 [==============================] - 0s 31ms/step
697.tif
1/1 [==============================] - 0s 32ms/step
309.tif
1/1 [==============================] - 0s 30ms/step
550.tif
1/1 [==============================] - 0s 31ms/step
36.tif
1/1 [==============================] - 0s 32ms/step
1052.tif
1/1 [==============================] - 0s 31ms/step
846.tif
1/1 [==============================] - 0s 31ms/step
683.tif
1/1 [==============================] - 0s 30ms/step
845.tif
1/1 [==============================] - 0s 30ms/step
521.tif
1/1 [==============================] - 0s 30ms/step
863.tif
1/1 [==============================] - 0s 34ms/step
328.tif
1/1 [==============================] - 0s 32ms/step
480.tif
1/1 [==============================] - 0s 31ms/step
726.tif
1/1 [==============================] - 0s 39ms/step
1042.tif
1/1 [==============================] - 0s 39ms/step
483.tif
1/1 [==============================] - 0s 35ms/step
1095.tif
1/1 [==============================] - 0s 32ms/step
1179.tif
1/1 [==============================] - 0s 30ms/step
341.tif
1/1 [==============================] - 0s 37ms/step
742.tif
1/1 [==============================] - 0s 38ms/step
765.tif
1/1 [==============================] - 0s 38ms/step
915.tif
1/1 [==============================] - 0s 32ms/step
53.tif
1/1 [==============================] - 0s 38ms/step
752.tif
1/1 [==============================] - 0s 52ms/step
274.tif
1/1 [==============================] - 0s 45ms/step
420.tif
1/1 [==============================] - 0s 38ms/step
814.tif
1/1 [==============================] - 0s 51ms/step
523.tif
1/1 [==============================] - 0s 34ms/step
761.tif
1/1 [==============================] - 0s 50ms/step
6.tif
1/1 [==============================] - 0s 52ms/step
289.tif
1/1 [==============================] - 0s 50ms/step
968.tif
1/1 [==============================] - 0s 43ms/step
304.tif
1/1 [==============================] - 0s 36ms/step
601.tif
1/1 [==============================] - 0s 32ms/step
774.tif
1/1 [==============================] - 0s 34ms/step
1159.tif
1/1 [==============================] - 0s 32ms/step
171.tif
1/1 [==============================] - 0s 33ms/step
1066.tif
1/1 [==============================] - 0s 32ms/step
330.tif
1/1 [==============================] - 0s 32ms/step
245.tif
1/1 [==============================] - 0s 31ms/step
648.tif
1/1 [==============================] - 0s 34ms/step
396.tif
1/1 [==============================] - 0s 35ms/step
955.tif
1/1 [==============================] - 0s 33ms/step
178.tif
1/1 [==============================] - 0s 30ms/step
137.tif
1/1 [==============================] - 0s 33ms/step
190.tif
1/1 [==============================] - 0s 33ms/step
1153.tif
1/1 [==============================] - 0s 32ms/step
150.tif
1/1 [==============================] - 0s 31ms/step
440.tif
1/1 [==============================] - 0s 36ms/step
316.tif
1/1 [==============================] - 0s 32ms/step
1184.tif
1/1 [==============================] - 0s 32ms/step
989.tif
1/1 [==============================] - 0s 32ms/step
244.tif
1/1 [==============================] - 0s 33ms/step
329.tif
1/1 [==============================] - 0s 32ms/step
215.tif
1/1 [==============================] - 0s 30ms/step
964.tif
1/1 [==============================] - 0s 33ms/step
41.tif
1/1 [==============================] - 0s 31ms/step
45.tif
1/1 [==============================] - 0s 36ms/step
1019.tif
1/1 [==============================] - 0s 34ms/step
684.tif
1/1 [==============================] - 0s 30ms/step
500.tif
1/1 [==============================] - 0s 30ms/step
1197.tif
1/1 [==============================] - 0s 37ms/step
204.tif
1/1 [==============================] - 0s 40ms/step
220.tif
1/1 [==============================] - 0s 34ms/step
963.tif
1/1 [==============================] - 0s 33ms/step
902.tif
1/1 [==============================] - 0s 34ms/step
508.tif
1/1 [==============================] - 0s 32ms/step
394.tif
1/1 [==============================] - 0s 33ms/step
640.tif
1/1 [==============================] - 0s 31ms/step
776.tif
1/1 [==============================] - 0s 32ms/step
536.tif
1/1 [==============================] - 0s 33ms/step
665.tif
1/1 [==============================] - 0s 30ms/step
698.tif
1/1 [==============================] - 0s 30ms/step
906.tif
1/1 [==============================] - 0s 32ms/step
882.tif
1/1 [==============================] - 0s 30ms/step
1070.tif
1/1 [==============================] - 0s 33ms/step
829.tif
1/1 [==============================] - 0s 31ms/step
106.tif
1/1 [==============================] - 0s 31ms/step
628.tif
1/1 [==============================] - 0s 31ms/step
708.tif
1/1 [==============================] - 0s 29ms/step
958.tif
1/1 [==============================] - 0s 30ms/step
1005.tif
1/1 [==============================] - 0s 29ms/step
61.tif
1/1 [==============================] - 0s 30ms/step
300.tif
1/1 [==============================] - 0s 30ms/step
476.tif
1/1 [==============================] - 0s 31ms/step
1053.tif
1/1 [==============================] - 0s 30ms/step
362.tif
1/1 [==============================] - 0s 30ms/step
453.tif
1/1 [==============================] - 0s 29ms/step
766.tif
1/1 [==============================] - 0s 31ms/step
195.tif
1/1 [==============================] - 0s 31ms/step
763.tif
1/1 [==============================] - 0s 30ms/step
1122.tif
1/1 [==============================] - 0s 32ms/step
275.tif
1/1 [==============================] - 0s 29ms/step
412.tif
1/1 [==============================] - 0s 31ms/step
631.tif
1/1 [==============================] - 0s 30ms/step
1182.tif
1/1 [==============================] - 0s 30ms/step
1090.tif
1/1 [==============================] - 0s 31ms/step
22.tif
1/1 [==============================] - 0s 31ms/step
634.tif
1/1 [==============================] - 0s 32ms/step
319.tif
1/1 [==============================] - 0s 33ms/step
1092.tif
1/1 [==============================] - 0s 31ms/step
746.tif
1/1 [==============================] - 0s 31ms/step
747.tif
1/1 [==============================] - 0s 30ms/step
1.tif
1/1 [==============================] - 0s 29ms/step
903.tif
1/1 [==============================] - 0s 30ms/step
81.tif
1/1 [==============================] - 0s 29ms/step
562.tif
1/1 [==============================] - 0s 31ms/step
918.tif
1/1 [==============================] - 0s 31ms/step
1024.tif
1/1 [==============================] - 0s 30ms/step
80.tif
1/1 [==============================] - 0s 31ms/step
303.tif
1/1 [==============================] - 0s 31ms/step
90.tif
1/1 [==============================] - 0s 33ms/step
135.tif
1/1 [==============================] - 0s 31ms/step
1111.tif
1/1 [==============================] - 0s 29ms/step
1147.tif
1/1 [==============================] - 0s 30ms/step
265.tif
1/1 [==============================] - 0s 30ms/step
777.tif
1/1 [==============================] - 0s 30ms/step
213.tif
1/1 [==============================] - 0s 30ms/step
931.tif
1/1 [==============================] - 0s 29ms/step
689.tif
1/1 [==============================] - 0s 32ms/step
717.tif
1/1 [==============================] - 0s 31ms/step
131.tif
1/1 [==============================] - 0s 31ms/step
428.tif
1/1 [==============================] - 0s 30ms/step
952.tif
1/1 [==============================] - 0s 32ms/step
1020.tif
1/1 [==============================] - 0s 31ms/step
792.tif
1/1 [==============================] - 0s 30ms/step
1128.tif
1/1 [==============================] - 0s 31ms/step
1185.tif
1/1 [==============================] - 0s 29ms/step
895.tif
1/1 [==============================] - 0s 29ms/step
785.tif
1/1 [==============================] - 0s 32ms/step
759.tif
1/1 [==============================] - 0s 30ms/step
129.tif
1/1 [==============================] - 0s 30ms/step
836.tif
1/1 [==============================] - 0s 31ms/step
891.tif
1/1 [==============================] - 0s 29ms/step
696.tif
1/1 [==============================] - 0s 29ms/step
343.tif
1/1 [==============================] - 0s 29ms/step
724.tif
1/1 [==============================] - 0s 28ms/step
796.tif
1/1 [==============================] - 0s 27ms/step
149.tif
1/1 [==============================] - 0s 30ms/step
79.tif
1/1 [==============================] - 0s 30ms/step
21.tif
1/1 [==============================] - 0s 30ms/step
944.tif
1/1 [==============================] - 0s 30ms/step
148.tif
1/1 [==============================] - 0s 30ms/step
1072.tif
1/1 [==============================] - 0s 30ms/step
1164.tif
1/1 [==============================] - 0s 30ms/step
1069.tif
1/1 [==============================] - 0s 28ms/step
1039.tif
1/1 [==============================] - 0s 29ms/step
1175.tif
1/1 [==============================] - 0s 31ms/step
705.tif
1/1 [==============================] - 0s 32ms/step
910.tif
1/1 [==============================] - 0s 31ms/step
358.tif
1/1 [==============================] - 0s 31ms/step
422.tif
1/1 [==============================] - 0s 30ms/step
1032.tif
1/1 [==============================] - 0s 28ms/step
512.tif
1/1 [==============================] - 0s 30ms/step
347.tif
1/1 [==============================] - 0s 30ms/step
574.tif
1/1 [==============================] - 0s 31ms/step
666.tif
1/1 [==============================] - 0s 31ms/step
455.tif
1/1 [==============================] - 0s 30ms/step
2.tif
1/1 [==============================] - 0s 30ms/step
260.tif
1/1 [==============================] - 0s 30ms/step
771.tif
1/1 [==============================] - 0s 32ms/step
1155.tif
1/1 [==============================] - 0s 30ms/step
151.tif
1/1 [==============================] - 0s 30ms/step
950.tif
1/1 [==============================] - 0s 31ms/step
862.tif
1/1 [==============================] - 0s 31ms/step
1137.tif
1/1 [==============================] - 0s 31ms/step
787.tif
1/1 [==============================] - 0s 33ms/step
259.tif
1/1 [==============================] - 0s 30ms/step
782.tif
1/1 [==============================] - 0s 31ms/step
1125.tif
1/1 [==============================] - 0s 32ms/step
1076.tif
1/1 [==============================] - 0s 31ms/step
378.tif
1/1 [==============================] - 0s 31ms/step
496.tif
1/1 [==============================] - 0s 31ms/step
618.tif
1/1 [==============================] - 0s 32ms/step
187.tif
1/1 [==============================] - 0s 31ms/step
361.tif
1/1 [==============================] - 0s 31ms/step
447.tif
1/1 [==============================] - 0s 31ms/step
339.tif
1/1 [==============================] - 0s 33ms/step
1098.tif
1/1 [==============================] - 0s 31ms/step
325.tif
1/1 [==============================] - 0s 31ms/step
727.tif
1/1 [==============================] - 0s 32ms/step
1149.tif
1/1 [==============================] - 0s 29ms/step
253.tif
1/1 [==============================] - 0s 31ms/step
559.tif
1/1 [==============================] - 0s 30ms/step
913.tif
1/1 [==============================] - 0s 32ms/step
273.tif
1/1 [==============================] - 0s 35ms/step
528.tif
1/1 [==============================] - 0s 32ms/step
114.tif
1/1 [==============================] - 0s 33ms/step
993.tif
1/1 [==============================] - 0s 31ms/step
1161.tif
1/1 [==============================] - 0s 30ms/step
827.tif
1/1 [==============================] - 0s 32ms/step
1041.tif
1/1 [==============================] - 0s 30ms/step
527.tif
1/1 [==============================] - 0s 33ms/step
1021.tif
1/1 [==============================] - 0s 32ms/step
1191.tif
1/1 [==============================] - 0s 31ms/step
1145.tif
1/1 [==============================] - 0s 31ms/step
475.tif
1/1 [==============================] - 0s 30ms/step
1091.tif
1/1 [==============================] - 0s 32ms/step
43.tif
1/1 [==============================] - 0s 30ms/step
515.tif
1/1 [==============================] - 0s 30ms/step
857.tif
1/1 [==============================] - 0s 31ms/step
842.tif
1/1 [==============================] - 0s 32ms/step
197.tif
1/1 [==============================] - 0s 32ms/step
650.tif
1/1 [==============================] - 0s 29ms/step
39.tif
1/1 [==============================] - 0s 31ms/step
1186.tif
1/1 [==============================] - 0s 33ms/step
497.tif
1/1 [==============================] - 0s 30ms/step
1160.tif
1/1 [==============================] - 0s 31ms/step
431.tif
1/1 [==============================] - 0s 30ms/step
729.tif
1/1 [==============================] - 0s 30ms/step
811.tif
1/1 [==============================] - 0s 33ms/step
387.tif
1/1 [==============================] - 0s 33ms/step
122.tif
1/1 [==============================] - 0s 33ms/step
70.tif
1/1 [==============================] - 0s 30ms/step
1176.tif
1/1 [==============================] - 0s 32ms/step
435.tif
1/1 [==============================] - 0s 31ms/step
1028.tif
1/1 [==============================] - 0s 30ms/step
486.tif
1/1 [==============================] - 0s 30ms/step
573.tif
1/1 [==============================] - 0s 32ms/step
472.tif
1/1 [==============================] - 0s 34ms/step
937.tif
1/1 [==============================] - 0s 30ms/step
865.tif
1/1 [==============================] - 0s 32ms/step
977.tif
1/1 [==============================] - 0s 30ms/step
20.tif
1/1 [==============================] - 0s 33ms/step
94.tif
1/1 [==============================] - 0s 30ms/step
156.tif
1/1 [==============================] - 0s 30ms/step
613.tif
1/1 [==============================] - 0s 31ms/step
832.tif
1/1 [==============================] - 0s 30ms/step
269.tif
1/1 [==============================] - 0s 32ms/step
282.tif
1/1 [==============================] - 0s 31ms/step
367.tif
1/1 [==============================] - 0s 31ms/step
911.tif
1/1 [==============================] - 0s 31ms/step
644.tif
1/1 [==============================] - 0s 31ms/step
1048.tif
1/1 [==============================] - 0s 31ms/step
691.tif
1/1 [==============================] - 0s 30ms/step
873.tif
1/1 [==============================] - 0s 29ms/step
203.tif
1/1 [==============================] - 0s 29ms/step
58.tif
1/1 [==============================] - 0s 32ms/step
481.tif
1/1 [==============================] - 0s 33ms/step
461.tif
1/1 [==============================] - 0s 29ms/step
218.tif
1/1 [==============================] - 0s 28ms/step
385.tif
1/1 [==============================] - 0s 30ms/step
985.tif
1/1 [==============================] - 0s 31ms/step
163.tif
1/1 [==============================] - 0s 32ms/step
258.tif
1/1 [==============================] - 0s 31ms/step
227.tif
1/1 [==============================] - 0s 31ms/step
458.tif
1/1 [==============================] - 0s 33ms/step
703.tif
1/1 [==============================] - 0s 33ms/step
946.tif
1/1 [==============================] - 0s 32ms/step
1088.tif
1/1 [==============================] - 0s 31ms/step
318.tif
1/1 [==============================] - 0s 31ms/step
1104.tif
1/1 [==============================] - 0s 34ms/step
73.tif
1/1 [==============================] - 0s 32ms/step
864.tif
1/1 [==============================] - 0s 33ms/step
72.tif
1/1 [==============================] - 0s 33ms/step
740.tif
1/1 [==============================] - 0s 32ms/step
905.tif
1/1 [==============================] - 0s 32ms/step
1033.tif
1/1 [==============================] - 0s 30ms/step
909.tif
1/1 [==============================] - 0s 32ms/step
813.tif
1/1 [==============================] - 0s 31ms/step
413.tif
1/1 [==============================] - 0s 29ms/step
1036.tif
1/1 [==============================] - 0s 32ms/step
193.tif
1/1 [==============================] - 0s 29ms/step
164.tif
1/1 [==============================] - 0s 31ms/step
1067.tif
1/1 [==============================] - 0s 31ms/step
652.tif
1/1 [==============================] - 0s 31ms/step
317.tif
1/1 [==============================] - 0s 30ms/step
839.tif
1/1 [==============================] - 0s 31ms/step
466.tif
1/1 [==============================] - 0s 31ms/step
1169.tif
1/1 [==============================] - 0s 31ms/step
292.tif
1/1 [==============================] - 0s 31ms/step
162.tif
1/1 [==============================] - 0s 33ms/step
938.tif
1/1 [==============================] - 0s 29ms/step
598.tif
1/1 [==============================] - 0s 33ms/step
205.tif
1/1 [==============================] - 0s 29ms/step
425.tif
1/1 [==============================] - 0s 31ms/step
534.tif
1/1 [==============================] - 0s 30ms/step
16.tif
1/1 [==============================] - 0s 30ms/step
1034.tif
1/1 [==============================] - 0s 32ms/step
820.tif
1/1 [==============================] - 0s 30ms/step
712.tif
1/1 [==============================] - 0s 30ms/step
437.tif
1/1 [==============================] - 0s 32ms/step
540.tif
1/1 [==============================] - 0s 32ms/step
393.tif
1/1 [==============================] - 0s 32ms/step
622.tif
1/1 [==============================] - 0s 29ms/step
33.tif
1/1 [==============================] - 0s 32ms/step
331.tif
1/1 [==============================] - 0s 29ms/step
670.tif
1/1 [==============================] - 0s 31ms/step
281.tif
1/1 [==============================] - 0s 31ms/step
840.tif
1/1 [==============================] - 0s 30ms/step
878.tif
1/1 [==============================] - 0s 31ms/step
514.tif
1/1 [==============================] - 0s 33ms/step
994.tif
1/1 [==============================] - 0s 32ms/step
892.tif
1/1 [==============================] - 0s 31ms/step
901.tif
1/1 [==============================] - 0s 31ms/step
397.tif
1/1 [==============================] - 0s 34ms/step
826.tif
1/1 [==============================] - 0s 31ms/step
1170.tif
1/1 [==============================] - 0s 33ms/step
176.tif
1/1 [==============================] - 0s 34ms/step
384.tif
1/1 [==============================] - 0s 31ms/step
154.tif
1/1 [==============================] - 0s 35ms/step
13.tif
1/1 [==============================] - 0s 32ms/step
1144.tif
1/1 [==============================] - 0s 29ms/step
804.tif
1/1 [==============================] - 0s 30ms/step
404.tif
1/1 [==============================] - 0s 31ms/step
1044.tif
1/1 [==============================] - 0s 30ms/step
502.tif
1/1 [==============================] - 0s 32ms/step
177.tif
1/1 [==============================] - 0s 32ms/step
893.tif
1/1 [==============================] - 0s 30ms/step
167.tif
1/1 [==============================] - 0s 32ms/step
664.tif
1/1 [==============================] - 0s 31ms/step
482.tif
1/1 [==============================] - 0s 29ms/step
103.tif
1/1 [==============================] - 0s 30ms/step
588.tif
1/1 [==============================] - 0s 28ms/step
202.tif
1/1 [==============================] - 0s 30ms/step
1063.tif
1/1 [==============================] - 0s 30ms/step
37.tif
1/1 [==============================] - 0s 32ms/step
467.tif
1/1 [==============================] - 0s 30ms/step
14.tif
1/1 [==============================] - 0s 31ms/step
185.tif
1/1 [==============================] - 0s 32ms/step
543.tif
1/1 [==============================] - 0s 31ms/step
223.tif
1/1 [==============================] - 0s 29ms/step
714.tif
1/1 [==============================] - 0s 31ms/step
34.tif
1/1 [==============================] - 0s 30ms/step
586.tif
1/1 [==============================] - 0s 31ms/step
406.tif
1/1 [==============================] - 0s 28ms/step
1016.tif
1/1 [==============================] - 0s 31ms/step
643.tif
1/1 [==============================] - 0s 31ms/step
411.tif
1/1 [==============================] - 0s 32ms/step
442.tif
1/1 [==============================] - 0s 32ms/step
781.tif
1/1 [==============================] - 0s 29ms/step
305.tif
1/1 [==============================] - 0s 31ms/step
661.tif
1/1 [==============================] - 0s 29ms/step
1114.tif
1/1 [==============================] - 0s 29ms/step
1109.tif
1/1 [==============================] - 0s 31ms/step
894.tif
1/1 [==============================] - 0s 29ms/step
568.tif
1/1 [==============================] - 0s 31ms/step
1154.tif
1/1 [==============================] - 0s 30ms/step
478.tif
1/1 [==============================] - 0s 30ms/step
449.tif
1/1 [==============================] - 0s 30ms/step
82.tif
1/1 [==============================] - 0s 29ms/step
200.tif
1/1 [==============================] - 0s 31ms/step
457.tif
1/1 [==============================] - 0s 31ms/step
306.tif
1/1 [==============================] - 0s 30ms/step
1193.tif
1/1 [==============================] - 0s 30ms/step
169.tif
1/1 [==============================] - 0s 32ms/step
1126.tif
1/1 [==============================] - 0s 31ms/step
145.tif
1/1 [==============================] - 0s 30ms/step
333.tif
1/1 [==============================] - 0s 31ms/step
133.tif
1/1 [==============================] - 0s 31ms/step
1059.tif
1/1 [==============================] - 0s 29ms/step
627.tif
1/1 [==============================] - 0s 29ms/step
623.tif
1/1 [==============================] - 0s 30ms/step
713.tif
1/1 [==============================] - 0s 30ms/step
1194.tif
1/1 [==============================] - 0s 33ms/step
405.tif
1/1 [==============================] - 0s 33ms/step
452.tif
1/1 [==============================] - 0s 31ms/step
296.tif
1/1 [==============================] - 0s 28ms/step
531.tif
1/1 [==============================] - 0s 31ms/step
1068.tif
1/1 [==============================] - 0s 30ms/step
1195.tif
1/1 [==============================] - 0s 31ms/step
1075.tif
1/1 [==============================] - 0s 30ms/step
888.tif
1/1 [==============================] - 0s 30ms/step
516.tif
1/1 [==============================] - 0s 32ms/step
48.tif
1/1 [==============================] - 0s 30ms/step
374.tif
1/1 [==============================] - 0s 29ms/step
471.tif
1/1 [==============================] - 0s 32ms/step
211.tif
1/1 [==============================] - 0s 29ms/step
400.tif
1/1 [==============================] - 0s 30ms/step
3.tif
1/1 [==============================] - 0s 30ms/step
206.tif
1/1 [==============================] - 0s 30ms/step
263.tif
1/1 [==============================] - 0s 30ms/step
801.tif
1/1 [==============================] - 0s 31ms/step
242.tif
1/1 [==============================] - 0s 31ms/step
417.tif
1/1 [==============================] - 0s 29ms/step
271.tif
1/1 [==============================] - 0s 32ms/step
597.tif
1/1 [==============================] - 0s 30ms/step
757.tif
1/1 [==============================] - 0s 29ms/step
433.tif
1/1 [==============================] - 0s 30ms/step
230.tif
1/1 [==============================] - 0s 29ms/step
564.tif
1/1 [==============================] - 0s 31ms/step
881.tif
1/1 [==============================] - 0s 32ms/step
89.tif
1/1 [==============================] - 0s 31ms/step
1037.tif
1/1 [==============================] - 0s 30ms/step
659.tif
1/1 [==============================] - 0s 30ms/step
18.tif
1/1 [==============================] - 0s 31ms/step
566.tif
1/1 [==============================] - 0s 31ms/step
501.tif
1/1 [==============================] - 0s 30ms/step
487.tif
1/1 [==============================] - 0s 30ms/step
750.tif
1/1 [==============================] - 0s 29ms/step
235.tif
1/1 [==============================] - 0s 32ms/step
254.tif
1/1 [==============================] - 0s 36ms/step
784.tif
1/1 [==============================] - 0s 36ms/step
126.tif
1/1 [==============================] - 0s 37ms/step
596.tif
1/1 [==============================] - 0s 37ms/step
1118.tif
1/1 [==============================] - 0s 38ms/step
1097.tif
1/1 [==============================] - 0s 35ms/step
360.tif
1/1 [==============================] - 0s 38ms/step
214.tif
1/1 [==============================] - 0s 39ms/step
488.tif
1/1 [==============================] - 0s 39ms/step
1003.tif
1/1 [==============================] - 0s 39ms/step
769.tif
1/1 [==============================] - 0s 38ms/step
710.tif
1/1 [==============================] - 0s 37ms/step
105.tif
1/1 [==============================] - 0s 39ms/step
109.tif
1/1 [==============================] - 0s 40ms/step
959.tif
1/1 [==============================] - 0s 40ms/step
611.tif
1/1 [==============================] - 0s 39ms/step
547.tif
1/1 [==============================] - 0s 40ms/step
822.tif
1/1 [==============================] - 0s 39ms/step
17.tif
1/1 [==============================] - 0s 39ms/step
947.tif
1/1 [==============================] - 0s 36ms/step
26.tif
1/1 [==============================] - 0s 38ms/step
775.tif
1/1 [==============================] - 0s 35ms/step
819.tif
1/1 [==============================] - 0s 38ms/step
1002.tif
1/1 [==============================] - 0s 37ms/step
942.tif
1/1 [==============================] - 0s 40ms/step
1057.tif
1/1 [==============================] - 0s 38ms/step
553.tif
1/1 [==============================] - 0s 40ms/step
380.tif
1/1 [==============================] - 0s 36ms/step
179.tif
1/1 [==============================] - 0s 38ms/step
1181.tif
1/1 [==============================] - 0s 38ms/step
85.tif
1/1 [==============================] - 0s 39ms/step
1129.tif
1/1 [==============================] - 0s 38ms/step
144.tif
1/1 [==============================] - 0s 36ms/step
268.tif
1/1 [==============================] - 0s 38ms/step
651.tif
1/1 [==============================] - 0s 40ms/step
1051.tif
1/1 [==============================] - 0s 42ms/step
1189.tif
1/1 [==============================] - 0s 37ms/step
642.tif
1/1 [==============================] - 0s 40ms/step
280.tif
1/1 [==============================] - 0s 37ms/step
744.tif
1/1 [==============================] - 0s 40ms/step
375.tif
1/1 [==============================] - 0s 40ms/step
1023.tif
1/1 [==============================] - 0s 38ms/step
438.tif
1/1 [==============================] - 0s 39ms/step
322.tif
1/1 [==============================] - 0s 40ms/step
173.tif
1/1 [==============================] - 0s 37ms/step
847.tif
1/1 [==============================] - 0s 37ms/step
219.tif
1/1 [==============================] - 0s 36ms/step
658.tif
1/1 [==============================] - 0s 40ms/step
372.tif
1/1 [==============================] - 0s 35ms/step
945.tif
1/1 [==============================] - 0s 35ms/step
50.tif
1/1 [==============================] - 0s 38ms/step
805.tif
1/1 [==============================] - 0s 35ms/step
128.tif
1/1 [==============================] - 0s 38ms/step
207.tif
1/1 [==============================] - 0s 34ms/step
904.tif
1/1 [==============================] - 0s 36ms/step
136.tif
1/1 [==============================] - 0s 35ms/step
758.tif
1/1 [==============================] - 0s 35ms/step
884.tif
1/1 [==============================] - 0s 36ms/step
636.tif
1/1 [==============================] - 0s 33ms/step
595.tif
1/1 [==============================] - 0s 37ms/step
880.tif
1/1 [==============================] - 0s 35ms/step
30.tif
1/1 [==============================] - 0s 35ms/step
1055.tif
1/1 [==============================] - 0s 34ms/step
55.tif
1/1 [==============================] - 0s 36ms/step
555.tif
1/1 [==============================] - 0s 36ms/step
249.tif
1/1 [==============================] - 0s 35ms/step
198.tif
1/1 [==============================] - 0s 34ms/step
160.tif
1/1 [==============================] - 0s 34ms/step
19.tif
1/1 [==============================] - 0s 37ms/step
391.tif
1/1 [==============================] - 0s 37ms/step
1080.tif
1/1 [==============================] - 0s 36ms/step
27.tif
1/1 [==============================] - 0s 38ms/step
798.tif
1/1 [==============================] - 0s 37ms/step
921.tif
1/1 [==============================] - 0s 36ms/step
883.tif
1/1 [==============================] - 0s 34ms/step
852.tif
1/1 [==============================] - 0s 36ms/step
675.tif
1/1 [==============================] - 0s 36ms/step
410.tif
1/1 [==============================] - 0s 35ms/step
134.tif
1/1 [==============================] - 0s 38ms/step
323.tif
1/1 [==============================] - 0s 38ms/step
57.tif
1/1 [==============================] - 0s 36ms/step
885.tif
1/1 [==============================] - 0s 34ms/step
773.tif
1/1 [==============================] - 0s 38ms/step
835.tif
1/1 [==============================] - 0s 35ms/step
756.tif
1/1 [==============================] - 0s 34ms/step
1031.tif
1/1 [==============================] - 0s 30ms/step
539.tif
1/1 [==============================] - 0s 34ms/step
1183.tif
1/1 [==============================] - 0s 31ms/step
1127.tif
1/1 [==============================] - 0s 37ms/step
315.tif
1/1 [==============================] - 0s 34ms/step
872.tif
1/1 [==============================] - 0s 34ms/step
302.tif
1/1 [==============================] - 0s 35ms/step
922.tif
1/1 [==============================] - 0s 31ms/step
225.tif
1/1 [==============================] - 0s 32ms/step
1013.tif
1/1 [==============================] - 0s 30ms/step
392.tif
1/1 [==============================] - 0s 32ms/step
1124.tif
1/1 [==============================] - 0s 32ms/step
1065.tif
1/1 [==============================] - 0s 34ms/step
182.tif
1/1 [==============================] - 0s 35ms/step
898.tif
1/1 [==============================] - 0s 31ms/step
609.tif
1/1 [==============================] - 0s 34ms/step
67.tif
1/1 [==============================] - 0s 30ms/step
692.tif
1/1 [==============================] - 0s 32ms/step
1101.tif
1/1 [==============================] - 0s 31ms/step
66.tif
1/1 [==============================] - 0s 30ms/step
199.tif
1/1 [==============================] - 0s 31ms/step
600.tif
1/1 [==============================] - 0s 31ms/step
59.tif
1/1 [==============================] - 0s 33ms/step
175.tif
1/1 [==============================] - 0s 32ms/step
112.tif
1/1 [==============================] - 0s 29ms/step
69.tif
1/1 [==============================] - 0s 31ms/step
990.tif
1/1 [==============================] - 0s 29ms/step
695.tif
1/1 [==============================] - 0s 32ms/step
78.tif
1/1 [==============================] - 0s 28ms/step
1082.tif
1/1 [==============================] - 0s 28ms/step
216.tif
1/1 [==============================] - 0s 33ms/step
809.tif
1/1 [==============================] - 0s 32ms/step
1188.tif
1/1 [==============================] - 0s 30ms/step
998.tif
1/1 [==============================] - 0s 30ms/step
382.tif
1/1 [==============================] - 0s 29ms/step
1078.tif
1/1 [==============================] - 0s 29ms/step
161.tif
1/1 [==============================] - 0s 30ms/step
373.tif
1/1 [==============================] - 0s 31ms/step
859.tif
1/1 [==============================] - 0s 29ms/step
860.tif
1/1 [==============================] - 0s 33ms/step
1038.tif
1/1 [==============================] - 0s 31ms/step
620.tif
1/1 [==============================] - 0s 29ms/step
117.tif
1/1 [==============================] - 0s 29ms/step
432.tif
1/1 [==============================] - 0s 29ms/step
349.tif
1/1 [==============================] - 0s 31ms/step
821.tif
1/1 [==============================] - 0s 29ms/step
1117.tif
1/1 [==============================] - 0s 30ms/step
270.tif
1/1 [==============================] - 0s 32ms/step
1071.tif
1/1 [==============================] - 0s 31ms/step
491.tif
1/1 [==============================] - 0s 31ms/step
576.tif
1/1 [==============================] - 0s 29ms/step
817.tif
1/1 [==============================] - 0s 31ms/step
1009.tif
1/1 [==============================] - 0s 30ms/step
1171.tif
1/1 [==============================] - 0s 30ms/step
1132.tif
1/1 [==============================] - 0s 29ms/step
625.tif
1/1 [==============================] - 0s 31ms/step
532.tif
1/1 [==============================] - 0s 31ms/step
365.tif
1/1 [==============================] - 0s 32ms/step
748.tif
1/1 [==============================] - 0s 31ms/step
926.tif
1/1 [==============================] - 0s 29ms/step
806.tif
1/1 [==============================] - 0s 30ms/step
23.tif
1/1 [==============================] - 0s 33ms/step
62.tif
1/1 [==============================] - 0s 31ms/step
1081.tif
1/1 [==============================] - 0s 34ms/step
545.tif
1/1 [==============================] - 0s 35ms/step
1025.tif
1/1 [==============================] - 0s 32ms/step
188.tif
1/1 [==============================] - 0s 34ms/step
454.tif
1/1 [==============================] - 0s 32ms/step
1022.tif
1/1 [==============================] - 0s 31ms/step
196.tif
1/1 [==============================] - 0s 29ms/step
74.tif
1/1 [==============================] - 0s 29ms/step
663.tif
1/1 [==============================] - 0s 32ms/step
272.tif
1/1 [==============================] - 0s 31ms/step
907.tif
1/1 [==============================] - 0s 30ms/step
954.tif
1/1 [==============================] - 0s 31ms/step
541.tif
1/1 [==============================] - 0s 30ms/step
682.tif
1/1 [==============================] - 0s 31ms/step
118.tif
1/1 [==============================] - 0s 30ms/step
1001.tif
1/1 [==============================] - 0s 33ms/step
506.tif
1/1 [==============================] - 0s 30ms/step
473.tif
1/1 [==============================] - 0s 33ms/step
778.tif
1/1 [==============================] - 0s 31ms/step
970.tif
1/1 [==============================] - 0s 30ms/step
673.tif
1/1 [==============================] - 0s 32ms/step
1058.tif
1/1 [==============================] - 0s 33ms/step
1168.tif
1/1 [==============================] - 0s 34ms/step
1121.tif
1/1 [==============================] - 0s 32ms/step
818.tif
1/1 [==============================] - 0s 31ms/step
1029.tif
1/1 [==============================] - 0s 32ms/step
399.tif
1/1 [==============================] - 0s 33ms/step
403.tif
1/1 [==============================] - 0s 32ms/step
960.tif
1/1 [==============================] - 0s 32ms/step
995.tif
1/1 [==============================] - 0s 31ms/step
914.tif
1/1 [==============================] - 0s 31ms/step
308.tif
1/1 [==============================] - 0s 30ms/step
1141.tif
1/1 [==============================] - 0s 30ms/step
871.tif
1/1 [==============================] - 0s 30ms/step
157.tif
1/1 [==============================] - 0s 28ms/step
111.tif
1/1 [==============================] - 0s 30ms/step
700.tif
1/1 [==============================] - 0s 30ms/step
794.tif
1/1 [==============================] - 0s 30ms/step
877.tif
1/1 [==============================] - 0s 29ms/step
992.tif
1/1 [==============================] - 0s 31ms/step
233.tif
1/1 [==============================] - 0s 29ms/step
352.tif
1/1 [==============================] - 0s 30ms/step
430.tif
1/1 [==============================] - 0s 30ms/step
525.tif
1/1 [==============================] - 0s 30ms/step
350.tif
1/1 [==============================] - 0s 31ms/step
1074.tif
1/1 [==============================] - 0s 30ms/step
324.tif
1/1 [==============================] - 0s 31ms/step
612.tif
1/1 [==============================] - 0s 31ms/step
671.tif
1/1 [==============================] - 0s 30ms/step
889.tif
1/1 [==============================] - 0s 31ms/step
660.tif
1/1 [==============================] - 0s 31ms/step
379.tif
1/1 [==============================] - 0s 31ms/step
12.tif
1/1 [==============================] - 0s 30ms/step
1152.tif
1/1 [==============================] - 0s 29ms/step
1113.tif
1/1 [==============================] - 0s 31ms/step
556.tif
1/1 [==============================] - 0s 30ms/step
585.tif
1/1 [==============================] - 0s 34ms/step
991.tif
1/1 [==============================] - 0s 31ms/step
232.tif
1/1 [==============================] - 0s 31ms/step
510.tif
1/1 [==============================] - 0s 32ms/step
356.tif
1/1 [==============================] - 0s 31ms/step
346.tif
1/1 [==============================] - 0s 31ms/step
427.tif
1/1 [==============================] - 0s 30ms/step
567.tif
1/1 [==============================] - 0s 31ms/step
925.tif
1/1 [==============================] - 0s 31ms/step
738.tif
1/1 [==============================] - 0s 33ms/step
1110.tif
1/1 [==============================] - 0s 32ms/step
337.tif
1/1 [==============================] - 0s 32ms/step
930.tif
1/1 [==============================] - 0s 31ms/step
976.tif
1/1 [==============================] - 0s 30ms/step
542.tif
1/1 [==============================] - 0s 31ms/step
1190.tif
1/1 [==============================] - 0s 32ms/step
284.tif
1/1 [==============================] - 0s 34ms/step
5.tif
1/1 [==============================] - 0s 33ms/step
370.tif
1/1 [==============================] - 0s 34ms/step
231.tif
1/1 [==============================] - 0s 33ms/step
1167.tif
1/1 [==============================] - 0s 32ms/step
716.tif
1/1 [==============================] - 0s 31ms/step
560.tif
1/1 [==============================] - 0s 31ms/step
694.tif
1/1 [==============================] - 0s 31ms/step
578.tif
1/1 [==============================] - 0s 31ms/step
63.tif
1/1 [==============================] - 0s 30ms/step
446.tif
1/1 [==============================] - 0s 32ms/step
966.tif
1/1 [==============================] - 0s 34ms/step
605.tif
1/1 [==============================] - 0s 30ms/step
11.tif
1/1 [==============================] - 0s 31ms/step
96.tif
1/1 [==============================] - 0s 30ms/step
194.tif
1/1 [==============================] - 0s 30ms/step
83.tif
1/1 [==============================] - 0s 29ms/step
110.tif
1/1 [==============================] - 0s 30ms/step
935.tif
1/1 [==============================] - 0s 30ms/step
191.tif
1/1 [==============================] - 0s 28ms/step
1115.tif
1/1 [==============================] - 0s 31ms/step
155.tif
1/1 [==============================] - 0s 29ms/step
201.tif
1/1 [==============================] - 0s 30ms/step
751.tif
1/1 [==============================] - 0s 29ms/step
470.tif
1/1 [==============================] - 0s 31ms/step
1012.tif
1/1 [==============================] - 0s 30ms/step
706.tif
1/1 [==============================] - 0s 31ms/step
51.tif
1/1 [==============================] - 0s 33ms/step
64.tif
1/1 [==============================] - 0s 31ms/step
875.tif
1/1 [==============================] - 0s 32ms/step
277.tif
1/1 [==============================] - 0s 33ms/step
699.tif
1/1 [==============================] - 0s 30ms/step
701.tif
1/1 [==============================] - 0s 31ms/step
783.tif
1/1 [==============================] - 0s 32ms/step
565.tif
1/1 [==============================] - 0s 30ms/step
180.tif
1/1 [==============================] - 0s 30ms/step
575.tif
1/1 [==============================] - 0s 29ms/step
879.tif
1/1 [==============================] - 0s 29ms/step
890.tif
1/1 [==============================] - 0s 32ms/step
1133.tif
1/1 [==============================] - 0s 31ms/step
424.tif
1/1 [==============================] - 0s 30ms/step
674.tif
1/1 [==============================] - 0s 29ms/step
668.tif
1/1 [==============================] - 0s 31ms/step
183.tif
1/1 [==============================] - 0s 32ms/step
102.tif
1/1 [==============================] - 0s 30ms/step
1107.tif
1/1 [==============================] - 0s 31ms/step
887.tif
1/1 [==============================] - 0s 27ms/step
368.tif
1/1 [==============================] - 0s 32ms/step
421.tif
1/1 [==============================] - 0s 32ms/step
237.tif
1/1 [==============================] - 0s 31ms/step
1026.tif
1/1 [==============================] - 0s 30ms/step
833.tif
1/1 [==============================] - 0s 31ms/step
629.tif
1/1 [==============================] - 0s 33ms/step
495.tif
1/1 [==============================] - 0s 33ms/step
7.tif
1/1 [==============================] - 0s 34ms/step
334.tif
1/1 [==============================] - 0s 31ms/step
77.tif
1/1 [==============================] - 0s 32ms/step
381.tif
1/1 [==============================] - 0s 32ms/step
383.tif
1/1 [==============================] - 0s 31ms/step
146.tif
1/1 [==============================] - 0s 30ms/step
581.tif
1/1 [==============================] - 0s 29ms/step
1102.tif
1/1 [==============================] - 0s 31ms/step
31.tif
1/1 [==============================] - 0s 31ms/step
927.tif
1/1 [==============================] - 0s 32ms/step
113.tif
1/1 [==============================] - 0s 30ms/step
583.tif
1/1 [==============================] - 0s 34ms/step
226.tif
1/1 [==============================] - 0s 31ms/step
363.tif
1/1 [==============================] - 0s 34ms/step
934.tif
1/1 [==============================] - 0s 31ms/step
590.tif
1/1 [==============================] - 0s 31ms/step
533.tif
1/1 [==============================] - 0s 30ms/step
1105.tif
1/1 [==============================] - 0s 30ms/step
28.tif
1/1 [==============================] - 0s 32ms/step
1061.tif
1/1 [==============================] - 0s 33ms/step
44.tif
1/1 [==============================] - 0s 32ms/step
587.tif
1/1 [==============================] - 0s 31ms/step
1100.tif
1/1 [==============================] - 0s 30ms/step
297.tif
1/1 [==============================] - 0s 33ms/step
251.tif

After predicting the images, we will mosaic the resulting masks into a single .tif file:

In [ ]:
out_fp = r"/content/Pred_mosaic.tif"
In [ ]:
images_files = [f for f in os.listdir(path_exp)]
print(images_files)
['Pred_764.tif', 'Pred_1167.tif', 'Pred_434.tif', 'Pred_700.tif', 'Pred_79.tif', 'Pred_548.tif', 'Pred_778.tif', 'Pred_499.tif', 'Pred_199.tif', 'Pred_992.tif', 'Pred_506.tif', 'Pred_108.tif', 'Pred_186.tif', 'Pred_101.tif', 'Pred_797.tif', 'Pred_1021.tif', 'Pred_848.tif', 'Pred_396.tif', 'Pred_891.tif', 'Pred_52.tif', 'Pred_329.tif', 'Pred_30.tif', 'Pred_720.tif', 'Pred_675.tif', 'Pred_896.tif', 'Pred_1159.tif', 'Pred_602.tif', 'Pred_209.tif', 'Pred_1005.tif', 'Pred_4.tif', 'Pred_902.tif', 'Pred_243.tif', 'Pred_605.tif', 'Pred_776.tif', 'Pred_367.tif', 'Pred_683.tif', 'Pred_598.tif', 'Pred_1132.tif', 'Pred_159.tif', 'Pred_875.tif', 'Pred_348.tif', 'Pred_494.tif', 'Pred_571.tif', 'Pred_450.tif', 'Pred_238.tif', 'Pred_948.tif', 'Pred_76.tif', 'Pred_301.tif', 'Pred_333.tif', 'Pred_308.tif', 'Pred_435.tif', 'Pred_217.tif', 'Pred_206.tif', 'Pred_1170.tif', 'Pred_43.tif', 'Pred_211.tif', 'Pred_962.tif', 'Pred_922.tif', 'Pred_1099.tif', 'Pred_1139.tif', 'Pred_175.tif', 'Pred_654.tif', 'Pred_786.tif', 'Pred_193.tif', 'Pred_1047.tif', 'Pred_1066.tif', 'Pred_275.tif', 'Pred_770.tif', 'Pred_381.tif', 'Pred_714.tif', 'Pred_997.tif', 'Pred_472.tif', 'Pred_315.tif', 'Pred_459.tif', 'Pred_283.tif', 'Pred_181.tif', 'Pred_565.tif', 'Pred_97.tif', 'Pred_562.tif', 'Pred_58.tif', 'Pred_1084.tif', 'Pred_1008.tif', 'Pred_668.tif', 'Pred_7.tif', 'Pred_620.tif', 'Pred_856.tif', 'Pred_431.tif', 'Pred_595.tif', 'Pred_660.tif', 'Pred_1104.tif', 'Pred_735.tif', 'Pred_727.tif', 'Pred_1111.tif', 'Pred_733.tif', 'Pred_1086.tif', 'Pred_218.tif', 'Pred_954.tif', 'Pred_188.tif', 'Pred_252.tif', 'Pred_607.tif', 'Pred_1077.tif', 'Pred_807.tif', 'Pred_372.tif', 'Pred_959.tif', 'Pred_547.tif', 'Pred_1043.tif', 'Pred_1060.tif', 'Pred_905.tif', 'Pred_853.tif', 'Pred_798.tif', 'Pred_11.tif', 'Pred_642.tif', 'Pred_104.tif', 'Pred_385.tif', 'Pred_692.tif', 'Pred_302.tif', 'Pred_36.tif', 'Pred_1097.tif', 'Pred_204.tif', 'Pred_6.tif', 'Pred_328.tif', 'Pred_1071.tif', 'Pred_214.tif', 'Pred_443.tif', 'Pred_237.tif', 'Pred_1187.tif', 'Pred_71.tif', 'Pred_567.tif', 'Pred_260.tif', 'Pred_1127.tif', 'Pred_38.tif', 'Pred_317.tif', 'Pred_1017.tif', 'Pred_939.tif', 'Pred_1101.tif', 'Pred_1125.tif', 'Pred_89.tif', 'Pred_75.tif', 'Pred_1078.tif', 'Pred_643.tif', 'Pred_1087.tif', 'Pred_947.tif', 'Pred_336.tif', 'Pred_883.tif', 'Pred_201.tif', 'Pred_626.tif', 'Pred_284.tif', 'Pred_486.tif', 'Pred_130.tif', 'Pred_180.tif', 'Pred_326.tif', 'Pred_979.tif', 'Pred_657.tif', 'Pred_519.tif', 'Pred_1165.tif', 'Pred_305.tif', 'Pred_1175.tif', 'Pred_1025.tif', 'Pred_1026.tif', 'Pred_137.tif', 'Pred_531.tif', 'Pred_444.tif', 'Pred_1002.tif', 'Pred_511.tif', 'Pred_831.tif', 'Pred_1004.tif', 'Pred_324.tif', 'Pred_387.tif', 'Pred_49.tif', 'Pred_665.tif', 'Pred_1069.tif', 'Pred_170.tif', 'Pred_314.tif', 'Pred_887.tif', 'Pred_1093.tif', 'Pred_528.tif', 'Pred_1107.tif', 'Pred_244.tif', 'Pred_810.tif', 'Pred_409.tif', 'Pred_847.tif', 'Pred_1158.tif', 'Pred_1076.tif', 'Pred_92.tif', 'Pred_661.tif', 'Pred_771.tif', 'Pred_343.tif', 'Pred_155.tif', 'Pred_81.tif', 'Pred_1081.tif', 'Pred_357.tif', 'Pred_374.tif', 'Pred_710.tif', 'Pred_1146.tif', 'Pred_57.tif', 'Pred_977.tif', 'Pred_876.tif', 'Pred_1041.tif', 'Pred_677.tif', 'Pred_757.tif', 'Pred_802.tif', 'Pred_158.tif', 'Pred_351.tif', 'Pred_232.tif', 'Pred_588.tif', 'Pred_345.tif', 'Pred_1029.tif', 'Pred_882.tif', 'Pred_207.tif', 'Pred_1044.tif', 'Pred_1122.tif', 'Pred_22.tif', 'Pred_93.tif', 'Pred_940.tif', 'Pred_832.tif', 'Pred_26.tif', 'Pred_917.tif', 'Pred_1128.tif', 'Pred_334.tif', 'Pred_774.tif', 'Pred_629.tif', 'Pred_975.tif', 'Pred_412.tif', 'Pred_213.tif', 'Pred_391.tif', 'Pred_291.tif', 'Pred_919.tif', 'Pred_895.tif', 'Pred_280.tif', 'Pred_355.tif', 'Pred_156.tif', 'Pred_577.tif', 'Pred_952.tif', 'Pred_316.tif', 'Pred_811.tif', 'Pred_795.tif', 'Pred_480.tif', 'Pred_1068.tif', 'Pred_191.tif', 'Pred_1194.tif', 'Pred_383.tif', 'Pred_742.tif', 'Pred_1006.tif', 'Pred_1137.tif', 'Pred_649.tif', 'Pred_380.tif', 'Pred_1112.tif', 'Pred_1094.tif', 'Pred_98.tif', 'Pred_747.tif', 'Pred_352.tif', 'Pred_56.tif', 'Pred_906.tif', 'Pred_1126.tif', 'Pred_344.tif', 'Pred_737.tif', 'Pred_163.tif', 'Pred_945.tif', 'Pred_769.tif', 'Pred_857.tif', 'Pred_151.tif', 'Pred_231.tif', 'Pred_95.tif', 'Pred_716.tif', 'Pred_559.tif', 'Pred_871.tif', 'Pred_1033.tif', 'Pred_542.tif', 'Pred_1143.tif', 'Pred_942.tif', 'Pred_676.tif', 'Pred_78.tif', 'Pred_473.tif', 'Pred_179.tif', 'Pred_982.tif', 'Pred_1049.tif', 'Pred_701.tif', 'Pred_510.tif', 'Pred_814.tif', 'Pred_698.tif', 'Pred_481.tif', 'Pred_221.tif', 'Pred_17.tif', 'Pred_845.tif', 'Pred_1031.tif', 'Pred_581.tif', 'Pred_637.tif', 'Pred_1103.tif', 'Pred_178.tif', 'Pred_1196.tif', 'Pred_622.tif', 'Pred_288.tif', 'Pred_161.tif', 'Pred_594.tif', 'Pred_624.tif', 'Pred_378.tif', 'Pred_680.tif', 'Pred_1042.tif', 'Pred_775.tif', 'Pred_1151.tif', 'Pred_1062.tif', 'Pred_907.tif', 'Pred_985.tif', 'Pred_756.tif', 'Pred_402.tif', 'Pred_938.tif', 'Pred_840.tif', 'Pred_177.tif', 'Pred_167.tif', 'Pred_860.tif', 'Pred_1193.tif', 'Pred_1003.tif', 'Pred_825.tif', 'Pred_564.tif', 'Pred_1142.tif', 'Pred_1110.tif', 'Pred_909.tif', 'Pred_709.tif', 'Pred_41.tif', 'Pred_646.tif', 'Pred_503.tif', 'Pred_169.tif', 'Pred_353.tif', 'Pred_1035.tif', 'Pred_560.tif', 'Pred_592.tif', 'Pred_446.tif', 'Pred_298.tif', 'Pred_644.tif', 'Pred_392.tif', 'Pred_1116.tif', 'Pred_358.tif', 'Pred_399.tif', 'Pred_445.tif', 'Pred_870.tif', 'Pred_530.tif', 'Pred_1130.tif', 'Pred_1015.tif', 'Pred_142.tif', 'Pred_339.tif', 'Pred_785.tif', 'Pred_46.tif', 'Pred_479.tif', 'Pred_1024.tif', 'Pred_768.tif', 'Pred_838.tif', 'Pred_262.tif', 'Pred_1184.tif', 'Pred_690.tif', 'Pred_971.tif', 'Pred_73.tif', 'Pred_63.tif', 'Pred_449.tif', 'Pred_1176.tif', 'Pred_617.tif', 'Pred_684.tif', 'Pred_761.tif', 'Pred_1095.tif', 'Pred_133.tif', 'Pred_613.tif', 'Pred_890.tif', 'Pred_570.tif', 'Pred_363.tif', 'Pred_576.tif', 'Pred_109.tif', 'Pred_833.tif', 'Pred_421.tif', 'Pred_85.tif', 'Pred_240.tif', 'Pred_388.tif', 'Pred_184.tif', 'Pred_422.tif', 'Pred_726.tif', 'Pred_1067.tif', 'Pred_223.tif', 'Pred_937.tif', 'Pred_212.tif', 'Pred_1046.tif', 'Pred_405.tif', 'Pred_470.tif', 'Pred_332.tif', 'Pred_1007.tif', 'Pred_498.tif', 'Pred_767.tif', 'Pred_903.tif', 'Pred_323.tif', 'Pred_1091.tif', 'Pred_50.tif', 'Pred_976.tif', 'Pred_981.tif', 'Pred_8.tif', 'Pred_719.tif', 'Pred_292.tif', 'Pred_805.tif', 'Pred_877.tif', 'Pred_651.tif', 'Pred_844.tif', 'Pred_475.tif', 'Pred_689.tif', 'Pred_911.tif', 'Pred_648.tif', 'Pred_1070.tif', 'Pred_257.tif', 'Pred_222.tif', 'Pred_471.tif', 'Pred_812.tif', 'Pred_557.tif', 'Pred_921.tif', 'Pred_965.tif', 'Pred_718.tif', 'Pred_509.tif', 'Pred_256.tif', 'Pred_604.tif', 'Pred_1182.tif', 'Pred_712.tif', 'Pred_616.tif', 'Pred_115.tif', 'Pred_39.tif', 'Pred_1140.tif', 'Pred_210.tif', 'Pred_67.tif', 'Pred_194.tif', 'Pred_968.tif', 'Pred_462.tif', 'Pred_105.tif', 'Pred_1064.tif', 'Pred_293.tif', 'Pred_368.tif', 'Pred_241.tif', 'Pred_666.tif', 'Pred_932.tif', 'Pred_609.tif', 'Pred_303.tif', 'Pred_707.tif', 'Pred_414.tif', 'Pred_70.tif', 'Pred_566.tif', 'Pred_1038.tif', 'Pred_14.tif', 'Pred_730.tif', 'Pred_474.tif', 'Pred_881.tif', 'Pred_615.tif', 'Pred_281.tif', 'Pred_360.tif', 'Pred_880.tif', 'Pred_885.tif', 'Pred_23.tif', 'Pred_1032.tif', 'Pred_145.tif', 'Pred_94.tif', 'Pred_1148.tif', 'Pred_523.tif', 'Pred_37.tif', 'Pred_998.tif', 'Pred_107.tif', 'Pred_197.tif', 'Pred_362.tif', 'Pred_134.tif', 'Pred_868.tif', 'Pred_174.tif', 'Pred_1100.tif', 'Pred_233.tif', 'Pred_321.tif', 'Pred_171.tif', 'Pred_960.tif', 'Pred_1166.tif', 'Pred_185.tif', 'Pred_116.tif', 'Pred_653.tif', 'Pred_550.tif', 'Pred_970.tif', 'Pred_157.tif', 'Pred_872.tif', 'Pred_740.tif', 'Pred_924.tif', 'Pred_48.tif', 'Pred_790.tif', 'Pred_394.tif', 'Pred_678.tif', 'Pred_889.tif', 'Pred_861.tif', 'Pred_27.tif', 'Pred_627.tif', 'Pred_248.tif', 'Pred_899.tif', 'Pred_68.tif', 'Pred_482.tif', 'Pred_575.tif', 'Pred_1022.tif', 'Pred_647.tif', 'Pred_1051.tif', 'Pred_621.tif', 'Pred_1109.tif', 'Pred_850.tif', 'Pred_195.tif', 'Pred_277.tif', 'Pred_1135.tif', 'Pred_454.tif', 'Pred_419.tif', 'Pred_664.tif', 'Pred_113.tif', 'Pred_346.tif', 'Pred_330.tif', 'Pred_610.tif', 'Pred_507.tif', 'Pred_1102.tif', 'Pred_751.tif', 'Pred_1136.tif', 'Pred_120.tif', 'Pred_420.tif', 'Pred_813.tif', 'Pred_236.tif', 'Pred_760.tif', 'Pred_477.tif', 'Pred_650.tif', 'Pred_512.tif', 'Pred_583.tif', 'Pred_640.tif', 'Pred_219.tif', 'Pred_572.tif', 'Pred_708.tif', 'Pred_693.tif', 'Pred_127.tif', 'Pred_569.tif', 'Pred_100.tif', 'Pred_397.tif', 'Pred_254.tif', 'Pred_794.tif', 'Pred_492.tif', 'Pred_725.tif', 'Pred_438.tif', 'Pred_1152.tif', 'Pred_943.tif', 'Pred_376.tif', 'Pred_466.tif', 'Pred_62.tif', 'Pred_74.tif', 'Pred_717.tif', 'Pred_131.tif', 'Pred_670.tif', 'Pred_913.tif', 'Pred_271.tif', 'Pred_554.tif', 'Pred_1061.tif', 'Pred_705.tif', 'Pred_593.tif', 'Pred_432.tif', 'Pred_715.tif', 'Pred_544.tif', 'Pred_373.tif', 'Pred_1108.tif', 'Pred_601.tif', 'Pred_173.tif', 'Pred_1083.tif', 'Pred_777.tif', 'Pred_656.tif', 'Pred_920.tif', 'Pred_110.tif', 'Pred_129.tif', 'Pred_1012.tif', 'Pred_748.tif', 'Pred_928.tif', 'Pred_858.tif', 'Pred_263.tif', 'Pred_390.tif', 'Pred_585.tif', 'Pred_1089.tif', 'Pred_1106.tif', 'Pred_524.tif', 'Pred_1115.tif', 'Pred_746.tif', 'Pred_864.tif', 'Pred_1195.tif', 'Pred_673.tif', 'Pred_1096.tif', 'Pred_561.tif', 'Pred_28.tif', 'Pred_659.tif', 'Pred_830.tif', 'Pred_436.tif', 'Pred_623.tif', 'Pred_51.tif', 'Pred_1162.tif', 'Pred_13.tif', 'Pred_999.tif', 'Pred_389.tif', 'Pred_160.tif', 'Pred_205.tif', 'Pred_753.tif', 'Pred_842.tif', 'Pred_953.tif', 'Pred_285.tif', 'Pred_991.tif', 'Pred_582.tif', 'Pred_546.tif', 'Pred_884.tif', 'Pred_279.tif', 'Pred_1186.tif', 'Pred_722.tif', 'Pred_702.tif', 'Pred_259.tif', 'Pred_525.tif', 'Pred_310.tif', 'Pred_153.tif', 'Pred_750.tif', 'Pred_888.tif', 'Pred_269.tif', 'Pred_780.tif', 'Pred_619.tif', 'Pred_215.tif', 'Pred_251.tif', 'Pred_1082.tif', 'Pred_1034.tif', 'Pred_1.tif', 'Pred_1059.tif', 'Pred_534.tif', 'Pred_914.tif', 'Pred_611.tif', 'Pred_741.tif', 'Pred_228.tif', 'Pred_941.tif', 'Pred_297.tif', 'Pred_736.tif', 'Pred_433.tif', 'Pred_817.tif', 'Pred_927.tif', 'Pred_824.tif', 'Pred_923.tif', 'Pred_1131.tif', 'Pred_496.tif', 'Pred_628.tif', 'Pred_983.tif', 'Pred_247.tif', 'Pred_823.tif', 'Pred_1141.tif', 'Pred_21.tif', 'Pred_239.tif', 'Pred_865.tif', 'Pred_1121.tif', 'Pred_90.tif', 'Pred_487.tif', 'Pred_1014.tif', 'Pred_723.tif', 'Pred_497.tif', 'Pred_461.tif', 'Pred_949.tif', 'Pred_290.tif', 'Pred_258.tif', 'Pred_441.tif', 'Pred_448.tif', 'Pred_416.tif', 'Pred_468.tif', 'Pred_2.tif', 'Pred_580.tif', 'Pred_273.tif', 'Pred_1117.tif', 'Pred_354.tif', 'Pred_234.tif', 'Pred_453.tif', 'Pred_452.tif', 'Pred_967.tif', 'Pred_77.tif', 'Pred_442.tif', 'Pred_984.tif', 'Pred_573.tif', 'Pred_772.tif', 'Pred_309.tif', 'Pred_1118.tif', 'Pred_359.tif', 'Pred_32.tif', 'Pred_1191.tif', 'Pred_168.tif', 'Pred_340.tif', 'Pred_1088.tif', 'Pred_669.tif', 'Pred_307.tif', 'Pred_294.tif', 'Pred_143.tif', 'Pred_809.tif', 'Pred_410.tif', 'Pred_1023.tif', 'Pred_974.tif', 'Pred_456.tif', 'Pred_1113.tif', 'Pred_1145.tif', 'Pred_1055.tif', 'Pred_1197.tif', 'Pred_563.tif', 'Pred_900.tif', 'Pred_393.tif', 'Pred_791.tif', 'Pred_765.tif', 'Pred_551.tif', 'Pred_729.tif', 'Pred_516.tif', 'Pred_543.tif', 'Pred_216.tif', 'Pred_490.tif', 'Pred_176.tif', 'Pred_1074.tif', 'Pred_224.tif', 'Pred_54.tif', 'Pred_483.tif', 'Pred_458.tif', 'Pred_1153.tif', 'Pred_146.tif', 'Pred_755.tif', 'Pred_738.tif', 'Pred_936.tif', 'Pred_264.tif', 'Pred_545.tif', 'Pred_1179.tif', 'Pred_792.tif', 'Pred_987.tif', 'Pred_555.tif', 'Pred_655.tif', 'Pred_91.tif', 'Pred_955.tif', 'Pred_371.tif', 'Pred_578.tif', 'Pred_514.tif', 'Pred_829.tif', 'Pred_931.tif', 'Pred_467.tif', 'Pred_916.tif', 'Pred_1085.tif', 'Pred_128.tif', 'Pred_926.tif', 'Pred_189.tif', 'Pred_908.tif', 'Pred_682.tif', 'Pred_632.tif', 'Pred_591.tif', 'Pred_1027.tif', 'Pred_679.tif', 'Pred_1147.tif', 'Pred_1129.tif', 'Pred_1016.tif', 'Pred_721.tif', 'Pred_20.tif', 'Pred_1144.tif', 'Pred_935.tif', 'Pred_915.tif', 'Pred_522.tif', 'Pred_556.tif', 'Pred_950.tif', 'Pred_667.tif', 'Pred_784.tif', 'Pred_766.tif', 'Pred_752.tif', 'Pred_235.tif', 'Pred_10.tif', 'Pred_304.tif', 'Pred_638.tif', 'Pred_154.tif', 'Pred_119.tif', 'Pred_44.tif', 'Pred_671.tif', 'Pred_144.tif', 'Pred_366.tif', 'Pred_1119.tif', 'Pred_873.tif', 'Pred_782.tif', 'Pred_521.tif', 'Pred_834.tif', 'Pred_691.tif', 'Pred_403.tif', 'Pred_1053.tif', 'Pred_370.tif', 'Pred_964.tif', 'Pred_963.tif', 'Pred_815.tif', 'Pred_584.tif', 'Pred_440.tif', 'Pred_65.tif', 'Pred_672.tif', 'Pred_1190.tif', 'Pred_526.tif', 'Pred_322.tif', 'Pred_377.tif', 'Pred_347.tif', 'Pred_1058.tif', 'Pred_325.tif', 'Pred_426.tif', 'Pred_633.tif', 'Pred_369.tif', 'Pred_816.tif', 'Pred_1172.tif', 'Pred_597.tif', 'Pred_493.tif', 'Pred_451.tif', 'Pred_411.tif', 'Pred_688.tif', 'Pred_596.tif', 'Pred_331.tif', 'Pred_406.tif', 'Pred_196.tif', 'Pred_464.tif', 'Pred_739.tif', 'Pred_779.tif', 'Pred_783.tif', 'Pred_711.tif', 'Pred_150.tif', 'Pred_886.tif', 'Pred_66.tif', 'Pred_1098.tif', 'Pred_463.tif', 'Pred_614.tif', 'Pred_268.tif', 'Pred_1105.tif', 'Pred_246.tif', 'Pred_957.tif', 'Pred_139.tif', 'Pred_540.tif', 'Pred_80.tif', 'Pred_1114.tif', 'Pred_455.tif', 'Pred_365.tif', 'Pred_852.tif', 'Pred_425.tif', 'Pred_427.tif', 'Pred_350.tif', 'Pred_972.tif', 'Pred_508.tif', 'Pred_428.tif', 'Pred_1037.tif', 'Pred_773.tif', 'Pred_400.tif', 'Pred_686.tif', 'Pred_841.tif', 'Pred_491.tif', 'Pred_1073.tif', 'Pred_1056.tif', 'Pred_762.tif', 'Pred_1185.tif', 'Pred_759.tif', 'Pred_86.tif', 'Pred_296.tif', 'Pred_800.tif', 'Pred_460.tif', 'Pred_1168.tif', 'Pred_117.tif', 'Pred_1011.tif', 'Pred_532.tif', 'Pred_634.tif', 'Pred_136.tif', 'Pred_123.tif', 'Pred_1177.tif', 'Pred_513.tif', 'Pred_587.tif', 'Pred_415.tif', 'Pred_31.tif', 'Pred_879.tif', 'Pred_863.tif', 'Pred_1163.tif', 'Pred_162.tif', 'Pred_515.tif', 'Pred_261.tif', 'Pred_1010.tif', 'Pred_541.tif', 'Pred_249.tif', 'Pred_138.tif', 'Pred_318.tif', 'Pred_395.tif', 'Pred_1001.tif', 'Pred_703.tif', 'Pred_695.tif', 'Pred_34.tif', 'Pred_837.tif', 'Pred_111.tif', 'Pred_132.tif', 'Pred_855.tif', 'Pred_1048.tif', 'Pred_652.tif', 'Pred_118.tif', 'Pred_505.tif', 'Pred_788.tif', 'Pred_478.tif', 'Pred_1028.tif', 'Pred_404.tif', 'Pred_439.tif', 'Pred_135.tif', 'Pred_29.tif', 'Pred_645.tif', 'Pred_287.tif', 'Pred_230.tif', 'Pred_951.tif', 'Pred_694.tif', 'Pred_187.tif', 'Pred_821.tif', 'Pred_99.tif', 'Pred_5.tif', 'Pred_295.tif', 'Pred_579.tif', 'Pred_1040.tif', 'Pred_1188.tif', 'Pred_42.tif', 'Pred_1164.tif', 'Pred_1123.tif', 'Pred_253.tif', 'Pred_19.tif', 'Pred_418.tif', 'Pred_836.tif', 'Pred_106.tif', 'Pred_337.tif', 'Pred_894.tif', 'Pred_745.tif', 'Pred_141.tif', 'Pred_993.tif', 'Pred_12.tif', 'Pred_929.tif', 'Pred_172.tif', 'Pred_978.tif', 'Pred_874.tif', 'Pred_862.tif', 'Pred_122.tif', 'Pred_818.tif', 'Pred_33.tif', 'Pred_1018.tif', 'Pred_386.tif', 'Pred_276.tif', 'Pred_18.tif', 'Pred_286.tif', 'Pred_988.tif', 'Pred_229.tif', 'Pred_45.tif', 'Pred_986.tif', 'Pred_457.tif', 'Pred_849.tif', 'Pred_59.tif', 'Pred_1092.tif', 'Pred_148.tif', 'Pred_1050.tif', 'Pred_1054.tif', 'Pred_164.tif', 'Pred_1156.tif', 'Pred_1039.tif', 'Pred_535.tif', 'Pred_728.tif', 'Pred_69.tif', 'Pred_867.tif', 'Pred_600.tif', 'Pred_447.tif', 'Pred_25.tif', 'Pred_501.tif', 'Pred_892.tif', 'Pred_586.tif', 'Pred_533.tif', 'Pred_437.tif', 'Pred_124.tif', 'Pred_1020.tif', 'Pred_398.tif', 'Pred_539.tif', 'Pred_147.tif', 'Pred_379.tif', 'Pred_697.tif', 'Pred_1155.tif', 'Pred_413.tif', 'Pred_208.tif', 'Pred_744.tif', 'Pred_828.tif', 'Pred_826.tif', 'Pred_558.tif', 'Pred_1013.tif', 'Pred_429.tif', 'Pred_635.tif', 'Pred_401.tif', 'Pred_361.tif', 'Pred_9.tif', 'Pred_84.tif', 'Pred_96.tif', 'Pred_82.tif', 'Pred_819.tif', 'Pred_356.tif', 'Pred_267.tif', 'Pred_83.tif', 'Pred_946.tif', 'Pred_226.tif', 'Pred_1072.tif', 'Pred_1154.tif', 'Pred_270.tif', 'Pred_658.tif', 'Pred_489.tif', 'Pred_342.tif', 'Pred_192.tif', 'Pred_625.tif', 'Pred_24.tif', 'Pred_202.tif', 'Pred_1150.tif', 'Pred_274.tif', 'Pred_55.tif', 'Pred_568.tif', 'Pred_1057.tif', 'Pred_1189.tif', 'Pred_88.tif', 'Pred_502.tif', 'Pred_804.tif', 'Pred_1052.tif', 'Pred_704.tif', 'Pred_973.tif', 'Pred_320.tif', 'Pred_60.tif', 'Pred_912.tif', 'Pred_612.tif', 'Pred_72.tif', 'Pred_47.tif', 'Pred_713.tif', 'Pred_200.tif', 'Pred_327.tif', 'Pred_1181.tif', 'Pred_408.tif', 'Pred_839.tif', 'Pred_674.tif', 'Pred_1138.tif', 'Pred_384.tif', 'Pred_15.tif', 'Pred_382.tif', 'Pred_618.tif', 'Pred_149.tif', 'Pred_599.tif', 'Pred_787.tif', 'Pred_1171.tif', 'Pred_910.tif', 'Pred_995.tif', 'Pred_282.tif', 'Pred_904.tif', 'Pred_846.tif', 'Pred_1174.tif', 'Pred_182.tif', 'Pred_958.tif', 'Pred_574.tif', 'Pred_495.tif', 'Pred_793.tif', 'Pred_1149.tif', 'Pred_520.tif', 'Pred_527.tif', 'Pred_430.tif', 'Pred_590.tif', 'Pred_537.tif', 'Pred_469.tif', 'Pred_1134.tif', 'Pred_1133.tif', 'Pred_1124.tif', 'Pred_808.tif', 'Pred_220.tif', 'Pred_1178.tif', 'Pred_529.tif', 'Pred_706.tif', 'Pred_758.tif', 'Pred_803.tif', 'Pred_64.tif', 'Pred_126.tif', 'Pred_3.tif', 'Pred_278.tif', 'Pred_165.tif', 'Pred_465.tif', 'Pred_1183.tif', 'Pred_866.tif', 'Pred_754.tif', 'Pred_606.tif', 'Pred_306.tif', 'Pred_225.tif', 'Pred_636.tif', 'Pred_121.tif', 'Pred_1157.tif', 'Pred_1045.tif', 'Pred_87.tif', 'Pred_796.tif', 'Pred_1173.tif', 'Pred_289.tif', 'Pred_630.tif', 'Pred_1030.tif', 'Pred_300.tif', 'Pred_552.tif', 'Pred_724.tif', 'Pred_994.tif', 'Pred_734.tif', 'Pred_476.tif', 'Pred_934.tif', 'Pred_250.tif', 'Pred_944.tif', 'Pred_835.tif', 'Pred_272.tif', 'Pred_801.tif', 'Pred_227.tif', 'Pred_898.tif', 'Pred_140.tif', 'Pred_245.tif', 'Pred_822.tif', 'Pred_53.tif', 'Pred_1160.tif', 'Pred_990.tif', 'Pred_820.tif', 'Pred_423.tif', 'Pred_925.tif', 'Pred_731.tif', 'Pred_859.tif', 'Pred_1161.tif', 'Pred_685.tif', 'Pred_319.tif', 'Pred_699.tif', 'Pred_966.tif', 'Pred_827.tif', 'Pred_1120.tif', 'Pred_255.tif', 'Pred_696.tif', 'Pred_639.tif', 'Pred_1063.tif', 'Pred_485.tif', 'Pred_424.tif', 'Pred_538.tif', 'Pred_166.tif', 'Pred_265.tif', 'Pred_851.tif', 'Pred_749.tif', 'Pred_781.tif', 'Pred_996.tif', 'Pred_854.tif', 'Pred_1090.tif', 'Pred_312.tif', 'Pred_806.tif', 'Pred_518.tif', 'Pred_313.tif', 'Pred_349.tif', 'Pred_989.tif', 'Pred_417.tif', 'Pred_311.tif', 'Pred_980.tif', 'Pred_335.tif', 'Pred_102.tif', 'Pred_125.tif', 'Pred_61.tif', 'Pred_603.tif', 'Pred_484.tif', 'Pred_1079.tif', 'Pred_799.tif', 'Pred_553.tif', 'Pred_930.tif', 'Pred_687.tif', 'Pred_893.tif', 'Pred_763.tif', 'Pred_536.tif', 'Pred_242.tif', 'Pred_843.tif', 'Pred_40.tif', 'Pred_517.tif', 'Pred_504.tif', 'Pred_681.tif', 'Pred_901.tif', 'Pred_407.tif', 'Pred_1019.tif', 'Pred_1009.tif', 'Pred_961.tif', 'Pred_956.tif', 'Pred_663.tif', 'Pred_869.tif', 'Pred_190.tif', 'Pred_933.tif', 'Pred_969.tif', 'Pred_608.tif', 'Pred_364.tif', 'Pred_1080.tif', 'Pred_1075.tif', 'Pred_338.tif', 'Pred_1000.tif', 'Pred_203.tif', 'Pred_589.tif', 'Pred_103.tif', 'Pred_1198.tif', 'Pred_743.tif', 'Pred_152.tif', 'Pred_488.tif', 'Pred_662.tif', 'Pred_1065.tif', 'Pred_789.tif', 'Pred_114.tif', 'Pred_897.tif', 'Pred_1192.tif', 'Pred_35.tif', 'Pred_1036.tif', 'Pred_375.tif', 'Pred_112.tif', 'Pred_16.tif', 'Pred_918.tif', 'Pred_183.tif', 'Pred_500.tif', 'Pred_266.tif', 'Pred_549.tif', 'Pred_299.tif', 'Pred_1169.tif', 'Pred_198.tif', 'Pred_341.tif', 'Pred_1180.tif', 'Pred_631.tif', 'Pred_641.tif', 'Pred_878.tif', 'Pred_732.tif']
In [ ]:
src_files_to_mosaic = []
for fp in images_files:
  src = rasterio.open(os.path.join(path_exp,fp))
  src_files_to_mosaic.append(src)

We use the rasterio merge function to join all the masks and save the result:

In [ ]:
mosaic, out_trans = merge(src_files_to_mosaic)
In [ ]:
out_meta.update({"driver": "GTiff",
                 "height": mosaic.shape[1],
                 "width": mosaic.shape[2],
                 "transform": out_trans,
                 "compress":'lzw'})
In [ ]:
with rasterio.open(out_fp, "w", **out_meta) as dest:
    dest.write(mosaic)
In [ ]:
predic_orto = rasterio.open('/content/Pred_mosaic.tif')
In [ ]:
pred_img_orto = predic_orto.read(1)
In [ ]:
plt.figure(figsize=[16,16])
plt.imshow(pred_img_orto)
plt.axis('off')
Out[ ]:
(-0.5, 22015.5, 21503.5, -0.5)
No description has been provided for this image

Let's then convert the binary image to a vector file by transforming each area defined as a built area into a polygon:

In [ ]:
from rasterio.features import shapes
from shapely.geometry import shape
In [ ]:
shape_gen = ((shape(s), v) for s, v in shapes(pred_img_orto, mask=pred_img_orto, transform=predic_orto.transform))
In [ ]:
Poly_gdf = gpd.GeoDataFrame(dict(zip(["geometry", "class"], zip(*shape_gen))), crs=predic_orto.crs)
In [ ]:
Poly_gdf
Out[ ]:
geometry class
0 POLYGON ((302697.176 700533.202, 302697.386 70... 1.0
1 POLYGON ((302706.290 700526.692, 302706.290 70... 1.0
2 POLYGON ((302711.876 700520.938, 302711.876 70... 1.0
3 POLYGON ((302709.776 700518.796, 302709.776 70... 1.0
4 POLYGON ((302709.482 700517.284, 302709.482 70... 1.0
... ... ...
13569 POLYGON ((302490.704 699631.546, 302490.830 69... 1.0
13570 POLYGON ((302501.876 699631.378, 302501.876 69... 1.0
13571 POLYGON ((302491.922 699634.990, 302492.006 69... 1.0
13572 POLYGON ((302511.158 699637.258, 302511.200 69... 1.0
13573 POLYGON ((302513.678 699630.958, 302513.678 69... 1.0

13574 rows × 2 columns

We can filter some polygons with an area smaller than 1 square meter to remove noise.

In [ ]:
Poly_gdf_filtred = Poly_gdf[Poly_gdf.area > 1].copy()

Finally, we plot the result and save it in a json:

In [ ]:
fig, ax = plt.subplots(figsize=(14, 14))
with rasterio.open(path_img_to_pred) as src:
    gdf = Poly_gdf_filtred.to_crs(src.crs.to_dict()['init'])
    show(src,ax=ax)
gdf.boundary.plot(ax=ax, edgecolor='red')
Out[ ]:
<Axes: >
No description has been provided for this image
In [ ]:
Poly_gdf_filtred.to_file('Constructions.json')

Attention U-Net¶

To improve segmentation performance, Khened et al. and Roth et al. relied on additional object localization prior models to separate the localization and subsequent segmentation steps. This can be achieved by integrating attention gates on top of the U-Net architecture, without training additional models. As a result, attention gates incorporated into U-Net can improve the model’s sensitivity and accuracy to foreground pixels without requiring significant computational overhead. Attention gates can progressively suppress feature responses in irrelevant background regions.

image.png

Attention gates are implemented before the concatenation operation to merge only relevant activations. Gradients originating from background regions are weighted down during the backward pass. This allows model parameters in earlier layers to be updated based on the spatial regions relevant to a given task.

To further improve the attention mechanism, Oktay et al. proposed a grid-based attention mechanism. When implementing grid-based gating, the gating signal is not a single global vector for all pixels in the image, but a grid signal conditioned on the spatial information of the image. The gating signal for each skip connection aggregates image features from multiple image scales. When using grid-based gating, this allows the attention coefficients to be more specific to local regions by increasing the grid resolution of the query signal. This achieves better performance compared to gating based on a global feature vector.

Attention Module:¶

“Need to pay attention” by Jetley et al. introduced the end-to-end trainable attention module. Attention gates are commonly used in natural image analysis and natural language processing.

image.png

Attention is used to perform class-specific clustering, which results in more accurate and robust image classification performance. These attention maps can zoom in on relevant regions, thus demonstrating superior generalization across multiple benchmark datasets.

The way soft attention works is by using an image region by iterative region proposal and cropping. But this is generally not differentiable and relies on reinforcement learning (a sampling-based technique called REINFORCE) for parameter updates that result in optimization of these more difficult models. In contrast, soft attention is probabilistic and uses standard backpropagation without the need for Monte Carlo sampling. The soft attention method by Seo et al. demonstrates improvements by implementing non-uniform and non-rigid attention maps that are better suited to natural object shapes seen in real images.

Let's implement Attention U-Net:

In [ ]:
from keras.layers import LeakyReLU
from keras.layers import multiply
from keras import backend as K

def expend_as(tensor, rep):
		my_repeat = Lambda(lambda x, repnum: K.repeat_elements(x, repnum, axis=3), arguments={'repnum': rep})(tensor)
		return my_repeat


def UnetGatingSignal(input, is_batchnorm=False):
		shape = K.int_shape(input)
		x = Conv2D(shape[3] * 2, (1, 1), strides=(1, 1), padding="same")(input)
		if is_batchnorm:
			x = BatchNormalization()(x)
		x = Activation('relu')(x)
		return x

def AttnGatingBlock(x, g, inter_shape):
		shape_x = K.int_shape(x)  # 32
		shape_g = K.int_shape(g)  # 16

		theta_x = Conv2D(inter_shape, (2, 2), strides=(2, 2), padding='same')(x)  # 16
		shape_theta_x = K.int_shape(theta_x)

		phi_g = Conv2D(inter_shape, (1, 1), padding='same')(g)
		upsample_g = Conv2DTranspose(inter_shape, (3, 3),strides=(shape_theta_x[1] // shape_g[1], shape_theta_x[2] // shape_g[2]),padding='same')(phi_g)  # 16

		concat_xg = add([upsample_g, theta_x])
		act_xg = Activation('relu')(concat_xg)
		psi = Conv2D(1, (1, 1), padding='same')(act_xg)
		sigmoid_xg = Activation('sigmoid')(psi)
		shape_sigmoid = K.int_shape(sigmoid_xg)
		upsample_psi = UpSampling2D(size=(shape_x[1] // shape_sigmoid[1], shape_x[2] // shape_sigmoid[2]))(sigmoid_xg)  # 32

		upsample_psi = expend_as(upsample_psi, shape_x[3])

		y = multiply([upsample_psi, x])

		result = Conv2D(shape_x[3], (1, 1), padding='same')(y)
		result_bn = BatchNormalization()(result)
		return result_bn
In [ ]:
inputs = Input(shape=x_train.shape[1:])
conv = Conv2D(32, (3, 3), kernel_initializer='he_uniform', padding='same')(inputs)
conv = LeakyReLU(alpha=0.3)(conv)

conv1 = Conv2D(32, (3, 3), kernel_initializer='he_uniform', padding='same')(conv)
conv1 = BatchNormalization()(conv1)
conv1 = Activation('relu')(conv1)
conv1 = Conv2D(32, (3, 3), kernel_initializer='he_uniform', padding='same')(conv1)
conv1 = BatchNormalization()(conv1)
conv1 = Activation('relu')(conv1)
pool1 = MaxPooling2D(pool_size=(2, 2))(conv1)

conv2 = Conv2D(64, (3, 3), kernel_initializer='he_uniform', padding='same')(pool1)
conv2 = BatchNormalization()(conv2)
conv2 = Activation('relu')(conv2)
conv2 = Conv2D(64, (3, 3), kernel_initializer='he_uniform', padding='same')(conv2)
conv2 = BatchNormalization()(conv2)
conv2 = Activation('relu')(conv2)
pool2 = MaxPooling2D(pool_size=(2, 2))(conv2)

conv3 = Conv2D(128, (3, 3), kernel_initializer='he_uniform', padding='same')(pool2)
conv3 = BatchNormalization()(conv3)
conv3 = Activation('relu')(conv3)
conv3 = Conv2D(128, (3, 3), kernel_initializer='he_uniform', padding='same')(conv3)
conv3 = BatchNormalization()(conv3)
conv3 = Activation('relu')(conv3)
pool3 = MaxPooling2D(pool_size=(2, 2))(conv3)

conv4 = Conv2D(256, (3, 3), kernel_initializer='he_uniform', padding='same')(pool3)
conv4 = BatchNormalization()(conv4)
conv4 = Activation('relu')(conv4)
conv4 = Conv2D(256, (3, 3), kernel_initializer='he_uniform', padding='same')(conv4)
conv4 = BatchNormalization()(conv4)
conv4 = Activation('relu')(conv4)
pool4 = MaxPooling2D(pool_size=(2, 2))(conv4)

conv5 = Conv2D(512, (3, 3), kernel_initializer='he_uniform', padding='same')(pool4)
conv5 = BatchNormalization()(conv5)
conv5 = Activation('relu')(conv5)
conv5 = Conv2D(512, (3, 3), kernel_initializer='he_uniform', padding='same')(conv5)
conv5 = BatchNormalization()(conv5)
conv5 = Activation('relu')(conv5)
pool5 = MaxPooling2D(pool_size=(2, 2))(conv5)

conv6 = Conv2D(1024, (3, 3), kernel_initializer='he_uniform', padding='same')(pool5)
conv6 = BatchNormalization()(conv6)
conv6 = Activation('relu')(conv6)
conv6 = Conv2D(512, (3, 3), kernel_initializer='he_uniform', padding='same')(conv6)
conv6 = BatchNormalization()(conv6)
conv6 = Activation('relu')(conv6)


gating1 = UnetGatingSignal(conv6, is_batchnorm=True)
attn_1 = AttnGatingBlock(conv5, gating1, 512)
up1 = concatenate([Conv2DTranspose(512, (3, 3), strides=(2, 2), padding='same',activation="relu")(conv6), attn_1], axis=3)
conv7 = Conv2D(512, (3, 3), kernel_initializer='he_uniform', padding='same')(up1)
conv7 = BatchNormalization()(conv7)
conv7 = Activation('relu')(conv7)
conv7 = Conv2D(256, (3, 3), kernel_initializer='he_uniform', padding='same')(conv7)
conv7 = BatchNormalization()(conv7)
conv7 = Activation('relu')(conv7)

gating2 = UnetGatingSignal(conv7, is_batchnorm=True)
attn_2 = AttnGatingBlock(conv4, gating2, 256)
up2 = concatenate([Conv2DTranspose(256, (3, 3), strides=(2, 2), padding='same',activation="relu")(up1), attn_2], axis=3)
conv8 = Conv2D(256, (3, 3), kernel_initializer='he_uniform', padding='same')(up2)
conv8 = BatchNormalization()(conv8)
conv8 = Activation('relu')(conv8)
conv8 = Conv2D(128, (3, 3), kernel_initializer='he_uniform', padding='same')(conv8)
conv8 = BatchNormalization()(conv8)
conv8 = Activation('relu')(conv8)


gating3 = UnetGatingSignal(conv8, is_batchnorm=True)
attn_3 = AttnGatingBlock(conv3, gating3, 128)
up3 = concatenate([Conv2DTranspose(128, (3, 3), strides=(2, 2), padding='same',activation="relu")(up2), attn_3], axis=3)
conv9 = Conv2D(128, (3, 3), kernel_initializer='he_uniform', padding='same')(up3)
conv9 = BatchNormalization()(conv9)
conv9 = Activation('relu')(conv9)
conv9 = Conv2D(64, (3, 3), kernel_initializer='he_uniform', padding='same')(conv9)
conv9 = BatchNormalization()(conv9)
conv9 = Activation('relu')(conv9)


gating4 = UnetGatingSignal(conv9, is_batchnorm=True)
attn_4 = AttnGatingBlock(conv2, gating4, 64)
up4 = concatenate([Conv2DTranspose(64, (3, 3), strides=(2, 2), padding='same',activation="relu")(up3), attn_4], axis=3)
conv10 = Conv2D(64, (3, 3), kernel_initializer='he_uniform', padding='same')(up4)
conv10 = BatchNormalization()(conv10)
conv10 = Activation('relu')(conv10)
conv10 = Conv2D(32, (3, 3), kernel_initializer='he_uniform', padding='same')(conv10)
conv10 = BatchNormalization()(conv10)
conv10 = Activation('relu')(conv10)

gating5 = UnetGatingSignal(conv10, is_batchnorm=True)
attn_5 = AttnGatingBlock(conv1, gating5, 32)
up5 = concatenate([Conv2DTranspose(32, (3, 3), strides=(2, 2), padding='same',activation="relu")(up4), attn_5], axis=3)
conv11 = Conv2D(32, (3, 3), kernel_initializer='he_uniform', padding='same')(up5)
conv11 = BatchNormalization()(conv11)
conv11 = Activation('relu')(conv11)
conv11 = Conv2D(32, (3, 3), kernel_initializer='he_uniform', padding='same')(conv11)
conv11 = BatchNormalization()(conv11)
conv11 = Activation('relu')(conv11)

conv12 = Conv2D(1, (1, 1), activation='sigmoid')(conv11)

model = Model(inputs=inputs, outputs=conv12)
model.compile(optimizer=Adam(learning_rate = 1e-5), loss = Dice, metrics=['accuracy'])
model.summary()
Model: "model_1"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_2 (InputLayer)           [(None, 256, 256, 3  0           []                               
                                )]                                                                
                                                                                                  
 conv2d_68 (Conv2D)             (None, 256, 256, 32  896         ['input_2[0][0]']                
                                )                                                                 
                                                                                                  
 leaky_re_lu (LeakyReLU)        (None, 256, 256, 32  0           ['conv2d_68[0][0]']              
                                )                                                                 
                                                                                                  
 conv2d_69 (Conv2D)             (None, 256, 256, 32  9248        ['leaky_re_lu[0][0]']            
                                )                                                                 
                                                                                                  
 batch_normalization_61 (BatchN  (None, 256, 256, 32  128        ['conv2d_69[0][0]']              
 ormalization)                  )                                                                 
                                                                                                  
 activation_51 (Activation)     (None, 256, 256, 32  0           ['batch_normalization_61[0][0]'] 
                                )                                                                 
                                                                                                  
 conv2d_70 (Conv2D)             (None, 256, 256, 32  9248        ['activation_51[0][0]']          
                                )                                                                 
                                                                                                  
 batch_normalization_62 (BatchN  (None, 256, 256, 32  128        ['conv2d_70[0][0]']              
 ormalization)                  )                                                                 
                                                                                                  
 activation_52 (Activation)     (None, 256, 256, 32  0           ['batch_normalization_62[0][0]'] 
                                )                                                                 
                                                                                                  
 max_pooling2d_1 (MaxPooling2D)  (None, 128, 128, 32  0          ['activation_52[0][0]']          
                                )                                                                 
                                                                                                  
 conv2d_71 (Conv2D)             (None, 128, 128, 64  18496       ['max_pooling2d_1[0][0]']        
                                )                                                                 
                                                                                                  
 batch_normalization_63 (BatchN  (None, 128, 128, 64  256        ['conv2d_71[0][0]']              
 ormalization)                  )                                                                 
                                                                                                  
 activation_53 (Activation)     (None, 128, 128, 64  0           ['batch_normalization_63[0][0]'] 
                                )                                                                 
                                                                                                  
 conv2d_72 (Conv2D)             (None, 128, 128, 64  36928       ['activation_53[0][0]']          
                                )                                                                 
                                                                                                  
 batch_normalization_64 (BatchN  (None, 128, 128, 64  256        ['conv2d_72[0][0]']              
 ormalization)                  )                                                                 
                                                                                                  
 activation_54 (Activation)     (None, 128, 128, 64  0           ['batch_normalization_64[0][0]'] 
                                )                                                                 
                                                                                                  
 max_pooling2d_2 (MaxPooling2D)  (None, 64, 64, 64)  0           ['activation_54[0][0]']          
                                                                                                  
 conv2d_73 (Conv2D)             (None, 64, 64, 128)  73856       ['max_pooling2d_2[0][0]']        
                                                                                                  
 batch_normalization_65 (BatchN  (None, 64, 64, 128)  512        ['conv2d_73[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_55 (Activation)     (None, 64, 64, 128)  0           ['batch_normalization_65[0][0]'] 
                                                                                                  
 conv2d_74 (Conv2D)             (None, 64, 64, 128)  147584      ['activation_55[0][0]']          
                                                                                                  
 batch_normalization_66 (BatchN  (None, 64, 64, 128)  512        ['conv2d_74[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_56 (Activation)     (None, 64, 64, 128)  0           ['batch_normalization_66[0][0]'] 
                                                                                                  
 max_pooling2d_3 (MaxPooling2D)  (None, 32, 32, 128)  0          ['activation_56[0][0]']          
                                                                                                  
 conv2d_75 (Conv2D)             (None, 32, 32, 256)  295168      ['max_pooling2d_3[0][0]']        
                                                                                                  
 batch_normalization_67 (BatchN  (None, 32, 32, 256)  1024       ['conv2d_75[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_57 (Activation)     (None, 32, 32, 256)  0           ['batch_normalization_67[0][0]'] 
                                                                                                  
 conv2d_76 (Conv2D)             (None, 32, 32, 256)  590080      ['activation_57[0][0]']          
                                                                                                  
 batch_normalization_68 (BatchN  (None, 32, 32, 256)  1024       ['conv2d_76[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_58 (Activation)     (None, 32, 32, 256)  0           ['batch_normalization_68[0][0]'] 
                                                                                                  
 max_pooling2d_4 (MaxPooling2D)  (None, 16, 16, 256)  0          ['activation_58[0][0]']          
                                                                                                  
 conv2d_77 (Conv2D)             (None, 16, 16, 512)  1180160     ['max_pooling2d_4[0][0]']        
                                                                                                  
 batch_normalization_69 (BatchN  (None, 16, 16, 512)  2048       ['conv2d_77[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_59 (Activation)     (None, 16, 16, 512)  0           ['batch_normalization_69[0][0]'] 
                                                                                                  
 conv2d_78 (Conv2D)             (None, 16, 16, 512)  2359808     ['activation_59[0][0]']          
                                                                                                  
 batch_normalization_70 (BatchN  (None, 16, 16, 512)  2048       ['conv2d_78[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_60 (Activation)     (None, 16, 16, 512)  0           ['batch_normalization_70[0][0]'] 
                                                                                                  
 max_pooling2d_5 (MaxPooling2D)  (None, 8, 8, 512)   0           ['activation_60[0][0]']          
                                                                                                  
 conv2d_79 (Conv2D)             (None, 8, 8, 1024)   4719616     ['max_pooling2d_5[0][0]']        
                                                                                                  
 batch_normalization_71 (BatchN  (None, 8, 8, 1024)  4096        ['conv2d_79[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_61 (Activation)     (None, 8, 8, 1024)   0           ['batch_normalization_71[0][0]'] 
                                                                                                  
 conv2d_80 (Conv2D)             (None, 8, 8, 512)    4719104     ['activation_61[0][0]']          
                                                                                                  
 batch_normalization_72 (BatchN  (None, 8, 8, 512)   2048        ['conv2d_80[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_62 (Activation)     (None, 8, 8, 512)    0           ['batch_normalization_72[0][0]'] 
                                                                                                  
 conv2d_81 (Conv2D)             (None, 8, 8, 1024)   525312      ['activation_62[0][0]']          
                                                                                                  
 batch_normalization_73 (BatchN  (None, 8, 8, 1024)  4096        ['conv2d_81[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_63 (Activation)     (None, 8, 8, 1024)   0           ['batch_normalization_73[0][0]'] 
                                                                                                  
 conv2d_83 (Conv2D)             (None, 8, 8, 512)    524800      ['activation_63[0][0]']          
                                                                                                  
 conv2d_transpose_5 (Conv2DTran  (None, 8, 8, 512)   2359808     ['conv2d_83[0][0]']              
 spose)                                                                                           
                                                                                                  
 conv2d_82 (Conv2D)             (None, 8, 8, 512)    1049088     ['activation_60[0][0]']          
                                                                                                  
 add_16 (Add)                   (None, 8, 8, 512)    0           ['conv2d_transpose_5[0][0]',     
                                                                  'conv2d_82[0][0]']              
                                                                                                  
 activation_64 (Activation)     (None, 8, 8, 512)    0           ['add_16[0][0]']                 
                                                                                                  
 conv2d_84 (Conv2D)             (None, 8, 8, 1)      513         ['activation_64[0][0]']          
                                                                                                  
 activation_65 (Activation)     (None, 8, 8, 1)      0           ['conv2d_84[0][0]']              
                                                                                                  
 up_sampling2d (UpSampling2D)   (None, 16, 16, 1)    0           ['activation_65[0][0]']          
                                                                                                  
 lambda (Lambda)                (None, 16, 16, 512)  0           ['up_sampling2d[0][0]']          
                                                                                                  
 multiply (Multiply)            (None, 16, 16, 512)  0           ['lambda[0][0]',                 
                                                                  'activation_60[0][0]']          
                                                                                                  
 conv2d_85 (Conv2D)             (None, 16, 16, 512)  262656      ['multiply[0][0]']               
                                                                                                  
 conv2d_transpose_6 (Conv2DTran  (None, 16, 16, 512)  2359808    ['activation_62[0][0]']          
 spose)                                                                                           
                                                                                                  
 batch_normalization_74 (BatchN  (None, 16, 16, 512)  2048       ['conv2d_85[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 concatenate_5 (Concatenate)    (None, 16, 16, 1024  0           ['conv2d_transpose_6[0][0]',     
                                )                                 'batch_normalization_74[0][0]'] 
                                                                                                  
 conv2d_86 (Conv2D)             (None, 16, 16, 512)  4719104     ['concatenate_5[0][0]']          
                                                                                                  
/usr/local/lib/python3.10/dist-packages/keras/optimizers/legacy/adam.py:117: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super().__init__(name, **kwargs)
 batch_normalization_75 (BatchN  (None, 16, 16, 512)  2048       ['conv2d_86[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_66 (Activation)     (None, 16, 16, 512)  0           ['batch_normalization_75[0][0]'] 
                                                                                                  
 conv2d_87 (Conv2D)             (None, 16, 16, 256)  1179904     ['activation_66[0][0]']          
                                                                                                  
 batch_normalization_76 (BatchN  (None, 16, 16, 256)  1024       ['conv2d_87[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_67 (Activation)     (None, 16, 16, 256)  0           ['batch_normalization_76[0][0]'] 
                                                                                                  
 conv2d_88 (Conv2D)             (None, 16, 16, 512)  131584      ['activation_67[0][0]']          
                                                                                                  
 batch_normalization_77 (BatchN  (None, 16, 16, 512)  2048       ['conv2d_88[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_68 (Activation)     (None, 16, 16, 512)  0           ['batch_normalization_77[0][0]'] 
                                                                                                  
 conv2d_90 (Conv2D)             (None, 16, 16, 256)  131328      ['activation_68[0][0]']          
                                                                                                  
 conv2d_transpose_7 (Conv2DTran  (None, 16, 16, 256)  590080     ['conv2d_90[0][0]']              
 spose)                                                                                           
                                                                                                  
 conv2d_89 (Conv2D)             (None, 16, 16, 256)  262400      ['activation_58[0][0]']          
                                                                                                  
 add_17 (Add)                   (None, 16, 16, 256)  0           ['conv2d_transpose_7[0][0]',     
                                                                  'conv2d_89[0][0]']              
                                                                                                  
 activation_69 (Activation)     (None, 16, 16, 256)  0           ['add_17[0][0]']                 
                                                                                                  
 conv2d_91 (Conv2D)             (None, 16, 16, 1)    257         ['activation_69[0][0]']          
                                                                                                  
 activation_70 (Activation)     (None, 16, 16, 1)    0           ['conv2d_91[0][0]']              
                                                                                                  
 up_sampling2d_1 (UpSampling2D)  (None, 32, 32, 1)   0           ['activation_70[0][0]']          
                                                                                                  
 lambda_1 (Lambda)              (None, 32, 32, 256)  0           ['up_sampling2d_1[0][0]']        
                                                                                                  
 multiply_1 (Multiply)          (None, 32, 32, 256)  0           ['lambda_1[0][0]',               
                                                                  'activation_58[0][0]']          
                                                                                                  
 conv2d_92 (Conv2D)             (None, 32, 32, 256)  65792       ['multiply_1[0][0]']             
                                                                                                  
 conv2d_transpose_8 (Conv2DTran  (None, 32, 32, 256)  2359552    ['concatenate_5[0][0]']          
 spose)                                                                                           
                                                                                                  
 batch_normalization_78 (BatchN  (None, 32, 32, 256)  1024       ['conv2d_92[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 concatenate_6 (Concatenate)    (None, 32, 32, 512)  0           ['conv2d_transpose_8[0][0]',     
                                                                  'batch_normalization_78[0][0]'] 
                                                                                                  
 conv2d_93 (Conv2D)             (None, 32, 32, 256)  1179904     ['concatenate_6[0][0]']          
                                                                                                  
 batch_normalization_79 (BatchN  (None, 32, 32, 256)  1024       ['conv2d_93[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_71 (Activation)     (None, 32, 32, 256)  0           ['batch_normalization_79[0][0]'] 
                                                                                                  
 conv2d_94 (Conv2D)             (None, 32, 32, 128)  295040      ['activation_71[0][0]']          
                                                                                                  
 batch_normalization_80 (BatchN  (None, 32, 32, 128)  512        ['conv2d_94[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_72 (Activation)     (None, 32, 32, 128)  0           ['batch_normalization_80[0][0]'] 
                                                                                                  
 conv2d_95 (Conv2D)             (None, 32, 32, 256)  33024       ['activation_72[0][0]']          
                                                                                                  
 batch_normalization_81 (BatchN  (None, 32, 32, 256)  1024       ['conv2d_95[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 activation_73 (Activation)     (None, 32, 32, 256)  0           ['batch_normalization_81[0][0]'] 
                                                                                                  
 conv2d_97 (Conv2D)             (None, 32, 32, 128)  32896       ['activation_73[0][0]']          
                                                                                                  
 conv2d_transpose_9 (Conv2DTran  (None, 32, 32, 128)  147584     ['conv2d_97[0][0]']              
 spose)                                                                                           
                                                                                                  
 conv2d_96 (Conv2D)             (None, 32, 32, 128)  65664       ['activation_56[0][0]']          
                                                                                                  
 add_18 (Add)                   (None, 32, 32, 128)  0           ['conv2d_transpose_9[0][0]',     
                                                                  'conv2d_96[0][0]']              
                                                                                                  
 activation_74 (Activation)     (None, 32, 32, 128)  0           ['add_18[0][0]']                 
                                                                                                  
 conv2d_98 (Conv2D)             (None, 32, 32, 1)    129         ['activation_74[0][0]']          
                                                                                                  
 activation_75 (Activation)     (None, 32, 32, 1)    0           ['conv2d_98[0][0]']              
                                                                                                  
 up_sampling2d_2 (UpSampling2D)  (None, 64, 64, 1)   0           ['activation_75[0][0]']          
                                                                                                  
 lambda_2 (Lambda)              (None, 64, 64, 128)  0           ['up_sampling2d_2[0][0]']        
                                                                                                  
 multiply_2 (Multiply)          (None, 64, 64, 128)  0           ['lambda_2[0][0]',               
                                                                  'activation_56[0][0]']          
                                                                                                  
 conv2d_99 (Conv2D)             (None, 64, 64, 128)  16512       ['multiply_2[0][0]']             
                                                                                                  
 conv2d_transpose_10 (Conv2DTra  (None, 64, 64, 128)  589952     ['concatenate_6[0][0]']          
 nspose)                                                                                          
                                                                                                  
 batch_normalization_82 (BatchN  (None, 64, 64, 128)  512        ['conv2d_99[0][0]']              
 ormalization)                                                                                    
                                                                                                  
 concatenate_7 (Concatenate)    (None, 64, 64, 256)  0           ['conv2d_transpose_10[0][0]',    
                                                                  'batch_normalization_82[0][0]'] 
                                                                                                  
 conv2d_100 (Conv2D)            (None, 64, 64, 128)  295040      ['concatenate_7[0][0]']          
                                                                                                  
 batch_normalization_83 (BatchN  (None, 64, 64, 128)  512        ['conv2d_100[0][0]']             
 ormalization)                                                                                    
                                                                                                  
 activation_76 (Activation)     (None, 64, 64, 128)  0           ['batch_normalization_83[0][0]'] 
                                                                                                  
 conv2d_101 (Conv2D)            (None, 64, 64, 64)   73792       ['activation_76[0][0]']          
                                                                                                  
 batch_normalization_84 (BatchN  (None, 64, 64, 64)  256         ['conv2d_101[0][0]']             
 ormalization)                                                                                    
                                                                                                  
 activation_77 (Activation)     (None, 64, 64, 64)   0           ['batch_normalization_84[0][0]'] 
                                                                                                  
 conv2d_102 (Conv2D)            (None, 64, 64, 128)  8320        ['activation_77[0][0]']          
                                                                                                  
 batch_normalization_85 (BatchN  (None, 64, 64, 128)  512        ['conv2d_102[0][0]']             
 ormalization)                                                                                    
                                                                                                  
 activation_78 (Activation)     (None, 64, 64, 128)  0           ['batch_normalization_85[0][0]'] 
                                                                                                  
 conv2d_104 (Conv2D)            (None, 64, 64, 64)   8256        ['activation_78[0][0]']          
                                                                                                  
 conv2d_transpose_11 (Conv2DTra  (None, 64, 64, 64)  36928       ['conv2d_104[0][0]']             
 nspose)                                                                                          
                                                                                                  
 conv2d_103 (Conv2D)            (None, 64, 64, 64)   16448       ['activation_54[0][0]']          
                                                                                                  
 add_19 (Add)                   (None, 64, 64, 64)   0           ['conv2d_transpose_11[0][0]',    
                                                                  'conv2d_103[0][0]']             
                                                                                                  
 activation_79 (Activation)     (None, 64, 64, 64)   0           ['add_19[0][0]']                 
                                                                                                  
 conv2d_105 (Conv2D)            (None, 64, 64, 1)    65          ['activation_79[0][0]']          
                                                                                                  
 activation_80 (Activation)     (None, 64, 64, 1)    0           ['conv2d_105[0][0]']             
                                                                                                  
 up_sampling2d_3 (UpSampling2D)  (None, 128, 128, 1)  0          ['activation_80[0][0]']          
                                                                                                  
 lambda_3 (Lambda)              (None, 128, 128, 64  0           ['up_sampling2d_3[0][0]']        
                                )                                                                 
                                                                                                  
 multiply_3 (Multiply)          (None, 128, 128, 64  0           ['lambda_3[0][0]',               
                                )                                 'activation_54[0][0]']          
                                                                                                  
 conv2d_106 (Conv2D)            (None, 128, 128, 64  4160        ['multiply_3[0][0]']             
                                )                                                                 
                                                                                                  
 conv2d_transpose_12 (Conv2DTra  (None, 128, 128, 64  147520     ['concatenate_7[0][0]']          
 nspose)                        )                                                                 
                                                                                                  
 batch_normalization_86 (BatchN  (None, 128, 128, 64  256        ['conv2d_106[0][0]']             
 ormalization)                  )                                                                 
                                                                                                  
 concatenate_8 (Concatenate)    (None, 128, 128, 12  0           ['conv2d_transpose_12[0][0]',    
                                8)                                'batch_normalization_86[0][0]'] 
                                                                                                  
 conv2d_107 (Conv2D)            (None, 128, 128, 64  73792       ['concatenate_8[0][0]']          
                                )                                                                 
                                                                                                  
 batch_normalization_87 (BatchN  (None, 128, 128, 64  256        ['conv2d_107[0][0]']             
 ormalization)                  )                                                                 
                                                                                                  
 activation_81 (Activation)     (None, 128, 128, 64  0           ['batch_normalization_87[0][0]'] 
                                )                                                                 
                                                                                                  
 conv2d_108 (Conv2D)            (None, 128, 128, 32  18464       ['activation_81[0][0]']          
                                )                                                                 
                                                                                                  
 batch_normalization_88 (BatchN  (None, 128, 128, 32  128        ['conv2d_108[0][0]']             
 ormalization)                  )                                                                 
                                                                                                  
 activation_82 (Activation)     (None, 128, 128, 32  0           ['batch_normalization_88[0][0]'] 
                                )                                                                 
                                                                                                  
 conv2d_109 (Conv2D)            (None, 128, 128, 64  2112        ['activation_82[0][0]']          
                                )                                                                 
                                                                                                  
 batch_normalization_89 (BatchN  (None, 128, 128, 64  256        ['conv2d_109[0][0]']             
 ormalization)                  )                                                                 
                                                                                                  
 activation_83 (Activation)     (None, 128, 128, 64  0           ['batch_normalization_89[0][0]'] 
                                )                                                                 
                                                                                                  
 conv2d_111 (Conv2D)            (None, 128, 128, 32  2080        ['activation_83[0][0]']          
                                )                                                                 
                                                                                                  
 conv2d_transpose_13 (Conv2DTra  (None, 128, 128, 32  9248       ['conv2d_111[0][0]']             
 nspose)                        )                                                                 
                                                                                                  
 conv2d_110 (Conv2D)            (None, 128, 128, 32  4128        ['activation_52[0][0]']          
                                )                                                                 
                                                                                                  
 add_20 (Add)                   (None, 128, 128, 32  0           ['conv2d_transpose_13[0][0]',    
                                )                                 'conv2d_110[0][0]']             
                                                                                                  
 activation_84 (Activation)     (None, 128, 128, 32  0           ['add_20[0][0]']                 
                                )                                                                 
                                                                                                  
 conv2d_112 (Conv2D)            (None, 128, 128, 1)  33          ['activation_84[0][0]']          
                                                                                                  
 activation_85 (Activation)     (None, 128, 128, 1)  0           ['conv2d_112[0][0]']             
                                                                                                  
 up_sampling2d_4 (UpSampling2D)  (None, 256, 256, 1)  0          ['activation_85[0][0]']          
                                                                                                  
 lambda_4 (Lambda)              (None, 256, 256, 32  0           ['up_sampling2d_4[0][0]']        
                                )                                                                 
                                                                                                  
 multiply_4 (Multiply)          (None, 256, 256, 32  0           ['lambda_4[0][0]',               
                                )                                 'activation_52[0][0]']          
                                                                                                  
 conv2d_113 (Conv2D)            (None, 256, 256, 32  1056        ['multiply_4[0][0]']             
                                )                                                                 
                                                                                                  
 conv2d_transpose_14 (Conv2DTra  (None, 256, 256, 32  36896      ['concatenate_8[0][0]']          
 nspose)                        )                                                                 
                                                                                                  
 batch_normalization_90 (BatchN  (None, 256, 256, 32  128        ['conv2d_113[0][0]']             
 ormalization)                  )                                                                 
                                                                                                  
 concatenate_9 (Concatenate)    (None, 256, 256, 64  0           ['conv2d_transpose_14[0][0]',    
                                )                                 'batch_normalization_90[0][0]'] 
                                                                                                  
 conv2d_114 (Conv2D)            (None, 256, 256, 32  18464       ['concatenate_9[0][0]']          
                                )                                                                 
                                                                                                  
 batch_normalization_91 (BatchN  (None, 256, 256, 32  128        ['conv2d_114[0][0]']             
 ormalization)                  )                                                                 
                                                                                                  
 activation_86 (Activation)     (None, 256, 256, 32  0           ['batch_normalization_91[0][0]'] 
                                )                                                                 
                                                                                                  
 conv2d_115 (Conv2D)            (None, 256, 256, 32  9248        ['activation_86[0][0]']          
                                )                                                                 
                                                                                                  
 batch_normalization_92 (BatchN  (None, 256, 256, 32  128        ['conv2d_115[0][0]']             
 ormalization)                  )                                                                 
                                                                                                  
 activation_87 (Activation)     (None, 256, 256, 32  0           ['batch_normalization_92[0][0]'] 
                                )                                                                 
                                                                                                  
 conv2d_116 (Conv2D)            (None, 256, 256, 1)  33          ['activation_87[0][0]']          
                                                                                                  
==================================================================================================
Total params: 33,840,966
Trainable params: 33,824,966
Non-trainable params: 16,000
__________________________________________________________________________________________________
In [ ]:
history = model.fit_generator(train_generator,steps_per_epoch=steps_per_epoch, validation_steps=validation_steps,
                              epochs=300, validation_data=(x_test,y_test))
<ipython-input-97-3caab27a3ebc>:1: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators.
  history = model.fit_generator(train_generator,steps_per_epoch=steps_per_epoch, validation_steps=validation_steps,
Epoch 1/300
104/104 [==============================] - 31s 206ms/step - loss: 0.6939 - jaccard_coef: 0.3060 - accuracy: 0.5419 - val_loss: 0.7874 - val_jaccard_coef: 0.2126 - val_accuracy: 0.2997
Epoch 2/300
104/104 [==============================] - 18s 169ms/step - loss: 0.6275 - jaccard_coef: 0.3728 - accuracy: 0.7115 - val_loss: 0.7468 - val_jaccard_coef: 0.2532 - val_accuracy: 0.7158
Epoch 3/300
104/104 [==============================] - 17s 167ms/step - loss: 0.5639 - jaccard_coef: 0.4360 - accuracy: 0.7975 - val_loss: 0.6669 - val_jaccard_coef: 0.3331 - val_accuracy: 0.8746
Epoch 4/300
104/104 [==============================] - 17s 166ms/step - loss: 0.5235 - jaccard_coef: 0.4769 - accuracy: 0.8472 - val_loss: 0.5751 - val_jaccard_coef: 0.4249 - val_accuracy: 0.9055
Epoch 5/300
104/104 [==============================] - 17s 167ms/step - loss: 0.4873 - jaccard_coef: 0.5128 - accuracy: 0.8716 - val_loss: 0.5262 - val_jaccard_coef: 0.4738 - val_accuracy: 0.8951
Epoch 6/300
104/104 [==============================] - 17s 164ms/step - loss: 0.4710 - jaccard_coef: 0.5289 - accuracy: 0.8775 - val_loss: 0.5105 - val_jaccard_coef: 0.4895 - val_accuracy: 0.8754
Epoch 7/300
104/104 [==============================] - 17s 166ms/step - loss: 0.4424 - jaccard_coef: 0.5577 - accuracy: 0.8868 - val_loss: 0.5019 - val_jaccard_coef: 0.4981 - val_accuracy: 0.8728
Epoch 8/300
104/104 [==============================] - 17s 164ms/step - loss: 0.4343 - jaccard_coef: 0.5656 - accuracy: 0.8895 - val_loss: 0.4918 - val_jaccard_coef: 0.5082 - val_accuracy: 0.8762
Epoch 9/300
104/104 [==============================] - 17s 166ms/step - loss: 0.4314 - jaccard_coef: 0.5686 - accuracy: 0.8950 - val_loss: 0.4795 - val_jaccard_coef: 0.5205 - val_accuracy: 0.8862
Epoch 10/300
104/104 [==============================] - 17s 166ms/step - loss: 0.4157 - jaccard_coef: 0.5844 - accuracy: 0.8973 - val_loss: 0.4528 - val_jaccard_coef: 0.5472 - val_accuracy: 0.9091
Epoch 11/300
104/104 [==============================] - 17s 166ms/step - loss: 0.4078 - jaccard_coef: 0.5925 - accuracy: 0.8999 - val_loss: 0.4495 - val_jaccard_coef: 0.5505 - val_accuracy: 0.9116
Epoch 12/300
104/104 [==============================] - 17s 167ms/step - loss: 0.4094 - jaccard_coef: 0.5907 - accuracy: 0.9001 - val_loss: 0.4427 - val_jaccard_coef: 0.5573 - val_accuracy: 0.9098
Epoch 13/300
104/104 [==============================] - 17s 165ms/step - loss: 0.3996 - jaccard_coef: 0.6002 - accuracy: 0.9028 - val_loss: 0.4411 - val_jaccard_coef: 0.5589 - val_accuracy: 0.9108
Epoch 14/300
104/104 [==============================] - 17s 164ms/step - loss: 0.3930 - jaccard_coef: 0.6071 - accuracy: 0.9063 - val_loss: 0.4496 - val_jaccard_coef: 0.5504 - val_accuracy: 0.9008
Epoch 15/300
104/104 [==============================] - 18s 170ms/step - loss: 0.3915 - jaccard_coef: 0.6080 - accuracy: 0.9084 - val_loss: 0.4339 - val_jaccard_coef: 0.5661 - val_accuracy: 0.9142
Epoch 16/300
104/104 [==============================] - 18s 171ms/step - loss: 0.3836 - jaccard_coef: 0.6163 - accuracy: 0.9065 - val_loss: 0.4235 - val_jaccard_coef: 0.5765 - val_accuracy: 0.9210
Epoch 17/300
104/104 [==============================] - 17s 168ms/step - loss: 0.3825 - jaccard_coef: 0.6175 - accuracy: 0.9096 - val_loss: 0.4125 - val_jaccard_coef: 0.5875 - val_accuracy: 0.9335
Epoch 18/300
104/104 [==============================] - 17s 166ms/step - loss: 0.3857 - jaccard_coef: 0.6143 - accuracy: 0.9101 - val_loss: 0.4164 - val_jaccard_coef: 0.5836 - val_accuracy: 0.9326
Epoch 19/300
104/104 [==============================] - 17s 165ms/step - loss: 0.3810 - jaccard_coef: 0.6190 - accuracy: 0.9122 - val_loss: 0.4070 - val_jaccard_coef: 0.5930 - val_accuracy: 0.9291
Epoch 20/300
104/104 [==============================] - 17s 166ms/step - loss: 0.3682 - jaccard_coef: 0.6321 - accuracy: 0.9124 - val_loss: 0.4092 - val_jaccard_coef: 0.5908 - val_accuracy: 0.9253
Epoch 21/300
104/104 [==============================] - 17s 165ms/step - loss: 0.3718 - jaccard_coef: 0.6285 - accuracy: 0.9136 - val_loss: 0.4049 - val_jaccard_coef: 0.5951 - val_accuracy: 0.9209
Epoch 22/300
104/104 [==============================] - 17s 165ms/step - loss: 0.3626 - jaccard_coef: 0.6375 - accuracy: 0.9194 - val_loss: 0.3963 - val_jaccard_coef: 0.6037 - val_accuracy: 0.9323
Epoch 23/300
104/104 [==============================] - 17s 166ms/step - loss: 0.3630 - jaccard_coef: 0.6371 - accuracy: 0.9181 - val_loss: 0.3936 - val_jaccard_coef: 0.6064 - val_accuracy: 0.9365
Epoch 24/300
104/104 [==============================] - 17s 164ms/step - loss: 0.3546 - jaccard_coef: 0.6453 - accuracy: 0.9173 - val_loss: 0.4020 - val_jaccard_coef: 0.5980 - val_accuracy: 0.9208
Epoch 25/300
104/104 [==============================] - 17s 161ms/step - loss: 0.3550 - jaccard_coef: 0.6447 - accuracy: 0.9188 - val_loss: 0.4194 - val_jaccard_coef: 0.5806 - val_accuracy: 0.9065
Epoch 26/300
104/104 [==============================] - 17s 164ms/step - loss: 0.3496 - jaccard_coef: 0.6504 - accuracy: 0.9202 - val_loss: 0.4088 - val_jaccard_coef: 0.5912 - val_accuracy: 0.9155
Epoch 27/300
104/104 [==============================] - 17s 160ms/step - loss: 0.3583 - jaccard_coef: 0.6417 - accuracy: 0.9187 - val_loss: 0.3838 - val_jaccard_coef: 0.6162 - val_accuracy: 0.9358
Epoch 28/300
104/104 [==============================] - 17s 166ms/step - loss: 0.3515 - jaccard_coef: 0.6486 - accuracy: 0.9213 - val_loss: 0.3767 - val_jaccard_coef: 0.6233 - val_accuracy: 0.9357
Epoch 29/300
104/104 [==============================] - 17s 165ms/step - loss: 0.3429 - jaccard_coef: 0.6571 - accuracy: 0.9181 - val_loss: 0.3810 - val_jaccard_coef: 0.6190 - val_accuracy: 0.9307
Epoch 30/300
104/104 [==============================] - 17s 164ms/step - loss: 0.3494 - jaccard_coef: 0.6505 - accuracy: 0.9215 - val_loss: 0.3748 - val_jaccard_coef: 0.6252 - val_accuracy: 0.9318
Epoch 31/300
104/104 [==============================] - 17s 160ms/step - loss: 0.3365 - jaccard_coef: 0.6638 - accuracy: 0.9232 - val_loss: 0.3805 - val_jaccard_coef: 0.6195 - val_accuracy: 0.9274
Epoch 32/300
104/104 [==============================] - 17s 162ms/step - loss: 0.3384 - jaccard_coef: 0.6612 - accuracy: 0.9251 - val_loss: 0.3670 - val_jaccard_coef: 0.6330 - val_accuracy: 0.9411
Epoch 33/300
104/104 [==============================] - 17s 163ms/step - loss: 0.3380 - jaccard_coef: 0.6622 - accuracy: 0.9236 - val_loss: 0.3922 - val_jaccard_coef: 0.6078 - val_accuracy: 0.9173
Epoch 34/300
104/104 [==============================] - 17s 161ms/step - loss: 0.3344 - jaccard_coef: 0.6657 - accuracy: 0.9220 - val_loss: 0.3630 - val_jaccard_coef: 0.6370 - val_accuracy: 0.9371
Epoch 35/300
104/104 [==============================] - 17s 165ms/step - loss: 0.3344 - jaccard_coef: 0.6652 - accuracy: 0.9262 - val_loss: 0.3765 - val_jaccard_coef: 0.6235 - val_accuracy: 0.9320
Epoch 36/300
104/104 [==============================] - 17s 164ms/step - loss: 0.3359 - jaccard_coef: 0.6641 - accuracy: 0.9228 - val_loss: 0.3673 - val_jaccard_coef: 0.6327 - val_accuracy: 0.9346
Epoch 37/300
104/104 [==============================] - 17s 160ms/step - loss: 0.3270 - jaccard_coef: 0.6732 - accuracy: 0.9240 - val_loss: 0.3641 - val_jaccard_coef: 0.6359 - val_accuracy: 0.9336
Epoch 38/300
104/104 [==============================] - 17s 162ms/step - loss: 0.3276 - jaccard_coef: 0.6722 - accuracy: 0.9237 - val_loss: 0.3759 - val_jaccard_coef: 0.6241 - val_accuracy: 0.9220
Epoch 39/300
104/104 [==============================] - 17s 160ms/step - loss: 0.3228 - jaccard_coef: 0.6774 - accuracy: 0.9292 - val_loss: 0.3578 - val_jaccard_coef: 0.6422 - val_accuracy: 0.9385
Epoch 40/300
104/104 [==============================] - 17s 162ms/step - loss: 0.3192 - jaccard_coef: 0.6808 - accuracy: 0.9255 - val_loss: 0.3626 - val_jaccard_coef: 0.6374 - val_accuracy: 0.9303
Epoch 41/300
104/104 [==============================] - 19s 179ms/step - loss: 0.3221 - jaccard_coef: 0.6781 - accuracy: 0.9240 - val_loss: 0.3512 - val_jaccard_coef: 0.6488 - val_accuracy: 0.9362
Epoch 42/300
104/104 [==============================] - 17s 163ms/step - loss: 0.3154 - jaccard_coef: 0.6845 - accuracy: 0.9297 - val_loss: 0.3516 - val_jaccard_coef: 0.6484 - val_accuracy: 0.9363
Epoch 43/300
104/104 [==============================] - 17s 161ms/step - loss: 0.3168 - jaccard_coef: 0.6834 - accuracy: 0.9278 - val_loss: 0.3502 - val_jaccard_coef: 0.6498 - val_accuracy: 0.9378
Epoch 44/300
104/104 [==============================] - 17s 161ms/step - loss: 0.3209 - jaccard_coef: 0.6791 - accuracy: 0.9295 - val_loss: 0.3633 - val_jaccard_coef: 0.6367 - val_accuracy: 0.9268
Epoch 45/300
104/104 [==============================] - 17s 163ms/step - loss: 0.3146 - jaccard_coef: 0.6855 - accuracy: 0.9292 - val_loss: 0.3426 - val_jaccard_coef: 0.6574 - val_accuracy: 0.9409
Epoch 46/300
104/104 [==============================] - 17s 165ms/step - loss: 0.2979 - jaccard_coef: 0.7022 - accuracy: 0.9307 - val_loss: 0.3467 - val_jaccard_coef: 0.6533 - val_accuracy: 0.9372
Epoch 47/300
104/104 [==============================] - 17s 162ms/step - loss: 0.3090 - jaccard_coef: 0.6912 - accuracy: 0.9275 - val_loss: 0.3743 - val_jaccard_coef: 0.6257 - val_accuracy: 0.9332
Epoch 48/300
104/104 [==============================] - 16s 158ms/step - loss: 0.3127 - jaccard_coef: 0.6873 - accuracy: 0.9306 - val_loss: 0.3393 - val_jaccard_coef: 0.6607 - val_accuracy: 0.9392
Epoch 49/300
104/104 [==============================] - 17s 162ms/step - loss: 0.3055 - jaccard_coef: 0.6947 - accuracy: 0.9288 - val_loss: 0.3370 - val_jaccard_coef: 0.6630 - val_accuracy: 0.9394
Epoch 50/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2901 - jaccard_coef: 0.7101 - accuracy: 0.9310 - val_loss: 0.3429 - val_jaccard_coef: 0.6571 - val_accuracy: 0.9351
Epoch 51/300
104/104 [==============================] - 17s 163ms/step - loss: 0.3024 - jaccard_coef: 0.6975 - accuracy: 0.9288 - val_loss: 0.3474 - val_jaccard_coef: 0.6526 - val_accuracy: 0.9313
Epoch 52/300
104/104 [==============================] - 17s 162ms/step - loss: 0.3038 - jaccard_coef: 0.6960 - accuracy: 0.9327 - val_loss: 0.3561 - val_jaccard_coef: 0.6439 - val_accuracy: 0.9241
Epoch 53/300
104/104 [==============================] - 17s 168ms/step - loss: 0.2981 - jaccard_coef: 0.7021 - accuracy: 0.9306 - val_loss: 0.3504 - val_jaccard_coef: 0.6496 - val_accuracy: 0.9377
Epoch 54/300
104/104 [==============================] - 17s 162ms/step - loss: 0.3009 - jaccard_coef: 0.6992 - accuracy: 0.9319 - val_loss: 0.3388 - val_jaccard_coef: 0.6612 - val_accuracy: 0.9323
Epoch 55/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2823 - jaccard_coef: 0.7178 - accuracy: 0.9354 - val_loss: 0.3278 - val_jaccard_coef: 0.6722 - val_accuracy: 0.9395
Epoch 56/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2937 - jaccard_coef: 0.7064 - accuracy: 0.9315 - val_loss: 0.3299 - val_jaccard_coef: 0.6701 - val_accuracy: 0.9372
Epoch 57/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2889 - jaccard_coef: 0.7111 - accuracy: 0.9310 - val_loss: 0.3336 - val_jaccard_coef: 0.6664 - val_accuracy: 0.9412
Epoch 58/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2823 - jaccard_coef: 0.7175 - accuracy: 0.9343 - val_loss: 0.3319 - val_jaccard_coef: 0.6681 - val_accuracy: 0.9378
Epoch 59/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2851 - jaccard_coef: 0.7149 - accuracy: 0.9322 - val_loss: 0.3261 - val_jaccard_coef: 0.6739 - val_accuracy: 0.9359
Epoch 60/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2938 - jaccard_coef: 0.7062 - accuracy: 0.9326 - val_loss: 0.3288 - val_jaccard_coef: 0.6712 - val_accuracy: 0.9368
Epoch 61/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2804 - jaccard_coef: 0.7195 - accuracy: 0.9355 - val_loss: 0.3388 - val_jaccard_coef: 0.6612 - val_accuracy: 0.9314
Epoch 62/300
104/104 [==============================] - 16s 158ms/step - loss: 0.2808 - jaccard_coef: 0.7193 - accuracy: 0.9334 - val_loss: 0.3151 - val_jaccard_coef: 0.6849 - val_accuracy: 0.9446
Epoch 63/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2763 - jaccard_coef: 0.7237 - accuracy: 0.9346 - val_loss: 0.3182 - val_jaccard_coef: 0.6818 - val_accuracy: 0.9401
Epoch 64/300
104/104 [==============================] - 17s 159ms/step - loss: 0.2859 - jaccard_coef: 0.7142 - accuracy: 0.9318 - val_loss: 0.3087 - val_jaccard_coef: 0.6913 - val_accuracy: 0.9438
Epoch 65/300
104/104 [==============================] - 17s 167ms/step - loss: 0.2825 - jaccard_coef: 0.7175 - accuracy: 0.9317 - val_loss: 0.3309 - val_jaccard_coef: 0.6691 - val_accuracy: 0.9298
Epoch 66/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2743 - jaccard_coef: 0.7257 - accuracy: 0.9345 - val_loss: 0.3093 - val_jaccard_coef: 0.6907 - val_accuracy: 0.9441
Epoch 67/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2759 - jaccard_coef: 0.7242 - accuracy: 0.9355 - val_loss: 0.3089 - val_jaccard_coef: 0.6911 - val_accuracy: 0.9429
Epoch 68/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2790 - jaccard_coef: 0.7209 - accuracy: 0.9340 - val_loss: 0.3080 - val_jaccard_coef: 0.6920 - val_accuracy: 0.9438
Epoch 69/300
104/104 [==============================] - 16s 159ms/step - loss: 0.2663 - jaccard_coef: 0.7337 - accuracy: 0.9382 - val_loss: 0.3225 - val_jaccard_coef: 0.6775 - val_accuracy: 0.9326
Epoch 70/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2752 - jaccard_coef: 0.7249 - accuracy: 0.9347 - val_loss: 0.3169 - val_jaccard_coef: 0.6831 - val_accuracy: 0.9350
Epoch 71/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2663 - jaccard_coef: 0.7338 - accuracy: 0.9365 - val_loss: 0.3004 - val_jaccard_coef: 0.6996 - val_accuracy: 0.9448
Epoch 72/300
104/104 [==============================] - 17s 159ms/step - loss: 0.2749 - jaccard_coef: 0.7249 - accuracy: 0.9352 - val_loss: 0.3066 - val_jaccard_coef: 0.6934 - val_accuracy: 0.9430
Epoch 73/300
104/104 [==============================] - 17s 166ms/step - loss: 0.2669 - jaccard_coef: 0.7332 - accuracy: 0.9346 - val_loss: 0.2988 - val_jaccard_coef: 0.7012 - val_accuracy: 0.9449
Epoch 74/300
104/104 [==============================] - 17s 163ms/step - loss: 0.2578 - jaccard_coef: 0.7420 - accuracy: 0.9373 - val_loss: 0.2966 - val_jaccard_coef: 0.7034 - val_accuracy: 0.9431
Epoch 75/300
104/104 [==============================] - 17s 165ms/step - loss: 0.2714 - jaccard_coef: 0.7286 - accuracy: 0.9362 - val_loss: 0.3108 - val_jaccard_coef: 0.6892 - val_accuracy: 0.9412
Epoch 76/300
104/104 [==============================] - 17s 164ms/step - loss: 0.2604 - jaccard_coef: 0.7398 - accuracy: 0.9338 - val_loss: 0.3043 - val_jaccard_coef: 0.6957 - val_accuracy: 0.9371
Epoch 77/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2622 - jaccard_coef: 0.7377 - accuracy: 0.9390 - val_loss: 0.3002 - val_jaccard_coef: 0.6998 - val_accuracy: 0.9415
Epoch 78/300
104/104 [==============================] - 17s 159ms/step - loss: 0.2576 - jaccard_coef: 0.7426 - accuracy: 0.9360 - val_loss: 0.2903 - val_jaccard_coef: 0.7097 - val_accuracy: 0.9442
Epoch 79/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2612 - jaccard_coef: 0.7389 - accuracy: 0.9383 - val_loss: 0.2914 - val_jaccard_coef: 0.7086 - val_accuracy: 0.9436
Epoch 80/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2537 - jaccard_coef: 0.7463 - accuracy: 0.9354 - val_loss: 0.2884 - val_jaccard_coef: 0.7116 - val_accuracy: 0.9444
Epoch 81/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2465 - jaccard_coef: 0.7536 - accuracy: 0.9408 - val_loss: 0.2876 - val_jaccard_coef: 0.7124 - val_accuracy: 0.9442
Epoch 82/300
104/104 [==============================] - 17s 164ms/step - loss: 0.2586 - jaccard_coef: 0.7414 - accuracy: 0.9400 - val_loss: 0.2948 - val_jaccard_coef: 0.7052 - val_accuracy: 0.9432
Epoch 83/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2458 - jaccard_coef: 0.7543 - accuracy: 0.9360 - val_loss: 0.2921 - val_jaccard_coef: 0.7079 - val_accuracy: 0.9397
Epoch 84/300
104/104 [==============================] - 16s 158ms/step - loss: 0.2509 - jaccard_coef: 0.7491 - accuracy: 0.9363 - val_loss: 0.2779 - val_jaccard_coef: 0.7221 - val_accuracy: 0.9469
Epoch 85/300
104/104 [==============================] - 16s 157ms/step - loss: 0.2494 - jaccard_coef: 0.7507 - accuracy: 0.9403 - val_loss: 0.2837 - val_jaccard_coef: 0.7163 - val_accuracy: 0.9428
Epoch 86/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2470 - jaccard_coef: 0.7527 - accuracy: 0.9427 - val_loss: 0.2865 - val_jaccard_coef: 0.7135 - val_accuracy: 0.9432
Epoch 87/300
104/104 [==============================] - 17s 165ms/step - loss: 0.2443 - jaccard_coef: 0.7554 - accuracy: 0.9376 - val_loss: 0.3035 - val_jaccard_coef: 0.6965 - val_accuracy: 0.9421
Epoch 88/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2545 - jaccard_coef: 0.7453 - accuracy: 0.9372 - val_loss: 0.3103 - val_jaccard_coef: 0.6897 - val_accuracy: 0.9345
Epoch 89/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2513 - jaccard_coef: 0.7490 - accuracy: 0.9374 - val_loss: 0.2766 - val_jaccard_coef: 0.7234 - val_accuracy: 0.9466
Epoch 90/300
104/104 [==============================] - 17s 159ms/step - loss: 0.2385 - jaccard_coef: 0.7617 - accuracy: 0.9408 - val_loss: 0.2765 - val_jaccard_coef: 0.7235 - val_accuracy: 0.9451
Epoch 91/300
104/104 [==============================] - 17s 164ms/step - loss: 0.2452 - jaccard_coef: 0.7550 - accuracy: 0.9360 - val_loss: 0.2749 - val_jaccard_coef: 0.7251 - val_accuracy: 0.9438
Epoch 92/300
104/104 [==============================] - 16s 156ms/step - loss: 0.2417 - jaccard_coef: 0.7583 - accuracy: 0.9412 - val_loss: 0.2722 - val_jaccard_coef: 0.7278 - val_accuracy: 0.9459
Epoch 93/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2401 - jaccard_coef: 0.7599 - accuracy: 0.9391 - val_loss: 0.2732 - val_jaccard_coef: 0.7268 - val_accuracy: 0.9456
Epoch 94/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2354 - jaccard_coef: 0.7647 - accuracy: 0.9405 - val_loss: 0.2908 - val_jaccard_coef: 0.7092 - val_accuracy: 0.9352
Epoch 95/300
104/104 [==============================] - 17s 163ms/step - loss: 0.2383 - jaccard_coef: 0.7618 - accuracy: 0.9417 - val_loss: 0.2772 - val_jaccard_coef: 0.7228 - val_accuracy: 0.9460
Epoch 96/300
104/104 [==============================] - 16s 159ms/step - loss: 0.2386 - jaccard_coef: 0.7613 - accuracy: 0.9399 - val_loss: 0.2742 - val_jaccard_coef: 0.7258 - val_accuracy: 0.9447
Epoch 97/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2367 - jaccard_coef: 0.7633 - accuracy: 0.9386 - val_loss: 0.2700 - val_jaccard_coef: 0.7300 - val_accuracy: 0.9464
Epoch 98/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2435 - jaccard_coef: 0.7565 - accuracy: 0.9366 - val_loss: 0.3061 - val_jaccard_coef: 0.6939 - val_accuracy: 0.9275
Epoch 99/300
104/104 [==============================] - 16s 158ms/step - loss: 0.2361 - jaccard_coef: 0.7639 - accuracy: 0.9397 - val_loss: 0.2736 - val_jaccard_coef: 0.7264 - val_accuracy: 0.9453
Epoch 100/300
104/104 [==============================] - 16s 156ms/step - loss: 0.2302 - jaccard_coef: 0.7698 - accuracy: 0.9424 - val_loss: 0.2694 - val_jaccard_coef: 0.7306 - val_accuracy: 0.9450
Epoch 101/300
104/104 [==============================] - 16s 159ms/step - loss: 0.2362 - jaccard_coef: 0.7639 - accuracy: 0.9382 - val_loss: 0.2658 - val_jaccard_coef: 0.7342 - val_accuracy: 0.9458
Epoch 102/300
104/104 [==============================] - 16s 159ms/step - loss: 0.2251 - jaccard_coef: 0.7750 - accuracy: 0.9411 - val_loss: 0.2954 - val_jaccard_coef: 0.7046 - val_accuracy: 0.9302
Epoch 103/300
104/104 [==============================] - 16s 157ms/step - loss: 0.2258 - jaccard_coef: 0.7742 - accuracy: 0.9414 - val_loss: 0.2663 - val_jaccard_coef: 0.7337 - val_accuracy: 0.9464
Epoch 104/300
104/104 [==============================] - 17s 165ms/step - loss: 0.2311 - jaccard_coef: 0.7690 - accuracy: 0.9425 - val_loss: 0.2679 - val_jaccard_coef: 0.7321 - val_accuracy: 0.9466
Epoch 105/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2263 - jaccard_coef: 0.7734 - accuracy: 0.9401 - val_loss: 0.2698 - val_jaccard_coef: 0.7302 - val_accuracy: 0.9426
Epoch 106/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2263 - jaccard_coef: 0.7738 - accuracy: 0.9400 - val_loss: 0.2760 - val_jaccard_coef: 0.7240 - val_accuracy: 0.9429
Epoch 107/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2294 - jaccard_coef: 0.7706 - accuracy: 0.9420 - val_loss: 0.2628 - val_jaccard_coef: 0.7372 - val_accuracy: 0.9452
Epoch 108/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2196 - jaccard_coef: 0.7804 - accuracy: 0.9431 - val_loss: 0.2869 - val_jaccard_coef: 0.7131 - val_accuracy: 0.9414
Epoch 109/300
104/104 [==============================] - 16s 157ms/step - loss: 0.2231 - jaccard_coef: 0.7768 - accuracy: 0.9424 - val_loss: 0.2575 - val_jaccard_coef: 0.7425 - val_accuracy: 0.9477
Epoch 110/300
104/104 [==============================] - 16s 159ms/step - loss: 0.2234 - jaccard_coef: 0.7768 - accuracy: 0.9422 - val_loss: 0.2708 - val_jaccard_coef: 0.7292 - val_accuracy: 0.9411
Epoch 111/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2147 - jaccard_coef: 0.7853 - accuracy: 0.9430 - val_loss: 0.2700 - val_jaccard_coef: 0.7300 - val_accuracy: 0.9398
Epoch 112/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2286 - jaccard_coef: 0.7712 - accuracy: 0.9423 - val_loss: 0.2606 - val_jaccard_coef: 0.7394 - val_accuracy: 0.9468
Epoch 113/300
104/104 [==============================] - 17s 165ms/step - loss: 0.2151 - jaccard_coef: 0.7848 - accuracy: 0.9416 - val_loss: 0.2560 - val_jaccard_coef: 0.7440 - val_accuracy: 0.9471
Epoch 114/300
104/104 [==============================] - 17s 163ms/step - loss: 0.2223 - jaccard_coef: 0.7777 - accuracy: 0.9400 - val_loss: 0.2689 - val_jaccard_coef: 0.7311 - val_accuracy: 0.9388
Epoch 115/300
104/104 [==============================] - 17s 165ms/step - loss: 0.2195 - jaccard_coef: 0.7801 - accuracy: 0.9412 - val_loss: 0.2541 - val_jaccard_coef: 0.7459 - val_accuracy: 0.9459
Epoch 116/300
104/104 [==============================] - 17s 165ms/step - loss: 0.2161 - jaccard_coef: 0.7838 - accuracy: 0.9409 - val_loss: 0.2487 - val_jaccard_coef: 0.7513 - val_accuracy: 0.9488
Epoch 117/300
104/104 [==============================] - 17s 163ms/step - loss: 0.2178 - jaccard_coef: 0.7823 - accuracy: 0.9425 - val_loss: 0.2631 - val_jaccard_coef: 0.7369 - val_accuracy: 0.9452
Epoch 118/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2174 - jaccard_coef: 0.7824 - accuracy: 0.9422 - val_loss: 0.2497 - val_jaccard_coef: 0.7503 - val_accuracy: 0.9473
Epoch 119/300
104/104 [==============================] - 17s 166ms/step - loss: 0.2207 - jaccard_coef: 0.7791 - accuracy: 0.9414 - val_loss: 0.2542 - val_jaccard_coef: 0.7458 - val_accuracy: 0.9465
Epoch 120/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2164 - jaccard_coef: 0.7838 - accuracy: 0.9428 - val_loss: 0.2573 - val_jaccard_coef: 0.7427 - val_accuracy: 0.9451
Epoch 121/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2158 - jaccard_coef: 0.7842 - accuracy: 0.9419 - val_loss: 0.2533 - val_jaccard_coef: 0.7467 - val_accuracy: 0.9445
Epoch 122/300
104/104 [==============================] - 17s 163ms/step - loss: 0.2120 - jaccard_coef: 0.7880 - accuracy: 0.9427 - val_loss: 0.2487 - val_jaccard_coef: 0.7513 - val_accuracy: 0.9468
Epoch 123/300
104/104 [==============================] - 17s 162ms/step - loss: 0.2133 - jaccard_coef: 0.7867 - accuracy: 0.9426 - val_loss: 0.2598 - val_jaccard_coef: 0.7402 - val_accuracy: 0.9413
Epoch 124/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2193 - jaccard_coef: 0.7808 - accuracy: 0.9400 - val_loss: 0.2653 - val_jaccard_coef: 0.7347 - val_accuracy: 0.9410
Epoch 125/300
104/104 [==============================] - 17s 167ms/step - loss: 0.2061 - jaccard_coef: 0.7940 - accuracy: 0.9436 - val_loss: 0.2559 - val_jaccard_coef: 0.7441 - val_accuracy: 0.9423
Epoch 126/300
104/104 [==============================] - 17s 163ms/step - loss: 0.2118 - jaccard_coef: 0.7884 - accuracy: 0.9420 - val_loss: 0.2417 - val_jaccard_coef: 0.7583 - val_accuracy: 0.9483
Epoch 127/300
104/104 [==============================] - 16s 159ms/step - loss: 0.2066 - jaccard_coef: 0.7933 - accuracy: 0.9435 - val_loss: 0.2408 - val_jaccard_coef: 0.7592 - val_accuracy: 0.9489
Epoch 128/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2068 - jaccard_coef: 0.7933 - accuracy: 0.9425 - val_loss: 0.2454 - val_jaccard_coef: 0.7546 - val_accuracy: 0.9468
Epoch 129/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2167 - jaccard_coef: 0.7831 - accuracy: 0.9409 - val_loss: 0.2620 - val_jaccard_coef: 0.7380 - val_accuracy: 0.9438
Epoch 130/300
104/104 [==============================] - 17s 166ms/step - loss: 0.2038 - jaccard_coef: 0.7964 - accuracy: 0.9437 - val_loss: 0.2436 - val_jaccard_coef: 0.7564 - val_accuracy: 0.9463
Epoch 131/300
104/104 [==============================] - 17s 161ms/step - loss: 0.2051 - jaccard_coef: 0.7947 - accuracy: 0.9433 - val_loss: 0.2417 - val_jaccard_coef: 0.7583 - val_accuracy: 0.9490
Epoch 132/300
104/104 [==============================] - 17s 163ms/step - loss: 0.2092 - jaccard_coef: 0.7906 - accuracy: 0.9435 - val_loss: 0.2452 - val_jaccard_coef: 0.7548 - val_accuracy: 0.9455
Epoch 133/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1987 - jaccard_coef: 0.8013 - accuracy: 0.9441 - val_loss: 0.2458 - val_jaccard_coef: 0.7542 - val_accuracy: 0.9444
Epoch 134/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1992 - jaccard_coef: 0.8009 - accuracy: 0.9461 - val_loss: 0.2413 - val_jaccard_coef: 0.7587 - val_accuracy: 0.9463
Epoch 135/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1971 - jaccard_coef: 0.8026 - accuracy: 0.9450 - val_loss: 0.2427 - val_jaccard_coef: 0.7573 - val_accuracy: 0.9457
Epoch 136/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1993 - jaccard_coef: 0.8007 - accuracy: 0.9431 - val_loss: 0.2701 - val_jaccard_coef: 0.7299 - val_accuracy: 0.9405
Epoch 137/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1981 - jaccard_coef: 0.8020 - accuracy: 0.9452 - val_loss: 0.2630 - val_jaccard_coef: 0.7370 - val_accuracy: 0.9381
Epoch 138/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1979 - jaccard_coef: 0.8022 - accuracy: 0.9444 - val_loss: 0.2350 - val_jaccard_coef: 0.7650 - val_accuracy: 0.9483
Epoch 139/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1955 - jaccard_coef: 0.8046 - accuracy: 0.9449 - val_loss: 0.2397 - val_jaccard_coef: 0.7603 - val_accuracy: 0.9456
Epoch 140/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1998 - jaccard_coef: 0.8003 - accuracy: 0.9444 - val_loss: 0.2331 - val_jaccard_coef: 0.7669 - val_accuracy: 0.9487
Epoch 141/300
104/104 [==============================] - 17s 164ms/step - loss: 0.2003 - jaccard_coef: 0.7996 - accuracy: 0.9417 - val_loss: 0.2340 - val_jaccard_coef: 0.7660 - val_accuracy: 0.9479
Epoch 142/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1904 - jaccard_coef: 0.8097 - accuracy: 0.9458 - val_loss: 0.2438 - val_jaccard_coef: 0.7562 - val_accuracy: 0.9442
Epoch 143/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1958 - jaccard_coef: 0.8042 - accuracy: 0.9467 - val_loss: 0.2389 - val_jaccard_coef: 0.7611 - val_accuracy: 0.9471
Epoch 144/300
104/104 [==============================] - 17s 160ms/step - loss: 0.2057 - jaccard_coef: 0.7944 - accuracy: 0.9421 - val_loss: 0.2416 - val_jaccard_coef: 0.7584 - val_accuracy: 0.9474
Epoch 145/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1953 - jaccard_coef: 0.8048 - accuracy: 0.9425 - val_loss: 0.2426 - val_jaccard_coef: 0.7574 - val_accuracy: 0.9464
Epoch 146/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1962 - jaccard_coef: 0.8037 - accuracy: 0.9449 - val_loss: 0.2322 - val_jaccard_coef: 0.7678 - val_accuracy: 0.9488
Epoch 147/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1930 - jaccard_coef: 0.8071 - accuracy: 0.9437 - val_loss: 0.2315 - val_jaccard_coef: 0.7685 - val_accuracy: 0.9485
Epoch 148/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1922 - jaccard_coef: 0.8079 - accuracy: 0.9438 - val_loss: 0.2297 - val_jaccard_coef: 0.7703 - val_accuracy: 0.9483
Epoch 149/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1950 - jaccard_coef: 0.8049 - accuracy: 0.9452 - val_loss: 0.2386 - val_jaccard_coef: 0.7614 - val_accuracy: 0.9468
Epoch 150/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1917 - jaccard_coef: 0.8082 - accuracy: 0.9440 - val_loss: 0.2311 - val_jaccard_coef: 0.7689 - val_accuracy: 0.9489
Epoch 151/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1910 - jaccard_coef: 0.8090 - accuracy: 0.9450 - val_loss: 0.2274 - val_jaccard_coef: 0.7726 - val_accuracy: 0.9494
Epoch 152/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1931 - jaccard_coef: 0.8069 - accuracy: 0.9454 - val_loss: 0.2354 - val_jaccard_coef: 0.7646 - val_accuracy: 0.9469
Epoch 153/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1873 - jaccard_coef: 0.8128 - accuracy: 0.9451 - val_loss: 0.2411 - val_jaccard_coef: 0.7589 - val_accuracy: 0.9420
Epoch 154/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1911 - jaccard_coef: 0.8090 - accuracy: 0.9459 - val_loss: 0.2271 - val_jaccard_coef: 0.7729 - val_accuracy: 0.9489
Epoch 155/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1883 - jaccard_coef: 0.8118 - accuracy: 0.9438 - val_loss: 0.2258 - val_jaccard_coef: 0.7742 - val_accuracy: 0.9491
Epoch 156/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1859 - jaccard_coef: 0.8139 - accuracy: 0.9482 - val_loss: 0.2249 - val_jaccard_coef: 0.7751 - val_accuracy: 0.9503
Epoch 157/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1882 - jaccard_coef: 0.8119 - accuracy: 0.9454 - val_loss: 0.2340 - val_jaccard_coef: 0.7660 - val_accuracy: 0.9454
Epoch 158/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1845 - jaccard_coef: 0.8154 - accuracy: 0.9438 - val_loss: 0.2227 - val_jaccard_coef: 0.7773 - val_accuracy: 0.9497
Epoch 159/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1841 - jaccard_coef: 0.8160 - accuracy: 0.9465 - val_loss: 0.2314 - val_jaccard_coef: 0.7686 - val_accuracy: 0.9460
Epoch 160/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1842 - jaccard_coef: 0.8159 - accuracy: 0.9456 - val_loss: 0.2353 - val_jaccard_coef: 0.7647 - val_accuracy: 0.9451
Epoch 161/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1882 - jaccard_coef: 0.8119 - accuracy: 0.9454 - val_loss: 0.2268 - val_jaccard_coef: 0.7732 - val_accuracy: 0.9483
Epoch 162/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1927 - jaccard_coef: 0.8065 - accuracy: 0.9433 - val_loss: 0.2327 - val_jaccard_coef: 0.7673 - val_accuracy: 0.9484
Epoch 163/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1796 - jaccard_coef: 0.8206 - accuracy: 0.9454 - val_loss: 0.2301 - val_jaccard_coef: 0.7699 - val_accuracy: 0.9480
Epoch 164/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1885 - jaccard_coef: 0.8116 - accuracy: 0.9466 - val_loss: 0.2384 - val_jaccard_coef: 0.7616 - val_accuracy: 0.9464
Epoch 165/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1872 - jaccard_coef: 0.8129 - accuracy: 0.9433 - val_loss: 0.2423 - val_jaccard_coef: 0.7577 - val_accuracy: 0.9452
Epoch 166/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1793 - jaccard_coef: 0.8207 - accuracy: 0.9476 - val_loss: 0.2247 - val_jaccard_coef: 0.7753 - val_accuracy: 0.9491
Epoch 167/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1858 - jaccard_coef: 0.8142 - accuracy: 0.9454 - val_loss: 0.2238 - val_jaccard_coef: 0.7762 - val_accuracy: 0.9490
Epoch 168/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1847 - jaccard_coef: 0.8154 - accuracy: 0.9470 - val_loss: 0.2178 - val_jaccard_coef: 0.7822 - val_accuracy: 0.9500
Epoch 169/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1808 - jaccard_coef: 0.8193 - accuracy: 0.9454 - val_loss: 0.2275 - val_jaccard_coef: 0.7725 - val_accuracy: 0.9463
Epoch 170/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1803 - jaccard_coef: 0.8193 - accuracy: 0.9460 - val_loss: 0.2303 - val_jaccard_coef: 0.7697 - val_accuracy: 0.9441
Epoch 171/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1843 - jaccard_coef: 0.8158 - accuracy: 0.9452 - val_loss: 0.2228 - val_jaccard_coef: 0.7772 - val_accuracy: 0.9480
Epoch 172/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1786 - jaccard_coef: 0.8213 - accuracy: 0.9459 - val_loss: 0.2197 - val_jaccard_coef: 0.7803 - val_accuracy: 0.9492
Epoch 173/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1782 - jaccard_coef: 0.8216 - accuracy: 0.9462 - val_loss: 0.2324 - val_jaccard_coef: 0.7676 - val_accuracy: 0.9477
Epoch 174/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1820 - jaccard_coef: 0.8181 - accuracy: 0.9467 - val_loss: 0.2156 - val_jaccard_coef: 0.7844 - val_accuracy: 0.9505
Epoch 175/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1769 - jaccard_coef: 0.8233 - accuracy: 0.9462 - val_loss: 0.2122 - val_jaccard_coef: 0.7878 - val_accuracy: 0.9511
Epoch 176/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1743 - jaccard_coef: 0.8258 - accuracy: 0.9492 - val_loss: 0.2123 - val_jaccard_coef: 0.7877 - val_accuracy: 0.9516
Epoch 177/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1787 - jaccard_coef: 0.8213 - accuracy: 0.9448 - val_loss: 0.2298 - val_jaccard_coef: 0.7702 - val_accuracy: 0.9481
Epoch 178/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1729 - jaccard_coef: 0.8273 - accuracy: 0.9475 - val_loss: 0.2178 - val_jaccard_coef: 0.7822 - val_accuracy: 0.9498
Epoch 179/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1722 - jaccard_coef: 0.8279 - accuracy: 0.9466 - val_loss: 0.2221 - val_jaccard_coef: 0.7779 - val_accuracy: 0.9476
Epoch 180/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1848 - jaccard_coef: 0.8151 - accuracy: 0.9458 - val_loss: 0.2171 - val_jaccard_coef: 0.7829 - val_accuracy: 0.9507
Epoch 181/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1694 - jaccard_coef: 0.8307 - accuracy: 0.9457 - val_loss: 0.2209 - val_jaccard_coef: 0.7791 - val_accuracy: 0.9476
Epoch 182/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1740 - jaccard_coef: 0.8261 - accuracy: 0.9482 - val_loss: 0.2121 - val_jaccard_coef: 0.7879 - val_accuracy: 0.9511
Epoch 183/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1743 - jaccard_coef: 0.8258 - accuracy: 0.9478 - val_loss: 0.2110 - val_jaccard_coef: 0.7890 - val_accuracy: 0.9521
Epoch 184/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1768 - jaccard_coef: 0.8232 - accuracy: 0.9460 - val_loss: 0.2229 - val_jaccard_coef: 0.7771 - val_accuracy: 0.9461
Epoch 185/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1818 - jaccard_coef: 0.8181 - accuracy: 0.9450 - val_loss: 0.2139 - val_jaccard_coef: 0.7861 - val_accuracy: 0.9511
Epoch 186/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1751 - jaccard_coef: 0.8249 - accuracy: 0.9470 - val_loss: 0.2141 - val_jaccard_coef: 0.7859 - val_accuracy: 0.9499
Epoch 187/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1733 - jaccard_coef: 0.8268 - accuracy: 0.9460 - val_loss: 0.2169 - val_jaccard_coef: 0.7831 - val_accuracy: 0.9489
Epoch 188/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1780 - jaccard_coef: 0.8222 - accuracy: 0.9454 - val_loss: 0.2257 - val_jaccard_coef: 0.7743 - val_accuracy: 0.9456
Epoch 189/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1681 - jaccard_coef: 0.8320 - accuracy: 0.9487 - val_loss: 0.2131 - val_jaccard_coef: 0.7869 - val_accuracy: 0.9504
Epoch 190/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1693 - jaccard_coef: 0.8308 - accuracy: 0.9478 - val_loss: 0.2212 - val_jaccard_coef: 0.7788 - val_accuracy: 0.9482
Epoch 191/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1792 - jaccard_coef: 0.8207 - accuracy: 0.9466 - val_loss: 0.2198 - val_jaccard_coef: 0.7802 - val_accuracy: 0.9470
Epoch 192/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1658 - jaccard_coef: 0.8343 - accuracy: 0.9474 - val_loss: 0.2138 - val_jaccard_coef: 0.7862 - val_accuracy: 0.9498
Epoch 193/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1699 - jaccard_coef: 0.8302 - accuracy: 0.9478 - val_loss: 0.2147 - val_jaccard_coef: 0.7853 - val_accuracy: 0.9499
Epoch 194/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1736 - jaccard_coef: 0.8262 - accuracy: 0.9445 - val_loss: 0.2172 - val_jaccard_coef: 0.7828 - val_accuracy: 0.9482
Epoch 195/300
104/104 [==============================] - 18s 169ms/step - loss: 0.1675 - jaccard_coef: 0.8326 - accuracy: 0.9485 - val_loss: 0.2257 - val_jaccard_coef: 0.7743 - val_accuracy: 0.9437
Epoch 196/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1743 - jaccard_coef: 0.8256 - accuracy: 0.9463 - val_loss: 0.2127 - val_jaccard_coef: 0.7873 - val_accuracy: 0.9497
Epoch 197/300
104/104 [==============================] - 17s 169ms/step - loss: 0.1728 - jaccard_coef: 0.8272 - accuracy: 0.9470 - val_loss: 0.2136 - val_jaccard_coef: 0.7864 - val_accuracy: 0.9489
Epoch 198/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1633 - jaccard_coef: 0.8367 - accuracy: 0.9478 - val_loss: 0.2097 - val_jaccard_coef: 0.7903 - val_accuracy: 0.9511
Epoch 199/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1684 - jaccard_coef: 0.8314 - accuracy: 0.9454 - val_loss: 0.2252 - val_jaccard_coef: 0.7748 - val_accuracy: 0.9447
Epoch 200/300
104/104 [==============================] - 18s 171ms/step - loss: 0.1622 - jaccard_coef: 0.8377 - accuracy: 0.9503 - val_loss: 0.2224 - val_jaccard_coef: 0.7776 - val_accuracy: 0.9449
Epoch 201/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1659 - jaccard_coef: 0.8341 - accuracy: 0.9474 - val_loss: 0.2094 - val_jaccard_coef: 0.7906 - val_accuracy: 0.9510
Epoch 202/300
104/104 [==============================] - 18s 170ms/step - loss: 0.1677 - jaccard_coef: 0.8322 - accuracy: 0.9485 - val_loss: 0.2205 - val_jaccard_coef: 0.7795 - val_accuracy: 0.9472
Epoch 203/300
104/104 [==============================] - 17s 167ms/step - loss: 0.1633 - jaccard_coef: 0.8366 - accuracy: 0.9483 - val_loss: 0.2088 - val_jaccard_coef: 0.7912 - val_accuracy: 0.9509
Epoch 204/300
104/104 [==============================] - 17s 169ms/step - loss: 0.1650 - jaccard_coef: 0.8348 - accuracy: 0.9470 - val_loss: 0.2279 - val_jaccard_coef: 0.7721 - val_accuracy: 0.9427
Epoch 205/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1696 - jaccard_coef: 0.8305 - accuracy: 0.9466 - val_loss: 0.2083 - val_jaccard_coef: 0.7917 - val_accuracy: 0.9513
Epoch 206/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1652 - jaccard_coef: 0.8348 - accuracy: 0.9495 - val_loss: 0.2209 - val_jaccard_coef: 0.7791 - val_accuracy: 0.9485
Epoch 207/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1777 - jaccard_coef: 0.8224 - accuracy: 0.9448 - val_loss: 0.2130 - val_jaccard_coef: 0.7870 - val_accuracy: 0.9483
Epoch 208/300
104/104 [==============================] - 17s 167ms/step - loss: 0.1672 - jaccard_coef: 0.8329 - accuracy: 0.9483 - val_loss: 0.2171 - val_jaccard_coef: 0.7829 - val_accuracy: 0.9468
Epoch 209/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1689 - jaccard_coef: 0.8312 - accuracy: 0.9461 - val_loss: 0.2139 - val_jaccard_coef: 0.7861 - val_accuracy: 0.9474
Epoch 210/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1576 - jaccard_coef: 0.8420 - accuracy: 0.9494 - val_loss: 0.2121 - val_jaccard_coef: 0.7879 - val_accuracy: 0.9479
Epoch 211/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1660 - jaccard_coef: 0.8341 - accuracy: 0.9477 - val_loss: 0.2115 - val_jaccard_coef: 0.7885 - val_accuracy: 0.9481
Epoch 212/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1652 - jaccard_coef: 0.8349 - accuracy: 0.9471 - val_loss: 0.2088 - val_jaccard_coef: 0.7912 - val_accuracy: 0.9504
Epoch 213/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1755 - jaccard_coef: 0.8246 - accuracy: 0.9445 - val_loss: 0.2194 - val_jaccard_coef: 0.7806 - val_accuracy: 0.9480
Epoch 214/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1609 - jaccard_coef: 0.8391 - accuracy: 0.9491 - val_loss: 0.2154 - val_jaccard_coef: 0.7846 - val_accuracy: 0.9493
Epoch 215/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1648 - jaccard_coef: 0.8350 - accuracy: 0.9464 - val_loss: 0.2257 - val_jaccard_coef: 0.7743 - val_accuracy: 0.9473
Epoch 216/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1698 - jaccard_coef: 0.8303 - accuracy: 0.9467 - val_loss: 0.2138 - val_jaccard_coef: 0.7862 - val_accuracy: 0.9487
Epoch 217/300
104/104 [==============================] - 17s 159ms/step - loss: 0.1618 - jaccard_coef: 0.8381 - accuracy: 0.9487 - val_loss: 0.2134 - val_jaccard_coef: 0.7866 - val_accuracy: 0.9479
Epoch 218/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1613 - jaccard_coef: 0.8387 - accuracy: 0.9470 - val_loss: 0.2142 - val_jaccard_coef: 0.7858 - val_accuracy: 0.9480
Epoch 219/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1628 - jaccard_coef: 0.8373 - accuracy: 0.9481 - val_loss: 0.2022 - val_jaccard_coef: 0.7978 - val_accuracy: 0.9514
Epoch 220/300
104/104 [==============================] - 17s 167ms/step - loss: 0.1570 - jaccard_coef: 0.8430 - accuracy: 0.9500 - val_loss: 0.2036 - val_jaccard_coef: 0.7964 - val_accuracy: 0.9516
Epoch 221/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1624 - jaccard_coef: 0.8377 - accuracy: 0.9478 - val_loss: 0.2044 - val_jaccard_coef: 0.7956 - val_accuracy: 0.9504
Epoch 222/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1603 - jaccard_coef: 0.8399 - accuracy: 0.9493 - val_loss: 0.2204 - val_jaccard_coef: 0.7796 - val_accuracy: 0.9466
Epoch 223/300
104/104 [==============================] - 17s 164ms/step - loss: 0.1590 - jaccard_coef: 0.8408 - accuracy: 0.9487 - val_loss: 0.2223 - val_jaccard_coef: 0.7777 - val_accuracy: 0.9430
Epoch 224/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1589 - jaccard_coef: 0.8411 - accuracy: 0.9481 - val_loss: 0.2079 - val_jaccard_coef: 0.7921 - val_accuracy: 0.9491
Epoch 225/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1603 - jaccard_coef: 0.8398 - accuracy: 0.9494 - val_loss: 0.2065 - val_jaccard_coef: 0.7935 - val_accuracy: 0.9505
Epoch 226/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1616 - jaccard_coef: 0.8384 - accuracy: 0.9474 - val_loss: 0.2074 - val_jaccard_coef: 0.7926 - val_accuracy: 0.9494
Epoch 227/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1629 - jaccard_coef: 0.8371 - accuracy: 0.9477 - val_loss: 0.2044 - val_jaccard_coef: 0.7956 - val_accuracy: 0.9513
Epoch 228/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1555 - jaccard_coef: 0.8446 - accuracy: 0.9498 - val_loss: 0.2030 - val_jaccard_coef: 0.7970 - val_accuracy: 0.9510
Epoch 229/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1591 - jaccard_coef: 0.8407 - accuracy: 0.9478 - val_loss: 0.2028 - val_jaccard_coef: 0.7972 - val_accuracy: 0.9509
Epoch 230/300
104/104 [==============================] - 17s 168ms/step - loss: 0.1590 - jaccard_coef: 0.8412 - accuracy: 0.9482 - val_loss: 0.2064 - val_jaccard_coef: 0.7936 - val_accuracy: 0.9483
Epoch 231/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1566 - jaccard_coef: 0.8434 - accuracy: 0.9489 - val_loss: 0.2035 - val_jaccard_coef: 0.7965 - val_accuracy: 0.9509
Epoch 232/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1574 - jaccard_coef: 0.8427 - accuracy: 0.9491 - val_loss: 0.2081 - val_jaccard_coef: 0.7919 - val_accuracy: 0.9494
Epoch 233/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1614 - jaccard_coef: 0.8386 - accuracy: 0.9473 - val_loss: 0.2058 - val_jaccard_coef: 0.7942 - val_accuracy: 0.9501
Epoch 234/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1583 - jaccard_coef: 0.8418 - accuracy: 0.9477 - val_loss: 0.2469 - val_jaccard_coef: 0.7531 - val_accuracy: 0.9332
Epoch 235/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1592 - jaccard_coef: 0.8406 - accuracy: 0.9487 - val_loss: 0.2102 - val_jaccard_coef: 0.7898 - val_accuracy: 0.9481
Epoch 236/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1559 - jaccard_coef: 0.8442 - accuracy: 0.9483 - val_loss: 0.1988 - val_jaccard_coef: 0.8012 - val_accuracy: 0.9517
Epoch 237/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1574 - jaccard_coef: 0.8426 - accuracy: 0.9503 - val_loss: 0.2114 - val_jaccard_coef: 0.7886 - val_accuracy: 0.9469
Epoch 238/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1542 - jaccard_coef: 0.8458 - accuracy: 0.9490 - val_loss: 0.2075 - val_jaccard_coef: 0.7925 - val_accuracy: 0.9476
Epoch 239/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1561 - jaccard_coef: 0.8439 - accuracy: 0.9477 - val_loss: 0.2028 - val_jaccard_coef: 0.7972 - val_accuracy: 0.9500
Epoch 240/300
104/104 [==============================] - 17s 166ms/step - loss: 0.1589 - jaccard_coef: 0.8411 - accuracy: 0.9496 - val_loss: 0.1985 - val_jaccard_coef: 0.8015 - val_accuracy: 0.9520
Epoch 241/300
104/104 [==============================] - 17s 159ms/step - loss: 0.1546 - jaccard_coef: 0.8454 - accuracy: 0.9486 - val_loss: 0.2040 - val_jaccard_coef: 0.7960 - val_accuracy: 0.9514
Epoch 242/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1624 - jaccard_coef: 0.8376 - accuracy: 0.9463 - val_loss: 0.2005 - val_jaccard_coef: 0.7995 - val_accuracy: 0.9506
Epoch 243/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1490 - jaccard_coef: 0.8509 - accuracy: 0.9508 - val_loss: 0.1973 - val_jaccard_coef: 0.8027 - val_accuracy: 0.9526
Epoch 244/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1587 - jaccard_coef: 0.8413 - accuracy: 0.9474 - val_loss: 0.2059 - val_jaccard_coef: 0.7941 - val_accuracy: 0.9511
Epoch 245/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1585 - jaccard_coef: 0.8416 - accuracy: 0.9490 - val_loss: 0.1983 - val_jaccard_coef: 0.8018 - val_accuracy: 0.9518
Epoch 246/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1565 - jaccard_coef: 0.8436 - accuracy: 0.9499 - val_loss: 0.1989 - val_jaccard_coef: 0.8011 - val_accuracy: 0.9503
Epoch 247/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1519 - jaccard_coef: 0.8481 - accuracy: 0.9480 - val_loss: 0.2025 - val_jaccard_coef: 0.7975 - val_accuracy: 0.9514
Epoch 248/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1553 - jaccard_coef: 0.8446 - accuracy: 0.9500 - val_loss: 0.2069 - val_jaccard_coef: 0.7931 - val_accuracy: 0.9494
Epoch 249/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1551 - jaccard_coef: 0.8450 - accuracy: 0.9495 - val_loss: 0.2011 - val_jaccard_coef: 0.7989 - val_accuracy: 0.9512
Epoch 250/300
104/104 [==============================] - 17s 165ms/step - loss: 0.1555 - jaccard_coef: 0.8446 - accuracy: 0.9496 - val_loss: 0.2060 - val_jaccard_coef: 0.7940 - val_accuracy: 0.9506
Epoch 251/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1459 - jaccard_coef: 0.8541 - accuracy: 0.9510 - val_loss: 0.2065 - val_jaccard_coef: 0.7935 - val_accuracy: 0.9497
Epoch 252/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1553 - jaccard_coef: 0.8449 - accuracy: 0.9499 - val_loss: 0.2043 - val_jaccard_coef: 0.7957 - val_accuracy: 0.9499
Epoch 253/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1556 - jaccard_coef: 0.8442 - accuracy: 0.9480 - val_loss: 0.2015 - val_jaccard_coef: 0.7985 - val_accuracy: 0.9498
Epoch 254/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1554 - jaccard_coef: 0.8444 - accuracy: 0.9483 - val_loss: 0.1999 - val_jaccard_coef: 0.8001 - val_accuracy: 0.9499
Epoch 255/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1577 - jaccard_coef: 0.8423 - accuracy: 0.9494 - val_loss: 0.2023 - val_jaccard_coef: 0.7977 - val_accuracy: 0.9506
Epoch 256/300
104/104 [==============================] - 17s 159ms/step - loss: 0.1502 - jaccard_coef: 0.8498 - accuracy: 0.9497 - val_loss: 0.1982 - val_jaccard_coef: 0.8018 - val_accuracy: 0.9516
Epoch 257/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1548 - jaccard_coef: 0.8453 - accuracy: 0.9490 - val_loss: 0.2213 - val_jaccard_coef: 0.7787 - val_accuracy: 0.9431
Epoch 258/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1573 - jaccard_coef: 0.8428 - accuracy: 0.9488 - val_loss: 0.2046 - val_jaccard_coef: 0.7954 - val_accuracy: 0.9488
Epoch 259/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1567 - jaccard_coef: 0.8434 - accuracy: 0.9455 - val_loss: 0.2072 - val_jaccard_coef: 0.7928 - val_accuracy: 0.9494
Epoch 260/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1563 - jaccard_coef: 0.8436 - accuracy: 0.9509 - val_loss: 0.2024 - val_jaccard_coef: 0.7976 - val_accuracy: 0.9500
Epoch 261/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1555 - jaccard_coef: 0.8445 - accuracy: 0.9468 - val_loss: 0.2002 - val_jaccard_coef: 0.7998 - val_accuracy: 0.9498
Epoch 262/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1482 - jaccard_coef: 0.8518 - accuracy: 0.9526 - val_loss: 0.2139 - val_jaccard_coef: 0.7861 - val_accuracy: 0.9492
Epoch 263/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1516 - jaccard_coef: 0.8483 - accuracy: 0.9480 - val_loss: 0.1936 - val_jaccard_coef: 0.8064 - val_accuracy: 0.9526
Epoch 264/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1524 - jaccard_coef: 0.8473 - accuracy: 0.9510 - val_loss: 0.1958 - val_jaccard_coef: 0.8042 - val_accuracy: 0.9524
Epoch 265/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1552 - jaccard_coef: 0.8449 - accuracy: 0.9475 - val_loss: 0.1971 - val_jaccard_coef: 0.8029 - val_accuracy: 0.9508
Epoch 266/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1479 - jaccard_coef: 0.8522 - accuracy: 0.9513 - val_loss: 0.1971 - val_jaccard_coef: 0.8029 - val_accuracy: 0.9520
Epoch 267/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1528 - jaccard_coef: 0.8473 - accuracy: 0.9494 - val_loss: 0.1987 - val_jaccard_coef: 0.8013 - val_accuracy: 0.9507
Epoch 268/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1552 - jaccard_coef: 0.8448 - accuracy: 0.9460 - val_loss: 0.1979 - val_jaccard_coef: 0.8021 - val_accuracy: 0.9507
Epoch 269/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1489 - jaccard_coef: 0.8512 - accuracy: 0.9512 - val_loss: 0.2011 - val_jaccard_coef: 0.7989 - val_accuracy: 0.9512
Epoch 270/300
104/104 [==============================] - 17s 161ms/step - loss: 0.1487 - jaccard_coef: 0.8514 - accuracy: 0.9499 - val_loss: 0.1954 - val_jaccard_coef: 0.8046 - val_accuracy: 0.9523
Epoch 271/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1520 - jaccard_coef: 0.8481 - accuracy: 0.9491 - val_loss: 0.2011 - val_jaccard_coef: 0.7989 - val_accuracy: 0.9515
Epoch 272/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1501 - jaccard_coef: 0.8498 - accuracy: 0.9502 - val_loss: 0.1952 - val_jaccard_coef: 0.8048 - val_accuracy: 0.9519
Epoch 273/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1418 - jaccard_coef: 0.8583 - accuracy: 0.9514 - val_loss: 0.1951 - val_jaccard_coef: 0.8049 - val_accuracy: 0.9518
Epoch 274/300
104/104 [==============================] - 17s 163ms/step - loss: 0.1520 - jaccard_coef: 0.8478 - accuracy: 0.9508 - val_loss: 0.1977 - val_jaccard_coef: 0.8023 - val_accuracy: 0.9511
Epoch 275/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1492 - jaccard_coef: 0.8509 - accuracy: 0.9487 - val_loss: 0.1949 - val_jaccard_coef: 0.8051 - val_accuracy: 0.9517
Epoch 276/300
104/104 [==============================] - 16s 154ms/step - loss: 0.1484 - jaccard_coef: 0.8517 - accuracy: 0.9498 - val_loss: 0.1949 - val_jaccard_coef: 0.8051 - val_accuracy: 0.9523
Epoch 277/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1478 - jaccard_coef: 0.8520 - accuracy: 0.9522 - val_loss: 0.1973 - val_jaccard_coef: 0.8027 - val_accuracy: 0.9516
Epoch 278/300
104/104 [==============================] - 17s 159ms/step - loss: 0.1519 - jaccard_coef: 0.8482 - accuracy: 0.9487 - val_loss: 0.2007 - val_jaccard_coef: 0.7993 - val_accuracy: 0.9517
Epoch 279/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1498 - jaccard_coef: 0.8503 - accuracy: 0.9489 - val_loss: 0.1948 - val_jaccard_coef: 0.8052 - val_accuracy: 0.9520
Epoch 280/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1501 - jaccard_coef: 0.8500 - accuracy: 0.9506 - val_loss: 0.1974 - val_jaccard_coef: 0.8026 - val_accuracy: 0.9509
Epoch 281/300
104/104 [==============================] - 16s 159ms/step - loss: 0.1440 - jaccard_coef: 0.8558 - accuracy: 0.9499 - val_loss: 0.1969 - val_jaccard_coef: 0.8031 - val_accuracy: 0.9511
Epoch 282/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1548 - jaccard_coef: 0.8451 - accuracy: 0.9484 - val_loss: 0.2532 - val_jaccard_coef: 0.7468 - val_accuracy: 0.9387
Epoch 283/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1505 - jaccard_coef: 0.8496 - accuracy: 0.9485 - val_loss: 0.1956 - val_jaccard_coef: 0.8044 - val_accuracy: 0.9522
Epoch 284/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1446 - jaccard_coef: 0.8554 - accuracy: 0.9529 - val_loss: 0.2016 - val_jaccard_coef: 0.7984 - val_accuracy: 0.9508
Epoch 285/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1440 - jaccard_coef: 0.8557 - accuracy: 0.9503 - val_loss: 0.1962 - val_jaccard_coef: 0.8038 - val_accuracy: 0.9515
Epoch 286/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1468 - jaccard_coef: 0.8533 - accuracy: 0.9497 - val_loss: 0.2042 - val_jaccard_coef: 0.7958 - val_accuracy: 0.9475
Epoch 287/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1516 - jaccard_coef: 0.8484 - accuracy: 0.9513 - val_loss: 0.2037 - val_jaccard_coef: 0.7963 - val_accuracy: 0.9485
Epoch 288/300
104/104 [==============================] - 17s 162ms/step - loss: 0.1495 - jaccard_coef: 0.8506 - accuracy: 0.9477 - val_loss: 0.1970 - val_jaccard_coef: 0.8030 - val_accuracy: 0.9513
Epoch 289/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1428 - jaccard_coef: 0.8572 - accuracy: 0.9525 - val_loss: 0.1931 - val_jaccard_coef: 0.8069 - val_accuracy: 0.9518
Epoch 290/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1474 - jaccard_coef: 0.8528 - accuracy: 0.9499 - val_loss: 0.2001 - val_jaccard_coef: 0.7999 - val_accuracy: 0.9503
Epoch 291/300
104/104 [==============================] - 17s 160ms/step - loss: 0.1456 - jaccard_coef: 0.8544 - accuracy: 0.9509 - val_loss: 0.1983 - val_jaccard_coef: 0.8017 - val_accuracy: 0.9509
Epoch 292/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1417 - jaccard_coef: 0.8579 - accuracy: 0.9523 - val_loss: 0.1966 - val_jaccard_coef: 0.8034 - val_accuracy: 0.9512
Epoch 293/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1463 - jaccard_coef: 0.8538 - accuracy: 0.9515 - val_loss: 0.1995 - val_jaccard_coef: 0.8005 - val_accuracy: 0.9506
Epoch 294/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1503 - jaccard_coef: 0.8499 - accuracy: 0.9506 - val_loss: 0.1937 - val_jaccard_coef: 0.8063 - val_accuracy: 0.9525
Epoch 295/300
104/104 [==============================] - 16s 158ms/step - loss: 0.1501 - jaccard_coef: 0.8500 - accuracy: 0.9480 - val_loss: 0.1935 - val_jaccard_coef: 0.8065 - val_accuracy: 0.9522
Epoch 296/300
104/104 [==============================] - 16s 156ms/step - loss: 0.1459 - jaccard_coef: 0.8541 - accuracy: 0.9492 - val_loss: 0.2017 - val_jaccard_coef: 0.7983 - val_accuracy: 0.9489
Epoch 297/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1524 - jaccard_coef: 0.8477 - accuracy: 0.9496 - val_loss: 0.1943 - val_jaccard_coef: 0.8057 - val_accuracy: 0.9525
Epoch 298/300
104/104 [==============================] - 16s 157ms/step - loss: 0.1404 - jaccard_coef: 0.8598 - accuracy: 0.9515 - val_loss: 0.1932 - val_jaccard_coef: 0.8068 - val_accuracy: 0.9525
Epoch 299/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1466 - jaccard_coef: 0.8535 - accuracy: 0.9519 - val_loss: 0.1989 - val_jaccard_coef: 0.8011 - val_accuracy: 0.9496
Epoch 300/300
104/104 [==============================] - 16s 155ms/step - loss: 0.1534 - jaccard_coef: 0.8467 - accuracy: 0.9474 - val_loss: 0.1940 - val_jaccard_coef: 0.8060 - val_accuracy: 0.9517
In [ ]:
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='lower right')
plt.show()

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper right')
plt.show()
No description has been provided for this image
No description has been provided for this image

Thus, we can calculate the accuracy for the test set:

In [ ]:
predict = model.predict(x_test)
12/12 [==============================] - 5s 90ms/step
In [ ]:
pred = np.round(predict)
In [ ]:
accuracy = accuracy_score(y_test.flatten(),pred.flatten())
print(accuracy)
0.9517484452989367
In [ ]:
y_test.shape
Out[ ]:
(360, 256, 256)

Finally, let's plot an example of the predicted result compared to the original mask:

In [ ]:
i = 0
plt.figure(figsize=(20,8))
plt.subplot(1,3,1),
plt.imshow(x_test[i])
plt.subplot(1,3,2),
plt.imshow(y_test[i,:,:])
plt.subplot(1,3,3),
plt.imshow(pred[i,:,:,0])
Out[ ]:
<matplotlib.image.AxesImage at 0x7e5bb1cdcc40>
No description has been provided for this image