Pixel classification of satellite images with artificial neural networks.¶

Pixel land use and land cover classification is a process of assigning specific labels or classes to each pixel in a remote sensing image, indicating the type of land use or land cover present at that location. There are several approaches to performing this classification, including supervised and unsupervised techniques.

Supervised Techniques: In supervised techniques, you need a labeled dataset, where each pixel in the image is assigned to a specific land use or land cover class. Labels can be obtained through field surveys, visual interpretation or other remote sensing techniques. Supervised classification involves training a model based on this labeled data and then using the trained model to classify new pixels.

The most common supervised techniques include machine learning algorithms such as Random Forest, Support Vector Machines (SVM), Artificial Neural Networks (ANN), and others. These algorithms use characteristics extracted from pixels, such as spectral values, textures, vegetation indices, among others, to perform the classification.

image.png

Let's use MLP to perform a pixel-by-pixel classification:

Implementation using SkLearn¶

First we need to install and import the necessary packages:

In [ ]:
!pip install rasterio
Collecting rasterio
  Downloading rasterio-1.3.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.3/21.3 MB 27.5 MB/s eta 0:00:00
Collecting affine (from rasterio)
  Downloading affine-2.4.0-py3-none-any.whl (15 kB)
Requirement already satisfied: attrs in /usr/local/lib/python3.10/dist-packages (from rasterio) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from rasterio) (2023.7.22)
Requirement already satisfied: click>=4.0 in /usr/local/lib/python3.10/dist-packages (from rasterio) (8.1.6)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.10/dist-packages (from rasterio) (0.7.2)
Requirement already satisfied: numpy>=1.18 in /usr/local/lib/python3.10/dist-packages (from rasterio) (1.22.4)
Collecting snuggs>=1.4.1 (from rasterio)
  Downloading snuggs-1.4.7-py3-none-any.whl (5.4 kB)
Requirement already satisfied: click-plugins in /usr/local/lib/python3.10/dist-packages (from rasterio) (1.1.1)
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from rasterio) (67.7.2)
Requirement already satisfied: pyparsing>=2.1.6 in /usr/local/lib/python3.10/dist-packages (from snuggs>=1.4.1->rasterio) (3.1.0)
Installing collected packages: snuggs, affine, rasterio
Successfully installed affine-2.4.0 rasterio-1.3.8 snuggs-1.4.7
Requirement already satisfied: geopandas in /usr/local/lib/python3.10/dist-packages (0.13.2)
Requirement already satisfied: fiona>=1.8.19 in /usr/local/lib/python3.10/dist-packages (from geopandas) (1.9.4.post1)
Requirement already satisfied: packaging in /usr/local/lib/python3.10/dist-packages (from geopandas) (23.1)
Requirement already satisfied: pandas>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from geopandas) (1.5.3)
Requirement already satisfied: pyproj>=3.0.1 in /usr/local/lib/python3.10/dist-packages (from geopandas) (3.6.0)
Requirement already satisfied: shapely>=1.7.1 in /usr/local/lib/python3.10/dist-packages (from geopandas) (2.0.1)
Requirement already satisfied: attrs>=19.2.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (2023.7.22)
Requirement already satisfied: click~=8.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (8.1.6)
Requirement already satisfied: click-plugins>=1.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (1.1.1)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (0.7.2)
Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (1.16.0)
Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (2022.7.1)
Requirement already satisfied: numpy>=1.21.0 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (1.22.4)
In [ ]:
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
In [ ]:
import rasterio
import numpy as np
from matplotlib import pyplot as plt
import cv2
from matplotlib import cm
import pandas as pd
import seaborn as sns
import geopandas as gpd
from rasterio.plot import show
from matplotlib import pyplot as plt
from matplotlib import cm
from matplotlib.colors import ListedColormap

We will use a Sentinel 2 image with 10 spectral bands and with normalized values between 0 and 1:

image.png

Let's use rasterio to import the image:

In [ ]:
path = '/content/drive/MyDrive/Datasets/LULC_Netherland/Netherlands_2021.tif'
In [ ]:
src = rasterio.open(path)
im = src.read()
In [ ]:
print(im.shape)
(10, 1571, 3701)
In [ ]:
im = im.transpose([1,2,0])
In [ ]:
im.shape
Out[ ]:
(1571, 3701, 10)

To plot, we separate the RGB bands and stack them in a new variable:

In [ ]:
R = im[:,:,2]
G = im[:,:,1]
B = im[:,:,0]
In [ ]:
rgb = np.dstack((R,G,B))
In [ ]:
plt.figure(figsize=(16,12))
plt.imshow(rgb*4)
plt.axis('off')
WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
Out[ ]:
(-0.5, 3700.5, 1570.5, -0.5)
No description has been provided for this image

We will use points that were sampled in 5 use and coverage classes: Water, Urban, Forest, Agriculture_vegetation and Agriculture_exposed_soil:

In [ ]:
samples = gpd.read_file('/content/drive/MyDrive/Datasets/LULC_Netherland/Samples.shp')
In [ ]:
samples
Out[ ]:
id geometry
0 1 POINT (5.53812 52.73801)
1 1 POINT (5.54938 52.69688)
2 1 POINT (5.58365 52.63372)
3 1 POINT (5.56897 52.73556)
4 1 POINT (5.56101 52.63513)
... ... ...
145 5 POINT (5.95186 52.76716)
146 5 POINT (5.91306 52.77823)
147 5 POINT (5.66679 52.55724)
148 5 POINT (5.67285 52.56134)
149 5 POINT (5.72456 52.55253)

150 rows × 2 columns

We check if the image and points are in the same coordinate system:

In [ ]:
print(samples.crs)
EPSG:4326
In [ ]:
print(src.crs)
EPSG:4326

We create a new column to add the class names:

In [ ]:
samples['label'] = samples['id'].replace({1:'Agua', 2:'Urbano', 3:'Floresta', 4:'Agricola_Vegetaçao', 5:'Agricola_Solo'})
In [ ]:
samples
Out[ ]:
id geometry label
0 1 POINT (5.53812 52.73801) Agua
1 1 POINT (5.54938 52.69688) Agua
2 1 POINT (5.58365 52.63372) Agua
3 1 POINT (5.56897 52.73556) Agua
4 1 POINT (5.56101 52.63513) Agua
... ... ... ...
145 5 POINT (5.95186 52.76716) Agricola_Solo
146 5 POINT (5.91306 52.77823) Agricola_Solo
147 5 POINT (5.66679 52.55724) Agricola_Solo
148 5 POINT (5.67285 52.56134) Agricola_Solo
149 5 POINT (5.72456 52.55253) Agricola_Solo

150 rows × 3 columns

In [ ]:
cmap = ListedColormap(['yellow','green','blue','darkgreen','red'])
fig, ax = plt.subplots(figsize=(20,20))
samples.plot(column='label', categorical=True, cmap=cmap, legend=True, legend_kwds={'bbox_to_anchor':(1, 0.5),'loc':'upper left','fontsize':16,'frameon':False}, ax=ax)
ax.axis('off')
show(rgb.transpose([2,0,1])*4, transform=src.transform, ax=ax)
WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
Out[ ]:
<Axes: >
No description has been provided for this image

So we will create a for for each point, selecting the corresponding pixel value for each spectral band:

In [ ]:
samples['geometry']
Out[ ]:
0      POINT (5.53812 52.73801)
1      POINT (5.54938 52.69688)
2      POINT (5.58365 52.63372)
3      POINT (5.56897 52.73556)
4      POINT (5.56101 52.63513)
                 ...           
145    POINT (5.95186 52.76716)
146    POINT (5.91306 52.77823)
147    POINT (5.66679 52.55724)
148    POINT (5.67285 52.56134)
149    POINT (5.72456 52.55253)
Name: geometry, Length: 150, dtype: geometry
In [ ]:
array_samples = []
for point in samples['geometry']:
  x = point.xy[0][0]
  y = point.xy[1][0]
  row, col = src.index(x,y)
  band_value = []
  for i in range(src.count):
    band_value.append(src.read(i+1)[row,col])
  array_samples.append(band_value)

At the end we convert the list to an array:

In [ ]:
X = np.array(array_samples)

This array will have the number of points X number of bands:

In [ ]:
X.shape
Out[ ]:
(150, 10)
In [ ]:
dataset = pd.DataFrame(data=X, columns=['B2','B3','B4','B5','B6','B7','B8','B8A','B11','B12'])
In [ ]:
dataset['label'] = samples['id']
In [ ]:
dataset
Out[ ]:
B2 B3 B4 B5 B6 B7 B8 B8A B11 B12 label
0 0.08895 0.06430 0.03805 0.03390 0.02710 0.02410 0.02085 0.01915 0.00510 0.00345 1
1 0.09045 0.06295 0.04010 0.03370 0.02945 0.02600 0.02240 0.01970 0.00445 0.00350 1
2 0.08905 0.06225 0.04045 0.03430 0.02940 0.02630 0.02235 0.02055 0.00535 0.00375 1
3 0.09050 0.06525 0.04105 0.03480 0.02830 0.02680 0.02175 0.02000 0.00615 0.00425 1
4 0.09040 0.06080 0.03865 0.03320 0.02830 0.02590 0.02095 0.01920 0.00530 0.00385 1
... ... ... ... ... ... ... ... ... ... ... ...
145 0.13585 0.12690 0.15220 0.16930 0.24035 0.27875 0.29300 0.32295 0.26210 0.16145 5
146 0.11155 0.09240 0.09975 0.12525 0.19760 0.23015 0.19315 0.26100 0.13890 0.08525 5
147 0.16380 0.16275 0.18500 0.19815 0.21450 0.23335 0.21905 0.24180 0.24210 0.22695 5
148 0.14335 0.13130 0.12980 0.15135 0.25745 0.29175 0.28280 0.30685 0.17595 0.13955 5
149 0.14075 0.13310 0.14955 0.16235 0.18070 0.19295 0.19540 0.20715 0.21625 0.19620 5

150 rows × 11 columns

The next step is to separate the spectral data from the target:

In [ ]:
X = dataset.iloc[:,0:-1].values
Y = dataset.iloc[:,-1].values

So let's import some Sklearn functions:

In [ ]:
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
from sklearn.metrics import confusion_matrix
from sklearn.preprocessing import OneHotEncoder
from sklearn.neural_network import MLPClassifier
from sklearn.metrics import classification_report

Since we are working with categorical data, we need to encode categorical values into binary values so that they are compatible with the expected outputs of a neural network:

image.png

In [ ]:
new_Y = Y[:,np.newaxis]
In [ ]:
new_Y.shape
Out[ ]:
(150, 1)
In [ ]:
enc = OneHotEncoder()

enc.fit(new_Y)

onehotlabels = enc.transform(new_Y).toarray()
In [ ]:
onehotlabels.shape
Out[ ]:
(150, 5)
In [ ]:
onehotlabels
Out[ ]:
array([[1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 1., 0., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.],
       [0., 0., 0., 0., 1.]])

We can then separate the data into training and testing:

In [ ]:
X_train, X_test, Y_train, Y_test = train_test_split(X, onehotlabels, test_size = 0.3, random_state = 42)

Let's then create the MLP instance using the implementation available in Sklearn. The hidden layers will have 16, 32 and 8 neurons. We will use Adam optimizer, Relu activation function and train for 100 iterations.

In [ ]:
classifier = MLPClassifier(hidden_layer_sizes=(16,32,8), max_iter=5000,activation = 'relu',solver='adam',random_state=1,verbose=10)
In [ ]:
classifier.fit(X_train, Y_train)
Iteration 1, loss = 3.87036050
Iteration 2, loss = 3.85071566
Iteration 3, loss = 3.83101681
Iteration 4, loss = 3.81124053
Iteration 5, loss = 3.79151764
Iteration 6, loss = 3.77189049
Iteration 7, loss = 3.75235660
Iteration 8, loss = 3.73289004
Iteration 9, loss = 3.71338557
Iteration 10, loss = 3.69376269
Iteration 11, loss = 3.67409729
Iteration 12, loss = 3.65444722
Iteration 13, loss = 3.63478774
Iteration 14, loss = 3.61528445
Iteration 15, loss = 3.59612260
Iteration 16, loss = 3.57791285
Iteration 17, loss = 3.56088203
Iteration 18, loss = 3.54439202
Iteration 19, loss = 3.52870800
Iteration 20, loss = 3.51382676
Iteration 21, loss = 3.49983748
Iteration 22, loss = 3.48681313
Iteration 23, loss = 3.47428713
Iteration 24, loss = 3.46232792
Iteration 25, loss = 3.45061288
Iteration 26, loss = 3.43886370
Iteration 27, loss = 3.42701062
Iteration 28, loss = 3.41497451
Iteration 29, loss = 3.40290618
Iteration 30, loss = 3.39090042
Iteration 31, loss = 3.37886731
Iteration 32, loss = 3.36678986
Iteration 33, loss = 3.35471647
Iteration 34, loss = 3.34258875
Iteration 35, loss = 3.33042126
Iteration 36, loss = 3.31820292
Iteration 37, loss = 3.30591293
Iteration 38, loss = 3.29361900
Iteration 39, loss = 3.28148536
Iteration 40, loss = 3.26931175
Iteration 41, loss = 3.25710230
Iteration 42, loss = 3.24486390
Iteration 43, loss = 3.23259433
Iteration 44, loss = 3.22031242
Iteration 45, loss = 3.20799065
Iteration 46, loss = 3.19565254
Iteration 47, loss = 3.18332125
Iteration 48, loss = 3.17098158
Iteration 49, loss = 3.15866825
Iteration 50, loss = 3.14641447
Iteration 51, loss = 3.13416827
Iteration 52, loss = 3.12187861
Iteration 53, loss = 3.10955416
Iteration 54, loss = 3.09720298
Iteration 55, loss = 3.08481484
Iteration 56, loss = 3.07238200
Iteration 57, loss = 3.05991821
Iteration 58, loss = 3.04742448
Iteration 59, loss = 3.03487014
Iteration 60, loss = 3.02228184
Iteration 61, loss = 3.00966407
Iteration 62, loss = 2.99699599
Iteration 63, loss = 2.98428920
Iteration 64, loss = 2.97152023
Iteration 65, loss = 2.95870407
Iteration 66, loss = 2.94584590
Iteration 67, loss = 2.93291684
Iteration 68, loss = 2.91993119
Iteration 69, loss = 2.90688551
Iteration 70, loss = 2.89377409
Iteration 71, loss = 2.88058572
Iteration 72, loss = 2.86740406
Iteration 73, loss = 2.85421453
Iteration 74, loss = 2.84103104
Iteration 75, loss = 2.82790435
Iteration 76, loss = 2.81486466
Iteration 77, loss = 2.80190837
Iteration 78, loss = 2.78899457
Iteration 79, loss = 2.77616638
Iteration 80, loss = 2.76345326
Iteration 81, loss = 2.75084420
Iteration 82, loss = 2.73832683
Iteration 83, loss = 2.72588770
Iteration 84, loss = 2.71358514
Iteration 85, loss = 2.70143521
Iteration 86, loss = 2.68947048
Iteration 87, loss = 2.67776621
Iteration 88, loss = 2.66638264
Iteration 89, loss = 2.65529559
Iteration 90, loss = 2.64444294
Iteration 91, loss = 2.63384847
Iteration 92, loss = 2.62350494
Iteration 93, loss = 2.61343491
Iteration 94, loss = 2.60364653
Iteration 95, loss = 2.59414433
Iteration 96, loss = 2.58495753
Iteration 97, loss = 2.57611289
Iteration 98, loss = 2.56758628
Iteration 99, loss = 2.55937539
Iteration 100, loss = 2.55148534
Iteration 101, loss = 2.54391632
Iteration 102, loss = 2.53666756
Iteration 103, loss = 2.52974020
Iteration 104, loss = 2.52313193
Iteration 105, loss = 2.51683769
Iteration 106, loss = 2.51085531
Iteration 107, loss = 2.50517682
Iteration 108, loss = 2.49979339
Iteration 109, loss = 2.49469612
Iteration 110, loss = 2.48989956
Iteration 111, loss = 2.48538668
Iteration 112, loss = 2.48112647
Iteration 113, loss = 2.47710106
Iteration 114, loss = 2.47329706
Iteration 115, loss = 2.46970120
Iteration 116, loss = 2.46630406
Iteration 117, loss = 2.46308664
Iteration 118, loss = 2.46003570
Iteration 119, loss = 2.45713971
Iteration 120, loss = 2.45438172
Iteration 121, loss = 2.45175351
Iteration 122, loss = 2.44923897
Iteration 123, loss = 2.44684118
Iteration 124, loss = 2.44455233
Iteration 125, loss = 2.44234853
Iteration 126, loss = 2.44022259
Iteration 127, loss = 2.43816602
Iteration 128, loss = 2.43617675
Iteration 129, loss = 2.43424280
Iteration 130, loss = 2.43236123
Iteration 131, loss = 2.43050908
Iteration 132, loss = 2.42868432
Iteration 133, loss = 2.42688577
Iteration 134, loss = 2.42510521
Iteration 135, loss = 2.42334304
Iteration 136, loss = 2.42160029
Iteration 137, loss = 2.41987251
Iteration 138, loss = 2.41815859
Iteration 139, loss = 2.41644991
Iteration 140, loss = 2.41473818
Iteration 141, loss = 2.41302461
Iteration 142, loss = 2.41130919
Iteration 143, loss = 2.40958860
Iteration 144, loss = 2.40786096
Iteration 145, loss = 2.40612503
Iteration 146, loss = 2.40438014
Iteration 147, loss = 2.40262132
Iteration 148, loss = 2.40084864
Iteration 149, loss = 2.39906163
Iteration 150, loss = 2.39725946
Iteration 151, loss = 2.39544115
Iteration 152, loss = 2.39360240
Iteration 153, loss = 2.39174360
Iteration 154, loss = 2.38986358
Iteration 155, loss = 2.38796252
Iteration 156, loss = 2.38603848
Iteration 157, loss = 2.38409209
Iteration 158, loss = 2.38212016
Iteration 159, loss = 2.38012651
Iteration 160, loss = 2.37811113
Iteration 161, loss = 2.37606771
Iteration 162, loss = 2.37399559
Iteration 163, loss = 2.37189916
Iteration 164, loss = 2.36977400
Iteration 165, loss = 2.36762049
Iteration 166, loss = 2.36543965
Iteration 167, loss = 2.36322789
Iteration 168, loss = 2.36097895
Iteration 169, loss = 2.35868767
Iteration 170, loss = 2.35636195
Iteration 171, loss = 2.35399750
Iteration 172, loss = 2.35159554
Iteration 173, loss = 2.34915987
Iteration 174, loss = 2.34668969
Iteration 175, loss = 2.34418379
Iteration 176, loss = 2.34163948
Iteration 177, loss = 2.33905638
Iteration 178, loss = 2.33643364
Iteration 179, loss = 2.33377035
Iteration 180, loss = 2.33107603
Iteration 181, loss = 2.32834583
Iteration 182, loss = 2.32557548
Iteration 183, loss = 2.32276431
Iteration 184, loss = 2.31991443
Iteration 185, loss = 2.31703459
Iteration 186, loss = 2.31411270
Iteration 187, loss = 2.31114932
Iteration 188, loss = 2.30814769
Iteration 189, loss = 2.30511541
Iteration 190, loss = 2.30205564
Iteration 191, loss = 2.29895139
Iteration 192, loss = 2.29580646
Iteration 193, loss = 2.29263348
Iteration 194, loss = 2.28942603
Iteration 195, loss = 2.28617956
Iteration 196, loss = 2.28289302
Iteration 197, loss = 2.27957223
Iteration 198, loss = 2.27627877
Iteration 199, loss = 2.27300248
Iteration 200, loss = 2.26969535
Iteration 201, loss = 2.26639865
Iteration 202, loss = 2.26306458
Iteration 203, loss = 2.25973902
Iteration 204, loss = 2.25637188
Iteration 205, loss = 2.25295590
Iteration 206, loss = 2.24950066
Iteration 207, loss = 2.24601221
Iteration 208, loss = 2.24251265
Iteration 209, loss = 2.23901390
Iteration 210, loss = 2.23551392
Iteration 211, loss = 2.23196944
Iteration 212, loss = 2.22842416
Iteration 213, loss = 2.22488477
Iteration 214, loss = 2.22133664
Iteration 215, loss = 2.21775611
Iteration 216, loss = 2.21414507
Iteration 217, loss = 2.21051072
Iteration 218, loss = 2.20684829
Iteration 219, loss = 2.20315354
Iteration 220, loss = 2.19942793
Iteration 221, loss = 2.19566489
Iteration 222, loss = 2.19187107
Iteration 223, loss = 2.18806606
Iteration 224, loss = 2.18424549
Iteration 225, loss = 2.18041002
Iteration 226, loss = 2.17656702
Iteration 227, loss = 2.17269579
Iteration 228, loss = 2.16879710
Iteration 229, loss = 2.16486741
Iteration 230, loss = 2.16091307
Iteration 231, loss = 2.15692871
Iteration 232, loss = 2.15291564
Iteration 233, loss = 2.14888092
Iteration 234, loss = 2.14483534
Iteration 235, loss = 2.14075889
Iteration 236, loss = 2.13665522
Iteration 237, loss = 2.13253039
Iteration 238, loss = 2.12839897
Iteration 239, loss = 2.12425330
Iteration 240, loss = 2.12009141
Iteration 241, loss = 2.11592157
Iteration 242, loss = 2.11174948
Iteration 243, loss = 2.10757303
Iteration 244, loss = 2.10338682
Iteration 245, loss = 2.09917213
Iteration 246, loss = 2.09492606
Iteration 247, loss = 2.09065011
Iteration 248, loss = 2.08637035
Iteration 249, loss = 2.08207593
Iteration 250, loss = 2.07774514
Iteration 251, loss = 2.07337846
Iteration 252, loss = 2.06898048
Iteration 253, loss = 2.06455640
Iteration 254, loss = 2.06010903
Iteration 255, loss = 2.05566018
Iteration 256, loss = 2.05118645
Iteration 257, loss = 2.04667632
Iteration 258, loss = 2.04213891
Iteration 259, loss = 2.03759645
Iteration 260, loss = 2.03303502
Iteration 261, loss = 2.02844944
Iteration 262, loss = 2.02384359
Iteration 263, loss = 2.01921593
Iteration 264, loss = 2.01456136
Iteration 265, loss = 2.00988180
Iteration 266, loss = 2.00518388
Iteration 267, loss = 2.00046601
Iteration 268, loss = 1.99572537
Iteration 269, loss = 1.99096199
Iteration 270, loss = 1.98617638
Iteration 271, loss = 1.98136940
Iteration 272, loss = 1.97654070
Iteration 273, loss = 1.97169014
Iteration 274, loss = 1.96681407
Iteration 275, loss = 1.96191574
Iteration 276, loss = 1.95699443
Iteration 277, loss = 1.95207842
Iteration 278, loss = 1.94715877
Iteration 279, loss = 1.94221702
Iteration 280, loss = 1.93725221
Iteration 281, loss = 1.93227904
Iteration 282, loss = 1.92729564
Iteration 283, loss = 1.92225782
Iteration 284, loss = 1.91719871
Iteration 285, loss = 1.91211522
Iteration 286, loss = 1.90700818
Iteration 287, loss = 1.90187677
Iteration 288, loss = 1.89671900
Iteration 289, loss = 1.89155252
Iteration 290, loss = 1.88637892
Iteration 291, loss = 1.88118403
Iteration 292, loss = 1.87597086
Iteration 293, loss = 1.87076874
Iteration 294, loss = 1.86556816
Iteration 295, loss = 1.86035057
Iteration 296, loss = 1.85511583
Iteration 297, loss = 1.84986587
Iteration 298, loss = 1.84460388
Iteration 299, loss = 1.83935576
Iteration 300, loss = 1.83410614
Iteration 301, loss = 1.82885934
Iteration 302, loss = 1.82362981
Iteration 303, loss = 1.81839660
Iteration 304, loss = 1.81315976
Iteration 305, loss = 1.80791881
Iteration 306, loss = 1.80267503
Iteration 307, loss = 1.79742900
Iteration 308, loss = 1.79215969
Iteration 309, loss = 1.78687356
Iteration 310, loss = 1.78159862
Iteration 311, loss = 1.77629882
Iteration 312, loss = 1.77096429
Iteration 313, loss = 1.76563313
Iteration 314, loss = 1.76030176
Iteration 315, loss = 1.75499908
Iteration 316, loss = 1.74968771
Iteration 317, loss = 1.74441027
Iteration 318, loss = 1.73914028
Iteration 319, loss = 1.73387317
Iteration 320, loss = 1.72862637
Iteration 321, loss = 1.72339413
Iteration 322, loss = 1.71816834
Iteration 323, loss = 1.71295519
Iteration 324, loss = 1.70776278
Iteration 325, loss = 1.70258911
Iteration 326, loss = 1.69741918
Iteration 327, loss = 1.69223375
Iteration 328, loss = 1.68705933
Iteration 329, loss = 1.68191336
Iteration 330, loss = 1.67678765
Iteration 331, loss = 1.67168751
Iteration 332, loss = 1.66659600
Iteration 333, loss = 1.66153242
Iteration 334, loss = 1.65648323
Iteration 335, loss = 1.65145136
Iteration 336, loss = 1.64644137
Iteration 337, loss = 1.64145508
Iteration 338, loss = 1.63649231
Iteration 339, loss = 1.63155471
Iteration 340, loss = 1.62664443
Iteration 341, loss = 1.62176021
Iteration 342, loss = 1.61689732
Iteration 343, loss = 1.61205215
Iteration 344, loss = 1.60722154
Iteration 345, loss = 1.60240343
Iteration 346, loss = 1.59760034
Iteration 347, loss = 1.59281380
Iteration 348, loss = 1.58805406
Iteration 349, loss = 1.58331440
Iteration 350, loss = 1.57857996
Iteration 351, loss = 1.57385991
Iteration 352, loss = 1.56914996
Iteration 353, loss = 1.56445110
Iteration 354, loss = 1.55977598
Iteration 355, loss = 1.55510898
Iteration 356, loss = 1.55044584
Iteration 357, loss = 1.54578711
Iteration 358, loss = 1.54115324
Iteration 359, loss = 1.53655005
Iteration 360, loss = 1.53195275
Iteration 361, loss = 1.52738616
Iteration 362, loss = 1.52285025
Iteration 363, loss = 1.51834509
Iteration 364, loss = 1.51386358
Iteration 365, loss = 1.50940584
Iteration 366, loss = 1.50496616
Iteration 367, loss = 1.50054135
Iteration 368, loss = 1.49613053
Iteration 369, loss = 1.49173232
Iteration 370, loss = 1.48734963
Iteration 371, loss = 1.48298787
Iteration 372, loss = 1.47865543
Iteration 373, loss = 1.47434442
Iteration 374, loss = 1.47005605
Iteration 375, loss = 1.46580818
Iteration 376, loss = 1.46158276
Iteration 377, loss = 1.45737957
Iteration 378, loss = 1.45319899
Iteration 379, loss = 1.44904935
Iteration 380, loss = 1.44492704
Iteration 381, loss = 1.44082930
Iteration 382, loss = 1.43675791
Iteration 383, loss = 1.43271464
Iteration 384, loss = 1.42869943
Iteration 385, loss = 1.42470934
Iteration 386, loss = 1.42074924
Iteration 387, loss = 1.41681282
Iteration 388, loss = 1.41290043
Iteration 389, loss = 1.40902343
Iteration 390, loss = 1.40517756
Iteration 391, loss = 1.40135974
Iteration 392, loss = 1.39757467
Iteration 393, loss = 1.39382049
Iteration 394, loss = 1.39009450
Iteration 395, loss = 1.38639594
Iteration 396, loss = 1.38272775
Iteration 397, loss = 1.37909147
Iteration 398, loss = 1.37548355
Iteration 399, loss = 1.37190363
Iteration 400, loss = 1.36835494
Iteration 401, loss = 1.36484438
Iteration 402, loss = 1.36135668
Iteration 403, loss = 1.35789413
Iteration 404, loss = 1.35446144
Iteration 405, loss = 1.35106801
Iteration 406, loss = 1.34770025
Iteration 407, loss = 1.34435770
Iteration 408, loss = 1.34104806
Iteration 409, loss = 1.33777289
Iteration 410, loss = 1.33452236
Iteration 411, loss = 1.33129707
Iteration 412, loss = 1.32810473
Iteration 413, loss = 1.32494009
Iteration 414, loss = 1.32180392
Iteration 415, loss = 1.31869655
Iteration 416, loss = 1.31561160
Iteration 417, loss = 1.31255614
Iteration 418, loss = 1.30952855
Iteration 419, loss = 1.30653123
Iteration 420, loss = 1.30356145
Iteration 421, loss = 1.30061809
Iteration 422, loss = 1.29770074
Iteration 423, loss = 1.29480946
Iteration 424, loss = 1.29194523
Iteration 425, loss = 1.28911496
Iteration 426, loss = 1.28631527
Iteration 427, loss = 1.28353891
Iteration 428, loss = 1.28079159
Iteration 429, loss = 1.27806145
Iteration 430, loss = 1.27535735
Iteration 431, loss = 1.27267688
Iteration 432, loss = 1.27001883
Iteration 433, loss = 1.26738757
Iteration 434, loss = 1.26477979
Iteration 435, loss = 1.26219431
Iteration 436, loss = 1.25963126
Iteration 437, loss = 1.25709153
Iteration 438, loss = 1.25457265
Iteration 439, loss = 1.25207679
Iteration 440, loss = 1.24959594
Iteration 441, loss = 1.24714142
Iteration 442, loss = 1.24470913
Iteration 443, loss = 1.24230052
Iteration 444, loss = 1.23991495
Iteration 445, loss = 1.23755112
Iteration 446, loss = 1.23521027
Iteration 447, loss = 1.23288870
Iteration 448, loss = 1.23058609
Iteration 449, loss = 1.22830130
Iteration 450, loss = 1.22603263
Iteration 451, loss = 1.22378275
Iteration 452, loss = 1.22155080
Iteration 453, loss = 1.21933648
Iteration 454, loss = 1.21713994
Iteration 455, loss = 1.21496118
Iteration 456, loss = 1.21279990
Iteration 457, loss = 1.21066439
Iteration 458, loss = 1.20854606
Iteration 459, loss = 1.20645790
Iteration 460, loss = 1.20438738
Iteration 461, loss = 1.20233319
Iteration 462, loss = 1.20029253
Iteration 463, loss = 1.19826756
Iteration 464, loss = 1.19626011
Iteration 465, loss = 1.19426829
Iteration 466, loss = 1.19229394
Iteration 467, loss = 1.19033628
Iteration 468, loss = 1.18839479
Iteration 469, loss = 1.18646942
Iteration 470, loss = 1.18456056
Iteration 471, loss = 1.18266772
Iteration 472, loss = 1.18079127
Iteration 473, loss = 1.17893455
Iteration 474, loss = 1.17708695
Iteration 475, loss = 1.17525497
Iteration 476, loss = 1.17343828
Iteration 477, loss = 1.17163423
Iteration 478, loss = 1.16984359
Iteration 479, loss = 1.16806636
Iteration 480, loss = 1.16630227
Iteration 481, loss = 1.16455045
Iteration 482, loss = 1.16281422
Iteration 483, loss = 1.16108817
Iteration 484, loss = 1.15937605
Iteration 485, loss = 1.15767592
Iteration 486, loss = 1.15598754
Iteration 487, loss = 1.15431310
Iteration 488, loss = 1.15264675
Iteration 489, loss = 1.15099451
Iteration 490, loss = 1.14935414
Iteration 491, loss = 1.14772383
Iteration 492, loss = 1.14610305
Iteration 493, loss = 1.14449253
Iteration 494, loss = 1.14289275
Iteration 495, loss = 1.14129243
Iteration 496, loss = 1.13969766
Iteration 497, loss = 1.13811109
Iteration 498, loss = 1.13654778
Iteration 499, loss = 1.13499232
Iteration 500, loss = 1.13344406
Iteration 501, loss = 1.13190498
Iteration 502, loss = 1.13038408
Iteration 503, loss = 1.12887283
Iteration 504, loss = 1.12737054
Iteration 505, loss = 1.12587540
Iteration 506, loss = 1.12438743
Iteration 507, loss = 1.12290725
Iteration 508, loss = 1.12144494
Iteration 509, loss = 1.11998424
Iteration 510, loss = 1.11852662
Iteration 511, loss = 1.11708244
Iteration 512, loss = 1.11565071
Iteration 513, loss = 1.11422613
Iteration 514, loss = 1.11280847
Iteration 515, loss = 1.11139658
Iteration 516, loss = 1.10999008
Iteration 517, loss = 1.10858931
Iteration 518, loss = 1.10719819
Iteration 519, loss = 1.10582108
Iteration 520, loss = 1.10444896
Iteration 521, loss = 1.10308257
Iteration 522, loss = 1.10172179
Iteration 523, loss = 1.10037309
Iteration 524, loss = 1.09903148
Iteration 525, loss = 1.09769410
Iteration 526, loss = 1.09636364
Iteration 527, loss = 1.09504308
Iteration 528, loss = 1.09372658
Iteration 529, loss = 1.09241882
Iteration 530, loss = 1.09111641
Iteration 531, loss = 1.08981706
Iteration 532, loss = 1.08852115
Iteration 533, loss = 1.08723250
Iteration 534, loss = 1.08594990
Iteration 535, loss = 1.08467382
Iteration 536, loss = 1.08340445
Iteration 537, loss = 1.08214743
Iteration 538, loss = 1.08090955
Iteration 539, loss = 1.07967182
Iteration 540, loss = 1.07844368
Iteration 541, loss = 1.07722066
Iteration 542, loss = 1.07600245
Iteration 543, loss = 1.07479235
Iteration 544, loss = 1.07358603
Iteration 545, loss = 1.07238987
Iteration 546, loss = 1.07120011
Iteration 547, loss = 1.06998181
Iteration 548, loss = 1.06869345
Iteration 549, loss = 1.06739081
Iteration 550, loss = 1.06609642
Iteration 551, loss = 1.06481410
Iteration 552, loss = 1.06339825
Iteration 553, loss = 1.06189139
Iteration 554, loss = 1.06045332
Iteration 555, loss = 1.05904069
Iteration 556, loss = 1.05763907
Iteration 557, loss = 1.05623334
Iteration 558, loss = 1.05481075
Iteration 559, loss = 1.05336543
Iteration 560, loss = 1.05189877
Iteration 561, loss = 1.05042082
Iteration 562, loss = 1.04894750
Iteration 563, loss = 1.04746498
Iteration 564, loss = 1.04598519
Iteration 565, loss = 1.04451924
Iteration 566, loss = 1.04308639
Iteration 567, loss = 1.04167330
Iteration 568, loss = 1.04027618
Iteration 569, loss = 1.03888110
Iteration 570, loss = 1.03747878
Iteration 571, loss = 1.03607097
Iteration 572, loss = 1.03447745
Iteration 573, loss = 1.03270519
Iteration 574, loss = 1.03091563
Iteration 575, loss = 1.02911630
Iteration 576, loss = 1.02732903
Iteration 577, loss = 1.02556486
Iteration 578, loss = 1.02381241
Iteration 579, loss = 1.02206216
Iteration 580, loss = 1.02031032
Iteration 581, loss = 1.01853083
Iteration 582, loss = 1.01670701
Iteration 583, loss = 1.01486757
Iteration 584, loss = 1.01304935
Iteration 585, loss = 1.01123936
Iteration 586, loss = 1.00944825
Iteration 587, loss = 1.00767351
Iteration 588, loss = 1.00590897
Iteration 589, loss = 1.00414503
Iteration 590, loss = 1.00238240
Iteration 591, loss = 1.00060524
Iteration 592, loss = 0.99881844
Iteration 593, loss = 0.99703269
Iteration 594, loss = 0.99525094
Iteration 595, loss = 0.99347742
Iteration 596, loss = 0.99172137
Iteration 597, loss = 0.98997781
Iteration 598, loss = 0.98824326
Iteration 599, loss = 0.98650611
Iteration 600, loss = 0.98475806
Iteration 601, loss = 0.98301699
Iteration 602, loss = 0.98127044
Iteration 603, loss = 0.97952580
Iteration 604, loss = 0.97778591
Iteration 605, loss = 0.97603760
Iteration 606, loss = 0.97427803
Iteration 607, loss = 0.97252274
Iteration 608, loss = 0.97077503
Iteration 609, loss = 0.96902878
Iteration 610, loss = 0.96728459
Iteration 611, loss = 0.96554886
Iteration 612, loss = 0.96383997
Iteration 613, loss = 0.96214848
Iteration 614, loss = 0.96046375
Iteration 615, loss = 0.95878496
Iteration 616, loss = 0.95709312
Iteration 617, loss = 0.95539873
Iteration 618, loss = 0.95372297
Iteration 619, loss = 0.95205976
Iteration 620, loss = 0.95040260
Iteration 621, loss = 0.94875181
Iteration 622, loss = 0.94711971
Iteration 623, loss = 0.94549681
Iteration 624, loss = 0.94387097
Iteration 625, loss = 0.94225172
Iteration 626, loss = 0.94063895
Iteration 627, loss = 0.93903345
Iteration 628, loss = 0.93744675
Iteration 629, loss = 0.93586518
Iteration 630, loss = 0.93428581
Iteration 631, loss = 0.93271401
Iteration 632, loss = 0.93116856
Iteration 633, loss = 0.92965458
Iteration 634, loss = 0.92814281
Iteration 635, loss = 0.92663783
Iteration 636, loss = 0.92515187
Iteration 637, loss = 0.92367448
Iteration 638, loss = 0.92220608
Iteration 639, loss = 0.92075210
Iteration 640, loss = 0.91930841
Iteration 641, loss = 0.91787022
Iteration 642, loss = 0.91643929
Iteration 643, loss = 0.91501490
Iteration 644, loss = 0.91359517
Iteration 645, loss = 0.91218064
Iteration 646, loss = 0.91077314
Iteration 647, loss = 0.90937293
Iteration 648, loss = 0.90797844
Iteration 649, loss = 0.90658933
Iteration 650, loss = 0.90520705
Iteration 651, loss = 0.90383087
Iteration 652, loss = 0.90246030
Iteration 653, loss = 0.90109531
Iteration 654, loss = 0.89973610
Iteration 655, loss = 0.89838279
Iteration 656, loss = 0.89703422
Iteration 657, loss = 0.89569162
Iteration 658, loss = 0.89435570
Iteration 659, loss = 0.89302630
Iteration 660, loss = 0.89170278
Iteration 661, loss = 0.89038596
Iteration 662, loss = 0.88907513
Iteration 663, loss = 0.88777024
Iteration 664, loss = 0.88647132
Iteration 665, loss = 0.88517851
Iteration 666, loss = 0.88389191
Iteration 667, loss = 0.88261163
Iteration 668, loss = 0.88133791
Iteration 669, loss = 0.88007067
Iteration 670, loss = 0.87880889
Iteration 671, loss = 0.87755273
Iteration 672, loss = 0.87630394
Iteration 673, loss = 0.87506212
Iteration 674, loss = 0.87382585
Iteration 675, loss = 0.87259522
Iteration 676, loss = 0.87137030
Iteration 677, loss = 0.87015127
Iteration 678, loss = 0.86893831
Iteration 679, loss = 0.86773137
Iteration 680, loss = 0.86653224
Iteration 681, loss = 0.86533899
Iteration 682, loss = 0.86415082
Iteration 683, loss = 0.86296831
Iteration 684, loss = 0.86179203
Iteration 685, loss = 0.86062240
Iteration 686, loss = 0.85945928
Iteration 687, loss = 0.85830204
Iteration 688, loss = 0.85715049
Iteration 689, loss = 0.85600563
Iteration 690, loss = 0.85486647
Iteration 691, loss = 0.85373321
Iteration 692, loss = 0.85260572
Iteration 693, loss = 0.85148446
Iteration 694, loss = 0.85036942
Iteration 695, loss = 0.84926353
Iteration 696, loss = 0.84816450
Iteration 697, loss = 0.84707524
Iteration 698, loss = 0.84598865
Iteration 699, loss = 0.84491417
Iteration 700, loss = 0.84384546
Iteration 701, loss = 0.84278020
Iteration 702, loss = 0.84171785
Iteration 703, loss = 0.84065876
Iteration 704, loss = 0.83961341
Iteration 705, loss = 0.83857253
Iteration 706, loss = 0.83754325
Iteration 707, loss = 0.83651762
Iteration 708, loss = 0.83549317
Iteration 709, loss = 0.83448429
Iteration 710, loss = 0.83347948
Iteration 711, loss = 0.83247800
Iteration 712, loss = 0.83148119
Iteration 713, loss = 0.83048984
Iteration 714, loss = 0.82950861
Iteration 715, loss = 0.82853094
Iteration 716, loss = 0.82755837
Iteration 717, loss = 0.82659536
Iteration 718, loss = 0.82563730
Iteration 719, loss = 0.82468360
Iteration 720, loss = 0.82373478
Iteration 721, loss = 0.82279184
Iteration 722, loss = 0.82185718
Iteration 723, loss = 0.82092832
Iteration 724, loss = 0.82000586
Iteration 725, loss = 0.81908871
Iteration 726, loss = 0.81817714
Iteration 727, loss = 0.81727086
Iteration 728, loss = 0.81637031
Iteration 729, loss = 0.81547559
Iteration 730, loss = 0.81458698
Iteration 731, loss = 0.81370390
Iteration 732, loss = 0.81282598
Iteration 733, loss = 0.81195381
Iteration 734, loss = 0.81108735
Iteration 735, loss = 0.81022614
Iteration 736, loss = 0.80937014
Iteration 737, loss = 0.80851941
Iteration 738, loss = 0.80767457
Iteration 739, loss = 0.80683488
Iteration 740, loss = 0.80600011
Iteration 741, loss = 0.80517061
Iteration 742, loss = 0.80434658
Iteration 743, loss = 0.80352753
Iteration 744, loss = 0.80271354
Iteration 745, loss = 0.80190445
Iteration 746, loss = 0.80110083
Iteration 747, loss = 0.80030177
Iteration 748, loss = 0.79950732
Iteration 749, loss = 0.79871837
Iteration 750, loss = 0.79793458
Iteration 751, loss = 0.79715538
Iteration 752, loss = 0.79638072
Iteration 753, loss = 0.79561101
Iteration 754, loss = 0.79484591
Iteration 755, loss = 0.79408542
Iteration 756, loss = 0.79333029
Iteration 757, loss = 0.79257966
Iteration 758, loss = 0.79183333
Iteration 759, loss = 0.79109186
Iteration 760, loss = 0.79035543
Iteration 761, loss = 0.78962347
Iteration 762, loss = 0.78889565
Iteration 763, loss = 0.78817322
Iteration 764, loss = 0.78745522
Iteration 765, loss = 0.78674158
Iteration 766, loss = 0.78603223
Iteration 767, loss = 0.78532737
Iteration 768, loss = 0.78462835
Iteration 769, loss = 0.78393283
Iteration 770, loss = 0.78324088
Iteration 771, loss = 0.78255358
Iteration 772, loss = 0.78187049
Iteration 773, loss = 0.78119225
Iteration 774, loss = 0.78051802
Iteration 775, loss = 0.77984783
Iteration 776, loss = 0.77918181
Iteration 777, loss = 0.77852021
Iteration 778, loss = 0.77786250
Iteration 779, loss = 0.77720880
Iteration 780, loss = 0.77655925
Iteration 781, loss = 0.77591368
Iteration 782, loss = 0.77527217
Iteration 783, loss = 0.77463457
Iteration 784, loss = 0.77400087
Iteration 785, loss = 0.77337102
Iteration 786, loss = 0.77274507
Iteration 787, loss = 0.77212277
Iteration 788, loss = 0.77150436
Iteration 789, loss = 0.77088966
Iteration 790, loss = 0.77027873
Iteration 791, loss = 0.76967144
Iteration 792, loss = 0.76906785
Iteration 793, loss = 0.76846773
Iteration 794, loss = 0.76787121
Iteration 795, loss = 0.76727825
Iteration 796, loss = 0.76668877
Iteration 797, loss = 0.76610282
Iteration 798, loss = 0.76552020
Iteration 799, loss = 0.76494094
Iteration 800, loss = 0.76436507
Iteration 801, loss = 0.76379252
Iteration 802, loss = 0.76322333
Iteration 803, loss = 0.76265735
Iteration 804, loss = 0.76209463
Iteration 805, loss = 0.76153506
Iteration 806, loss = 0.76097867
Iteration 807, loss = 0.76042542
Iteration 808, loss = 0.75987524
Iteration 809, loss = 0.75932814
Iteration 810, loss = 0.75878403
Iteration 811, loss = 0.75824296
Iteration 812, loss = 0.75770474
Iteration 813, loss = 0.75716953
Iteration 814, loss = 0.75663722
Iteration 815, loss = 0.75610770
Iteration 816, loss = 0.75558102
Iteration 817, loss = 0.75505709
Iteration 818, loss = 0.75453595
Iteration 819, loss = 0.75401744
Iteration 820, loss = 0.75350172
Iteration 821, loss = 0.75298865
Iteration 822, loss = 0.75247809
Iteration 823, loss = 0.75197027
Iteration 824, loss = 0.75146490
Iteration 825, loss = 0.75096204
Iteration 826, loss = 0.75046176
Iteration 827, loss = 0.74996387
Iteration 828, loss = 0.74946848
Iteration 829, loss = 0.74897356
Iteration 830, loss = 0.74847838
Iteration 831, loss = 0.74798453
Iteration 832, loss = 0.74749226
Iteration 833, loss = 0.74700153
Iteration 834, loss = 0.74651220
Iteration 835, loss = 0.74602480
Iteration 836, loss = 0.74553888
Iteration 837, loss = 0.74505433
Iteration 838, loss = 0.74457145
Iteration 839, loss = 0.74409042
Iteration 840, loss = 0.74361098
Iteration 841, loss = 0.74313958
Iteration 842, loss = 0.74267469
Iteration 843, loss = 0.74221202
Iteration 844, loss = 0.74175149
Iteration 845, loss = 0.74129303
Iteration 846, loss = 0.74083909
Iteration 847, loss = 0.74039179
Iteration 848, loss = 0.73994670
Iteration 849, loss = 0.73950294
Iteration 850, loss = 0.73906034
Iteration 851, loss = 0.73861933
Iteration 852, loss = 0.73817971
Iteration 853, loss = 0.73774157
Iteration 854, loss = 0.73730495
Iteration 855, loss = 0.73687216
Iteration 856, loss = 0.73644130
Iteration 857, loss = 0.73601207
Iteration 858, loss = 0.73558446
Iteration 859, loss = 0.73515860
Iteration 860, loss = 0.73473443
Iteration 861, loss = 0.73431203
Iteration 862, loss = 0.73389235
Iteration 863, loss = 0.73347447
Iteration 864, loss = 0.73305836
Iteration 865, loss = 0.73264399
Iteration 866, loss = 0.73223145
Iteration 867, loss = 0.73182050
Iteration 868, loss = 0.73141272
Iteration 869, loss = 0.73100624
Iteration 870, loss = 0.73060080
Iteration 871, loss = 0.73019664
Iteration 872, loss = 0.72979378
Iteration 873, loss = 0.72939415
Iteration 874, loss = 0.72899589
Iteration 875, loss = 0.72859879
Iteration 876, loss = 0.72820315
Iteration 877, loss = 0.72780881
Iteration 878, loss = 0.72741588
Iteration 879, loss = 0.72702442
Iteration 880, loss = 0.72663465
Iteration 881, loss = 0.72624666
Iteration 882, loss = 0.72585975
Iteration 883, loss = 0.72547420
Iteration 884, loss = 0.72509038
Iteration 885, loss = 0.72470771
Iteration 886, loss = 0.72432649
Iteration 887, loss = 0.72394673
Iteration 888, loss = 0.72356778
Iteration 889, loss = 0.72319137
Iteration 890, loss = 0.72281530
Iteration 891, loss = 0.72244018
Iteration 892, loss = 0.72206672
Iteration 893, loss = 0.72169480
Iteration 894, loss = 0.72132395
Iteration 895, loss = 0.72095403
Iteration 896, loss = 0.72058530
Iteration 897, loss = 0.72021819
Iteration 898, loss = 0.71985173
Iteration 899, loss = 0.71948642
Iteration 900, loss = 0.71912234
Iteration 901, loss = 0.71876135
Iteration 902, loss = 0.71840220
Iteration 903, loss = 0.71804380
Iteration 904, loss = 0.71768641
Iteration 905, loss = 0.71732994
Iteration 906, loss = 0.71697398
Iteration 907, loss = 0.71661885
Iteration 908, loss = 0.71626514
Iteration 909, loss = 0.71591250
Iteration 910, loss = 0.71556161
Iteration 911, loss = 0.71521019
Iteration 912, loss = 0.71486043
Iteration 913, loss = 0.71451196
Iteration 914, loss = 0.71416578
Iteration 915, loss = 0.71381860
Iteration 916, loss = 0.71347502
Iteration 917, loss = 0.71312889
Iteration 918, loss = 0.71278514
Iteration 919, loss = 0.71244149
Iteration 920, loss = 0.71209955
Iteration 921, loss = 0.71175825
Iteration 922, loss = 0.71141933
Iteration 923, loss = 0.71108087
Iteration 924, loss = 0.71074208
Iteration 925, loss = 0.71040630
Iteration 926, loss = 0.71006874
Iteration 927, loss = 0.70973354
Iteration 928, loss = 0.70939903
Iteration 929, loss = 0.70906621
Iteration 930, loss = 0.70873444
Iteration 931, loss = 0.70840403
Iteration 932, loss = 0.70807160
Iteration 933, loss = 0.70774429
Iteration 934, loss = 0.70741438
Iteration 935, loss = 0.70708686
Iteration 936, loss = 0.70676118
Iteration 937, loss = 0.70643489
Iteration 938, loss = 0.70610708
Iteration 939, loss = 0.70578434
Iteration 940, loss = 0.70545995
Iteration 941, loss = 0.70513380
Iteration 942, loss = 0.70481279
Iteration 943, loss = 0.70449068
Iteration 944, loss = 0.70416926
Iteration 945, loss = 0.70384990
Iteration 946, loss = 0.70352813
Iteration 947, loss = 0.70321050
Iteration 948, loss = 0.70289194
Iteration 949, loss = 0.70257282
Iteration 950, loss = 0.70225774
Iteration 951, loss = 0.70194053
Iteration 952, loss = 0.70162274
Iteration 953, loss = 0.70130867
Iteration 954, loss = 0.70099446
Iteration 955, loss = 0.70067969
Iteration 956, loss = 0.70036612
Iteration 957, loss = 0.70005361
Iteration 958, loss = 0.69974127
Iteration 959, loss = 0.69943102
Iteration 960, loss = 0.69912055
Iteration 961, loss = 0.69880993
Iteration 962, loss = 0.69850006
Iteration 963, loss = 0.69819275
Iteration 964, loss = 0.69788241
Iteration 965, loss = 0.69757454
Iteration 966, loss = 0.69726762
Iteration 967, loss = 0.69695983
Iteration 968, loss = 0.69665401
Iteration 969, loss = 0.69634732
Iteration 970, loss = 0.69604527
Iteration 971, loss = 0.69573834
Iteration 972, loss = 0.69543417
Iteration 973, loss = 0.69513108
Iteration 974, loss = 0.69482812
Iteration 975, loss = 0.69452540
Iteration 976, loss = 0.69422286
Iteration 977, loss = 0.69392108
Iteration 978, loss = 0.69361982
Iteration 979, loss = 0.69331896
Iteration 980, loss = 0.69301851
Iteration 981, loss = 0.69271849
Iteration 982, loss = 0.69241887
Iteration 983, loss = 0.69211979
Iteration 984, loss = 0.69182127
Iteration 985, loss = 0.69152289
Iteration 986, loss = 0.69122513
Iteration 987, loss = 0.69092765
Iteration 988, loss = 0.69063055
Iteration 989, loss = 0.69033556
Iteration 990, loss = 0.69003969
Iteration 991, loss = 0.68974338
Iteration 992, loss = 0.68944917
Iteration 993, loss = 0.68915563
Iteration 994, loss = 0.68886290
Iteration 995, loss = 0.68856881
Iteration 996, loss = 0.68827604
Iteration 997, loss = 0.68798358
Iteration 998, loss = 0.68769116
Iteration 999, loss = 0.68739889
Iteration 1000, loss = 0.68710687
Iteration 1001, loss = 0.68681556
Iteration 1002, loss = 0.68652415
Iteration 1003, loss = 0.68623338
Iteration 1004, loss = 0.68594301
Iteration 1005, loss = 0.68565367
Iteration 1006, loss = 0.68536436
Iteration 1007, loss = 0.68507481
Iteration 1008, loss = 0.68478516
Iteration 1009, loss = 0.68449710
Iteration 1010, loss = 0.68420916
Iteration 1011, loss = 0.68392101
Iteration 1012, loss = 0.68363300
Iteration 1013, loss = 0.68334493
Iteration 1014, loss = 0.68305701
Iteration 1015, loss = 0.68276907
Iteration 1016, loss = 0.68248319
Iteration 1017, loss = 0.68219547
Iteration 1018, loss = 0.68190858
Iteration 1019, loss = 0.68162205
Iteration 1020, loss = 0.68133627
Iteration 1021, loss = 0.68105036
Iteration 1022, loss = 0.68076444
Iteration 1023, loss = 0.68047826
Iteration 1024, loss = 0.68019371
Iteration 1025, loss = 0.67990992
Iteration 1026, loss = 0.67962361
Iteration 1027, loss = 0.67933800
Iteration 1028, loss = 0.67905398
Iteration 1029, loss = 0.67876955
Iteration 1030, loss = 0.67848489
Iteration 1031, loss = 0.67820120
Iteration 1032, loss = 0.67791705
Iteration 1033, loss = 0.67763258
Iteration 1034, loss = 0.67734894
Iteration 1035, loss = 0.67706576
Iteration 1036, loss = 0.67678298
Iteration 1037, loss = 0.67650146
Iteration 1038, loss = 0.67622036
Iteration 1039, loss = 0.67593927
Iteration 1040, loss = 0.67565790
Iteration 1041, loss = 0.67537686
Iteration 1042, loss = 0.67509598
Iteration 1043, loss = 0.67481520
Iteration 1044, loss = 0.67453433
Iteration 1045, loss = 0.67425430
Iteration 1046, loss = 0.67397381
Iteration 1047, loss = 0.67369396
Iteration 1048, loss = 0.67341385
Iteration 1049, loss = 0.67313442
Iteration 1050, loss = 0.67285485
Iteration 1051, loss = 0.67257508
Iteration 1052, loss = 0.67229520
Iteration 1053, loss = 0.67201530
Iteration 1054, loss = 0.67173817
Iteration 1055, loss = 0.67145728
Iteration 1056, loss = 0.67117799
Iteration 1057, loss = 0.67089926
Iteration 1058, loss = 0.67062060
Iteration 1059, loss = 0.67034178
Iteration 1060, loss = 0.67006262
Iteration 1061, loss = 0.66977516
Iteration 1062, loss = 0.66936725
Iteration 1063, loss = 0.66891520
Iteration 1064, loss = 0.66852196
Iteration 1065, loss = 0.66819384
Iteration 1066, loss = 0.66789354
Iteration 1067, loss = 0.66759603
Iteration 1068, loss = 0.66728083
Iteration 1069, loss = 0.66695077
Iteration 1070, loss = 0.66662282
Iteration 1071, loss = 0.66629917
Iteration 1072, loss = 0.66596366
Iteration 1073, loss = 0.66561654
Iteration 1074, loss = 0.66525341
Iteration 1075, loss = 0.66489770
Iteration 1076, loss = 0.66455794
Iteration 1077, loss = 0.66423740
Iteration 1078, loss = 0.66392289
Iteration 1079, loss = 0.66360543
Iteration 1080, loss = 0.66327495
Iteration 1081, loss = 0.66293556
Iteration 1082, loss = 0.66259128
Iteration 1083, loss = 0.66225725
Iteration 1084, loss = 0.66192768
Iteration 1085, loss = 0.66159913
Iteration 1086, loss = 0.66127269
Iteration 1087, loss = 0.66094004
Iteration 1088, loss = 0.66060833
Iteration 1089, loss = 0.66028312
Iteration 1090, loss = 0.65995891
Iteration 1091, loss = 0.65963589
Iteration 1092, loss = 0.65931353
Iteration 1093, loss = 0.65898753
Iteration 1094, loss = 0.65865539
Iteration 1095, loss = 0.65832596
Iteration 1096, loss = 0.65800183
Iteration 1097, loss = 0.65768256
Iteration 1098, loss = 0.65736455
Iteration 1099, loss = 0.65704309
Iteration 1100, loss = 0.65671641
Iteration 1101, loss = 0.65639558
Iteration 1102, loss = 0.65607305
Iteration 1103, loss = 0.65574946
Iteration 1104, loss = 0.65542704
Iteration 1105, loss = 0.65510378
Iteration 1106, loss = 0.65478474
Iteration 1107, loss = 0.65446117
Iteration 1108, loss = 0.65413975
Iteration 1109, loss = 0.65382233
Iteration 1110, loss = 0.65350314
Iteration 1111, loss = 0.65318383
Iteration 1112, loss = 0.65286036
Iteration 1113, loss = 0.65253569
Iteration 1114, loss = 0.65222149
Iteration 1115, loss = 0.65190301
Iteration 1116, loss = 0.65158044
Iteration 1117, loss = 0.65126067
Iteration 1118, loss = 0.65093961
Iteration 1119, loss = 0.65062023
Iteration 1120, loss = 0.65029970
Iteration 1121, loss = 0.64997779
Iteration 1122, loss = 0.64966420
Iteration 1123, loss = 0.64934732
Iteration 1124, loss = 0.64902441
Iteration 1125, loss = 0.64870698
Iteration 1126, loss = 0.64839064
Iteration 1127, loss = 0.64807231
Iteration 1128, loss = 0.64775333
Iteration 1129, loss = 0.64743908
Iteration 1130, loss = 0.64711843
Iteration 1131, loss = 0.64680465
Iteration 1132, loss = 0.64649062
Iteration 1133, loss = 0.64617310
Iteration 1134, loss = 0.64585738
Iteration 1135, loss = 0.64554507
Iteration 1136, loss = 0.64522947
Iteration 1137, loss = 0.64491546
Iteration 1138, loss = 0.64460009
Iteration 1139, loss = 0.64428853
Iteration 1140, loss = 0.64397762
Iteration 1141, loss = 0.64366348
Iteration 1142, loss = 0.64335019
Iteration 1143, loss = 0.64303698
Iteration 1144, loss = 0.64272299
Iteration 1145, loss = 0.64241360
Iteration 1146, loss = 0.64209862
Iteration 1147, loss = 0.64178704
Iteration 1148, loss = 0.64147741
Iteration 1149, loss = 0.64116351
Iteration 1150, loss = 0.64085334
Iteration 1151, loss = 0.64054112
Iteration 1152, loss = 0.64022723
Iteration 1153, loss = 0.63991489
Iteration 1154, loss = 0.63960840
Iteration 1155, loss = 0.63929762
Iteration 1156, loss = 0.63898733
Iteration 1157, loss = 0.63867922
Iteration 1158, loss = 0.63837043
Iteration 1159, loss = 0.63806071
Iteration 1160, loss = 0.63774962
Iteration 1161, loss = 0.63743876
Iteration 1162, loss = 0.63713771
Iteration 1163, loss = 0.63682773
Iteration 1164, loss = 0.63651314
Iteration 1165, loss = 0.63620556
Iteration 1166, loss = 0.63589905
Iteration 1167, loss = 0.63559079
Iteration 1168, loss = 0.63528299
Iteration 1169, loss = 0.63497750
Iteration 1170, loss = 0.63466589
Iteration 1171, loss = 0.63435998
Iteration 1172, loss = 0.63405875
Iteration 1173, loss = 0.63375025
Iteration 1174, loss = 0.63344456
Iteration 1175, loss = 0.63314108
Iteration 1176, loss = 0.63283539
Iteration 1177, loss = 0.63252950
Iteration 1178, loss = 0.63222367
Iteration 1179, loss = 0.63191690
Iteration 1180, loss = 0.63161002
Iteration 1181, loss = 0.63130471
Iteration 1182, loss = 0.63099924
Iteration 1183, loss = 0.63069389
Iteration 1184, loss = 0.63038969
Iteration 1185, loss = 0.63008774
Iteration 1186, loss = 0.62978439
Iteration 1187, loss = 0.62947784
Iteration 1188, loss = 0.62917561
Iteration 1189, loss = 0.62887366
Iteration 1190, loss = 0.62857155
Iteration 1191, loss = 0.62826907
Iteration 1192, loss = 0.62796638
Iteration 1193, loss = 0.62766845
Iteration 1194, loss = 0.62736662
Iteration 1195, loss = 0.62706630
Iteration 1196, loss = 0.62676645
Iteration 1197, loss = 0.62646925
Iteration 1198, loss = 0.62617377
Iteration 1199, loss = 0.62587369
Iteration 1200, loss = 0.62557790
Iteration 1201, loss = 0.62528122
Iteration 1202, loss = 0.62498459
Iteration 1203, loss = 0.62469219
Iteration 1204, loss = 0.62439707
Iteration 1205, loss = 0.62410098
Iteration 1206, loss = 0.62380769
Iteration 1207, loss = 0.62351449
Iteration 1208, loss = 0.62322124
Iteration 1209, loss = 0.62292813
Iteration 1210, loss = 0.62263624
Iteration 1211, loss = 0.62234294
Iteration 1212, loss = 0.62205086
Iteration 1213, loss = 0.62175902
Iteration 1214, loss = 0.62146785
Iteration 1215, loss = 0.62117796
Iteration 1216, loss = 0.62088740
Iteration 1217, loss = 0.62059856
Iteration 1218, loss = 0.62030984
Iteration 1219, loss = 0.62001893
Iteration 1220, loss = 0.61972813
Iteration 1221, loss = 0.61944163
Iteration 1222, loss = 0.61915376
Iteration 1223, loss = 0.61886319
Iteration 1224, loss = 0.61857639
Iteration 1225, loss = 0.61828916
Iteration 1226, loss = 0.61800234
Iteration 1227, loss = 0.61771655
Iteration 1228, loss = 0.61742944
Iteration 1229, loss = 0.61714176
Iteration 1230, loss = 0.61685899
Iteration 1231, loss = 0.61657190
Iteration 1232, loss = 0.61628974
Iteration 1233, loss = 0.61600634
Iteration 1234, loss = 0.61572076
Iteration 1235, loss = 0.61543356
Iteration 1236, loss = 0.61515106
Iteration 1237, loss = 0.61486734
Iteration 1238, loss = 0.61458229
Iteration 1239, loss = 0.61430127
Iteration 1240, loss = 0.61401846
Iteration 1241, loss = 0.61373642
Iteration 1242, loss = 0.61345537
Iteration 1243, loss = 0.61317473
Iteration 1244, loss = 0.61289457
Iteration 1245, loss = 0.61261543
Iteration 1246, loss = 0.61233481
Iteration 1247, loss = 0.61205548
Iteration 1248, loss = 0.61177591
Iteration 1249, loss = 0.61149651
Iteration 1250, loss = 0.61121772
Iteration 1251, loss = 0.61093874
Iteration 1252, loss = 0.61066218
Iteration 1253, loss = 0.61038303
Iteration 1254, loss = 0.61010603
Iteration 1255, loss = 0.60983079
Iteration 1256, loss = 0.60955313
Iteration 1257, loss = 0.60927795
Iteration 1258, loss = 0.60900298
Iteration 1259, loss = 0.60872661
Iteration 1260, loss = 0.60845115
Iteration 1261, loss = 0.60817582
Iteration 1262, loss = 0.60789869
Iteration 1263, loss = 0.60761929
Iteration 1264, loss = 0.60734279
Iteration 1265, loss = 0.60706607
Iteration 1266, loss = 0.60679262
Iteration 1267, loss = 0.60652000
Iteration 1268, loss = 0.60624504
Iteration 1269, loss = 0.60597487
Iteration 1270, loss = 0.60570415
Iteration 1271, loss = 0.60543171
Iteration 1272, loss = 0.60516083
Iteration 1273, loss = 0.60489397
Iteration 1274, loss = 0.60462711
Iteration 1275, loss = 0.60436024
Iteration 1276, loss = 0.60409367
Iteration 1277, loss = 0.60382941
Iteration 1278, loss = 0.60356274
Iteration 1279, loss = 0.60329777
Iteration 1280, loss = 0.60303233
Iteration 1281, loss = 0.60277045
Iteration 1282, loss = 0.60251026
Iteration 1283, loss = 0.60224197
Iteration 1284, loss = 0.60197788
Iteration 1285, loss = 0.60171592
Iteration 1286, loss = 0.60145103
Iteration 1287, loss = 0.60118728
Iteration 1288, loss = 0.60092284
Iteration 1289, loss = 0.60065926
Iteration 1290, loss = 0.60039656
Iteration 1291, loss = 0.60013451
Iteration 1292, loss = 0.59987515
Iteration 1293, loss = 0.59961446
Iteration 1294, loss = 0.59935580
Iteration 1295, loss = 0.59909415
Iteration 1296, loss = 0.59883379
Iteration 1297, loss = 0.59857518
Iteration 1298, loss = 0.59831856
Iteration 1299, loss = 0.59806002
Iteration 1300, loss = 0.59780136
Iteration 1301, loss = 0.59754460
Iteration 1302, loss = 0.59728813
Iteration 1303, loss = 0.59703069
Iteration 1304, loss = 0.59677340
Iteration 1305, loss = 0.59652088
Iteration 1306, loss = 0.59626485
Iteration 1307, loss = 0.59600930
Iteration 1308, loss = 0.59575634
Iteration 1309, loss = 0.59550406
Iteration 1310, loss = 0.59525017
Iteration 1311, loss = 0.59499544
Iteration 1312, loss = 0.59474271
Iteration 1313, loss = 0.59449272
Iteration 1314, loss = 0.59423616
Iteration 1315, loss = 0.59399039
Iteration 1316, loss = 0.59374150
Iteration 1317, loss = 0.59348868
Iteration 1318, loss = 0.59323853
Iteration 1319, loss = 0.59298891
Iteration 1320, loss = 0.59273820
Iteration 1321, loss = 0.59249750
Iteration 1322, loss = 0.59225414
Iteration 1323, loss = 0.59200730
Iteration 1324, loss = 0.59175768
Iteration 1325, loss = 0.59150867
Iteration 1326, loss = 0.59126541
Iteration 1327, loss = 0.59102203
Iteration 1328, loss = 0.59077661
Iteration 1329, loss = 0.59053163
Iteration 1330, loss = 0.59029341
Iteration 1331, loss = 0.59004774
Iteration 1332, loss = 0.58980582
Iteration 1333, loss = 0.58956150
Iteration 1334, loss = 0.58932262
Iteration 1335, loss = 0.58908294
Iteration 1336, loss = 0.58884234
Iteration 1337, loss = 0.58860252
Iteration 1338, loss = 0.58836175
Iteration 1339, loss = 0.58812376
Iteration 1340, loss = 0.58788390
Iteration 1341, loss = 0.58764043
Iteration 1342, loss = 0.58739911
Iteration 1343, loss = 0.58716421
Iteration 1344, loss = 0.58692458
Iteration 1345, loss = 0.58668610
Iteration 1346, loss = 0.58644936
Iteration 1347, loss = 0.58621283
Iteration 1348, loss = 0.58597764
Iteration 1349, loss = 0.58574218
Iteration 1350, loss = 0.58550541
Iteration 1351, loss = 0.58527285
Iteration 1352, loss = 0.58503720
Iteration 1353, loss = 0.58480073
Iteration 1354, loss = 0.58457026
Iteration 1355, loss = 0.58433846
Iteration 1356, loss = 0.58410278
Iteration 1357, loss = 0.58386724
Iteration 1358, loss = 0.58363441
Iteration 1359, loss = 0.58339882
Iteration 1360, loss = 0.58317138
Iteration 1361, loss = 0.58293994
Iteration 1362, loss = 0.58270723
Iteration 1363, loss = 0.58247719
Iteration 1364, loss = 0.58224528
Iteration 1365, loss = 0.58201420
Iteration 1366, loss = 0.58178442
Iteration 1367, loss = 0.58155290
Iteration 1368, loss = 0.58132190
Iteration 1369, loss = 0.58109525
Iteration 1370, loss = 0.58086579
Iteration 1371, loss = 0.58063812
Iteration 1372, loss = 0.58041309
Iteration 1373, loss = 0.58018204
Iteration 1374, loss = 0.57995232
Iteration 1375, loss = 0.57973035
Iteration 1376, loss = 0.57949759
Iteration 1377, loss = 0.57927332
Iteration 1378, loss = 0.57904977
Iteration 1379, loss = 0.57882015
Iteration 1380, loss = 0.57859227
Iteration 1381, loss = 0.57836823
Iteration 1382, loss = 0.57814088
Iteration 1383, loss = 0.57791436
Iteration 1384, loss = 0.57769142
Iteration 1385, loss = 0.57746918
Iteration 1386, loss = 0.57725223
Iteration 1387, loss = 0.57702751
Iteration 1388, loss = 0.57680245
Iteration 1389, loss = 0.57658592
Iteration 1390, loss = 0.57636726
Iteration 1391, loss = 0.57614697
Iteration 1392, loss = 0.57592541
Iteration 1393, loss = 0.57570273
Iteration 1394, loss = 0.57548841
Iteration 1395, loss = 0.57526781
Iteration 1396, loss = 0.57504542
Iteration 1397, loss = 0.57482832
Iteration 1398, loss = 0.57460848
Iteration 1399, loss = 0.57439530
Iteration 1400, loss = 0.57418006
Iteration 1401, loss = 0.57396381
Iteration 1402, loss = 0.57374605
Iteration 1403, loss = 0.57353283
Iteration 1404, loss = 0.57331963
Iteration 1405, loss = 0.57310591
Iteration 1406, loss = 0.57289431
Iteration 1407, loss = 0.57268088
Iteration 1408, loss = 0.57246966
Iteration 1409, loss = 0.57225855
Iteration 1410, loss = 0.57204437
Iteration 1411, loss = 0.57183297
Iteration 1412, loss = 0.57162182
Iteration 1413, loss = 0.57141122
Iteration 1414, loss = 0.57120454
Iteration 1415, loss = 0.57099017
Iteration 1416, loss = 0.57078351
Iteration 1417, loss = 0.57057658
Iteration 1418, loss = 0.57036636
Iteration 1419, loss = 0.57015725
Iteration 1420, loss = 0.56994978
Iteration 1421, loss = 0.56973761
Iteration 1422, loss = 0.56952938
Iteration 1423, loss = 0.56932165
Iteration 1424, loss = 0.56911445
Iteration 1425, loss = 0.56890613
Iteration 1426, loss = 0.56869927
Iteration 1427, loss = 0.56849134
Iteration 1428, loss = 0.56828449
Iteration 1429, loss = 0.56808879
Iteration 1430, loss = 0.56787718
Iteration 1431, loss = 0.56766951
Iteration 1432, loss = 0.56747414
Iteration 1433, loss = 0.56726555
Iteration 1434, loss = 0.56706298
Iteration 1435, loss = 0.56685658
Iteration 1436, loss = 0.56665546
Iteration 1437, loss = 0.56645090
Iteration 1438, loss = 0.56625191
Iteration 1439, loss = 0.56604504
Iteration 1440, loss = 0.56583966
Iteration 1441, loss = 0.56563799
Iteration 1442, loss = 0.56543696
Iteration 1443, loss = 0.56523840
Iteration 1444, loss = 0.56503937
Iteration 1445, loss = 0.56483324
Iteration 1446, loss = 0.56463280
Iteration 1447, loss = 0.56443102
Iteration 1448, loss = 0.56422954
Iteration 1449, loss = 0.56403419
Iteration 1450, loss = 0.56384038
Iteration 1451, loss = 0.56363569
Iteration 1452, loss = 0.56343939
Iteration 1453, loss = 0.56324344
Iteration 1454, loss = 0.56304707
Iteration 1455, loss = 0.56285172
Iteration 1456, loss = 0.56265375
Iteration 1457, loss = 0.56245487
Iteration 1458, loss = 0.56226008
Iteration 1459, loss = 0.56206300
Iteration 1460, loss = 0.56186655
Iteration 1461, loss = 0.56167389
Iteration 1462, loss = 0.56147932
Iteration 1463, loss = 0.56128623
Iteration 1464, loss = 0.56109220
Iteration 1465, loss = 0.56089752
Iteration 1466, loss = 0.56070513
Iteration 1467, loss = 0.56050878
Iteration 1468, loss = 0.56032084
Iteration 1469, loss = 0.56012226
Iteration 1470, loss = 0.55993459
Iteration 1471, loss = 0.55974143
Iteration 1472, loss = 0.55954454
Iteration 1473, loss = 0.55935261
Iteration 1474, loss = 0.55916313
Iteration 1475, loss = 0.55897330
Iteration 1476, loss = 0.55877890
Iteration 1477, loss = 0.55858766
Iteration 1478, loss = 0.55840182
Iteration 1479, loss = 0.55820606
Iteration 1480, loss = 0.55801632
Iteration 1481, loss = 0.55782736
Iteration 1482, loss = 0.55763467
Iteration 1483, loss = 0.55744560
Iteration 1484, loss = 0.55725313
Iteration 1485, loss = 0.55706521
Iteration 1486, loss = 0.55687531
Iteration 1487, loss = 0.55668892
Iteration 1488, loss = 0.55649440
Iteration 1489, loss = 0.55630951
Iteration 1490, loss = 0.55611747
Iteration 1491, loss = 0.55592785
Iteration 1492, loss = 0.55574203
Iteration 1493, loss = 0.55555458
Iteration 1494, loss = 0.55536433
Iteration 1495, loss = 0.55517412
Iteration 1496, loss = 0.55499419
Iteration 1497, loss = 0.55480012
Iteration 1498, loss = 0.55461285
Iteration 1499, loss = 0.55442578
Iteration 1500, loss = 0.55423649
Iteration 1501, loss = 0.55404672
Iteration 1502, loss = 0.55386275
Iteration 1503, loss = 0.55367363
Iteration 1504, loss = 0.55348678
Iteration 1505, loss = 0.55329939
Iteration 1506, loss = 0.55311247
Iteration 1507, loss = 0.55292975
Iteration 1508, loss = 0.55274453
Iteration 1509, loss = 0.55255584
Iteration 1510, loss = 0.55237443
Iteration 1511, loss = 0.55218106
Iteration 1512, loss = 0.55200301
Iteration 1513, loss = 0.55181734
Iteration 1514, loss = 0.55162783
Iteration 1515, loss = 0.55144156
Iteration 1516, loss = 0.55125667
Iteration 1517, loss = 0.55107137
Iteration 1518, loss = 0.55088683
Iteration 1519, loss = 0.55069854
Iteration 1520, loss = 0.55051641
Iteration 1521, loss = 0.55033354
Iteration 1522, loss = 0.55015223
Iteration 1523, loss = 0.54997122
Iteration 1524, loss = 0.54978256
Iteration 1525, loss = 0.54961156
Iteration 1526, loss = 0.54942443
Iteration 1527, loss = 0.54923392
Iteration 1528, loss = 0.54905603
Iteration 1529, loss = 0.54886580
Iteration 1530, loss = 0.54870015
Iteration 1531, loss = 0.54850906
Iteration 1532, loss = 0.54832464
Iteration 1533, loss = 0.54813667
Iteration 1534, loss = 0.54797316
Iteration 1535, loss = 0.54777968
Iteration 1536, loss = 0.54760738
Iteration 1537, loss = 0.54743398
Iteration 1538, loss = 0.54724608
Iteration 1539, loss = 0.54705765
Iteration 1540, loss = 0.54689730
Iteration 1541, loss = 0.54669457
Iteration 1542, loss = 0.54655056
Iteration 1543, loss = 0.54638043
Iteration 1544, loss = 0.54617667
Iteration 1545, loss = 0.54601520
Iteration 1546, loss = 0.54583367
Iteration 1547, loss = 0.54564795
Iteration 1548, loss = 0.54549020
Iteration 1549, loss = 0.54531639
Iteration 1550, loss = 0.54512716
Iteration 1551, loss = 0.54494105
Iteration 1552, loss = 0.54479493
Iteration 1553, loss = 0.54458853
Iteration 1554, loss = 0.54442954
Iteration 1555, loss = 0.54425790
Iteration 1556, loss = 0.54407292
Iteration 1557, loss = 0.54390733
Iteration 1558, loss = 0.54372219
Iteration 1559, loss = 0.54355057
Iteration 1560, loss = 0.54337249
Iteration 1561, loss = 0.54320061
Iteration 1562, loss = 0.54302729
Iteration 1563, loss = 0.54285455
Iteration 1564, loss = 0.54268175
Iteration 1565, loss = 0.54250881
Iteration 1566, loss = 0.54233727
Iteration 1567, loss = 0.54216330
Iteration 1568, loss = 0.54199447
Iteration 1569, loss = 0.54181951
Iteration 1570, loss = 0.54164844
Iteration 1571, loss = 0.54147892
Iteration 1572, loss = 0.54131628
Iteration 1573, loss = 0.54114430
Iteration 1574, loss = 0.54096951
Iteration 1575, loss = 0.54080214
Iteration 1576, loss = 0.54062253
Iteration 1577, loss = 0.54046051
Iteration 1578, loss = 0.54028313
Iteration 1579, loss = 0.54012048
Iteration 1580, loss = 0.53994462
Iteration 1581, loss = 0.53977795
Iteration 1582, loss = 0.53960859
Iteration 1583, loss = 0.53943766
Iteration 1584, loss = 0.53926846
Iteration 1585, loss = 0.53910017
Iteration 1586, loss = 0.53893432
Iteration 1587, loss = 0.53876756
Iteration 1588, loss = 0.53859768
Iteration 1589, loss = 0.53843501
Iteration 1590, loss = 0.53826273
Iteration 1591, loss = 0.53809975
Iteration 1592, loss = 0.53792799
Iteration 1593, loss = 0.53776407
Iteration 1594, loss = 0.53759768
Iteration 1595, loss = 0.53743239
Iteration 1596, loss = 0.53726986
Iteration 1597, loss = 0.53709994
Iteration 1598, loss = 0.53693543
Iteration 1599, loss = 0.53676589
Iteration 1600, loss = 0.53660197
Iteration 1601, loss = 0.53643612
Iteration 1602, loss = 0.53627349
Iteration 1603, loss = 0.53610611
Iteration 1604, loss = 0.53594181
Iteration 1605, loss = 0.53577662
Iteration 1606, loss = 0.53561688
Iteration 1607, loss = 0.53545053
Iteration 1608, loss = 0.53528547
Iteration 1609, loss = 0.53512175
Iteration 1610, loss = 0.53496694
Iteration 1611, loss = 0.53479620
Iteration 1612, loss = 0.53463612
Iteration 1613, loss = 0.53447235
Iteration 1614, loss = 0.53430592
Iteration 1615, loss = 0.53414921
Iteration 1616, loss = 0.53398416
Iteration 1617, loss = 0.53382140
Iteration 1618, loss = 0.53365505
Iteration 1619, loss = 0.53349985
Iteration 1620, loss = 0.53333145
Iteration 1621, loss = 0.53316894
Iteration 1622, loss = 0.53300951
Iteration 1623, loss = 0.53284837
Iteration 1624, loss = 0.53268824
Iteration 1625, loss = 0.53252769
Iteration 1626, loss = 0.53236606
Iteration 1627, loss = 0.53220503
Iteration 1628, loss = 0.53204239
Iteration 1629, loss = 0.53188435
Iteration 1630, loss = 0.53172261
Iteration 1631, loss = 0.53156192
Iteration 1632, loss = 0.53140331
Iteration 1633, loss = 0.53124586
Iteration 1634, loss = 0.53108171
Iteration 1635, loss = 0.53092481
Iteration 1636, loss = 0.53077391
Iteration 1637, loss = 0.53061545
Iteration 1638, loss = 0.53044635
Iteration 1639, loss = 0.53029794
Iteration 1640, loss = 0.53012696
Iteration 1641, loss = 0.52998072
Iteration 1642, loss = 0.52982205
Iteration 1643, loss = 0.52965404
Iteration 1644, loss = 0.52950082
Iteration 1645, loss = 0.52934309
Iteration 1646, loss = 0.52917957
Iteration 1647, loss = 0.52902920
Iteration 1648, loss = 0.52887055
Iteration 1649, loss = 0.52870431
Iteration 1650, loss = 0.52855522
Iteration 1651, loss = 0.52841357
Iteration 1652, loss = 0.52824148
Iteration 1653, loss = 0.52807963
Iteration 1654, loss = 0.52793443
Iteration 1655, loss = 0.52777869
Iteration 1656, loss = 0.52761799
Iteration 1657, loss = 0.52746526
Iteration 1658, loss = 0.52730929
Iteration 1659, loss = 0.52714557
Iteration 1660, loss = 0.52699255
Iteration 1661, loss = 0.52684089
Iteration 1662, loss = 0.52668047
Iteration 1663, loss = 0.52652702
Iteration 1664, loss = 0.52637364
Iteration 1665, loss = 0.52621157
Iteration 1666, loss = 0.52605409
Iteration 1667, loss = 0.52590110
Iteration 1668, loss = 0.52574522
Iteration 1669, loss = 0.52558956
Iteration 1670, loss = 0.52543597
Iteration 1671, loss = 0.52527983
Iteration 1672, loss = 0.52512803
Iteration 1673, loss = 0.52497349
Iteration 1674, loss = 0.52481940
Iteration 1675, loss = 0.52466148
Iteration 1676, loss = 0.52450996
Iteration 1677, loss = 0.52435679
Iteration 1678, loss = 0.52420222
Iteration 1679, loss = 0.52404851
Iteration 1680, loss = 0.52389152
Iteration 1681, loss = 0.52373940
Iteration 1682, loss = 0.52358694
Iteration 1683, loss = 0.52343512
Iteration 1684, loss = 0.52327756
Iteration 1685, loss = 0.52312564
Iteration 1686, loss = 0.52297614
Iteration 1687, loss = 0.52282284
Iteration 1688, loss = 0.52266841
Iteration 1689, loss = 0.52251982
Iteration 1690, loss = 0.52236680
Iteration 1691, loss = 0.52221084
Iteration 1692, loss = 0.52206000
Iteration 1693, loss = 0.52190712
Iteration 1694, loss = 0.52175428
Iteration 1695, loss = 0.52160573
Iteration 1696, loss = 0.52145324
Iteration 1697, loss = 0.52130370
Iteration 1698, loss = 0.52115343
Iteration 1699, loss = 0.52099950
Iteration 1700, loss = 0.52085263
Iteration 1701, loss = 0.52070118
Iteration 1702, loss = 0.52054683
Iteration 1703, loss = 0.52040352
Iteration 1704, loss = 0.52024918
Iteration 1705, loss = 0.52009525
Iteration 1706, loss = 0.51994557
Iteration 1707, loss = 0.51979605
Iteration 1708, loss = 0.51964820
Iteration 1709, loss = 0.51949558
Iteration 1710, loss = 0.51934282
Iteration 1711, loss = 0.51919331
Iteration 1712, loss = 0.51904743
Iteration 1713, loss = 0.51889737
Iteration 1714, loss = 0.51874986
Iteration 1715, loss = 0.51859260
Iteration 1716, loss = 0.51845412
Iteration 1717, loss = 0.51830485
Iteration 1718, loss = 0.51815323
Iteration 1719, loss = 0.51800657
Iteration 1720, loss = 0.51785192
Iteration 1721, loss = 0.51769439
Iteration 1722, loss = 0.51754814
Iteration 1723, loss = 0.51739694
Iteration 1724, loss = 0.51724951
Iteration 1725, loss = 0.51709876
Iteration 1726, loss = 0.51694962
Iteration 1727, loss = 0.51680187
Iteration 1728, loss = 0.51665308
Iteration 1729, loss = 0.51649977
Iteration 1730, loss = 0.51635225
Iteration 1731, loss = 0.51620819
Iteration 1732, loss = 0.51606037
Iteration 1733, loss = 0.51591714
Iteration 1734, loss = 0.51576218
Iteration 1735, loss = 0.51562725
Iteration 1736, loss = 0.51547331
Iteration 1737, loss = 0.51531884
Iteration 1738, loss = 0.51518051
Iteration 1739, loss = 0.51502856
Iteration 1740, loss = 0.51487781
Iteration 1741, loss = 0.51472936
Iteration 1742, loss = 0.51458061
Iteration 1743, loss = 0.51443477
Iteration 1744, loss = 0.51428230
Iteration 1745, loss = 0.51414231
Iteration 1746, loss = 0.51399318
Iteration 1747, loss = 0.51384448
Iteration 1748, loss = 0.51369519
Iteration 1749, loss = 0.51354632
Iteration 1750, loss = 0.51340102
Iteration 1751, loss = 0.51325572
Iteration 1752, loss = 0.51310194
Iteration 1753, loss = 0.51295736
Iteration 1754, loss = 0.51280936
Iteration 1755, loss = 0.51266244
Iteration 1756, loss = 0.51252210
Iteration 1757, loss = 0.51237368
Iteration 1758, loss = 0.51222106
Iteration 1759, loss = 0.51207641
Iteration 1760, loss = 0.51193046
Iteration 1761, loss = 0.51178255
Iteration 1762, loss = 0.51163841
Iteration 1763, loss = 0.51149040
Iteration 1764, loss = 0.51134257
Iteration 1765, loss = 0.51120038
Iteration 1766, loss = 0.51105551
Iteration 1767, loss = 0.51090948
Iteration 1768, loss = 0.51075820
Iteration 1769, loss = 0.51061597
Iteration 1770, loss = 0.51047470
Iteration 1771, loss = 0.51032338
Iteration 1772, loss = 0.51017560
Iteration 1773, loss = 0.51003594
Iteration 1774, loss = 0.50989591
Iteration 1775, loss = 0.50974635
Iteration 1776, loss = 0.50959540
Iteration 1777, loss = 0.50944661
Iteration 1778, loss = 0.50930914
Iteration 1779, loss = 0.50916088
Iteration 1780, loss = 0.50901102
Iteration 1781, loss = 0.50886679
Iteration 1782, loss = 0.50872315
Iteration 1783, loss = 0.50857526
Iteration 1784, loss = 0.50843630
Iteration 1785, loss = 0.50828216
Iteration 1786, loss = 0.50815340
Iteration 1787, loss = 0.50799542
Iteration 1788, loss = 0.50784854
Iteration 1789, loss = 0.50772155
Iteration 1790, loss = 0.50756350
Iteration 1791, loss = 0.50742504
Iteration 1792, loss = 0.50728622
Iteration 1793, loss = 0.50713204
Iteration 1794, loss = 0.50699894
Iteration 1795, loss = 0.50685203
Iteration 1796, loss = 0.50669637
Iteration 1797, loss = 0.50655116
Iteration 1798, loss = 0.50640290
Iteration 1799, loss = 0.50625673
Iteration 1800, loss = 0.50611625
Iteration 1801, loss = 0.50597244
Iteration 1802, loss = 0.50582618
Iteration 1803, loss = 0.50567881
Iteration 1804, loss = 0.50553801
Iteration 1805, loss = 0.50539171
Iteration 1806, loss = 0.50524904
Iteration 1807, loss = 0.50510311
Iteration 1808, loss = 0.50495625
Iteration 1809, loss = 0.50481587
Iteration 1810, loss = 0.50467698
Iteration 1811, loss = 0.50452280
Iteration 1812, loss = 0.50439880
Iteration 1813, loss = 0.50425152
Iteration 1814, loss = 0.50410476
Iteration 1815, loss = 0.50396824
Iteration 1816, loss = 0.50382316
Iteration 1817, loss = 0.50367592
Iteration 1818, loss = 0.50352773
Iteration 1819, loss = 0.50339338
Iteration 1820, loss = 0.50324261
Iteration 1821, loss = 0.50309813
Iteration 1822, loss = 0.50295485
Iteration 1823, loss = 0.50281072
Iteration 1824, loss = 0.50266722
Iteration 1825, loss = 0.50252128
Iteration 1826, loss = 0.50237529
Iteration 1827, loss = 0.50222943
Iteration 1828, loss = 0.50209701
Iteration 1829, loss = 0.50194477
Iteration 1830, loss = 0.50180688
Iteration 1831, loss = 0.50166176
Iteration 1832, loss = 0.50152083
Iteration 1833, loss = 0.50137597
Iteration 1834, loss = 0.50123261
Iteration 1835, loss = 0.50109572
Iteration 1836, loss = 0.50095345
Iteration 1837, loss = 0.50080213
Iteration 1838, loss = 0.50066284
Iteration 1839, loss = 0.50051887
Iteration 1840, loss = 0.50037452
Iteration 1841, loss = 0.50023436
Iteration 1842, loss = 0.50009893
Iteration 1843, loss = 0.49994562
Iteration 1844, loss = 0.49980453
Iteration 1845, loss = 0.49966091
Iteration 1846, loss = 0.49952480
Iteration 1847, loss = 0.49937427
Iteration 1848, loss = 0.49923751
Iteration 1849, loss = 0.49909444
Iteration 1850, loss = 0.49895760
Iteration 1851, loss = 0.49881513
Iteration 1852, loss = 0.49866636
Iteration 1853, loss = 0.49852285
Iteration 1854, loss = 0.49837833
Iteration 1855, loss = 0.49824689
Iteration 1856, loss = 0.49810585
Iteration 1857, loss = 0.49795511
Iteration 1858, loss = 0.49781969
Iteration 1859, loss = 0.49768341
Iteration 1860, loss = 0.49754255
Iteration 1861, loss = 0.49739497
Iteration 1862, loss = 0.49724313
Iteration 1863, loss = 0.49710821
Iteration 1864, loss = 0.49697365
Iteration 1865, loss = 0.49682984
Iteration 1866, loss = 0.49669453
Iteration 1867, loss = 0.49654499
Iteration 1868, loss = 0.49641167
Iteration 1869, loss = 0.49626888
Iteration 1870, loss = 0.49612438
Iteration 1871, loss = 0.49598956
Iteration 1872, loss = 0.49585748
Iteration 1873, loss = 0.49572154
Iteration 1874, loss = 0.49557375
Iteration 1875, loss = 0.49543172
Iteration 1876, loss = 0.49531546
Iteration 1877, loss = 0.49515019
Iteration 1878, loss = 0.49503833
Iteration 1879, loss = 0.49489716
Iteration 1880, loss = 0.49473862
Iteration 1881, loss = 0.49460473
Iteration 1882, loss = 0.49446109
Iteration 1883, loss = 0.49431554
Iteration 1884, loss = 0.49420221
Iteration 1885, loss = 0.49405284
Iteration 1886, loss = 0.49389897
Iteration 1887, loss = 0.49376205
Iteration 1888, loss = 0.49363285
Iteration 1889, loss = 0.49348583
Iteration 1890, loss = 0.49334192
Iteration 1891, loss = 0.49320274
Iteration 1892, loss = 0.49306296
Iteration 1893, loss = 0.49292712
Iteration 1894, loss = 0.49278762
Iteration 1895, loss = 0.49266816
Iteration 1896, loss = 0.49250999
Iteration 1897, loss = 0.49237758
Iteration 1898, loss = 0.49223454
Iteration 1899, loss = 0.49210049
Iteration 1900, loss = 0.49195866
Iteration 1901, loss = 0.49182089
Iteration 1902, loss = 0.49168310
Iteration 1903, loss = 0.49155090
Iteration 1904, loss = 0.49141567
Iteration 1905, loss = 0.49127026
Iteration 1906, loss = 0.49113175
Iteration 1907, loss = 0.49100041
Iteration 1908, loss = 0.49086146
Iteration 1909, loss = 0.49071568
Iteration 1910, loss = 0.49058641
Iteration 1911, loss = 0.49044902
Iteration 1912, loss = 0.49030686
Iteration 1913, loss = 0.49017792
Iteration 1914, loss = 0.49004089
Iteration 1915, loss = 0.48990149
Iteration 1916, loss = 0.48976273
Iteration 1917, loss = 0.48962876
Iteration 1918, loss = 0.48949138
Iteration 1919, loss = 0.48935032
Iteration 1920, loss = 0.48920905
Iteration 1921, loss = 0.48907202
Iteration 1922, loss = 0.48893761
Iteration 1923, loss = 0.48881227
Iteration 1924, loss = 0.48868257
Iteration 1925, loss = 0.48853236
Iteration 1926, loss = 0.48841259
Iteration 1927, loss = 0.48829105
Iteration 1928, loss = 0.48813314
Iteration 1929, loss = 0.48801017
Iteration 1930, loss = 0.48787253
Iteration 1931, loss = 0.48770874
Iteration 1932, loss = 0.48759473
Iteration 1933, loss = 0.48745946
Iteration 1934, loss = 0.48729672
Iteration 1935, loss = 0.48717459
Iteration 1936, loss = 0.48704259
Iteration 1937, loss = 0.48689935
Iteration 1938, loss = 0.48676118
Iteration 1939, loss = 0.48662458
Iteration 1940, loss = 0.48649091
Iteration 1941, loss = 0.48635700
Iteration 1942, loss = 0.48622373
Iteration 1943, loss = 0.48607991
Iteration 1944, loss = 0.48594806
Iteration 1945, loss = 0.48580738
Iteration 1946, loss = 0.48568873
Iteration 1947, loss = 0.48554285
Iteration 1948, loss = 0.48539469
Iteration 1949, loss = 0.48526761
Iteration 1950, loss = 0.48512526
Iteration 1951, loss = 0.48498532
Iteration 1952, loss = 0.48485034
Iteration 1953, loss = 0.48471209
Iteration 1954, loss = 0.48457509
Iteration 1955, loss = 0.48444881
Iteration 1956, loss = 0.48430615
Iteration 1957, loss = 0.48416566
Iteration 1958, loss = 0.48402885
Iteration 1959, loss = 0.48388903
Iteration 1960, loss = 0.48375583
Iteration 1961, loss = 0.48361205
Iteration 1962, loss = 0.48347566
Iteration 1963, loss = 0.48335417
Iteration 1964, loss = 0.48320687
Iteration 1965, loss = 0.48307525
Iteration 1966, loss = 0.48292247
Iteration 1967, loss = 0.48281679
Iteration 1968, loss = 0.48266094
Iteration 1969, loss = 0.48253163
Iteration 1970, loss = 0.48238587
Iteration 1971, loss = 0.48227558
Iteration 1972, loss = 0.48213970
Iteration 1973, loss = 0.48196933
Iteration 1974, loss = 0.48185961
Iteration 1975, loss = 0.48171245
Iteration 1976, loss = 0.48159213
Iteration 1977, loss = 0.48143945
Iteration 1978, loss = 0.48131550
Iteration 1979, loss = 0.48117242
Iteration 1980, loss = 0.48102969
Iteration 1981, loss = 0.48088814
Iteration 1982, loss = 0.48075158
Iteration 1983, loss = 0.48063021
Iteration 1984, loss = 0.48048685
Iteration 1985, loss = 0.48034810
Iteration 1986, loss = 0.48020859
Iteration 1987, loss = 0.48008663
Iteration 1988, loss = 0.47993941
Iteration 1989, loss = 0.47980685
Iteration 1990, loss = 0.47967367
Iteration 1991, loss = 0.47952912
Iteration 1992, loss = 0.47940530
Iteration 1993, loss = 0.47926257
Iteration 1994, loss = 0.47912961
Iteration 1995, loss = 0.47898232
Iteration 1996, loss = 0.47885635
Iteration 1997, loss = 0.47873640
Iteration 1998, loss = 0.47860146
Iteration 1999, loss = 0.47844875
Iteration 2000, loss = 0.47832517
Iteration 2001, loss = 0.47819929
Iteration 2002, loss = 0.47805630
Iteration 2003, loss = 0.47791932
Iteration 2004, loss = 0.47777064
Iteration 2005, loss = 0.47763928
Iteration 2006, loss = 0.47751499
Iteration 2007, loss = 0.47737609
Iteration 2008, loss = 0.47722838
Iteration 2009, loss = 0.47710088
Iteration 2010, loss = 0.47696267
Iteration 2011, loss = 0.47682668
Iteration 2012, loss = 0.47669448
Iteration 2013, loss = 0.47655379
Iteration 2014, loss = 0.47641511
Iteration 2015, loss = 0.47628053
Iteration 2016, loss = 0.47615159
Iteration 2017, loss = 0.47600977
Iteration 2018, loss = 0.47586641
Iteration 2019, loss = 0.47574596
Iteration 2020, loss = 0.47561422
Iteration 2021, loss = 0.47546704
Iteration 2022, loss = 0.47533569
Iteration 2023, loss = 0.47520501
Iteration 2024, loss = 0.47506463
Iteration 2025, loss = 0.47493404
Iteration 2026, loss = 0.47480063
Iteration 2027, loss = 0.47465948
Iteration 2028, loss = 0.47451420
Iteration 2029, loss = 0.47438708
Iteration 2030, loss = 0.47426046
Iteration 2031, loss = 0.47410894
Iteration 2032, loss = 0.47396545
Iteration 2033, loss = 0.47384377
Iteration 2034, loss = 0.47370612
Iteration 2035, loss = 0.47357368
Iteration 2036, loss = 0.47345003
Iteration 2037, loss = 0.47331650
Iteration 2038, loss = 0.47316205
Iteration 2039, loss = 0.47304746
Iteration 2040, loss = 0.47291559
Iteration 2041, loss = 0.47276993
Iteration 2042, loss = 0.47262434
Iteration 2043, loss = 0.47247740
Iteration 2044, loss = 0.47237079
Iteration 2045, loss = 0.47219900
Iteration 2046, loss = 0.47209521
Iteration 2047, loss = 0.47196693
Iteration 2048, loss = 0.47180886
Iteration 2049, loss = 0.47169971
Iteration 2050, loss = 0.47157789
Iteration 2051, loss = 0.47141616
Iteration 2052, loss = 0.47127749
Iteration 2053, loss = 0.47113818
Iteration 2054, loss = 0.47099001
Iteration 2055, loss = 0.47086751
Iteration 2056, loss = 0.47071116
Iteration 2057, loss = 0.47056808
Iteration 2058, loss = 0.47043785
Iteration 2059, loss = 0.47029902
Iteration 2060, loss = 0.47016051
Iteration 2061, loss = 0.47002658
Iteration 2062, loss = 0.46987637
Iteration 2063, loss = 0.46974873
Iteration 2064, loss = 0.46960546
Iteration 2065, loss = 0.46947374
Iteration 2066, loss = 0.46932932
Iteration 2067, loss = 0.46919746
Iteration 2068, loss = 0.46905482
Iteration 2069, loss = 0.46892114
Iteration 2070, loss = 0.46878154
Iteration 2071, loss = 0.46865167
Iteration 2072, loss = 0.46849878
Iteration 2073, loss = 0.46837059
Iteration 2074, loss = 0.46822863
Iteration 2075, loss = 0.46808925
Iteration 2076, loss = 0.46794389
Iteration 2077, loss = 0.46781874
Iteration 2078, loss = 0.46768224
Iteration 2079, loss = 0.46754686
Iteration 2080, loss = 0.46740698
Iteration 2081, loss = 0.46726695
Iteration 2082, loss = 0.46712228
Iteration 2083, loss = 0.46699566
Iteration 2084, loss = 0.46685424
Iteration 2085, loss = 0.46671899
Iteration 2086, loss = 0.46658509
Iteration 2087, loss = 0.46644285
Iteration 2088, loss = 0.46630751
Iteration 2089, loss = 0.46618051
Iteration 2090, loss = 0.46604192
Iteration 2091, loss = 0.46588597
Iteration 2092, loss = 0.46577289
Iteration 2093, loss = 0.46564692
Iteration 2094, loss = 0.46549566
Iteration 2095, loss = 0.46533746
Iteration 2096, loss = 0.46520069
Iteration 2097, loss = 0.46507312
Iteration 2098, loss = 0.46492510
Iteration 2099, loss = 0.46480298
Iteration 2100, loss = 0.46463899
Iteration 2101, loss = 0.46450835
Iteration 2102, loss = 0.46436848
Iteration 2103, loss = 0.46421946
Iteration 2104, loss = 0.46409167
Iteration 2105, loss = 0.46393858
Iteration 2106, loss = 0.46379633
Iteration 2107, loss = 0.46365584
Iteration 2108, loss = 0.46352605
Iteration 2109, loss = 0.46339030
Iteration 2110, loss = 0.46324581
Iteration 2111, loss = 0.46309824
Iteration 2112, loss = 0.46296866
Iteration 2113, loss = 0.46282496
Iteration 2114, loss = 0.46268452
Iteration 2115, loss = 0.46254669
Iteration 2116, loss = 0.46241028
Iteration 2117, loss = 0.46225863
Iteration 2118, loss = 0.46213072
Iteration 2119, loss = 0.46197964
Iteration 2120, loss = 0.46186722
Iteration 2121, loss = 0.46172069
Iteration 2122, loss = 0.46155661
Iteration 2123, loss = 0.46146460
Iteration 2124, loss = 0.46129790
Iteration 2125, loss = 0.46114577
Iteration 2126, loss = 0.46101448
Iteration 2127, loss = 0.46087102
Iteration 2128, loss = 0.46076514
Iteration 2129, loss = 0.46060816
Iteration 2130, loss = 0.46045561
Iteration 2131, loss = 0.46032471
Iteration 2132, loss = 0.46017756
Iteration 2133, loss = 0.46002598
Iteration 2134, loss = 0.45988927
Iteration 2135, loss = 0.45973184
Iteration 2136, loss = 0.45959456
Iteration 2137, loss = 0.45945801
Iteration 2138, loss = 0.45930195
Iteration 2139, loss = 0.45916338
Iteration 2140, loss = 0.45903480
Iteration 2141, loss = 0.45889472
Iteration 2142, loss = 0.45874816
Iteration 2143, loss = 0.45860345
Iteration 2144, loss = 0.45845667
Iteration 2145, loss = 0.45833314
Iteration 2146, loss = 0.45819134
Iteration 2147, loss = 0.45805172
Iteration 2148, loss = 0.45791381
Iteration 2149, loss = 0.45775865
Iteration 2150, loss = 0.45763284
Iteration 2151, loss = 0.45747654
Iteration 2152, loss = 0.45733509
Iteration 2153, loss = 0.45720421
Iteration 2154, loss = 0.45704753
Iteration 2155, loss = 0.45691182
Iteration 2156, loss = 0.45675758
Iteration 2157, loss = 0.45662287
Iteration 2158, loss = 0.45648424
Iteration 2159, loss = 0.45633122
Iteration 2160, loss = 0.45619319
Iteration 2161, loss = 0.45604996
Iteration 2162, loss = 0.45590065
Iteration 2163, loss = 0.45575742
Iteration 2164, loss = 0.45561034
Iteration 2165, loss = 0.45546490
Iteration 2166, loss = 0.45532497
Iteration 2167, loss = 0.45517273
Iteration 2168, loss = 0.45503118
Iteration 2169, loss = 0.45488772
Iteration 2170, loss = 0.45474222
Iteration 2171, loss = 0.45459685
Iteration 2172, loss = 0.45445882
Iteration 2173, loss = 0.45431669
Iteration 2174, loss = 0.45417419
Iteration 2175, loss = 0.45402155
Iteration 2176, loss = 0.45387565
Iteration 2177, loss = 0.45372992
Iteration 2178, loss = 0.45359189
Iteration 2179, loss = 0.45344190
Iteration 2180, loss = 0.45328980
Iteration 2181, loss = 0.45315416
Iteration 2182, loss = 0.45300436
Iteration 2183, loss = 0.45285491
Iteration 2184, loss = 0.45270630
Iteration 2185, loss = 0.45256971
Iteration 2186, loss = 0.45241437
Iteration 2187, loss = 0.45226521
Iteration 2188, loss = 0.45212627
Iteration 2189, loss = 0.45197213
Iteration 2190, loss = 0.45183199
Iteration 2191, loss = 0.45167931
Iteration 2192, loss = 0.45154545
Iteration 2193, loss = 0.45138315
Iteration 2194, loss = 0.45124330
Iteration 2195, loss = 0.45109513
Iteration 2196, loss = 0.45094281
Iteration 2197, loss = 0.45080176
Iteration 2198, loss = 0.45065368
Iteration 2199, loss = 0.45050291
Iteration 2200, loss = 0.45034869
Iteration 2201, loss = 0.45019956
Iteration 2202, loss = 0.45004684
Iteration 2203, loss = 0.44990971
Iteration 2204, loss = 0.44974642
Iteration 2205, loss = 0.44960394
Iteration 2206, loss = 0.44944895
Iteration 2207, loss = 0.44932408
Iteration 2208, loss = 0.44915707
Iteration 2209, loss = 0.44903175
Iteration 2210, loss = 0.44888147
Iteration 2211, loss = 0.44870983
Iteration 2212, loss = 0.44857019
Iteration 2213, loss = 0.44837142
Iteration 2214, loss = 0.44821247
Iteration 2215, loss = 0.44803112
Iteration 2216, loss = 0.44783718
Iteration 2217, loss = 0.44764582
Iteration 2218, loss = 0.44747646
Iteration 2219, loss = 0.44728269
Iteration 2220, loss = 0.44709223
Iteration 2221, loss = 0.44691007
Iteration 2222, loss = 0.44671474
Iteration 2223, loss = 0.44654413
Iteration 2224, loss = 0.44634105
Iteration 2225, loss = 0.44614794
Iteration 2226, loss = 0.44596623
Iteration 2227, loss = 0.44576157
Iteration 2228, loss = 0.44556571
Iteration 2229, loss = 0.44537406
Iteration 2230, loss = 0.44518458
Iteration 2231, loss = 0.44498946
Iteration 2232, loss = 0.44479865
Iteration 2233, loss = 0.44459614
Iteration 2234, loss = 0.44440281
Iteration 2235, loss = 0.44421294
Iteration 2236, loss = 0.44403654
Iteration 2237, loss = 0.44383079
Iteration 2238, loss = 0.44364492
Iteration 2239, loss = 0.44343417
Iteration 2240, loss = 0.44325222
Iteration 2241, loss = 0.44306221
Iteration 2242, loss = 0.44284952
Iteration 2243, loss = 0.44264354
Iteration 2244, loss = 0.44243766
Iteration 2245, loss = 0.44222934
Iteration 2246, loss = 0.44204230
Iteration 2247, loss = 0.44183496
Iteration 2248, loss = 0.44167476
Iteration 2249, loss = 0.44148378
Iteration 2250, loss = 0.44127695
Iteration 2251, loss = 0.44110190
Iteration 2252, loss = 0.44092511
Iteration 2253, loss = 0.44071813
Iteration 2254, loss = 0.44052617
Iteration 2255, loss = 0.44036040
Iteration 2256, loss = 0.44014713
Iteration 2257, loss = 0.43997657
Iteration 2258, loss = 0.43978250
Iteration 2259, loss = 0.43958099
Iteration 2260, loss = 0.43939943
Iteration 2261, loss = 0.43921066
Iteration 2262, loss = 0.43901429
Iteration 2263, loss = 0.43880737
Iteration 2264, loss = 0.43863449
Iteration 2265, loss = 0.43845044
Iteration 2266, loss = 0.43823576
Iteration 2267, loss = 0.43805562
Iteration 2268, loss = 0.43786563
Iteration 2269, loss = 0.43766940
Iteration 2270, loss = 0.43746418
Iteration 2271, loss = 0.43728095
Iteration 2272, loss = 0.43708613
Iteration 2273, loss = 0.43689315
Iteration 2274, loss = 0.43669685
Iteration 2275, loss = 0.43648999
Iteration 2276, loss = 0.43630402
Iteration 2277, loss = 0.43610463
Iteration 2278, loss = 0.43589057
Iteration 2279, loss = 0.43572803
Iteration 2280, loss = 0.43552918
Iteration 2281, loss = 0.43531427
Iteration 2282, loss = 0.43512169
Iteration 2283, loss = 0.43495305
Iteration 2284, loss = 0.43477038
Iteration 2285, loss = 0.43457396
Iteration 2286, loss = 0.43436855
Iteration 2287, loss = 0.43414532
Iteration 2288, loss = 0.43393376
Iteration 2289, loss = 0.43375475
Iteration 2290, loss = 0.43356156
Iteration 2291, loss = 0.43334546
Iteration 2292, loss = 0.43314154
Iteration 2293, loss = 0.43298562
Iteration 2294, loss = 0.43277449
Iteration 2295, loss = 0.43257259
Iteration 2296, loss = 0.43234582
Iteration 2297, loss = 0.43216561
Iteration 2298, loss = 0.43197841
Iteration 2299, loss = 0.43176550
Iteration 2300, loss = 0.43157990
Iteration 2301, loss = 0.43137490
Iteration 2302, loss = 0.43118791
Iteration 2303, loss = 0.43097887
Iteration 2304, loss = 0.43082489
Iteration 2305, loss = 0.43061723
Iteration 2306, loss = 0.43038432
Iteration 2307, loss = 0.43019060
Iteration 2308, loss = 0.43002659
Iteration 2309, loss = 0.42982100
Iteration 2310, loss = 0.42956854
Iteration 2311, loss = 0.42942258
Iteration 2312, loss = 0.42922668
Iteration 2313, loss = 0.42900261
Iteration 2314, loss = 0.42879950
Iteration 2315, loss = 0.42862763
Iteration 2316, loss = 0.42841992
Iteration 2317, loss = 0.42816839
Iteration 2318, loss = 0.42800845
Iteration 2319, loss = 0.42783171
Iteration 2320, loss = 0.42760787
Iteration 2321, loss = 0.42740571
Iteration 2322, loss = 0.42720304
Iteration 2323, loss = 0.42701405
Iteration 2324, loss = 0.42681366
Iteration 2325, loss = 0.42661526
Iteration 2326, loss = 0.42641703
Iteration 2327, loss = 0.42620746
Iteration 2328, loss = 0.42601467
Iteration 2329, loss = 0.42585442
Iteration 2330, loss = 0.42567027
Iteration 2331, loss = 0.42545206
Iteration 2332, loss = 0.42527056
Iteration 2333, loss = 0.42505565
Iteration 2334, loss = 0.42484779
Iteration 2335, loss = 0.42466244
Iteration 2336, loss = 0.42446832
Iteration 2337, loss = 0.42425192
Iteration 2338, loss = 0.42406875
Iteration 2339, loss = 0.42387873
Iteration 2340, loss = 0.42367717
Iteration 2341, loss = 0.42347135
Iteration 2342, loss = 0.42332112
Iteration 2343, loss = 0.42312798
Iteration 2344, loss = 0.42290077
Iteration 2345, loss = 0.42271876
Iteration 2346, loss = 0.42255684
Iteration 2347, loss = 0.42233896
Iteration 2348, loss = 0.42217107
Iteration 2349, loss = 0.42194840
Iteration 2350, loss = 0.42179690
Iteration 2351, loss = 0.42157466
Iteration 2352, loss = 0.42140291
Iteration 2353, loss = 0.42118943
Iteration 2354, loss = 0.42100192
Iteration 2355, loss = 0.42083486
Iteration 2356, loss = 0.42062102
Iteration 2357, loss = 0.42044211
Iteration 2358, loss = 0.42025307
Iteration 2359, loss = 0.42005220
Iteration 2360, loss = 0.41986438
Iteration 2361, loss = 0.41967408
Iteration 2362, loss = 0.41946995
Iteration 2363, loss = 0.41927203
Iteration 2364, loss = 0.41911620
Iteration 2365, loss = 0.41890078
Iteration 2366, loss = 0.41872322
Iteration 2367, loss = 0.41851073
Iteration 2368, loss = 0.41837434
Iteration 2369, loss = 0.41813807
Iteration 2370, loss = 0.41798741
Iteration 2371, loss = 0.41779889
Iteration 2372, loss = 0.41758132
Iteration 2373, loss = 0.41739399
Iteration 2374, loss = 0.41722199
Iteration 2375, loss = 0.41706267
Iteration 2376, loss = 0.41684384
Iteration 2377, loss = 0.41668826
Iteration 2378, loss = 0.41647211
Iteration 2379, loss = 0.41627207
Iteration 2380, loss = 0.41603496
Iteration 2381, loss = 0.41583710
Iteration 2382, loss = 0.41564298
Iteration 2383, loss = 0.41542073
Iteration 2384, loss = 0.41520056
Iteration 2385, loss = 0.41497751
Iteration 2386, loss = 0.41478920
Iteration 2387, loss = 0.41457774
Iteration 2388, loss = 0.41435018
Iteration 2389, loss = 0.41412599
Iteration 2390, loss = 0.41392510
Iteration 2391, loss = 0.41373419
Iteration 2392, loss = 0.41351640
Iteration 2393, loss = 0.41331131
Iteration 2394, loss = 0.41311085
Iteration 2395, loss = 0.41291528
Iteration 2396, loss = 0.41270958
Iteration 2397, loss = 0.41248050
Iteration 2398, loss = 0.41226833
Iteration 2399, loss = 0.41204555
Iteration 2400, loss = 0.41182932
Iteration 2401, loss = 0.41163216
Iteration 2402, loss = 0.41138578
Iteration 2403, loss = 0.41119229
Iteration 2404, loss = 0.41092332
Iteration 2405, loss = 0.41074774
Iteration 2406, loss = 0.41050966
Iteration 2407, loss = 0.41028071
Iteration 2408, loss = 0.41013716
Iteration 2409, loss = 0.40990363
Iteration 2410, loss = 0.40972571
Iteration 2411, loss = 0.40951588
Iteration 2412, loss = 0.40937452
Iteration 2413, loss = 0.40916094
Iteration 2414, loss = 0.40897205
Iteration 2415, loss = 0.40879576
Iteration 2416, loss = 0.40859992
Iteration 2417, loss = 0.40841763
Iteration 2418, loss = 0.40822640
Iteration 2419, loss = 0.40804985
Iteration 2420, loss = 0.40785193
Iteration 2421, loss = 0.40766844
Iteration 2422, loss = 0.40747176
Iteration 2423, loss = 0.40727897
Iteration 2424, loss = 0.40712396
Iteration 2425, loss = 0.40689069
Iteration 2426, loss = 0.40671464
Iteration 2427, loss = 0.40655463
Iteration 2428, loss = 0.40631948
Iteration 2429, loss = 0.40617645
Iteration 2430, loss = 0.40592287
Iteration 2431, loss = 0.40581867
Iteration 2432, loss = 0.40558690
Iteration 2433, loss = 0.40541980
Iteration 2434, loss = 0.40522713
Iteration 2435, loss = 0.40503676
Iteration 2436, loss = 0.40488555
Iteration 2437, loss = 0.40465062
Iteration 2438, loss = 0.40450001
Iteration 2439, loss = 0.40429210
Iteration 2440, loss = 0.40412118
Iteration 2441, loss = 0.40394341
Iteration 2442, loss = 0.40374367
Iteration 2443, loss = 0.40358960
Iteration 2444, loss = 0.40339712
Iteration 2445, loss = 0.40322242
Iteration 2446, loss = 0.40304757
Iteration 2447, loss = 0.40284610
Iteration 2448, loss = 0.40269546
Iteration 2449, loss = 0.40250716
Iteration 2450, loss = 0.40234708
Iteration 2451, loss = 0.40214078
Iteration 2452, loss = 0.40196404
Iteration 2453, loss = 0.40180882
Iteration 2454, loss = 0.40164937
Iteration 2455, loss = 0.40146581
Iteration 2456, loss = 0.40129842
Iteration 2457, loss = 0.40111612
Iteration 2458, loss = 0.40091246
Iteration 2459, loss = 0.40074819
Iteration 2460, loss = 0.40056177
Iteration 2461, loss = 0.40040356
Iteration 2462, loss = 0.40023170
Iteration 2463, loss = 0.40003128
Iteration 2464, loss = 0.39989028
Iteration 2465, loss = 0.39969456
Iteration 2466, loss = 0.39955399
Iteration 2467, loss = 0.39934229
Iteration 2468, loss = 0.39918707
Iteration 2469, loss = 0.39857444
Iteration 2470, loss = 0.39834908
Iteration 2471, loss = 0.39792505
Iteration 2472, loss = 0.39760048
Iteration 2473, loss = 0.39739717
Iteration 2474, loss = 0.39725127
Iteration 2475, loss = 0.39669329
Iteration 2476, loss = 0.39635150
Iteration 2477, loss = 0.39609545
Iteration 2478, loss = 0.39563764
Iteration 2479, loss = 0.39537704
Iteration 2480, loss = 0.39514560
Iteration 2481, loss = 0.39476290
Iteration 2482, loss = 0.39436544
Iteration 2483, loss = 0.39414884
Iteration 2484, loss = 0.39382519
Iteration 2485, loss = 0.39345793
Iteration 2486, loss = 0.39320991
Iteration 2487, loss = 0.39288122
Iteration 2488, loss = 0.39251738
Iteration 2489, loss = 0.39222515
Iteration 2490, loss = 0.39197758
Iteration 2491, loss = 0.39170173
Iteration 2492, loss = 0.39140754
Iteration 2493, loss = 0.39112046
Iteration 2494, loss = 0.39082514
Iteration 2495, loss = 0.39056635
Iteration 2496, loss = 0.39032784
Iteration 2497, loss = 0.39007237
Iteration 2498, loss = 0.38980178
Iteration 2499, loss = 0.38953324
Iteration 2500, loss = 0.38927698
Iteration 2501, loss = 0.38901503
Iteration 2502, loss = 0.38875484
Iteration 2503, loss = 0.38850994
Iteration 2504, loss = 0.38827445
Iteration 2505, loss = 0.38804788
Iteration 2506, loss = 0.38779587
Iteration 2507, loss = 0.38754551
Iteration 2508, loss = 0.38730315
Iteration 2509, loss = 0.38708812
Iteration 2510, loss = 0.38685559
Iteration 2511, loss = 0.38662015
Iteration 2512, loss = 0.38639109
Iteration 2513, loss = 0.38614341
Iteration 2514, loss = 0.38591649
Iteration 2515, loss = 0.38571603
Iteration 2516, loss = 0.38549880
Iteration 2517, loss = 0.38527281
Iteration 2518, loss = 0.38504010
Iteration 2519, loss = 0.38480508
Iteration 2520, loss = 0.38463040
Iteration 2521, loss = 0.38443245
Iteration 2522, loss = 0.38421101
Iteration 2523, loss = 0.38398442
Iteration 2524, loss = 0.38375737
Iteration 2525, loss = 0.38351895
Iteration 2526, loss = 0.38333003
Iteration 2527, loss = 0.38314435
Iteration 2528, loss = 0.38293157
Iteration 2529, loss = 0.38269400
Iteration 2530, loss = 0.38247225
Iteration 2531, loss = 0.38227945
Iteration 2532, loss = 0.38208895
Iteration 2533, loss = 0.38188164
Iteration 2534, loss = 0.38166316
Iteration 2535, loss = 0.38144162
Iteration 2536, loss = 0.38123589
Iteration 2537, loss = 0.38104213
Iteration 2538, loss = 0.38084743
Iteration 2539, loss = 0.38062248
Iteration 2540, loss = 0.38044014
Iteration 2541, loss = 0.38022919
Iteration 2542, loss = 0.38002612
Iteration 2543, loss = 0.37981171
Iteration 2544, loss = 0.37963838
Iteration 2545, loss = 0.37943579
Iteration 2546, loss = 0.37924246
Iteration 2547, loss = 0.37903227
Iteration 2548, loss = 0.37885622
Iteration 2549, loss = 0.37865223
Iteration 2550, loss = 0.37846076
Iteration 2551, loss = 0.37825109
Iteration 2552, loss = 0.37805259
Iteration 2553, loss = 0.37787125
Iteration 2554, loss = 0.37768355
Iteration 2555, loss = 0.37748471
Iteration 2556, loss = 0.37727589
Iteration 2557, loss = 0.37709567
Iteration 2558, loss = 0.37691099
Iteration 2559, loss = 0.37672964
Iteration 2560, loss = 0.37650795
Iteration 2561, loss = 0.37631771
Iteration 2562, loss = 0.37615010
Iteration 2563, loss = 0.37595846
Iteration 2564, loss = 0.37575116
Iteration 2565, loss = 0.37560595
Iteration 2566, loss = 0.37537406
Iteration 2567, loss = 0.37522221
Iteration 2568, loss = 0.37503129
Iteration 2569, loss = 0.37482902
Iteration 2570, loss = 0.37463555
Iteration 2571, loss = 0.37446315
Iteration 2572, loss = 0.37425711
Iteration 2573, loss = 0.37406882
Iteration 2574, loss = 0.37390035
Iteration 2575, loss = 0.37372290
Iteration 2576, loss = 0.37353496
Iteration 2577, loss = 0.37335247
Iteration 2578, loss = 0.37318825
Iteration 2579, loss = 0.37302022
Iteration 2580, loss = 0.37279972
Iteration 2581, loss = 0.37265154
Iteration 2582, loss = 0.37246739
Iteration 2583, loss = 0.37229295
Iteration 2584, loss = 0.37211110
Iteration 2585, loss = 0.37190472
Iteration 2586, loss = 0.37175608
Iteration 2587, loss = 0.37155942
Iteration 2588, loss = 0.37136528
Iteration 2589, loss = 0.37122055
Iteration 2590, loss = 0.37102462
Iteration 2591, loss = 0.37088644
Iteration 2592, loss = 0.37067362
Iteration 2593, loss = 0.37052638
Iteration 2594, loss = 0.37033250
Iteration 2595, loss = 0.37015315
Iteration 2596, loss = 0.36998218
Iteration 2597, loss = 0.36981885
Iteration 2598, loss = 0.36965007
Iteration 2599, loss = 0.36944418
Iteration 2600, loss = 0.36930915
Iteration 2601, loss = 0.36911636
Iteration 2602, loss = 0.36893012
Iteration 2603, loss = 0.36875808
Iteration 2604, loss = 0.36858598
Iteration 2605, loss = 0.36841768
Iteration 2606, loss = 0.36824789
Iteration 2607, loss = 0.36808131
Iteration 2608, loss = 0.36791083
Iteration 2609, loss = 0.36773670
Iteration 2610, loss = 0.36755255
Iteration 2611, loss = 0.36741665
Iteration 2612, loss = 0.36723772
Iteration 2613, loss = 0.36707019
Iteration 2614, loss = 0.36690793
Iteration 2615, loss = 0.36672326
Iteration 2616, loss = 0.36653728
Iteration 2617, loss = 0.36637494
Iteration 2618, loss = 0.36619666
Iteration 2619, loss = 0.36603588
Iteration 2620, loss = 0.36585899
Iteration 2621, loss = 0.36567595
Iteration 2622, loss = 0.36549271
Iteration 2623, loss = 0.36533723
Iteration 2624, loss = 0.36515140
Iteration 2625, loss = 0.36499756
Iteration 2626, loss = 0.36480512
Iteration 2627, loss = 0.36462139
Iteration 2628, loss = 0.36449657
Iteration 2629, loss = 0.36430941
Iteration 2630, loss = 0.36411606
Iteration 2631, loss = 0.36395622
Iteration 2632, loss = 0.36376723
Iteration 2633, loss = 0.36358715
Iteration 2634, loss = 0.36342483
Iteration 2635, loss = 0.36325088
Iteration 2636, loss = 0.36308872
Iteration 2637, loss = 0.36291724
Iteration 2638, loss = 0.36273424
Iteration 2639, loss = 0.36260109
Iteration 2640, loss = 0.36240815
Iteration 2641, loss = 0.36224174
Iteration 2642, loss = 0.36207951
Iteration 2643, loss = 0.36189084
Iteration 2644, loss = 0.36174464
Iteration 2645, loss = 0.36156215
Iteration 2646, loss = 0.36139409
Iteration 2647, loss = 0.36124400
Iteration 2648, loss = 0.36104990
Iteration 2649, loss = 0.36089629
Iteration 2650, loss = 0.36080355
Iteration 2651, loss = 0.36055364
Iteration 2652, loss = 0.36047974
Iteration 2653, loss = 0.36021932
Iteration 2654, loss = 0.36013413
Iteration 2655, loss = 0.35987411
Iteration 2656, loss = 0.35977842
Iteration 2657, loss = 0.35962524
Iteration 2658, loss = 0.35949067
Iteration 2659, loss = 0.35928655
Iteration 2660, loss = 0.35907151
Iteration 2661, loss = 0.35892456
Iteration 2662, loss = 0.35874324
Iteration 2663, loss = 0.35860866
Iteration 2664, loss = 0.35845377
Iteration 2665, loss = 0.35830190
Iteration 2666, loss = 0.35811337
Iteration 2667, loss = 0.35793995
Iteration 2668, loss = 0.35777843
Iteration 2669, loss = 0.35760756
Iteration 2670, loss = 0.35744485
Iteration 2671, loss = 0.35728546
Iteration 2672, loss = 0.35714049
Iteration 2673, loss = 0.35698232
Iteration 2674, loss = 0.35679048
Iteration 2675, loss = 0.35663810
Iteration 2676, loss = 0.35650993
Iteration 2677, loss = 0.35633247
Iteration 2678, loss = 0.35620241
Iteration 2679, loss = 0.35600692
Iteration 2680, loss = 0.35585037
Iteration 2681, loss = 0.35569581
Iteration 2682, loss = 0.35549979
Iteration 2683, loss = 0.35535776
Iteration 2684, loss = 0.35523223
Iteration 2685, loss = 0.35505401
Iteration 2686, loss = 0.35488274
Iteration 2687, loss = 0.35471789
Iteration 2688, loss = 0.35456194
Iteration 2689, loss = 0.35440477
Iteration 2690, loss = 0.35421892
Iteration 2691, loss = 0.35405436
Iteration 2692, loss = 0.35394571
Iteration 2693, loss = 0.35377383
Iteration 2694, loss = 0.35361971
Iteration 2695, loss = 0.35345081
Iteration 2696, loss = 0.35334689
Iteration 2697, loss = 0.35312860
Iteration 2698, loss = 0.35309161
Iteration 2699, loss = 0.35280785
Iteration 2700, loss = 0.35275867
Iteration 2701, loss = 0.35251091
Iteration 2702, loss = 0.35246276
Iteration 2703, loss = 0.35220306
Iteration 2704, loss = 0.35205962
Iteration 2705, loss = 0.35187261
Iteration 2706, loss = 0.35177462
Iteration 2707, loss = 0.35161235
Iteration 2708, loss = 0.35142111
Iteration 2709, loss = 0.35134679
Iteration 2710, loss = 0.35110893
Iteration 2711, loss = 0.35093898
Iteration 2712, loss = 0.35082073
Iteration 2713, loss = 0.35064962
Iteration 2714, loss = 0.35046853
Iteration 2715, loss = 0.35033528
Iteration 2716, loss = 0.35024515
Iteration 2717, loss = 0.35003827
Iteration 2718, loss = 0.34988944
Iteration 2719, loss = 0.34971476
Iteration 2720, loss = 0.34955454
Iteration 2721, loss = 0.34939612
Iteration 2722, loss = 0.34923194
Iteration 2723, loss = 0.34909528
Iteration 2724, loss = 0.34898041
Iteration 2725, loss = 0.34878159
Iteration 2726, loss = 0.34873304
Iteration 2727, loss = 0.34849397
Iteration 2728, loss = 0.34833612
Iteration 2729, loss = 0.34822037
Iteration 2730, loss = 0.34801928
Iteration 2731, loss = 0.34788247
Iteration 2732, loss = 0.34771679
Iteration 2733, loss = 0.34755890
Iteration 2734, loss = 0.34742875
Iteration 2735, loss = 0.34724530
Iteration 2736, loss = 0.34718530
Iteration 2737, loss = 0.34694925
Iteration 2738, loss = 0.34679860
Iteration 2739, loss = 0.34664533
Iteration 2740, loss = 0.34655420
Iteration 2741, loss = 0.34640670
Iteration 2742, loss = 0.34618789
Iteration 2743, loss = 0.34618362
Iteration 2744, loss = 0.34588755
Iteration 2745, loss = 0.34581789
Iteration 2746, loss = 0.34559196
Iteration 2747, loss = 0.34555698
Iteration 2748, loss = 0.34528982
Iteration 2749, loss = 0.34514881
Iteration 2750, loss = 0.34498016
Iteration 2751, loss = 0.34474596
Iteration 2752, loss = 0.34457748
Iteration 2753, loss = 0.34445774
Iteration 2754, loss = 0.34430134
Iteration 2755, loss = 0.34411485
Iteration 2756, loss = 0.34400842
Iteration 2757, loss = 0.34385545
Iteration 2758, loss = 0.34368683
Iteration 2759, loss = 0.34359027
Iteration 2760, loss = 0.34337608
Iteration 2761, loss = 0.34321779
Iteration 2762, loss = 0.34304705
Iteration 2763, loss = 0.34293405
Iteration 2764, loss = 0.34275243
Iteration 2765, loss = 0.34256968
Iteration 2766, loss = 0.34247608
Iteration 2767, loss = 0.34227667
Iteration 2768, loss = 0.34212455
Iteration 2769, loss = 0.34194890
Iteration 2770, loss = 0.34180350
Iteration 2771, loss = 0.34163762
Iteration 2772, loss = 0.34152751
Iteration 2773, loss = 0.34137659
Iteration 2774, loss = 0.34120730
Iteration 2775, loss = 0.34108237
Iteration 2776, loss = 0.34085923
Iteration 2777, loss = 0.34073709
Iteration 2778, loss = 0.34060170
Iteration 2779, loss = 0.34042715
Iteration 2780, loss = 0.34028286
Iteration 2781, loss = 0.34014911
Iteration 2782, loss = 0.33996036
Iteration 2783, loss = 0.33979702
Iteration 2784, loss = 0.33964505
Iteration 2785, loss = 0.33948431
Iteration 2786, loss = 0.33933488
Iteration 2787, loss = 0.33922546
Iteration 2788, loss = 0.33899891
Iteration 2789, loss = 0.33887357
Iteration 2790, loss = 0.33876259
Iteration 2791, loss = 0.33855389
Iteration 2792, loss = 0.33848159
Iteration 2793, loss = 0.33823834
Iteration 2794, loss = 0.33824320
Iteration 2795, loss = 0.33796018
Iteration 2796, loss = 0.33796933
Iteration 2797, loss = 0.33771499
Iteration 2798, loss = 0.33751627
Iteration 2799, loss = 0.33756759
Iteration 2800, loss = 0.33721637
Iteration 2801, loss = 0.33732639
Iteration 2802, loss = 0.33691614
Iteration 2803, loss = 0.33700687
Iteration 2804, loss = 0.33656561
Iteration 2805, loss = 0.33661425
Iteration 2806, loss = 0.33625960
Iteration 2807, loss = 0.33621071
Iteration 2808, loss = 0.33605616
Iteration 2809, loss = 0.33585901
Iteration 2810, loss = 0.33578350
Iteration 2811, loss = 0.33556825
Iteration 2812, loss = 0.33549999
Iteration 2813, loss = 0.33524811
Iteration 2814, loss = 0.33518464
Iteration 2815, loss = 0.33489980
Iteration 2816, loss = 0.33484866
Iteration 2817, loss = 0.33469212
Iteration 2818, loss = 0.33450294
Iteration 2819, loss = 0.33443971
Iteration 2820, loss = 0.33424852
Iteration 2821, loss = 0.33410667
Iteration 2822, loss = 0.33390443
Iteration 2823, loss = 0.33383823
Iteration 2824, loss = 0.33353507
Iteration 2825, loss = 0.33353215
Iteration 2826, loss = 0.33328144
Iteration 2827, loss = 0.33318764
Iteration 2828, loss = 0.33302737
Iteration 2829, loss = 0.33287725
Iteration 2830, loss = 0.33273299
Iteration 2831, loss = 0.33255819
Iteration 2832, loss = 0.33244422
Iteration 2833, loss = 0.33219970
Iteration 2834, loss = 0.33214052
Iteration 2835, loss = 0.33191360
Iteration 2836, loss = 0.33185393
Iteration 2837, loss = 0.33167945
Iteration 2838, loss = 0.33151894
Iteration 2839, loss = 0.33141658
Iteration 2840, loss = 0.33123502
Iteration 2841, loss = 0.33112579
Iteration 2842, loss = 0.33091612
Iteration 2843, loss = 0.33077698
Iteration 2844, loss = 0.33058025
Iteration 2845, loss = 0.33045465
Iteration 2846, loss = 0.33031268
Iteration 2847, loss = 0.33017090
Iteration 2848, loss = 0.33004360
Iteration 2849, loss = 0.32982743
Iteration 2850, loss = 0.32971822
Iteration 2851, loss = 0.32956902
Iteration 2852, loss = 0.32947924
Iteration 2853, loss = 0.32924619
Iteration 2854, loss = 0.32926377
Iteration 2855, loss = 0.32896887
Iteration 2856, loss = 0.32903081
Iteration 2857, loss = 0.32866928
Iteration 2858, loss = 0.32859775
Iteration 2859, loss = 0.32836501
Iteration 2860, loss = 0.32823732
Iteration 2861, loss = 0.32820513
Iteration 2862, loss = 0.32795625
Iteration 2863, loss = 0.32793510
Iteration 2864, loss = 0.32772276
Iteration 2865, loss = 0.32757555
Iteration 2866, loss = 0.32738925
Iteration 2867, loss = 0.32734952
Iteration 2868, loss = 0.32705641
Iteration 2869, loss = 0.32695528
Iteration 2870, loss = 0.32679074
Iteration 2871, loss = 0.32663268
Iteration 2872, loss = 0.32651401
Iteration 2873, loss = 0.32633069
Iteration 2874, loss = 0.32620550
Iteration 2875, loss = 0.32608148
Iteration 2876, loss = 0.32591715
Iteration 2877, loss = 0.32577707
Iteration 2878, loss = 0.32564684
Iteration 2879, loss = 0.32547938
Iteration 2880, loss = 0.32529722
Iteration 2881, loss = 0.32515796
Iteration 2882, loss = 0.32498313
Iteration 2883, loss = 0.32486256
Iteration 2884, loss = 0.32469709
Iteration 2885, loss = 0.32453245
Iteration 2886, loss = 0.32440944
Iteration 2887, loss = 0.32425250
Iteration 2888, loss = 0.32408647
Iteration 2889, loss = 0.32399901
Iteration 2890, loss = 0.32382320
Iteration 2891, loss = 0.32367610
Iteration 2892, loss = 0.32352924
Iteration 2893, loss = 0.32337700
Iteration 2894, loss = 0.32324922
Iteration 2895, loss = 0.32309751
Iteration 2896, loss = 0.32294496
Iteration 2897, loss = 0.32283201
Iteration 2898, loss = 0.32266170
Iteration 2899, loss = 0.32251908
Iteration 2900, loss = 0.32240179
Iteration 2901, loss = 0.32220040
Iteration 2902, loss = 0.32216190
Iteration 2903, loss = 0.32190112
Iteration 2904, loss = 0.32180559
Iteration 2905, loss = 0.32166325
Iteration 2906, loss = 0.32148139
Iteration 2907, loss = 0.32138716
Iteration 2908, loss = 0.32119102
Iteration 2909, loss = 0.32105916
Iteration 2910, loss = 0.32091600
Iteration 2911, loss = 0.32073037
Iteration 2912, loss = 0.32058315
Iteration 2913, loss = 0.32043548
Iteration 2914, loss = 0.32032558
Iteration 2915, loss = 0.32013664
Iteration 2916, loss = 0.32000626
Iteration 2917, loss = 0.31984081
Iteration 2918, loss = 0.31970156
Iteration 2919, loss = 0.31958660
Iteration 2920, loss = 0.31946762
Iteration 2921, loss = 0.31926376
Iteration 2922, loss = 0.31912141
Iteration 2923, loss = 0.31897794
Iteration 2924, loss = 0.31888778
Iteration 2925, loss = 0.31867193
Iteration 2926, loss = 0.31852552
Iteration 2927, loss = 0.31838240
Iteration 2928, loss = 0.31824146
Iteration 2929, loss = 0.31813055
Iteration 2930, loss = 0.31795366
Iteration 2931, loss = 0.31782139
Iteration 2932, loss = 0.31767799
Iteration 2933, loss = 0.31752218
Iteration 2934, loss = 0.31741090
Iteration 2935, loss = 0.31721960
Iteration 2936, loss = 0.31708334
Iteration 2937, loss = 0.31697456
Iteration 2938, loss = 0.31683383
Iteration 2939, loss = 0.31666333
Iteration 2940, loss = 0.31649977
Iteration 2941, loss = 0.31637533
Iteration 2942, loss = 0.31627011
Iteration 2943, loss = 0.31608711
Iteration 2944, loss = 0.31606217
Iteration 2945, loss = 0.31580131
Iteration 2946, loss = 0.31567240
Iteration 2947, loss = 0.31555128
Iteration 2948, loss = 0.31538446
Iteration 2949, loss = 0.31525109
Iteration 2950, loss = 0.31507632
Iteration 2951, loss = 0.31493383
Iteration 2952, loss = 0.31479638
Iteration 2953, loss = 0.31466105
Iteration 2954, loss = 0.31453541
Iteration 2955, loss = 0.31435576
Iteration 2956, loss = 0.31432481
Iteration 2957, loss = 0.31408047
Iteration 2958, loss = 0.31395674
Iteration 2959, loss = 0.31384633
Iteration 2960, loss = 0.31365977
Iteration 2961, loss = 0.31352782
Iteration 2962, loss = 0.31340252
Iteration 2963, loss = 0.31323019
Iteration 2964, loss = 0.31311012
Iteration 2965, loss = 0.31294633
Iteration 2966, loss = 0.31284181
Iteration 2967, loss = 0.31273924
Iteration 2968, loss = 0.31253852
Iteration 2969, loss = 0.31241245
Iteration 2970, loss = 0.31229366
Iteration 2971, loss = 0.31208781
Iteration 2972, loss = 0.31198695
Iteration 2973, loss = 0.31183233
Iteration 2974, loss = 0.31167922
Iteration 2975, loss = 0.31156223
Iteration 2976, loss = 0.31142381
Iteration 2977, loss = 0.31125795
Iteration 2978, loss = 0.31113928
Iteration 2979, loss = 0.31101521
Iteration 2980, loss = 0.31084214
Iteration 2981, loss = 0.31070179
Iteration 2982, loss = 0.31055506
Iteration 2983, loss = 0.31043082
Iteration 2984, loss = 0.31030388
Iteration 2985, loss = 0.31014886
Iteration 2986, loss = 0.30999974
Iteration 2987, loss = 0.30988852
Iteration 2988, loss = 0.30974015
Iteration 2989, loss = 0.30959541
Iteration 2990, loss = 0.30949072
Iteration 2991, loss = 0.30935665
Iteration 2992, loss = 0.30916489
Iteration 2993, loss = 0.30909457
Iteration 2994, loss = 0.30892424
Iteration 2995, loss = 0.30879740
Iteration 2996, loss = 0.30865102
Iteration 2997, loss = 0.30847902
Iteration 2998, loss = 0.30834668
Iteration 2999, loss = 0.30822318
Iteration 3000, loss = 0.30807925
Iteration 3001, loss = 0.30796069
Iteration 3002, loss = 0.30781036
Iteration 3003, loss = 0.30768833
Iteration 3004, loss = 0.30750182
Iteration 3005, loss = 0.30736014
Iteration 3006, loss = 0.30724610
Iteration 3007, loss = 0.30711570
Iteration 3008, loss = 0.30696724
Iteration 3009, loss = 0.30682456
Iteration 3010, loss = 0.30669603
Iteration 3011, loss = 0.30655865
Iteration 3012, loss = 0.30643030
Iteration 3013, loss = 0.30630375
Iteration 3014, loss = 0.30617043
Iteration 3015, loss = 0.30608765
Iteration 3016, loss = 0.30587345
Iteration 3017, loss = 0.30579486
Iteration 3018, loss = 0.30571037
Iteration 3019, loss = 0.30553478
Iteration 3020, loss = 0.30542296
Iteration 3021, loss = 0.30521596
Iteration 3022, loss = 0.30516275
Iteration 3023, loss = 0.30493776
Iteration 3024, loss = 0.30493117
Iteration 3025, loss = 0.30475218
Iteration 3026, loss = 0.30454770
Iteration 3027, loss = 0.30447668
Iteration 3028, loss = 0.30430708
Iteration 3029, loss = 0.30420241
Iteration 3030, loss = 0.30403256
Iteration 3031, loss = 0.30386372
Iteration 3032, loss = 0.30375563
Iteration 3033, loss = 0.30359596
Iteration 3034, loss = 0.30348795
Iteration 3035, loss = 0.30337691
Iteration 3036, loss = 0.30320431
Iteration 3037, loss = 0.30318906
Iteration 3038, loss = 0.30296393
Iteration 3039, loss = 0.30283188
Iteration 3040, loss = 0.30275177
Iteration 3041, loss = 0.30257878
Iteration 3042, loss = 0.30246335
Iteration 3043, loss = 0.30230176
Iteration 3044, loss = 0.30219130
Iteration 3045, loss = 0.30207648
Iteration 3046, loss = 0.30192816
Iteration 3047, loss = 0.30181161
Iteration 3048, loss = 0.30165248
Iteration 3049, loss = 0.30152475
Iteration 3050, loss = 0.30140624
Iteration 3051, loss = 0.30129401
Iteration 3052, loss = 0.30114634
Iteration 3053, loss = 0.30101117
Iteration 3054, loss = 0.30087696
Iteration 3055, loss = 0.30076074
Iteration 3056, loss = 0.30064290
Iteration 3057, loss = 0.30051002
Iteration 3058, loss = 0.30036929
Iteration 3059, loss = 0.30023687
Iteration 3060, loss = 0.30009713
Iteration 3061, loss = 0.29995040
Iteration 3062, loss = 0.29982917
Iteration 3063, loss = 0.29972588
Iteration 3064, loss = 0.29956827
Iteration 3065, loss = 0.29942503
Iteration 3066, loss = 0.29932227
Iteration 3067, loss = 0.29923885
Iteration 3068, loss = 0.29906932
Iteration 3069, loss = 0.29896350
Iteration 3070, loss = 0.29887102
Iteration 3071, loss = 0.29867191
Iteration 3072, loss = 0.29867340
Iteration 3073, loss = 0.29850733
Iteration 3074, loss = 0.29833681
Iteration 3075, loss = 0.29830777
Iteration 3076, loss = 0.29801311
Iteration 3077, loss = 0.29804050
Iteration 3078, loss = 0.29787490
Iteration 3079, loss = 0.29765649
Iteration 3080, loss = 0.29769364
Iteration 3081, loss = 0.29737825
Iteration 3082, loss = 0.29749853
Iteration 3083, loss = 0.29723237
Iteration 3084, loss = 0.29707712
Iteration 3085, loss = 0.29709191
Iteration 3086, loss = 0.29676423
Iteration 3087, loss = 0.29676716
Iteration 3088, loss = 0.29652003
Iteration 3089, loss = 0.29648719
Iteration 3090, loss = 0.29645007
Iteration 3091, loss = 0.29619051
Iteration 3092, loss = 0.29621832
Iteration 3093, loss = 0.29590823
Iteration 3094, loss = 0.29587714
Iteration 3095, loss = 0.29571410
Iteration 3096, loss = 0.29554031
Iteration 3097, loss = 0.29553294
Iteration 3098, loss = 0.29530970
Iteration 3099, loss = 0.29527922
Iteration 3100, loss = 0.29505966
Iteration 3101, loss = 0.29495040
Iteration 3102, loss = 0.29485629
Iteration 3103, loss = 0.29465132
Iteration 3104, loss = 0.29459469
Iteration 3105, loss = 0.29439139
Iteration 3106, loss = 0.29432911
Iteration 3107, loss = 0.29423223
Iteration 3108, loss = 0.29404609
Iteration 3109, loss = 0.29401607
Iteration 3110, loss = 0.29382359
Iteration 3111, loss = 0.29370141
Iteration 3112, loss = 0.29357422
Iteration 3113, loss = 0.29342260
Iteration 3114, loss = 0.29330754
Iteration 3115, loss = 0.29319120
Iteration 3116, loss = 0.29307155
Iteration 3117, loss = 0.29296653
Iteration 3118, loss = 0.29284427
Iteration 3119, loss = 0.29275505
Iteration 3120, loss = 0.29256901
Iteration 3121, loss = 0.29249844
Iteration 3122, loss = 0.29235919
Iteration 3123, loss = 0.29223804
Iteration 3124, loss = 0.29214254
Iteration 3125, loss = 0.29196238
Iteration 3126, loss = 0.29187065
Iteration 3127, loss = 0.29177298
Iteration 3128, loss = 0.29164365
Iteration 3129, loss = 0.29151417
Iteration 3130, loss = 0.29136943
Iteration 3131, loss = 0.29126295
Iteration 3132, loss = 0.29117050
Iteration 3133, loss = 0.29103226
Iteration 3134, loss = 0.29093231
Iteration 3135, loss = 0.29080865
Iteration 3136, loss = 0.29066790
Iteration 3137, loss = 0.29056049
Iteration 3138, loss = 0.29043380
Iteration 3139, loss = 0.29034671
Iteration 3140, loss = 0.29023336
Iteration 3141, loss = 0.29009405
Iteration 3142, loss = 0.29006006
Iteration 3143, loss = 0.28980961
Iteration 3144, loss = 0.28981618
Iteration 3145, loss = 0.28968839
Iteration 3146, loss = 0.28952477
Iteration 3147, loss = 0.28949082
Iteration 3148, loss = 0.28926417
Iteration 3149, loss = 0.28918973
Iteration 3150, loss = 0.28903067
Iteration 3151, loss = 0.28887900
Iteration 3152, loss = 0.28880059
Iteration 3153, loss = 0.28866638
Iteration 3154, loss = 0.28861693
Iteration 3155, loss = 0.28845912
Iteration 3156, loss = 0.28832949
Iteration 3157, loss = 0.28822610
Iteration 3158, loss = 0.28807777
Iteration 3159, loss = 0.28798203
Iteration 3160, loss = 0.28785043
Iteration 3161, loss = 0.28771887
Iteration 3162, loss = 0.28772187
Iteration 3163, loss = 0.28746877
Iteration 3164, loss = 0.28755257
Iteration 3165, loss = 0.28733840
Iteration 3166, loss = 0.28718976
Iteration 3167, loss = 0.28711288
Iteration 3168, loss = 0.28694398
Iteration 3169, loss = 0.28691326
Iteration 3170, loss = 0.28670929
Iteration 3171, loss = 0.28663840
Iteration 3172, loss = 0.28646850
Iteration 3173, loss = 0.28637272
Iteration 3174, loss = 0.28626165
Iteration 3175, loss = 0.28616145
Iteration 3176, loss = 0.28602359
Iteration 3177, loss = 0.28584849
Iteration 3178, loss = 0.28578076
Iteration 3179, loss = 0.28569218
Iteration 3180, loss = 0.28552210
Iteration 3181, loss = 0.28543414
Iteration 3182, loss = 0.28534037
Iteration 3183, loss = 0.28523163
Iteration 3184, loss = 0.28510511
Iteration 3185, loss = 0.28494006
Iteration 3186, loss = 0.28490031
Iteration 3187, loss = 0.28472346
Iteration 3188, loss = 0.28465485
Iteration 3189, loss = 0.28458064
Iteration 3190, loss = 0.28441765
Iteration 3191, loss = 0.28434233
Iteration 3192, loss = 0.28414789
Iteration 3193, loss = 0.28408089
Iteration 3194, loss = 0.28395586
Iteration 3195, loss = 0.28380293
Iteration 3196, loss = 0.28373321
Iteration 3197, loss = 0.28357063
Iteration 3198, loss = 0.28352330
Iteration 3199, loss = 0.28338682
Iteration 3200, loss = 0.28322993
Iteration 3201, loss = 0.28320444
Iteration 3202, loss = 0.28302163
Iteration 3203, loss = 0.28291881
Iteration 3204, loss = 0.28283347
Iteration 3205, loss = 0.28268400
Iteration 3206, loss = 0.28257372
Iteration 3207, loss = 0.28244658
Iteration 3208, loss = 0.28227245
Iteration 3209, loss = 0.28215716
Iteration 3210, loss = 0.28196742
Iteration 3211, loss = 0.28185234
Iteration 3212, loss = 0.28171116
Iteration 3213, loss = 0.28156928
Iteration 3214, loss = 0.28142305
Iteration 3215, loss = 0.28126012
Iteration 3216, loss = 0.28114994
Iteration 3217, loss = 0.28099951
Iteration 3218, loss = 0.28086417
Iteration 3219, loss = 0.28070702
Iteration 3220, loss = 0.28059539
Iteration 3221, loss = 0.28043780
Iteration 3222, loss = 0.28030377
Iteration 3223, loss = 0.28014920
Iteration 3224, loss = 0.28002105
Iteration 3225, loss = 0.27988377
Iteration 3226, loss = 0.27976951
Iteration 3227, loss = 0.27961778
Iteration 3228, loss = 0.27948749
Iteration 3229, loss = 0.27931989
Iteration 3230, loss = 0.27918309
Iteration 3231, loss = 0.27906688
Iteration 3232, loss = 0.27890994
Iteration 3233, loss = 0.27879327
Iteration 3234, loss = 0.27867903
Iteration 3235, loss = 0.27854279
Iteration 3236, loss = 0.27838097
Iteration 3237, loss = 0.27824299
Iteration 3238, loss = 0.27814107
Iteration 3239, loss = 0.27800209
Iteration 3240, loss = 0.27788405
Iteration 3241, loss = 0.27773311
Iteration 3242, loss = 0.27760257
Iteration 3243, loss = 0.27746794
Iteration 3244, loss = 0.27735558
Iteration 3245, loss = 0.27719679
Iteration 3246, loss = 0.27709407
Iteration 3247, loss = 0.27696115
Iteration 3248, loss = 0.27681934
Iteration 3249, loss = 0.27668568
Iteration 3250, loss = 0.27657189
Iteration 3251, loss = 0.27642393
Iteration 3252, loss = 0.27631815
Iteration 3253, loss = 0.27623489
Iteration 3254, loss = 0.27603830
Iteration 3255, loss = 0.27600755
Iteration 3256, loss = 0.27580351
Iteration 3257, loss = 0.27568125
Iteration 3258, loss = 0.27558594
Iteration 3259, loss = 0.27529995
Iteration 3260, loss = 0.27497495
Iteration 3261, loss = 0.27454623
Iteration 3262, loss = 0.27419618
Iteration 3263, loss = 0.27389447
Iteration 3264, loss = 0.27362309
Iteration 3265, loss = 0.27347427
Iteration 3266, loss = 0.27332465
Iteration 3267, loss = 0.27321289
Iteration 3268, loss = 0.27307755
Iteration 3269, loss = 0.27297599
Iteration 3270, loss = 0.27286297
Iteration 3271, loss = 0.27274434
Iteration 3272, loss = 0.27260670
Iteration 3273, loss = 0.27245253
Iteration 3274, loss = 0.27227744
Iteration 3275, loss = 0.27210064
Iteration 3276, loss = 0.27192940
Iteration 3277, loss = 0.27173930
Iteration 3278, loss = 0.27157427
Iteration 3279, loss = 0.27141955
Iteration 3280, loss = 0.27125953
Iteration 3281, loss = 0.27111921
Iteration 3282, loss = 0.27098464
Iteration 3283, loss = 0.27088562
Iteration 3284, loss = 0.27072413
Iteration 3285, loss = 0.27059061
Iteration 3286, loss = 0.27058201
Iteration 3287, loss = 0.27029567
Iteration 3288, loss = 0.27038109
Iteration 3289, loss = 0.27021148
Iteration 3290, loss = 0.26991791
Iteration 3291, loss = 0.27002507
Iteration 3292, loss = 0.26968350
Iteration 3293, loss = 0.26958511
Iteration 3294, loss = 0.26948356
Iteration 3295, loss = 0.26918542
Iteration 3296, loss = 0.26910199
Iteration 3297, loss = 0.26891675
Iteration 3298, loss = 0.26885134
Iteration 3299, loss = 0.26884803
Iteration 3300, loss = 0.26851357
Iteration 3301, loss = 0.26863646
Iteration 3302, loss = 0.26835796
Iteration 3303, loss = 0.26816362
Iteration 3304, loss = 0.26819382
Iteration 3305, loss = 0.26787432
Iteration 3306, loss = 0.26782076
Iteration 3307, loss = 0.26766109
Iteration 3308, loss = 0.26744130
Iteration 3309, loss = 0.26732500
Iteration 3310, loss = 0.26723249
Iteration 3311, loss = 0.26706610
Iteration 3312, loss = 0.26693349
Iteration 3313, loss = 0.26681218
Iteration 3314, loss = 0.26670840
Iteration 3315, loss = 0.26653801
Iteration 3316, loss = 0.26648528
Iteration 3317, loss = 0.26634299
Iteration 3318, loss = 0.26615780
Iteration 3319, loss = 0.26611294
Iteration 3320, loss = 0.26590146
Iteration 3321, loss = 0.26576663
Iteration 3322, loss = 0.26577559
Iteration 3323, loss = 0.26552060
Iteration 3324, loss = 0.26537731
Iteration 3325, loss = 0.26530541
Iteration 3326, loss = 0.26515271
Iteration 3327, loss = 0.26501206
Iteration 3328, loss = 0.26492726
Iteration 3329, loss = 0.26479210
Iteration 3330, loss = 0.26461463
Iteration 3331, loss = 0.26458626
Iteration 3332, loss = 0.26440777
Iteration 3333, loss = 0.26426436
Iteration 3334, loss = 0.26418198
Iteration 3335, loss = 0.26399752
Iteration 3336, loss = 0.26387439
Iteration 3337, loss = 0.26376364
Iteration 3338, loss = 0.26362828
Iteration 3339, loss = 0.26349206
Iteration 3340, loss = 0.26338201
Iteration 3341, loss = 0.26321787
Iteration 3342, loss = 0.26326736
Iteration 3343, loss = 0.26296315
Iteration 3344, loss = 0.26299814
Iteration 3345, loss = 0.26288243
Iteration 3346, loss = 0.26264369
Iteration 3347, loss = 0.26269392
Iteration 3348, loss = 0.26237729
Iteration 3349, loss = 0.26231741
Iteration 3350, loss = 0.26216688
Iteration 3351, loss = 0.26197057
Iteration 3352, loss = 0.26189480
Iteration 3353, loss = 0.26171165
Iteration 3354, loss = 0.26164025
Iteration 3355, loss = 0.26160644
Iteration 3356, loss = 0.26136495
Iteration 3357, loss = 0.26137533
Iteration 3358, loss = 0.26117964
Iteration 3359, loss = 0.26103230
Iteration 3360, loss = 0.26100914
Iteration 3361, loss = 0.26075085
Iteration 3362, loss = 0.26066397
Iteration 3363, loss = 0.26052409
Iteration 3364, loss = 0.26037532
Iteration 3365, loss = 0.26029631
Iteration 3366, loss = 0.26011630
Iteration 3367, loss = 0.26011389
Iteration 3368, loss = 0.25990790
Iteration 3369, loss = 0.25976947
Iteration 3370, loss = 0.25972606
Iteration 3371, loss = 0.25952920
Iteration 3372, loss = 0.25940461
Iteration 3373, loss = 0.25929153
Iteration 3374, loss = 0.25916424
Iteration 3375, loss = 0.25904396
Iteration 3376, loss = 0.25890101
Iteration 3377, loss = 0.25874374
Iteration 3378, loss = 0.25862102
Iteration 3379, loss = 0.25851997
Iteration 3380, loss = 0.25836020
Iteration 3381, loss = 0.25823410
Iteration 3382, loss = 0.25808029
Iteration 3383, loss = 0.25792663
Iteration 3384, loss = 0.25780648
Iteration 3385, loss = 0.25765817
Iteration 3386, loss = 0.25751125
Iteration 3387, loss = 0.25733996
Iteration 3388, loss = 0.25713285
Iteration 3389, loss = 0.25696417
Iteration 3390, loss = 0.25675038
Iteration 3391, loss = 0.25658219
Iteration 3392, loss = 0.25639469
Iteration 3393, loss = 0.25613636
Iteration 3394, loss = 0.25594015
Iteration 3395, loss = 0.25567284
Iteration 3396, loss = 0.25540091
Iteration 3397, loss = 0.25506338
Iteration 3398, loss = 0.25475065
Iteration 3399, loss = 0.25439483
Iteration 3400, loss = 0.25404170
Iteration 3401, loss = 0.25363496
Iteration 3402, loss = 0.25338369
Iteration 3403, loss = 0.25312599
Iteration 3404, loss = 0.25280146
Iteration 3405, loss = 0.25274384
Iteration 3406, loss = 0.25246684
Iteration 3407, loss = 0.25231271
Iteration 3408, loss = 0.25212566
Iteration 3409, loss = 0.25190085
Iteration 3410, loss = 0.25185728
Iteration 3411, loss = 0.25164166
Iteration 3412, loss = 0.25162164
Iteration 3413, loss = 0.25143781
Iteration 3414, loss = 0.25126646
Iteration 3415, loss = 0.25133465
Iteration 3416, loss = 0.25099362
Iteration 3417, loss = 0.25098402
Iteration 3418, loss = 0.25075573
Iteration 3419, loss = 0.25062961
Iteration 3420, loss = 0.25060427
Iteration 3421, loss = 0.25025248
Iteration 3422, loss = 0.25031595
Iteration 3423, loss = 0.24994894
Iteration 3424, loss = 0.24978780
Iteration 3425, loss = 0.24956340
Iteration 3426, loss = 0.24926026
Iteration 3427, loss = 0.24910622
Iteration 3428, loss = 0.24873916
Iteration 3429, loss = 0.24847764
Iteration 3430, loss = 0.24812259
Iteration 3431, loss = 0.24771681
Iteration 3432, loss = 0.24731421
Iteration 3433, loss = 0.24679849
Iteration 3434, loss = 0.24612483
Iteration 3435, loss = 0.24550816
Iteration 3436, loss = 0.24484703
Iteration 3437, loss = 0.24436938
Iteration 3438, loss = 0.24391639
Iteration 3439, loss = 0.24360623
Iteration 3440, loss = 0.24337032
Iteration 3441, loss = 0.24305929
Iteration 3442, loss = 0.24292777
Iteration 3443, loss = 0.24280188
Iteration 3444, loss = 0.24270262
Iteration 3445, loss = 0.24254329
Iteration 3446, loss = 0.24238568
Iteration 3447, loss = 0.24223181
Iteration 3448, loss = 0.24208726
Iteration 3449, loss = 0.24196336
Iteration 3450, loss = 0.24190708
Iteration 3451, loss = 0.24165931
Iteration 3452, loss = 0.24159515
Iteration 3453, loss = 0.24142345
Iteration 3454, loss = 0.24118139
Iteration 3455, loss = 0.24107749
Iteration 3456, loss = 0.24079735
Iteration 3457, loss = 0.24074388
Iteration 3458, loss = 0.24048049
Iteration 3459, loss = 0.24031971
Iteration 3460, loss = 0.24015502
Iteration 3461, loss = 0.24000291
Iteration 3462, loss = 0.23981512
Iteration 3463, loss = 0.23961911
Iteration 3464, loss = 0.23944752
Iteration 3465, loss = 0.23928239
Iteration 3466, loss = 0.23913036
Iteration 3467, loss = 0.23892066
Iteration 3468, loss = 0.23877740
Iteration 3469, loss = 0.23865942
Iteration 3470, loss = 0.23844017
Iteration 3471, loss = 0.23825187
Iteration 3472, loss = 0.23809304
Iteration 3473, loss = 0.23790362
Iteration 3474, loss = 0.23774774
Iteration 3475, loss = 0.23758778
Iteration 3476, loss = 0.23742208
Iteration 3477, loss = 0.23724380
Iteration 3478, loss = 0.23706921
Iteration 3479, loss = 0.23688811
Iteration 3480, loss = 0.23672711
Iteration 3481, loss = 0.23655334
Iteration 3482, loss = 0.23636458
Iteration 3483, loss = 0.23622519
Iteration 3484, loss = 0.23602859
Iteration 3485, loss = 0.23592916
Iteration 3486, loss = 0.23594419
Iteration 3487, loss = 0.23559737
Iteration 3488, loss = 0.23566040
Iteration 3489, loss = 0.23533320
Iteration 3490, loss = 0.23515994
Iteration 3491, loss = 0.23514176
Iteration 3492, loss = 0.23477303
Iteration 3493, loss = 0.23474527
Iteration 3494, loss = 0.23448150
Iteration 3495, loss = 0.23439176
Iteration 3496, loss = 0.23425828
Iteration 3497, loss = 0.23397810
Iteration 3498, loss = 0.23401174
Iteration 3499, loss = 0.23370992
Iteration 3500, loss = 0.23356626
Iteration 3501, loss = 0.23337726
Iteration 3502, loss = 0.23318411
Iteration 3503, loss = 0.23309671
Iteration 3504, loss = 0.23285937
Iteration 3505, loss = 0.23280156
Iteration 3506, loss = 0.23283514
Iteration 3507, loss = 0.23247606
Iteration 3508, loss = 0.23256044
Iteration 3509, loss = 0.23212892
Iteration 3510, loss = 0.23208413
Iteration 3511, loss = 0.23179018
Iteration 3512, loss = 0.23174384
Iteration 3513, loss = 0.23151911
Iteration 3514, loss = 0.23132882
Iteration 3515, loss = 0.23129518
Iteration 3516, loss = 0.23098694
Iteration 3517, loss = 0.23089255
Iteration 3518, loss = 0.23071766
Iteration 3519, loss = 0.23054650
Iteration 3520, loss = 0.23034827
Iteration 3521, loss = 0.23029057
Iteration 3522, loss = 0.23008015
Iteration 3523, loss = 0.22994991
Iteration 3524, loss = 0.22977959
Iteration 3525, loss = 0.22959449
Iteration 3526, loss = 0.22950799
Iteration 3527, loss = 0.22928608
Iteration 3528, loss = 0.22914891
Iteration 3529, loss = 0.22904856
Iteration 3530, loss = 0.22888635
Iteration 3531, loss = 0.22872049
Iteration 3532, loss = 0.22860803
Iteration 3533, loss = 0.22837327
Iteration 3534, loss = 0.22816750
Iteration 3535, loss = 0.22809384
Iteration 3536, loss = 0.22795789
Iteration 3537, loss = 0.22768938
Iteration 3538, loss = 0.22756447
Iteration 3539, loss = 0.22742518
Iteration 3540, loss = 0.22727185
Iteration 3541, loss = 0.22709697
Iteration 3542, loss = 0.22691860
Iteration 3543, loss = 0.22683566
Iteration 3544, loss = 0.22666016
Iteration 3545, loss = 0.22644118
Iteration 3546, loss = 0.22637955
Iteration 3547, loss = 0.22641380
Iteration 3548, loss = 0.22610619
Iteration 3549, loss = 0.22618572
Iteration 3550, loss = 0.22577237
Iteration 3551, loss = 0.22570388
Iteration 3552, loss = 0.22542643
Iteration 3553, loss = 0.22524300
Iteration 3554, loss = 0.22513956
Iteration 3555, loss = 0.22489286
Iteration 3556, loss = 0.22493632
Iteration 3557, loss = 0.22474354
Iteration 3558, loss = 0.22451329
Iteration 3559, loss = 0.22442445
Iteration 3560, loss = 0.22415578
Iteration 3561, loss = 0.22418351
Iteration 3562, loss = 0.22384032
Iteration 3563, loss = 0.22391360
Iteration 3564, loss = 0.22370039
Iteration 3565, loss = 0.22361809
Iteration 3566, loss = 0.22339662
Iteration 3567, loss = 0.22314419
Iteration 3568, loss = 0.22304671
Iteration 3569, loss = 0.22294541
Iteration 3570, loss = 0.22279334
Iteration 3571, loss = 0.22256768
Iteration 3572, loss = 0.22251173
Iteration 3573, loss = 0.22223566
Iteration 3574, loss = 0.22205317
Iteration 3575, loss = 0.22199755
Iteration 3576, loss = 0.22179837
Iteration 3577, loss = 0.22149369
Iteration 3578, loss = 0.22155066
Iteration 3579, loss = 0.22138929
Iteration 3580, loss = 0.22107266
Iteration 3581, loss = 0.22111029
Iteration 3582, loss = 0.22097968
Iteration 3583, loss = 0.22057430
Iteration 3584, loss = 0.22066994
Iteration 3585, loss = 0.22047802
Iteration 3586, loss = 0.22028248
Iteration 3587, loss = 0.22019897
Iteration 3588, loss = 0.21992520
Iteration 3589, loss = 0.21990344
Iteration 3590, loss = 0.21958075
Iteration 3591, loss = 0.21961294
Iteration 3592, loss = 0.21921354
Iteration 3593, loss = 0.21926109
Iteration 3594, loss = 0.21896410
Iteration 3595, loss = 0.21881997
Iteration 3596, loss = 0.21867888
Iteration 3597, loss = 0.21853563
Iteration 3598, loss = 0.21833320
Iteration 3599, loss = 0.21814593
Iteration 3600, loss = 0.21804686
Iteration 3601, loss = 0.21788767
Iteration 3602, loss = 0.21772621
Iteration 3603, loss = 0.21761019
Iteration 3604, loss = 0.21741128
Iteration 3605, loss = 0.21725542
Iteration 3606, loss = 0.21709288
Iteration 3607, loss = 0.21696274
Iteration 3608, loss = 0.21681298
Iteration 3609, loss = 0.21667088
Iteration 3610, loss = 0.21648961
Iteration 3611, loss = 0.21633886
Iteration 3612, loss = 0.21625740
Iteration 3613, loss = 0.21605309
Iteration 3614, loss = 0.21591867
Iteration 3615, loss = 0.21577572
Iteration 3616, loss = 0.21558469
Iteration 3617, loss = 0.21549675
Iteration 3618, loss = 0.21535386
Iteration 3619, loss = 0.21518109
Iteration 3620, loss = 0.21501712
Iteration 3621, loss = 0.21491206
Iteration 3622, loss = 0.21473166
Iteration 3623, loss = 0.21458956
Iteration 3624, loss = 0.21445765
Iteration 3625, loss = 0.21424647
Iteration 3626, loss = 0.21417888
Iteration 3627, loss = 0.21398999
Iteration 3628, loss = 0.21381537
Iteration 3629, loss = 0.21370765
Iteration 3630, loss = 0.21349537
Iteration 3631, loss = 0.21336570
Iteration 3632, loss = 0.21321629
Iteration 3633, loss = 0.21305238
Iteration 3634, loss = 0.21293874
Iteration 3635, loss = 0.21277108
Iteration 3636, loss = 0.21265219
Iteration 3637, loss = 0.21251491
Iteration 3638, loss = 0.21230933
Iteration 3639, loss = 0.21226626
Iteration 3640, loss = 0.21200434
Iteration 3641, loss = 0.21194581
Iteration 3642, loss = 0.21176445
Iteration 3643, loss = 0.21155400
Iteration 3644, loss = 0.21140876
Iteration 3645, loss = 0.21125530
Iteration 3646, loss = 0.21111053
Iteration 3647, loss = 0.21094311
Iteration 3648, loss = 0.21079088
Iteration 3649, loss = 0.21063709
Iteration 3650, loss = 0.21048451
Iteration 3651, loss = 0.21033557
Iteration 3652, loss = 0.21019055
Iteration 3653, loss = 0.21001609
Iteration 3654, loss = 0.20989672
Iteration 3655, loss = 0.20973847
Iteration 3656, loss = 0.20957315
Iteration 3657, loss = 0.20945651
Iteration 3658, loss = 0.20925312
Iteration 3659, loss = 0.20914787
Iteration 3660, loss = 0.20901965
Iteration 3661, loss = 0.20880648
Iteration 3662, loss = 0.20872637
Iteration 3663, loss = 0.20854079
Iteration 3664, loss = 0.20834305
Iteration 3665, loss = 0.20824521
Iteration 3666, loss = 0.20804494
Iteration 3667, loss = 0.20791971
Iteration 3668, loss = 0.20778370
Iteration 3669, loss = 0.20756781
Iteration 3670, loss = 0.20744809
Iteration 3671, loss = 0.20731399
Iteration 3672, loss = 0.20711319
Iteration 3673, loss = 0.20705738
Iteration 3674, loss = 0.20689084
Iteration 3675, loss = 0.20664750
Iteration 3676, loss = 0.20654251
Iteration 3677, loss = 0.20634473
Iteration 3678, loss = 0.20619948
Iteration 3679, loss = 0.20604595
Iteration 3680, loss = 0.20587407
Iteration 3681, loss = 0.20571969
Iteration 3682, loss = 0.20557121
Iteration 3683, loss = 0.20540973
Iteration 3684, loss = 0.20523756
Iteration 3685, loss = 0.20514749
Iteration 3686, loss = 0.20493351
Iteration 3687, loss = 0.20480689
Iteration 3688, loss = 0.20459084
Iteration 3689, loss = 0.20426311
Iteration 3690, loss = 0.20401001
Iteration 3691, loss = 0.20373028
Iteration 3692, loss = 0.20354980
Iteration 3693, loss = 0.20335871
Iteration 3694, loss = 0.20323238
Iteration 3695, loss = 0.20317683
Iteration 3696, loss = 0.20293732
Iteration 3697, loss = 0.20295284
Iteration 3698, loss = 0.20270057
Iteration 3699, loss = 0.20250163
Iteration 3700, loss = 0.20236207
Iteration 3701, loss = 0.20207677
Iteration 3702, loss = 0.20197717
Iteration 3703, loss = 0.20173969
Iteration 3704, loss = 0.20161895
Iteration 3705, loss = 0.20144703
Iteration 3706, loss = 0.20126170
Iteration 3707, loss = 0.20115869
Iteration 3708, loss = 0.20091607
Iteration 3709, loss = 0.20079288
Iteration 3710, loss = 0.20064527
Iteration 3711, loss = 0.20042748
Iteration 3712, loss = 0.20031263
Iteration 3713, loss = 0.20011389
Iteration 3714, loss = 0.19996052
Iteration 3715, loss = 0.19980899
Iteration 3716, loss = 0.19961238
Iteration 3717, loss = 0.19948213
Iteration 3718, loss = 0.19939842
Iteration 3719, loss = 0.19917766
Iteration 3720, loss = 0.19910037
Iteration 3721, loss = 0.19882243
Iteration 3722, loss = 0.19873343
Iteration 3723, loss = 0.19851671
Iteration 3724, loss = 0.19838476
Iteration 3725, loss = 0.19819636
Iteration 3726, loss = 0.19800629
Iteration 3727, loss = 0.19787974
Iteration 3728, loss = 0.19771693
Iteration 3729, loss = 0.19761306
Iteration 3730, loss = 0.19750358
Iteration 3731, loss = 0.19721066
Iteration 3732, loss = 0.19707573
Iteration 3733, loss = 0.19693586
Iteration 3734, loss = 0.19680143
Iteration 3735, loss = 0.19663352
Iteration 3736, loss = 0.19645321
Iteration 3737, loss = 0.19630391
Iteration 3738, loss = 0.19622862
Iteration 3739, loss = 0.19607493
Iteration 3740, loss = 0.19585046
Iteration 3741, loss = 0.19577776
Iteration 3742, loss = 0.19552253
Iteration 3743, loss = 0.19538301
Iteration 3744, loss = 0.19517961
Iteration 3745, loss = 0.19498547
Iteration 3746, loss = 0.19487650
Iteration 3747, loss = 0.19464184
Iteration 3748, loss = 0.19455480
Iteration 3749, loss = 0.19445797
Iteration 3750, loss = 0.19418880
Iteration 3751, loss = 0.19412320
Iteration 3752, loss = 0.19392316
Iteration 3753, loss = 0.19378212
Iteration 3754, loss = 0.19372194
Iteration 3755, loss = 0.19340772
Iteration 3756, loss = 0.19335090
Iteration 3757, loss = 0.19310172
Iteration 3758, loss = 0.19302828
Iteration 3759, loss = 0.19281811
Iteration 3760, loss = 0.19263319
Iteration 3761, loss = 0.19256327
Iteration 3762, loss = 0.19237884
Iteration 3763, loss = 0.19223902
Iteration 3764, loss = 0.19214555
Iteration 3765, loss = 0.19189869
Iteration 3766, loss = 0.19175131
Iteration 3767, loss = 0.19166231
Iteration 3768, loss = 0.19145835
Iteration 3769, loss = 0.19133135
Iteration 3770, loss = 0.19112426
Iteration 3771, loss = 0.19101188
Iteration 3772, loss = 0.19079018
Iteration 3773, loss = 0.19068651
Iteration 3774, loss = 0.19053660
Iteration 3775, loss = 0.19032227
Iteration 3776, loss = 0.19021175
Iteration 3777, loss = 0.19010150
Iteration 3778, loss = 0.19004792
Iteration 3779, loss = 0.18987209
Iteration 3780, loss = 0.18975172
Iteration 3781, loss = 0.18963114
Iteration 3782, loss = 0.18933178
Iteration 3783, loss = 0.18941531
Iteration 3784, loss = 0.18912635
Iteration 3785, loss = 0.18901504
Iteration 3786, loss = 0.18892744
Iteration 3787, loss = 0.18868110
Iteration 3788, loss = 0.18887534
Iteration 3789, loss = 0.18848446
Iteration 3790, loss = 0.18834685
Iteration 3791, loss = 0.18814537
Iteration 3792, loss = 0.18795856
Iteration 3793, loss = 0.18776513
Iteration 3794, loss = 0.18761696
Iteration 3795, loss = 0.18749871
Iteration 3796, loss = 0.18732413
Iteration 3797, loss = 0.18721636
Iteration 3798, loss = 0.18703014
Iteration 3799, loss = 0.18689216
Iteration 3800, loss = 0.18671863
Iteration 3801, loss = 0.18655274
Iteration 3802, loss = 0.18649026
Iteration 3803, loss = 0.18633793
Iteration 3804, loss = 0.18617393
Iteration 3805, loss = 0.18598778
Iteration 3806, loss = 0.18592282
Iteration 3807, loss = 0.18580944
Iteration 3808, loss = 0.18560761
Iteration 3809, loss = 0.18541956
Iteration 3810, loss = 0.18530110
Iteration 3811, loss = 0.18523584
Iteration 3812, loss = 0.18513037
Iteration 3813, loss = 0.18501941
Iteration 3814, loss = 0.18481417
Iteration 3815, loss = 0.18465978
Iteration 3816, loss = 0.18455058
Iteration 3817, loss = 0.18437693
Iteration 3818, loss = 0.18434584
Iteration 3819, loss = 0.18408816
Iteration 3820, loss = 0.18392668
Iteration 3821, loss = 0.18387612
Iteration 3822, loss = 0.18368201
Iteration 3823, loss = 0.18369846
Iteration 3824, loss = 0.18346668
Iteration 3825, loss = 0.18333811
Iteration 3826, loss = 0.18316557
Iteration 3827, loss = 0.18299095
Iteration 3828, loss = 0.18331607
Iteration 3829, loss = 0.18299390
Iteration 3830, loss = 0.18292190
Iteration 3831, loss = 0.18265506
Iteration 3832, loss = 0.18254462
Iteration 3833, loss = 0.18262886
Iteration 3834, loss = 0.18214216
Iteration 3835, loss = 0.18214571
Iteration 3836, loss = 0.18188662
Iteration 3837, loss = 0.18167964
Iteration 3838, loss = 0.18164075
Iteration 3839, loss = 0.18148315
Iteration 3840, loss = 0.18119533
Iteration 3841, loss = 0.18107641
Iteration 3842, loss = 0.18086376
Iteration 3843, loss = 0.18079338
Iteration 3844, loss = 0.18072391
Iteration 3845, loss = 0.18049640
Iteration 3846, loss = 0.18043159
Iteration 3847, loss = 0.18017229
Iteration 3848, loss = 0.17990519
Iteration 3849, loss = 0.18010732
Iteration 3850, loss = 0.17967996
Iteration 3851, loss = 0.17946193
Iteration 3852, loss = 0.17954799
Iteration 3853, loss = 0.17938571
Iteration 3854, loss = 0.17923941
Iteration 3855, loss = 0.17920732
Iteration 3856, loss = 0.17890287
Iteration 3857, loss = 0.17884941
Iteration 3858, loss = 0.17861523
Iteration 3859, loss = 0.17841091
Iteration 3860, loss = 0.17853691
Iteration 3861, loss = 0.17828609
Iteration 3862, loss = 0.17823642
Iteration 3863, loss = 0.17802062
Iteration 3864, loss = 0.17790067
Iteration 3865, loss = 0.17755623
Iteration 3866, loss = 0.17752942
Iteration 3867, loss = 0.17730998
Iteration 3868, loss = 0.17725101
Iteration 3869, loss = 0.17708374
Iteration 3870, loss = 0.17692818
Iteration 3871, loss = 0.17689973
Iteration 3872, loss = 0.17676376
Iteration 3873, loss = 0.17654160
Iteration 3874, loss = 0.17635878
Iteration 3875, loss = 0.17640687
Iteration 3876, loss = 0.17623963
Iteration 3877, loss = 0.17602476
Iteration 3878, loss = 0.17597041
Iteration 3879, loss = 0.17566852
Iteration 3880, loss = 0.17565761
Iteration 3881, loss = 0.17570233
Iteration 3882, loss = 0.17537694
Iteration 3883, loss = 0.17537369
Iteration 3884, loss = 0.17514948
Iteration 3885, loss = 0.17490209
Iteration 3886, loss = 0.17503280
Iteration 3887, loss = 0.17471822
Iteration 3888, loss = 0.17467549
Iteration 3889, loss = 0.17456140
Iteration 3890, loss = 0.17436124
Iteration 3891, loss = 0.17415271
Iteration 3892, loss = 0.17437016
Iteration 3893, loss = 0.17410025
Iteration 3894, loss = 0.17383988
Iteration 3895, loss = 0.17389598
Iteration 3896, loss = 0.17353268
Iteration 3897, loss = 0.17359098
Iteration 3898, loss = 0.17329493
Iteration 3899, loss = 0.17331444
Iteration 3900, loss = 0.17309605
Iteration 3901, loss = 0.17302856
Iteration 3902, loss = 0.17293044
Iteration 3903, loss = 0.17274411
Iteration 3904, loss = 0.17265754
Iteration 3905, loss = 0.17252146
Iteration 3906, loss = 0.17237021
Iteration 3907, loss = 0.17225501
Iteration 3908, loss = 0.17223373
Iteration 3909, loss = 0.17212971
Iteration 3910, loss = 0.17198569
Iteration 3911, loss = 0.17185782
Iteration 3912, loss = 0.17170573
Iteration 3913, loss = 0.17163447
Iteration 3914, loss = 0.17144720
Iteration 3915, loss = 0.17142651
Iteration 3916, loss = 0.17124759
Iteration 3917, loss = 0.17110778
Iteration 3918, loss = 0.17121878
Iteration 3919, loss = 0.17089151
Iteration 3920, loss = 0.17075984
Iteration 3921, loss = 0.17067111
Iteration 3922, loss = 0.17052288
Iteration 3923, loss = 0.17041807
Iteration 3924, loss = 0.17042160
Iteration 3925, loss = 0.17029236
Iteration 3926, loss = 0.17014566
Iteration 3927, loss = 0.17011144
Iteration 3928, loss = 0.16992175
Iteration 3929, loss = 0.16983963
Iteration 3930, loss = 0.16966475
Iteration 3931, loss = 0.16954057
Iteration 3932, loss = 0.16944381
Iteration 3933, loss = 0.16926357
Iteration 3934, loss = 0.16923887
Iteration 3935, loss = 0.16915703
Iteration 3936, loss = 0.16897976
Iteration 3937, loss = 0.16889642
Iteration 3938, loss = 0.16872381
Iteration 3939, loss = 0.16867541
Iteration 3940, loss = 0.16852891
Iteration 3941, loss = 0.16845481
Iteration 3942, loss = 0.16831575
Iteration 3943, loss = 0.16819104
Iteration 3944, loss = 0.16810910
Iteration 3945, loss = 0.16799355
Iteration 3946, loss = 0.16788614
Iteration 3947, loss = 0.16773654
Iteration 3948, loss = 0.16772226
Iteration 3949, loss = 0.16761860
Iteration 3950, loss = 0.16746717
Iteration 3951, loss = 0.16736544
Iteration 3952, loss = 0.16723048
Iteration 3953, loss = 0.16712940
Iteration 3954, loss = 0.16696314
Iteration 3955, loss = 0.16688654
Iteration 3956, loss = 0.16677566
Iteration 3957, loss = 0.16668564
Iteration 3958, loss = 0.16657643
Iteration 3959, loss = 0.16648670
Iteration 3960, loss = 0.16637557
Iteration 3961, loss = 0.16623912
Iteration 3962, loss = 0.16617374
Iteration 3963, loss = 0.16602864
Iteration 3964, loss = 0.16593453
Iteration 3965, loss = 0.16580462
Iteration 3966, loss = 0.16586956
Iteration 3967, loss = 0.16565621
Iteration 3968, loss = 0.16564885
Iteration 3969, loss = 0.16543904
Iteration 3970, loss = 0.16533921
Iteration 3971, loss = 0.16551961
Iteration 3972, loss = 0.16517814
Iteration 3973, loss = 0.16506146
Iteration 3974, loss = 0.16498754
Iteration 3975, loss = 0.16482085
Iteration 3976, loss = 0.16477313
Iteration 3977, loss = 0.16462131
Iteration 3978, loss = 0.16460749
Iteration 3979, loss = 0.16456774
Iteration 3980, loss = 0.16435889
Iteration 3981, loss = 0.16427055
Iteration 3982, loss = 0.16415666
Iteration 3983, loss = 0.16401105
Iteration 3984, loss = 0.16394792
Iteration 3985, loss = 0.16389244
Iteration 3986, loss = 0.16366952
Iteration 3987, loss = 0.16358579
Iteration 3988, loss = 0.16358533
Iteration 3989, loss = 0.16342078
Iteration 3990, loss = 0.16330544
Iteration 3991, loss = 0.16317574
Iteration 3992, loss = 0.16306504
Iteration 3993, loss = 0.16293934
Iteration 3994, loss = 0.16284799
Iteration 3995, loss = 0.16273427
Iteration 3996, loss = 0.16261862
Iteration 3997, loss = 0.16247754
Iteration 3998, loss = 0.16247069
Iteration 3999, loss = 0.16239028
Iteration 4000, loss = 0.16223250
Iteration 4001, loss = 0.16210515
Iteration 4002, loss = 0.16211040
Iteration 4003, loss = 0.16190082
Iteration 4004, loss = 0.16185450
Iteration 4005, loss = 0.16173628
Iteration 4006, loss = 0.16158331
Iteration 4007, loss = 0.16164234
Iteration 4008, loss = 0.16140883
Iteration 4009, loss = 0.16135625
Iteration 4010, loss = 0.16129410
Iteration 4011, loss = 0.16115542
Iteration 4012, loss = 0.16103664
Iteration 4013, loss = 0.16091394
Iteration 4014, loss = 0.16088360
Iteration 4015, loss = 0.16097120
Iteration 4016, loss = 0.16071429
Iteration 4017, loss = 0.16071564
Iteration 4018, loss = 0.16046141
Iteration 4019, loss = 0.16045532
Iteration 4020, loss = 0.16022927
Iteration 4021, loss = 0.16023017
Iteration 4022, loss = 0.16007428
Iteration 4023, loss = 0.15996474
Iteration 4024, loss = 0.15986630
Iteration 4025, loss = 0.15974283
Iteration 4026, loss = 0.15964289
Iteration 4027, loss = 0.15954363
Iteration 4028, loss = 0.15943573
Iteration 4029, loss = 0.15933293
Iteration 4030, loss = 0.15928340
Iteration 4031, loss = 0.15917297
Iteration 4032, loss = 0.15905939
Iteration 4033, loss = 0.15897813
Iteration 4034, loss = 0.15885075
Iteration 4035, loss = 0.15878939
Iteration 4036, loss = 0.15873243
Iteration 4037, loss = 0.15856627
Iteration 4038, loss = 0.15847472
Iteration 4039, loss = 0.15852118
Iteration 4040, loss = 0.15828564
Iteration 4041, loss = 0.15821430
Iteration 4042, loss = 0.15814803
Iteration 4043, loss = 0.15797825
Iteration 4044, loss = 0.15794462
Iteration 4045, loss = 0.15780654
Iteration 4046, loss = 0.15773297
Iteration 4047, loss = 0.15767022
Iteration 4048, loss = 0.15754213
Iteration 4049, loss = 0.15739788
Iteration 4050, loss = 0.15754723
Iteration 4051, loss = 0.15738557
Iteration 4052, loss = 0.15719661
Iteration 4053, loss = 0.15720646
Iteration 4054, loss = 0.15704686
Iteration 4055, loss = 0.15694908
Iteration 4056, loss = 0.15686760
Iteration 4057, loss = 0.15667521
Iteration 4058, loss = 0.15685536
Iteration 4059, loss = 0.15659544
Iteration 4060, loss = 0.15658055
Iteration 4061, loss = 0.15636830
Iteration 4062, loss = 0.15638497
Iteration 4063, loss = 0.15612935
Iteration 4064, loss = 0.15614590
Iteration 4065, loss = 0.15618678
Iteration 4066, loss = 0.15603550
Iteration 4067, loss = 0.15589834
Iteration 4068, loss = 0.15584522
Iteration 4069, loss = 0.15571513
Iteration 4070, loss = 0.15559923
Iteration 4071, loss = 0.15545651
Iteration 4072, loss = 0.15536979
Iteration 4073, loss = 0.15541548
Iteration 4074, loss = 0.15530085
Iteration 4075, loss = 0.15513607
Iteration 4076, loss = 0.15515262
Iteration 4077, loss = 0.15491353
Iteration 4078, loss = 0.15480516
Iteration 4079, loss = 0.15489377
Iteration 4080, loss = 0.15463164
Iteration 4081, loss = 0.15455477
Iteration 4082, loss = 0.15448299
Iteration 4083, loss = 0.15439408
Iteration 4084, loss = 0.15432843
Iteration 4085, loss = 0.15415083
Iteration 4086, loss = 0.15410149
Iteration 4087, loss = 0.15396230
Iteration 4088, loss = 0.15384153
Iteration 4089, loss = 0.15404907
Iteration 4090, loss = 0.15370716
Iteration 4091, loss = 0.15357292
Iteration 4092, loss = 0.15356118
Iteration 4093, loss = 0.15342852
Iteration 4094, loss = 0.15334954
Iteration 4095, loss = 0.15325602
Iteration 4096, loss = 0.15311309
Iteration 4097, loss = 0.15323430
Iteration 4098, loss = 0.15297960
Iteration 4099, loss = 0.15298053
Iteration 4100, loss = 0.15275380
Iteration 4101, loss = 0.15269174
Iteration 4102, loss = 0.15258016
Iteration 4103, loss = 0.15245326
Iteration 4104, loss = 0.15241485
Iteration 4105, loss = 0.15224979
Iteration 4106, loss = 0.15214242
Iteration 4107, loss = 0.15218091
Iteration 4108, loss = 0.15204846
Iteration 4109, loss = 0.15191018
Iteration 4110, loss = 0.15187063
Iteration 4111, loss = 0.15173889
Iteration 4112, loss = 0.15158257
Iteration 4113, loss = 0.15157676
Iteration 4114, loss = 0.15149167
Iteration 4115, loss = 0.15138029
Iteration 4116, loss = 0.15133536
Iteration 4117, loss = 0.15113090
Iteration 4118, loss = 0.15111319
Iteration 4119, loss = 0.15106734
Iteration 4120, loss = 0.15083186
Iteration 4121, loss = 0.15081206
Iteration 4122, loss = 0.15074765
Iteration 4123, loss = 0.15062700
Iteration 4124, loss = 0.15050722
Iteration 4125, loss = 0.15055530
Iteration 4126, loss = 0.15036126
Iteration 4127, loss = 0.15025786
Iteration 4128, loss = 0.15021989
Iteration 4129, loss = 0.15011923
Iteration 4130, loss = 0.15000946
Iteration 4131, loss = 0.14992122
Iteration 4132, loss = 0.14987849
Iteration 4133, loss = 0.14978255
Iteration 4134, loss = 0.14967164
Iteration 4135, loss = 0.14958875
Iteration 4136, loss = 0.14949395
Iteration 4137, loss = 0.14948584
Iteration 4138, loss = 0.14950159
Iteration 4139, loss = 0.14930159
Iteration 4140, loss = 0.14927095
Iteration 4141, loss = 0.14914401
Iteration 4142, loss = 0.14900623
Iteration 4143, loss = 0.14924320
Iteration 4144, loss = 0.14903210
Iteration 4145, loss = 0.14888967
Iteration 4146, loss = 0.14904873
Iteration 4147, loss = 0.14866300
Iteration 4148, loss = 0.14886553
Iteration 4149, loss = 0.14861910
Iteration 4150, loss = 0.14849815
Iteration 4151, loss = 0.14842604
Iteration 4152, loss = 0.14820582
Iteration 4153, loss = 0.14832209
Iteration 4154, loss = 0.14807590
Iteration 4155, loss = 0.14795418
Iteration 4156, loss = 0.14801644
Iteration 4157, loss = 0.14789506
Iteration 4158, loss = 0.14786542
Iteration 4159, loss = 0.14777437
Iteration 4160, loss = 0.14757329
Iteration 4161, loss = 0.14752596
Iteration 4162, loss = 0.14743616
Iteration 4163, loss = 0.14729004
Iteration 4164, loss = 0.14721711
Iteration 4165, loss = 0.14714070
Iteration 4166, loss = 0.14712089
Iteration 4167, loss = 0.14700801
Iteration 4168, loss = 0.14691275
Iteration 4169, loss = 0.14682160
Iteration 4170, loss = 0.14682238
Iteration 4171, loss = 0.14665562
Iteration 4172, loss = 0.14660774
Iteration 4173, loss = 0.14651884
Iteration 4174, loss = 0.14641388
Iteration 4175, loss = 0.14635514
Iteration 4176, loss = 0.14626942
Iteration 4177, loss = 0.14614157
Iteration 4178, loss = 0.14605653
Iteration 4179, loss = 0.14613033
Iteration 4180, loss = 0.14596538
Iteration 4181, loss = 0.14581253
Iteration 4182, loss = 0.14586850
Iteration 4183, loss = 0.14585028
Iteration 4184, loss = 0.14566374
Iteration 4185, loss = 0.14574624
Iteration 4186, loss = 0.14555157
Iteration 4187, loss = 0.14556220
Iteration 4188, loss = 0.14539681
Iteration 4189, loss = 0.14530108
Iteration 4190, loss = 0.14527182
Iteration 4191, loss = 0.14507809
Iteration 4192, loss = 0.14505403
Iteration 4193, loss = 0.14502749
Iteration 4194, loss = 0.14490590
Iteration 4195, loss = 0.14486196
Iteration 4196, loss = 0.14465650
Iteration 4197, loss = 0.14474817
Iteration 4198, loss = 0.14450923
Iteration 4199, loss = 0.14460292
Iteration 4200, loss = 0.14438181
Iteration 4201, loss = 0.14440764
Iteration 4202, loss = 0.14425710
Iteration 4203, loss = 0.14423520
Iteration 4204, loss = 0.14421499
Iteration 4205, loss = 0.14400726
Iteration 4206, loss = 0.14425005
Iteration 4207, loss = 0.14393538
Iteration 4208, loss = 0.14389633
Iteration 4209, loss = 0.14377738
Iteration 4210, loss = 0.14367432
Iteration 4211, loss = 0.14379001
Iteration 4212, loss = 0.14348009
Iteration 4213, loss = 0.14363773
Iteration 4214, loss = 0.14346066
Iteration 4215, loss = 0.14336475
Iteration 4216, loss = 0.14341105
Iteration 4217, loss = 0.14312342
Iteration 4218, loss = 0.14322939
Iteration 4219, loss = 0.14288099
Iteration 4220, loss = 0.14322518
Iteration 4221, loss = 0.14301029
Iteration 4222, loss = 0.14280418
Iteration 4223, loss = 0.14285060
Iteration 4224, loss = 0.14254435
Iteration 4225, loss = 0.14263316
Iteration 4226, loss = 0.14241407
Iteration 4227, loss = 0.14232664
Iteration 4228, loss = 0.14234734
Iteration 4229, loss = 0.14224384
Iteration 4230, loss = 0.14212579
Iteration 4231, loss = 0.14206968
Iteration 4232, loss = 0.14201861
Iteration 4233, loss = 0.14186685
Iteration 4234, loss = 0.14178292
Iteration 4235, loss = 0.14174888
Iteration 4236, loss = 0.14178651
Iteration 4237, loss = 0.14161140
Iteration 4238, loss = 0.14157392
Iteration 4239, loss = 0.14153497
Iteration 4240, loss = 0.14136235
Iteration 4241, loss = 0.14134096
Iteration 4242, loss = 0.14131952
Iteration 4243, loss = 0.14125143
Iteration 4244, loss = 0.14113837
Iteration 4245, loss = 0.14098834
Iteration 4246, loss = 0.14103797
Iteration 4247, loss = 0.14091331
Iteration 4248, loss = 0.14076589
Iteration 4249, loss = 0.14068686
Iteration 4250, loss = 0.14063992
Iteration 4251, loss = 0.14057756
Iteration 4252, loss = 0.14046897
Iteration 4253, loss = 0.14037951
Iteration 4254, loss = 0.14035741
Iteration 4255, loss = 0.14026164
Iteration 4256, loss = 0.14013116
Iteration 4257, loss = 0.14009666
Iteration 4258, loss = 0.14011018
Iteration 4259, loss = 0.13997690
Iteration 4260, loss = 0.13986130
Iteration 4261, loss = 0.13980514
Iteration 4262, loss = 0.13974957
Iteration 4263, loss = 0.13964396
Iteration 4264, loss = 0.13957004
Iteration 4265, loss = 0.13949084
Iteration 4266, loss = 0.13948717
Iteration 4267, loss = 0.13942194
Iteration 4268, loss = 0.13932149
Iteration 4269, loss = 0.13922951
Iteration 4270, loss = 0.13918568
Iteration 4271, loss = 0.13909214
Iteration 4272, loss = 0.13902106
Iteration 4273, loss = 0.13893838
Iteration 4274, loss = 0.13883124
Iteration 4275, loss = 0.13899555
Iteration 4276, loss = 0.13878757
Iteration 4277, loss = 0.13866936
Iteration 4278, loss = 0.13866321
Iteration 4279, loss = 0.13848494
Iteration 4280, loss = 0.13853585
Iteration 4281, loss = 0.13835620
Iteration 4282, loss = 0.13846388
Iteration 4283, loss = 0.13849045
Iteration 4284, loss = 0.13825055
Iteration 4285, loss = 0.13835195
Iteration 4286, loss = 0.13804762
Iteration 4287, loss = 0.13815356
Iteration 4288, loss = 0.13796709
Iteration 4289, loss = 0.13784908
Iteration 4290, loss = 0.13779855
Iteration 4291, loss = 0.13769877
Iteration 4292, loss = 0.13774796
Iteration 4293, loss = 0.13755525
Iteration 4294, loss = 0.13746866
Iteration 4295, loss = 0.13738719
Iteration 4296, loss = 0.13726755
Iteration 4297, loss = 0.13722781
Iteration 4298, loss = 0.13722058
Iteration 4299, loss = 0.13711058
Iteration 4300, loss = 0.13704389
Iteration 4301, loss = 0.13697606
Iteration 4302, loss = 0.13692410
Iteration 4303, loss = 0.13684858
Iteration 4304, loss = 0.13676673
Iteration 4305, loss = 0.13668337
Iteration 4306, loss = 0.13660772
Iteration 4307, loss = 0.13651071
Iteration 4308, loss = 0.13644971
Iteration 4309, loss = 0.13642746
Iteration 4310, loss = 0.13635902
Training loss did not improve more than tol=0.000100 for 10 consecutive epochs. Stopping.
Out[ ]:
MLPClassifier(hidden_layer_sizes=(16, 32, 8), max_iter=5000, random_state=1,
              verbose=10)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
MLPClassifier(hidden_layer_sizes=(16, 32, 8), max_iter=5000, random_state=1,
              verbose=10)

After finishing training, let's predict the test data:

In [ ]:
y_pred = classifier.predict(X_test)

To then calculate the accuracy:

In [ ]:
accuracy_score(Y_test, y_pred)
Out[ ]:
0.8888888888888888
In [ ]:
print(classification_report(Y_test, y_pred))
              precision    recall  f1-score   support

           0       1.00      1.00      1.00        13
           1       0.86      0.67      0.75         9
           2       1.00      1.00      1.00        10
           3       1.00      1.00      1.00         4
           4       0.78      0.78      0.78         9

   micro avg       0.93      0.89      0.91        45
   macro avg       0.93      0.89      0.91        45
weighted avg       0.93      0.89      0.91        45
 samples avg       0.89      0.89      0.89        45

/usr/local/lib/python3.10/dist-packages/sklearn/metrics/_classification.py:1344: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in samples with no predicted labels. Use `zero_division` parameter to control this behavior.
  _warn_prf(average, modifier, msg_start, len(result))

Predicting a usage and coverage map using the trained model:¶

After training the model, we will apply it to an image and generate a usage and coverage map in tiff format. Let's use the same training image, with the same bands.

In [ ]:
with rasterio.open(path) as src:
    im = src.read()
In [ ]:
out_meta = src.meta.copy()
In [ ]:
im = im.transpose([1,2,0])
In [ ]:
X = np.nan_to_num(im)
In [ ]:
flatten_X = X.reshape(X.shape[0]*X.shape[1],X.shape[2])
In [ ]:
pred = classifier.predict(flatten_X)
In [ ]:
pred
Out[ ]:
array([[1, 0, 0, 0, 0],
       [1, 0, 0, 0, 0],
       [1, 0, 0, 0, 0],
       ...,
       [0, 0, 0, 1, 0],
       [0, 0, 0, 1, 0],
       [0, 0, 0, 1, 0]])
In [ ]:
classify = np.argmax(pred, axis=1)
In [ ]:
classify = classify + 1
In [ ]:
classify = classify.reshape(X.shape[0],X.shape[1])
In [ ]:
plt.figure(figsize=[16,10])
plt.imshow(classify,cmap='tab20c')
plt.axis('off')
Out[ ]:
(-0.5, 3700.5, 1570.5, -0.5)
No description has been provided for this image
In [ ]:
export_image = classify[np.newaxis,:,:]
In [ ]:
out_meta.update({"driver": "GTiff",
                  "height": export_image.shape[1],
                  "width": export_image.shape[2],
                  "compress":'lzw',
                  "count":1
                  })
In [ ]:
with rasterio.open('/content/Netherlands_2020_LULC.tif', "w", **out_meta) as dest:
     dest.write(export_image)

Tensorflow¶

image.png

TensorFlow is an open-source framework developed by Google researchers to run machine learning, deep learning, and other statistical and predictive analytics workloads. Like similar platforms, it is designed to streamline the process of developing and running advanced analytics applications for users such as data scientists, statisticians, and predictive modelers.

TensorFlow software deals with datasets that are organized as computational nodes in graph form. The edges that connect nodes in a graph can represent vectors or multidimensional matrices, creating what we call tensors. Because TensorFlow programs use a dataflow architecture that works with generalized intermediate results of calculations, they are especially open to large-scale parallel processing applications, neural networks being a common example.

The framework includes sets of high- and low-level APIs. Google recommends using high-level ones when possible to simplify data pipeline development and application programming. However, knowing how to use the low-level APIs — called TensorFlow Core — can be valuable for experimenting and debugging applications, the company says; it also gives users a "mental model" of the inner workings of machine learning technology, in Google's words.

What is Tensor in Tensorflow?¶

TensorFlow, as the name implies, is a framework for defining and performing computations involving tensors. A tensor is a generalization of vectors and matrices to potentially higher dimensions. Internally, TensorFlow represents tensors as n-dimensional arrays of basic data types. Every element in Tensor has the same data type, and the data type is always known. The shape (i.e., the number of dimensions it has and the size of each dimension) may only be partially known. Most operations produce tensors of fully known shapes if the shapes of their inputs are also fully known, but in some cases it is only possible to find the shape of a tensor at graph run time.

image.png

Why use TensorFlow?¶

The single biggest benefit that TensorFlow offers for machine learning development is abstraction. Instead of dealing with the nitty-gritty details of implementing algorithms or figuring out suitable ways to link the output of one function to the input of another, the developer can focus on the overall logic of the application. TensorFlow takes care of the details behind the scenes.

Keras¶

image.png

Keras runs on top of open source machine libraries such as TensorFlow, Theano or Cognitive Toolkit (CNTK). Theano is a Python library used for fast numerical computation tasks. TensorFlow is the most famous symbolic mathematics library used to create neural networks and deep learning models. TensorFlow is very flexible and the main benefit is distributed computing. CNTK is a deep learning framework developed by Microsoft. It uses libraries like Python, C#, C++, or standalone machine learning toolkits. Theano and TensorFlow are very powerful but difficult to understand libraries for creating neural networks.

Keras is based on a minimal framework that provides a clean and easy way to create deep learning models based on TensorFlow or Theano. Keras is designed to quickly define deep learning models. Well, Keras is an ideal choice for deep learning applications.

Keras uses various optimization techniques to make the high-level neural network API easier and more performant. It supports the following features −

  • Consistent, simple and extensible API.

  • Minimal structure - easy to achieve results without frills.

  • Supports multiple platforms and backends.

  • It is a user-friendly framework that runs on both CPU and GPU.

  • High computing scalability.

Benefits¶

Keras is a highly powerful and dynamic framework and has the following advantages −

  • Greater community support.

  • Easy to test.

  • Keras neural networks are written in Python, which makes things simpler.

  • Keras supports convolution and recurrent networks.

  • Deep learning models are discrete components, so you can combine them in many ways.

Let's use Keras integrated with tensorflow:

Given that TensorFlow was the de facto standard backend for the Keras open source project, the integration means that a single library can now be used instead of two separate libraries. Additionally, the independent Keras project now recommends that all future Keras developments use the tf.keras API.

In [ ]:
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
In [ ]:
import numpy as np
from matplotlib import pyplot as plt
import cv2
from matplotlib import cm
import pandas as pd
import seaborn as sns
from matplotlib import pyplot as plt
from matplotlib import cm
from matplotlib.colors import ListedColormap
import geopandas as gpd
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import OneHotEncoder
from sklearn.metrics import classification_report
from sklearn.metrics import confusion_matrix
In [ ]:
from tensorflow import keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
In [ ]:
samples = gpd.read_file('/content/drive/MyDrive/Datasets/LULC_Pixel_Classification/classes_de_uso.shp')
In [ ]:
samples
Out[ ]:
B11 classe B12 B8A label B1 B2 rand B3 random B4 B5 B6 B7 B8 B9 geometry
0 0.11060 3 0.04925 0.34265 agricultura 0.01640 0.02725 0.999976 0.05925 0.990412 0.03110 0.10555 0.25665 0.32045 0.32960 0.37570 POINT (-51.90767 -29.70724)
1 0.12030 3 0.05790 0.29270 agricultura 0.01665 0.02035 0.999933 0.03345 0.054960 0.02090 0.05785 0.19675 0.27700 0.21795 0.35335 POINT (-51.96534 -29.61651)
2 0.14800 3 0.07730 0.37535 agricultura 0.02515 0.04140 0.999918 0.08090 0.610448 0.06920 0.15110 0.27670 0.33200 0.32470 0.36825 POINT (-51.91476 -29.72386)
3 0.16370 3 0.10150 0.34395 agricultura 0.02620 0.04200 0.999899 0.08415 0.009023 0.07935 0.15470 0.26035 0.31145 0.33460 0.31360 POINT (-51.91099 -29.71236)
4 0.12560 3 0.06400 0.35155 agricultura 0.02160 0.03355 0.999895 0.06595 0.347600 0.04990 0.12265 0.26420 0.31950 0.31360 0.35290 POINT (-51.90767 -29.71272)
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
4995 0.34995 2 0.27430 0.25040 solo_exposto 0.05200 0.05640 0.954984 0.07990 0.986132 0.09175 0.13375 0.18910 0.21780 0.22310 0.26125 POINT (-51.93111 -29.71227)
4996 0.33655 2 0.24140 0.26275 solo_exposto 0.05255 0.07705 0.954976 0.10610 0.858527 0.14510 0.17850 0.19920 0.22540 0.23440 0.26600 POINT (-51.77894 -29.79348)
4997 0.35550 2 0.30085 0.28730 solo_exposto 0.05565 0.05985 0.954880 0.08320 0.503585 0.10155 0.17590 0.23170 0.25520 0.23840 0.28280 POINT (-51.77957 -29.78935)
4998 0.32695 2 0.20510 0.26720 solo_exposto 0.04175 0.05025 0.954838 0.07265 0.280611 0.07485 0.12135 0.20115 0.23260 0.23200 0.26075 POINT (-51.77948 -29.79150)
4999 0.31495 2 0.21080 0.30215 solo_exposto 0.05320 0.07360 0.954777 0.10425 0.767035 0.14460 0.16560 0.22965 0.25620 0.27530 0.29205 POINT (-51.81101 -29.80471)

5000 rows × 17 columns

In [ ]:
samples.columns
Out[ ]:
Index(['B11', 'classe', 'B12', 'B8A', 'label', 'B1', 'B2', 'rand', 'B3',
       'random', 'B4', 'B5', 'B6', 'B7', 'B8', 'B9', 'geometry'],
      dtype='object')
In [ ]:
X = samples[['B12', 'B8A', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8']].values
Y = samples['classe'].values
In [ ]:
Y = Y[:,np.newaxis]
In [ ]:
Y.shape
Out[ ]:
(5000, 1)
In [ ]:
enc = OneHotEncoder()

enc.fit(Y)

Y = enc.transform(Y).toarray()
In [ ]:
Y.shape
Out[ ]:
(5000, 5)
In [ ]:
Y
Out[ ]:
array([[0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       [0., 0., 0., 1., 0.],
       ...,
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.],
       [0., 0., 1., 0., 0.]])
In [ ]:
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size = 0.3, random_state = 42)

Now let's define the size of the input shape and the number of classes we have:

In [ ]:
input_shape = (X_train.shape[1:])
num_classes = len(np.unique(samples['classe'].values))
In [ ]:
input_shape
Out[ ]:
(9,)
In [ ]:
num_classes
Out[ ]:
5

So let's build our neural network by adding the dense layers:

In [ ]:
model = Sequential()
model.add(Dense(128, input_shape=input_shape, activation='relu'))
model.add(Dense(32, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(num_classes, activation='softmax'))

In the input layer and hidden layers we use the relu activation function. Since we are dealing with a multi-class classification problem, we will use the softmax enable function in the output layer.

Now we can compile the model, defining the loss as categorical entropy, the Adam optimizer, and the validation metric as accuracy:

In [ ]:
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
In [ ]:
model.summary()
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 128)               1280      
                                                                 
 dense_1 (Dense)             (None, 32)                4128      
                                                                 
 dense_2 (Dense)             (None, 8)                 264       
                                                                 
 dense_3 (Dense)             (None, 5)                 45        
                                                                 
=================================================================
Total params: 5,717
Trainable params: 5,717
Non-trainable params: 0
_________________________________________________________________

And finally, train the model for 100 iterations:

In [ ]:
history = model.fit(X_train, Y_train, epochs=300, batch_size=250, verbose=1, validation_split=0.2)
Epoch 1/300
12/12 [==============================] - 7s 23ms/step - loss: 1.5921 - accuracy: 0.2179 - val_loss: 1.5667 - val_accuracy: 0.3186
Epoch 2/300
12/12 [==============================] - 0s 5ms/step - loss: 1.5462 - accuracy: 0.3389 - val_loss: 1.5154 - val_accuracy: 0.3714
Epoch 3/300
12/12 [==============================] - 0s 5ms/step - loss: 1.4904 - accuracy: 0.3625 - val_loss: 1.4510 - val_accuracy: 0.3771
Epoch 4/300
12/12 [==============================] - 0s 6ms/step - loss: 1.4201 - accuracy: 0.3725 - val_loss: 1.3732 - val_accuracy: 0.3857
Epoch 5/300
12/12 [==============================] - 0s 6ms/step - loss: 1.3385 - accuracy: 0.3804 - val_loss: 1.2861 - val_accuracy: 0.3929
Epoch 6/300
12/12 [==============================] - 0s 5ms/step - loss: 1.2531 - accuracy: 0.3843 - val_loss: 1.2009 - val_accuracy: 0.3986
Epoch 7/300
12/12 [==============================] - 0s 5ms/step - loss: 1.1730 - accuracy: 0.3850 - val_loss: 1.1268 - val_accuracy: 0.3986
Epoch 8/300
12/12 [==============================] - 0s 5ms/step - loss: 1.1045 - accuracy: 0.3861 - val_loss: 1.0687 - val_accuracy: 0.4000
Epoch 9/300
12/12 [==============================] - 0s 5ms/step - loss: 1.0512 - accuracy: 0.4025 - val_loss: 1.0214 - val_accuracy: 0.4657
Epoch 10/300
12/12 [==============================] - 0s 5ms/step - loss: 1.0098 - accuracy: 0.4821 - val_loss: 0.9855 - val_accuracy: 0.5186
Epoch 11/300
12/12 [==============================] - 0s 6ms/step - loss: 0.9793 - accuracy: 0.5257 - val_loss: 0.9626 - val_accuracy: 0.5371
Epoch 12/300
12/12 [==============================] - 0s 5ms/step - loss: 0.9528 - accuracy: 0.5346 - val_loss: 0.9351 - val_accuracy: 0.5571
Epoch 13/300
12/12 [==============================] - 0s 5ms/step - loss: 0.9312 - accuracy: 0.5364 - val_loss: 0.9149 - val_accuracy: 0.5657
Epoch 14/300
12/12 [==============================] - 0s 5ms/step - loss: 0.9101 - accuracy: 0.5679 - val_loss: 0.8923 - val_accuracy: 0.5500
Epoch 15/300
12/12 [==============================] - 0s 5ms/step - loss: 0.8899 - accuracy: 0.6921 - val_loss: 0.8764 - val_accuracy: 0.5971
Epoch 16/300
12/12 [==============================] - 0s 5ms/step - loss: 0.8721 - accuracy: 0.7432 - val_loss: 0.8572 - val_accuracy: 0.7571
Epoch 17/300
12/12 [==============================] - 0s 5ms/step - loss: 0.8558 - accuracy: 0.7611 - val_loss: 0.8345 - val_accuracy: 0.7600
Epoch 18/300
12/12 [==============================] - 0s 5ms/step - loss: 0.8255 - accuracy: 0.7829 - val_loss: 0.8074 - val_accuracy: 0.7714
Epoch 19/300
12/12 [==============================] - 0s 5ms/step - loss: 0.7977 - accuracy: 0.7854 - val_loss: 0.7805 - val_accuracy: 0.7857
Epoch 20/300
12/12 [==============================] - 0s 5ms/step - loss: 0.7692 - accuracy: 0.8000 - val_loss: 0.7537 - val_accuracy: 0.7843
Epoch 21/300
12/12 [==============================] - 0s 5ms/step - loss: 0.7423 - accuracy: 0.7946 - val_loss: 0.7324 - val_accuracy: 0.7843
Epoch 22/300
12/12 [==============================] - 0s 5ms/step - loss: 0.7175 - accuracy: 0.7868 - val_loss: 0.7014 - val_accuracy: 0.7914
Epoch 23/300
12/12 [==============================] - 0s 5ms/step - loss: 0.6881 - accuracy: 0.7943 - val_loss: 0.6690 - val_accuracy: 0.7957
Epoch 24/300
12/12 [==============================] - 0s 5ms/step - loss: 0.6576 - accuracy: 0.7979 - val_loss: 0.6398 - val_accuracy: 0.7986
Epoch 25/300
12/12 [==============================] - 0s 5ms/step - loss: 0.6301 - accuracy: 0.8032 - val_loss: 0.6195 - val_accuracy: 0.7629
Epoch 26/300
12/12 [==============================] - 0s 5ms/step - loss: 0.6118 - accuracy: 0.7993 - val_loss: 0.5897 - val_accuracy: 0.7929
Epoch 27/300
12/12 [==============================] - 0s 5ms/step - loss: 0.5846 - accuracy: 0.8032 - val_loss: 0.5695 - val_accuracy: 0.7843
Epoch 28/300
12/12 [==============================] - 0s 5ms/step - loss: 0.5629 - accuracy: 0.8018 - val_loss: 0.5504 - val_accuracy: 0.8043
Epoch 29/300
12/12 [==============================] - 0s 5ms/step - loss: 0.5445 - accuracy: 0.8207 - val_loss: 0.5344 - val_accuracy: 0.8086
Epoch 30/300
12/12 [==============================] - 0s 5ms/step - loss: 0.5274 - accuracy: 0.8157 - val_loss: 0.5206 - val_accuracy: 0.8086
Epoch 31/300
12/12 [==============================] - 0s 5ms/step - loss: 0.5159 - accuracy: 0.8186 - val_loss: 0.5097 - val_accuracy: 0.8129
Epoch 32/300
12/12 [==============================] - 0s 5ms/step - loss: 0.5063 - accuracy: 0.8171 - val_loss: 0.5064 - val_accuracy: 0.8057
Epoch 33/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4985 - accuracy: 0.8211 - val_loss: 0.5010 - val_accuracy: 0.8100
Epoch 34/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4884 - accuracy: 0.8150 - val_loss: 0.4806 - val_accuracy: 0.8171
Epoch 35/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4783 - accuracy: 0.8254 - val_loss: 0.4735 - val_accuracy: 0.8114
Epoch 36/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4717 - accuracy: 0.8232 - val_loss: 0.4671 - val_accuracy: 0.8186
Epoch 37/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4669 - accuracy: 0.8229 - val_loss: 0.4705 - val_accuracy: 0.8129
Epoch 38/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4593 - accuracy: 0.8268 - val_loss: 0.4551 - val_accuracy: 0.8186
Epoch 39/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4548 - accuracy: 0.8221 - val_loss: 0.4498 - val_accuracy: 0.8229
Epoch 40/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4517 - accuracy: 0.8211 - val_loss: 0.4480 - val_accuracy: 0.8171
Epoch 41/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4424 - accuracy: 0.8307 - val_loss: 0.4385 - val_accuracy: 0.8243
Epoch 42/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4385 - accuracy: 0.8279 - val_loss: 0.4362 - val_accuracy: 0.8229
Epoch 43/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4348 - accuracy: 0.8311 - val_loss: 0.4424 - val_accuracy: 0.8100
Epoch 44/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4328 - accuracy: 0.8304 - val_loss: 0.4265 - val_accuracy: 0.8300
Epoch 45/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4255 - accuracy: 0.8286 - val_loss: 0.4223 - val_accuracy: 0.8257
Epoch 46/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4246 - accuracy: 0.8382 - val_loss: 0.4249 - val_accuracy: 0.8186
Epoch 47/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4193 - accuracy: 0.8325 - val_loss: 0.4166 - val_accuracy: 0.8286
Epoch 48/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4143 - accuracy: 0.8343 - val_loss: 0.4123 - val_accuracy: 0.8300
Epoch 49/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4135 - accuracy: 0.8364 - val_loss: 0.4117 - val_accuracy: 0.8386
Epoch 50/300
12/12 [==============================] - 0s 7ms/step - loss: 0.4096 - accuracy: 0.8382 - val_loss: 0.4081 - val_accuracy: 0.8329
Epoch 51/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4061 - accuracy: 0.8375 - val_loss: 0.4041 - val_accuracy: 0.8371
Epoch 52/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4037 - accuracy: 0.8389 - val_loss: 0.4032 - val_accuracy: 0.8386
Epoch 53/300
12/12 [==============================] - 0s 5ms/step - loss: 0.4003 - accuracy: 0.8386 - val_loss: 0.4035 - val_accuracy: 0.8329
Epoch 54/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3988 - accuracy: 0.8425 - val_loss: 0.3952 - val_accuracy: 0.8414
Epoch 55/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3973 - accuracy: 0.8393 - val_loss: 0.3969 - val_accuracy: 0.8443
Epoch 56/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3948 - accuracy: 0.8429 - val_loss: 0.3900 - val_accuracy: 0.8471
Epoch 57/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3954 - accuracy: 0.8407 - val_loss: 0.3977 - val_accuracy: 0.8329
Epoch 58/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3919 - accuracy: 0.8389 - val_loss: 0.3925 - val_accuracy: 0.8400
Epoch 59/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3904 - accuracy: 0.8454 - val_loss: 0.3902 - val_accuracy: 0.8400
Epoch 60/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3879 - accuracy: 0.8414 - val_loss: 0.3961 - val_accuracy: 0.8486
Epoch 61/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3873 - accuracy: 0.8446 - val_loss: 0.3836 - val_accuracy: 0.8414
Epoch 62/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3824 - accuracy: 0.8446 - val_loss: 0.3807 - val_accuracy: 0.8414
Epoch 63/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3819 - accuracy: 0.8446 - val_loss: 0.3790 - val_accuracy: 0.8486
Epoch 64/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3773 - accuracy: 0.8436 - val_loss: 0.3743 - val_accuracy: 0.8543
Epoch 65/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3782 - accuracy: 0.8511 - val_loss: 0.3807 - val_accuracy: 0.8457
Epoch 66/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3771 - accuracy: 0.8486 - val_loss: 0.3736 - val_accuracy: 0.8457
Epoch 67/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3738 - accuracy: 0.8489 - val_loss: 0.3760 - val_accuracy: 0.8471
Epoch 68/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3772 - accuracy: 0.8479 - val_loss: 0.3714 - val_accuracy: 0.8500
Epoch 69/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3695 - accuracy: 0.8489 - val_loss: 0.3671 - val_accuracy: 0.8514
Epoch 70/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3674 - accuracy: 0.8518 - val_loss: 0.3705 - val_accuracy: 0.8471
Epoch 71/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3675 - accuracy: 0.8493 - val_loss: 0.3675 - val_accuracy: 0.8514
Epoch 72/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3696 - accuracy: 0.8468 - val_loss: 0.3626 - val_accuracy: 0.8571
Epoch 73/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3665 - accuracy: 0.8511 - val_loss: 0.3608 - val_accuracy: 0.8586
Epoch 74/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3673 - accuracy: 0.8496 - val_loss: 0.3622 - val_accuracy: 0.8571
Epoch 75/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3628 - accuracy: 0.8557 - val_loss: 0.3625 - val_accuracy: 0.8543
Epoch 76/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3584 - accuracy: 0.8554 - val_loss: 0.3555 - val_accuracy: 0.8714
Epoch 77/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3600 - accuracy: 0.8514 - val_loss: 0.3688 - val_accuracy: 0.8457
Epoch 78/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3621 - accuracy: 0.8521 - val_loss: 0.3534 - val_accuracy: 0.8629
Epoch 79/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3565 - accuracy: 0.8529 - val_loss: 0.3563 - val_accuracy: 0.8657
Epoch 80/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3585 - accuracy: 0.8536 - val_loss: 0.3518 - val_accuracy: 0.8657
Epoch 81/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3549 - accuracy: 0.8561 - val_loss: 0.3561 - val_accuracy: 0.8543
Epoch 82/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3530 - accuracy: 0.8589 - val_loss: 0.3547 - val_accuracy: 0.8514
Epoch 83/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3521 - accuracy: 0.8575 - val_loss: 0.3477 - val_accuracy: 0.8671
Epoch 84/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3471 - accuracy: 0.8593 - val_loss: 0.3501 - val_accuracy: 0.8600
Epoch 85/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3479 - accuracy: 0.8561 - val_loss: 0.3445 - val_accuracy: 0.8771
Epoch 86/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3476 - accuracy: 0.8571 - val_loss: 0.3453 - val_accuracy: 0.8614
Epoch 87/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3439 - accuracy: 0.8611 - val_loss: 0.3459 - val_accuracy: 0.8643
Epoch 88/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3428 - accuracy: 0.8618 - val_loss: 0.3412 - val_accuracy: 0.8729
Epoch 89/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3436 - accuracy: 0.8611 - val_loss: 0.3393 - val_accuracy: 0.8729
Epoch 90/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3407 - accuracy: 0.8604 - val_loss: 0.3421 - val_accuracy: 0.8686
Epoch 91/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3403 - accuracy: 0.8575 - val_loss: 0.3368 - val_accuracy: 0.8743
Epoch 92/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3446 - accuracy: 0.8629 - val_loss: 0.3388 - val_accuracy: 0.8671
Epoch 93/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3394 - accuracy: 0.8621 - val_loss: 0.3434 - val_accuracy: 0.8600
Epoch 94/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3366 - accuracy: 0.8639 - val_loss: 0.3383 - val_accuracy: 0.8757
Epoch 95/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3357 - accuracy: 0.8611 - val_loss: 0.3395 - val_accuracy: 0.8629
Epoch 96/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3358 - accuracy: 0.8636 - val_loss: 0.3330 - val_accuracy: 0.8743
Epoch 97/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3336 - accuracy: 0.8607 - val_loss: 0.3363 - val_accuracy: 0.8714
Epoch 98/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3347 - accuracy: 0.8646 - val_loss: 0.3367 - val_accuracy: 0.8643
Epoch 99/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3299 - accuracy: 0.8639 - val_loss: 0.3303 - val_accuracy: 0.8743
Epoch 100/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3303 - accuracy: 0.8654 - val_loss: 0.3298 - val_accuracy: 0.8714
Epoch 101/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3306 - accuracy: 0.8646 - val_loss: 0.3310 - val_accuracy: 0.8714
Epoch 102/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3291 - accuracy: 0.8668 - val_loss: 0.3303 - val_accuracy: 0.8729
Epoch 103/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3280 - accuracy: 0.8639 - val_loss: 0.3261 - val_accuracy: 0.8800
Epoch 104/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3274 - accuracy: 0.8671 - val_loss: 0.3264 - val_accuracy: 0.8686
Epoch 105/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3311 - accuracy: 0.8618 - val_loss: 0.3308 - val_accuracy: 0.8714
Epoch 106/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3268 - accuracy: 0.8639 - val_loss: 0.3330 - val_accuracy: 0.8729
Epoch 107/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3286 - accuracy: 0.8657 - val_loss: 0.3271 - val_accuracy: 0.8771
Epoch 108/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3253 - accuracy: 0.8668 - val_loss: 0.3222 - val_accuracy: 0.8729
Epoch 109/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3253 - accuracy: 0.8704 - val_loss: 0.3237 - val_accuracy: 0.8771
Epoch 110/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3214 - accuracy: 0.8711 - val_loss: 0.3351 - val_accuracy: 0.8629
Epoch 111/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3236 - accuracy: 0.8664 - val_loss: 0.3325 - val_accuracy: 0.8643
Epoch 112/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3275 - accuracy: 0.8654 - val_loss: 0.3368 - val_accuracy: 0.8600
Epoch 113/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3232 - accuracy: 0.8643 - val_loss: 0.3162 - val_accuracy: 0.8800
Epoch 114/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3189 - accuracy: 0.8704 - val_loss: 0.3171 - val_accuracy: 0.8800
Epoch 115/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3211 - accuracy: 0.8661 - val_loss: 0.3257 - val_accuracy: 0.8743
Epoch 116/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3181 - accuracy: 0.8686 - val_loss: 0.3240 - val_accuracy: 0.8800
Epoch 117/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3180 - accuracy: 0.8700 - val_loss: 0.3193 - val_accuracy: 0.8700
Epoch 118/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3196 - accuracy: 0.8700 - val_loss: 0.3130 - val_accuracy: 0.8814
Epoch 119/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3137 - accuracy: 0.8732 - val_loss: 0.3179 - val_accuracy: 0.8786
Epoch 120/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3161 - accuracy: 0.8689 - val_loss: 0.3188 - val_accuracy: 0.8800
Epoch 121/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3122 - accuracy: 0.8704 - val_loss: 0.3159 - val_accuracy: 0.8829
Epoch 122/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3155 - accuracy: 0.8714 - val_loss: 0.3176 - val_accuracy: 0.8771
Epoch 123/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3131 - accuracy: 0.8757 - val_loss: 0.3115 - val_accuracy: 0.8814
Epoch 124/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3115 - accuracy: 0.8761 - val_loss: 0.3064 - val_accuracy: 0.8829
Epoch 125/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3092 - accuracy: 0.8779 - val_loss: 0.3075 - val_accuracy: 0.8843
Epoch 126/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3087 - accuracy: 0.8739 - val_loss: 0.3180 - val_accuracy: 0.8743
Epoch 127/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3106 - accuracy: 0.8779 - val_loss: 0.3095 - val_accuracy: 0.8786
Epoch 128/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3054 - accuracy: 0.8804 - val_loss: 0.3121 - val_accuracy: 0.8743
Epoch 129/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3179 - accuracy: 0.8661 - val_loss: 0.3152 - val_accuracy: 0.8800
Epoch 130/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3108 - accuracy: 0.8696 - val_loss: 0.3153 - val_accuracy: 0.8786
Epoch 131/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3073 - accuracy: 0.8757 - val_loss: 0.3060 - val_accuracy: 0.8800
Epoch 132/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3038 - accuracy: 0.8793 - val_loss: 0.3098 - val_accuracy: 0.8800
Epoch 133/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3055 - accuracy: 0.8768 - val_loss: 0.3010 - val_accuracy: 0.8843
Epoch 134/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3014 - accuracy: 0.8771 - val_loss: 0.3029 - val_accuracy: 0.8843
Epoch 135/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2992 - accuracy: 0.8789 - val_loss: 0.3154 - val_accuracy: 0.8771
Epoch 136/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3031 - accuracy: 0.8771 - val_loss: 0.2994 - val_accuracy: 0.8857
Epoch 137/300
12/12 [==============================] - 0s 5ms/step - loss: 0.3003 - accuracy: 0.8786 - val_loss: 0.3025 - val_accuracy: 0.8886
Epoch 138/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2999 - accuracy: 0.8789 - val_loss: 0.2997 - val_accuracy: 0.8843
Epoch 139/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2970 - accuracy: 0.8796 - val_loss: 0.2991 - val_accuracy: 0.8914
Epoch 140/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2974 - accuracy: 0.8789 - val_loss: 0.3093 - val_accuracy: 0.8786
Epoch 141/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2981 - accuracy: 0.8782 - val_loss: 0.2963 - val_accuracy: 0.8843
Epoch 142/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2980 - accuracy: 0.8789 - val_loss: 0.3011 - val_accuracy: 0.8814
Epoch 143/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2941 - accuracy: 0.8814 - val_loss: 0.2954 - val_accuracy: 0.8843
Epoch 144/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2923 - accuracy: 0.8821 - val_loss: 0.2957 - val_accuracy: 0.8829
Epoch 145/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2938 - accuracy: 0.8829 - val_loss: 0.3000 - val_accuracy: 0.8886
Epoch 146/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2931 - accuracy: 0.8789 - val_loss: 0.2899 - val_accuracy: 0.8914
Epoch 147/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2896 - accuracy: 0.8846 - val_loss: 0.2933 - val_accuracy: 0.8914
Epoch 148/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2903 - accuracy: 0.8846 - val_loss: 0.2927 - val_accuracy: 0.8843
Epoch 149/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2888 - accuracy: 0.8857 - val_loss: 0.2966 - val_accuracy: 0.8843
Epoch 150/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2899 - accuracy: 0.8846 - val_loss: 0.2998 - val_accuracy: 0.8800
Epoch 151/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2888 - accuracy: 0.8839 - val_loss: 0.2875 - val_accuracy: 0.8886
Epoch 152/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2848 - accuracy: 0.8861 - val_loss: 0.2968 - val_accuracy: 0.8871
Epoch 153/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2913 - accuracy: 0.8829 - val_loss: 0.2846 - val_accuracy: 0.8943
Epoch 154/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2858 - accuracy: 0.8886 - val_loss: 0.2866 - val_accuracy: 0.8900
Epoch 155/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2853 - accuracy: 0.8911 - val_loss: 0.2880 - val_accuracy: 0.8857
Epoch 156/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2821 - accuracy: 0.8893 - val_loss: 0.2879 - val_accuracy: 0.8871
Epoch 157/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2817 - accuracy: 0.8882 - val_loss: 0.2806 - val_accuracy: 0.8943
Epoch 158/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2816 - accuracy: 0.8889 - val_loss: 0.2817 - val_accuracy: 0.8957
Epoch 159/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2823 - accuracy: 0.8911 - val_loss: 0.2918 - val_accuracy: 0.8886
Epoch 160/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2863 - accuracy: 0.8871 - val_loss: 0.2784 - val_accuracy: 0.8929
Epoch 161/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2809 - accuracy: 0.8836 - val_loss: 0.2793 - val_accuracy: 0.8900
Epoch 162/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2811 - accuracy: 0.8857 - val_loss: 0.2810 - val_accuracy: 0.8914
Epoch 163/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2766 - accuracy: 0.8921 - val_loss: 0.2855 - val_accuracy: 0.8886
Epoch 164/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2756 - accuracy: 0.8875 - val_loss: 0.2734 - val_accuracy: 0.8943
Epoch 165/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2737 - accuracy: 0.8911 - val_loss: 0.2806 - val_accuracy: 0.8914
Epoch 166/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2735 - accuracy: 0.8918 - val_loss: 0.2739 - val_accuracy: 0.9000
Epoch 167/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2751 - accuracy: 0.8911 - val_loss: 0.2768 - val_accuracy: 0.8943
Epoch 168/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2747 - accuracy: 0.8961 - val_loss: 0.2788 - val_accuracy: 0.8914
Epoch 169/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2700 - accuracy: 0.8975 - val_loss: 0.2759 - val_accuracy: 0.8971
Epoch 170/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2691 - accuracy: 0.8954 - val_loss: 0.2756 - val_accuracy: 0.8957
Epoch 171/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2708 - accuracy: 0.8964 - val_loss: 0.2846 - val_accuracy: 0.8900
Epoch 172/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2736 - accuracy: 0.8943 - val_loss: 0.2688 - val_accuracy: 0.8957
Epoch 173/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2667 - accuracy: 0.9004 - val_loss: 0.2681 - val_accuracy: 0.8986
Epoch 174/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2682 - accuracy: 0.8946 - val_loss: 0.2673 - val_accuracy: 0.8943
Epoch 175/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2722 - accuracy: 0.8971 - val_loss: 0.2744 - val_accuracy: 0.8929
Epoch 176/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2634 - accuracy: 0.9007 - val_loss: 0.2644 - val_accuracy: 0.9014
Epoch 177/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2676 - accuracy: 0.9000 - val_loss: 0.2652 - val_accuracy: 0.8986
Epoch 178/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2638 - accuracy: 0.8950 - val_loss: 0.2902 - val_accuracy: 0.8900
Epoch 179/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2664 - accuracy: 0.8971 - val_loss: 0.2773 - val_accuracy: 0.8943
Epoch 180/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2635 - accuracy: 0.8968 - val_loss: 0.2651 - val_accuracy: 0.9014
Epoch 181/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2638 - accuracy: 0.8993 - val_loss: 0.2603 - val_accuracy: 0.9029
Epoch 182/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2613 - accuracy: 0.9018 - val_loss: 0.2676 - val_accuracy: 0.8957
Epoch 183/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2579 - accuracy: 0.9011 - val_loss: 0.2594 - val_accuracy: 0.8986
Epoch 184/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2625 - accuracy: 0.8961 - val_loss: 0.2607 - val_accuracy: 0.9000
Epoch 185/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2592 - accuracy: 0.8993 - val_loss: 0.2564 - val_accuracy: 0.9000
Epoch 186/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2546 - accuracy: 0.9050 - val_loss: 0.2611 - val_accuracy: 0.9000
Epoch 187/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2556 - accuracy: 0.9004 - val_loss: 0.2583 - val_accuracy: 0.9014
Epoch 188/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2551 - accuracy: 0.9054 - val_loss: 0.2656 - val_accuracy: 0.8971
Epoch 189/300
12/12 [==============================] - 0s 7ms/step - loss: 0.2541 - accuracy: 0.9029 - val_loss: 0.2578 - val_accuracy: 0.9014
Epoch 190/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2562 - accuracy: 0.9004 - val_loss: 0.2534 - val_accuracy: 0.9029
Epoch 191/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2550 - accuracy: 0.9014 - val_loss: 0.2521 - val_accuracy: 0.9029
Epoch 192/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2516 - accuracy: 0.9007 - val_loss: 0.2581 - val_accuracy: 0.9000
Epoch 193/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2487 - accuracy: 0.9082 - val_loss: 0.2574 - val_accuracy: 0.8986
Epoch 194/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2507 - accuracy: 0.9043 - val_loss: 0.2688 - val_accuracy: 0.8929
Epoch 195/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2560 - accuracy: 0.8993 - val_loss: 0.2671 - val_accuracy: 0.8943
Epoch 196/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2580 - accuracy: 0.9021 - val_loss: 0.2558 - val_accuracy: 0.9014
Epoch 197/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2493 - accuracy: 0.9064 - val_loss: 0.2622 - val_accuracy: 0.8986
Epoch 198/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2502 - accuracy: 0.9046 - val_loss: 0.2539 - val_accuracy: 0.9043
Epoch 199/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2492 - accuracy: 0.9036 - val_loss: 0.2536 - val_accuracy: 0.9057
Epoch 200/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2460 - accuracy: 0.9064 - val_loss: 0.2470 - val_accuracy: 0.9143
Epoch 201/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2475 - accuracy: 0.9064 - val_loss: 0.2479 - val_accuracy: 0.9086
Epoch 202/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2438 - accuracy: 0.9111 - val_loss: 0.2439 - val_accuracy: 0.9086
Epoch 203/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2439 - accuracy: 0.9079 - val_loss: 0.2483 - val_accuracy: 0.9043
Epoch 204/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2402 - accuracy: 0.9143 - val_loss: 0.2492 - val_accuracy: 0.9043
Epoch 205/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2401 - accuracy: 0.9104 - val_loss: 0.2590 - val_accuracy: 0.8957
Epoch 206/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2468 - accuracy: 0.9068 - val_loss: 0.2548 - val_accuracy: 0.8971
Epoch 207/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2391 - accuracy: 0.9129 - val_loss: 0.2651 - val_accuracy: 0.8957
Epoch 208/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2441 - accuracy: 0.9096 - val_loss: 0.2499 - val_accuracy: 0.9000
Epoch 209/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2406 - accuracy: 0.9093 - val_loss: 0.2374 - val_accuracy: 0.9157
Epoch 210/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2335 - accuracy: 0.9164 - val_loss: 0.2515 - val_accuracy: 0.9057
Epoch 211/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2388 - accuracy: 0.9104 - val_loss: 0.2397 - val_accuracy: 0.9114
Epoch 212/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2394 - accuracy: 0.9104 - val_loss: 0.2400 - val_accuracy: 0.9129
Epoch 213/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2383 - accuracy: 0.9121 - val_loss: 0.2376 - val_accuracy: 0.9143
Epoch 214/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2320 - accuracy: 0.9146 - val_loss: 0.2351 - val_accuracy: 0.9157
Epoch 215/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2318 - accuracy: 0.9168 - val_loss: 0.2371 - val_accuracy: 0.9171
Epoch 216/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2310 - accuracy: 0.9171 - val_loss: 0.2370 - val_accuracy: 0.9129
Epoch 217/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2289 - accuracy: 0.9171 - val_loss: 0.2392 - val_accuracy: 0.9100
Epoch 218/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2286 - accuracy: 0.9164 - val_loss: 0.2299 - val_accuracy: 0.9171
Epoch 219/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2278 - accuracy: 0.9182 - val_loss: 0.2336 - val_accuracy: 0.9129
Epoch 220/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2293 - accuracy: 0.9189 - val_loss: 0.2451 - val_accuracy: 0.9043
Epoch 221/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2267 - accuracy: 0.9179 - val_loss: 0.2294 - val_accuracy: 0.9143
Epoch 222/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2248 - accuracy: 0.9182 - val_loss: 0.2303 - val_accuracy: 0.9143
Epoch 223/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2242 - accuracy: 0.9186 - val_loss: 0.2296 - val_accuracy: 0.9129
Epoch 224/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2231 - accuracy: 0.9193 - val_loss: 0.2333 - val_accuracy: 0.9143
Epoch 225/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2307 - accuracy: 0.9168 - val_loss: 0.2302 - val_accuracy: 0.9143
Epoch 226/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2263 - accuracy: 0.9136 - val_loss: 0.2264 - val_accuracy: 0.9157
Epoch 227/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2310 - accuracy: 0.9157 - val_loss: 0.2266 - val_accuracy: 0.9214
Epoch 228/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2261 - accuracy: 0.9182 - val_loss: 0.2487 - val_accuracy: 0.9043
Epoch 229/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2270 - accuracy: 0.9175 - val_loss: 0.2418 - val_accuracy: 0.9071
Epoch 230/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2249 - accuracy: 0.9143 - val_loss: 0.2261 - val_accuracy: 0.9214
Epoch 231/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2213 - accuracy: 0.9196 - val_loss: 0.2223 - val_accuracy: 0.9171
Epoch 232/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2195 - accuracy: 0.9207 - val_loss: 0.2258 - val_accuracy: 0.9143
Epoch 233/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2203 - accuracy: 0.9236 - val_loss: 0.2279 - val_accuracy: 0.9143
Epoch 234/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2200 - accuracy: 0.9236 - val_loss: 0.2353 - val_accuracy: 0.9129
Epoch 235/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2200 - accuracy: 0.9182 - val_loss: 0.2186 - val_accuracy: 0.9214
Epoch 236/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2171 - accuracy: 0.9229 - val_loss: 0.2195 - val_accuracy: 0.9171
Epoch 237/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2185 - accuracy: 0.9193 - val_loss: 0.2421 - val_accuracy: 0.9057
Epoch 238/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2220 - accuracy: 0.9168 - val_loss: 0.2207 - val_accuracy: 0.9214
Epoch 239/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2145 - accuracy: 0.9232 - val_loss: 0.2162 - val_accuracy: 0.9229
Epoch 240/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2170 - accuracy: 0.9221 - val_loss: 0.2304 - val_accuracy: 0.9129
Epoch 241/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2134 - accuracy: 0.9243 - val_loss: 0.2320 - val_accuracy: 0.9143
Epoch 242/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2178 - accuracy: 0.9207 - val_loss: 0.2182 - val_accuracy: 0.9214
Epoch 243/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2149 - accuracy: 0.9225 - val_loss: 0.2163 - val_accuracy: 0.9243
Epoch 244/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2106 - accuracy: 0.9261 - val_loss: 0.2302 - val_accuracy: 0.9157
Epoch 245/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2102 - accuracy: 0.9236 - val_loss: 0.2216 - val_accuracy: 0.9214
Epoch 246/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2123 - accuracy: 0.9232 - val_loss: 0.2232 - val_accuracy: 0.9157
Epoch 247/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2077 - accuracy: 0.9254 - val_loss: 0.2111 - val_accuracy: 0.9286
Epoch 248/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2069 - accuracy: 0.9257 - val_loss: 0.2169 - val_accuracy: 0.9200
Epoch 249/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2055 - accuracy: 0.9250 - val_loss: 0.2219 - val_accuracy: 0.9214
Epoch 250/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2102 - accuracy: 0.9232 - val_loss: 0.2275 - val_accuracy: 0.9143
Epoch 251/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2132 - accuracy: 0.9193 - val_loss: 0.2146 - val_accuracy: 0.9214
Epoch 252/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2048 - accuracy: 0.9250 - val_loss: 0.2102 - val_accuracy: 0.9257
Epoch 253/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2029 - accuracy: 0.9286 - val_loss: 0.2107 - val_accuracy: 0.9257
Epoch 254/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2059 - accuracy: 0.9286 - val_loss: 0.2090 - val_accuracy: 0.9286
Epoch 255/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2021 - accuracy: 0.9268 - val_loss: 0.2087 - val_accuracy: 0.9271
Epoch 256/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2006 - accuracy: 0.9286 - val_loss: 0.2159 - val_accuracy: 0.9243
Epoch 257/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2046 - accuracy: 0.9243 - val_loss: 0.2055 - val_accuracy: 0.9271
Epoch 258/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2010 - accuracy: 0.9300 - val_loss: 0.2045 - val_accuracy: 0.9271
Epoch 259/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1986 - accuracy: 0.9311 - val_loss: 0.2249 - val_accuracy: 0.9200
Epoch 260/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2025 - accuracy: 0.9289 - val_loss: 0.2065 - val_accuracy: 0.9271
Epoch 261/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1994 - accuracy: 0.9296 - val_loss: 0.2053 - val_accuracy: 0.9271
Epoch 262/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1973 - accuracy: 0.9318 - val_loss: 0.2092 - val_accuracy: 0.9257
Epoch 263/300
12/12 [==============================] - 0s 5ms/step - loss: 0.2041 - accuracy: 0.9243 - val_loss: 0.2049 - val_accuracy: 0.9271
Epoch 264/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1964 - accuracy: 0.9296 - val_loss: 0.2040 - val_accuracy: 0.9243
Epoch 265/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1937 - accuracy: 0.9321 - val_loss: 0.2013 - val_accuracy: 0.9271
Epoch 266/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1926 - accuracy: 0.9329 - val_loss: 0.2058 - val_accuracy: 0.9229
Epoch 267/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1991 - accuracy: 0.9293 - val_loss: 0.2363 - val_accuracy: 0.9086
Epoch 268/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1989 - accuracy: 0.9293 - val_loss: 0.2064 - val_accuracy: 0.9300
Epoch 269/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1967 - accuracy: 0.9289 - val_loss: 0.2041 - val_accuracy: 0.9271
Epoch 270/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1924 - accuracy: 0.9343 - val_loss: 0.2033 - val_accuracy: 0.9286
Epoch 271/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1899 - accuracy: 0.9336 - val_loss: 0.2024 - val_accuracy: 0.9286
Epoch 272/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1894 - accuracy: 0.9346 - val_loss: 0.2076 - val_accuracy: 0.9286
Epoch 273/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1974 - accuracy: 0.9304 - val_loss: 0.2108 - val_accuracy: 0.9257
Epoch 274/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1929 - accuracy: 0.9329 - val_loss: 0.2002 - val_accuracy: 0.9300
Epoch 275/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1913 - accuracy: 0.9314 - val_loss: 0.1967 - val_accuracy: 0.9286
Epoch 276/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1869 - accuracy: 0.9343 - val_loss: 0.2076 - val_accuracy: 0.9286
Epoch 277/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1917 - accuracy: 0.9314 - val_loss: 0.1997 - val_accuracy: 0.9300
Epoch 278/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1882 - accuracy: 0.9314 - val_loss: 0.2036 - val_accuracy: 0.9257
Epoch 279/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1866 - accuracy: 0.9364 - val_loss: 0.1979 - val_accuracy: 0.9286
Epoch 280/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1882 - accuracy: 0.9368 - val_loss: 0.1943 - val_accuracy: 0.9286
Epoch 281/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1849 - accuracy: 0.9364 - val_loss: 0.1957 - val_accuracy: 0.9343
Epoch 282/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1874 - accuracy: 0.9321 - val_loss: 0.1931 - val_accuracy: 0.9314
Epoch 283/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1879 - accuracy: 0.9361 - val_loss: 0.2089 - val_accuracy: 0.9286
Epoch 284/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1885 - accuracy: 0.9350 - val_loss: 0.2063 - val_accuracy: 0.9257
Epoch 285/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1821 - accuracy: 0.9382 - val_loss: 0.1980 - val_accuracy: 0.9300
Epoch 286/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1873 - accuracy: 0.9336 - val_loss: 0.1969 - val_accuracy: 0.9286
Epoch 287/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1828 - accuracy: 0.9386 - val_loss: 0.1980 - val_accuracy: 0.9271
Epoch 288/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1820 - accuracy: 0.9400 - val_loss: 0.1913 - val_accuracy: 0.9329
Epoch 289/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1845 - accuracy: 0.9389 - val_loss: 0.1890 - val_accuracy: 0.9314
Epoch 290/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1781 - accuracy: 0.9404 - val_loss: 0.1860 - val_accuracy: 0.9329
Epoch 291/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1786 - accuracy: 0.9414 - val_loss: 0.1900 - val_accuracy: 0.9300
Epoch 292/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1797 - accuracy: 0.9396 - val_loss: 0.1909 - val_accuracy: 0.9314
Epoch 293/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1787 - accuracy: 0.9411 - val_loss: 0.1941 - val_accuracy: 0.9271
Epoch 294/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1802 - accuracy: 0.9382 - val_loss: 0.1916 - val_accuracy: 0.9314
Epoch 295/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1783 - accuracy: 0.9371 - val_loss: 0.1960 - val_accuracy: 0.9300
Epoch 296/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1789 - accuracy: 0.9382 - val_loss: 0.1854 - val_accuracy: 0.9329
Epoch 297/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1772 - accuracy: 0.9414 - val_loss: 0.1837 - val_accuracy: 0.9300
Epoch 298/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1783 - accuracy: 0.9389 - val_loss: 0.1944 - val_accuracy: 0.9271
Epoch 299/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1836 - accuracy: 0.9375 - val_loss: 0.1893 - val_accuracy: 0.9300
Epoch 300/300
12/12 [==============================] - 0s 5ms/step - loss: 0.1790 - accuracy: 0.9407 - val_loss: 0.1958 - val_accuracy: 0.9257

Let's plot the loss and accuracy graphs per epoch for the training and validation data:

In [ ]:
fig, ax = plt.subplots(1,2, figsize=(16,8))
ax[0].plot(history.history['loss'], color='b', label="Training loss")
ax[0].plot(history.history['val_loss'], color='r', label="validation loss",axes =ax[0])
legend = ax[0].legend(loc='best', shadow=True)

ax[1].plot(history.history['accuracy'], color='b', label="Training accuracy")
ax[1].plot(history.history['val_accuracy'], color='r',label="Validation accuracy")
legend = ax[1].legend(loc='best', shadow=True)
No description has been provided for this image
In [ ]:
score = model.evaluate(X_test, Y_test, verbose=0)
print('Test loss:', score[0])
print('Test accuracy:', score[1])
Test loss: 0.20278969407081604
Test accuracy: 0.921999990940094
In [ ]:
y_pred = model.predict(X_test)
47/47 [==============================] - 0s 1ms/step
In [ ]:
y_pred_res = np.argmax(y_pred, axis=1)
In [ ]:
Y_test_res = np.argmax(Y_test, axis=1)
In [ ]:
print(classification_report(Y_test_res, y_pred_res))
              precision    recall  f1-score   support

           0       0.98      0.99      0.98       284
           1       0.86      0.97      0.91       309
           2       0.87      0.94      0.91       290
           3       0.95      0.85      0.90       316
           4       0.97      0.86      0.92       301

    accuracy                           0.92      1500
   macro avg       0.93      0.92      0.92      1500
weighted avg       0.93      0.92      0.92      1500

Dropout¶

Dropout is a regularization method that approximates the training of a large number of neural networks with different architectures in parallel.

During training, some layer outputs are randomly ignored or "dropped". This has the effect of making the layer look and feel like a layer with a different number of nodes and connectivity to the previous layer. In effect, each update of a layer during training is performed with a different “view” of the configured layer.

image.png

Batch Normalization¶

Normalization is a data preprocessing tool used to bring numerical data to a common scale without distorting its shape.

Generally, when we input data into a deep learning machine or algorithm, we tend to change the values to a balanced scale. The reason we normalize is, in part, to ensure that our model can generalize properly.

Now coming back to batch normalization, it is a process to make neural networks faster and more stable by adding extra layers in a deep neural network. The new layer performs standardization and normalization operations on the input of a layer from a previous layer.

But what is the reason behind the term “Batch” in batch normalization? A typical neural network is trained using a collected set of input data called a batch. Similarly, the normalization process in batch normalization occurs in batches, not as a single input.

In [ ]:
from tensorflow.keras.layers import Dropout, BatchNormalization

Let's then add a dropout and batch normalization layer between the dense layers and see how the model behaves:

In [ ]:
model_2 = Sequential()
model_2.add(Dense(128, input_shape=input_shape, activation='relu'))
model_2.add(Dropout(0.1))
model_2.add(BatchNormalization())
model_2.add(Dense(32, activation='relu'))
model_2.add(Dropout(0.1))
model_2.add(Dense(8, activation='relu'))
model_2.add(Dropout(0.1))
model_2.add(Dense(num_classes, activation='softmax'))
In [ ]:
model_2.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
In [ ]:
model_2.summary()
Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_4 (Dense)             (None, 128)               1280      
                                                                 
 dropout (Dropout)           (None, 128)               0         
                                                                 
 batch_normalization (BatchN  (None, 128)              512       
 ormalization)                                                   
                                                                 
 dense_5 (Dense)             (None, 32)                4128      
                                                                 
 dropout_1 (Dropout)         (None, 32)                0         
                                                                 
 dense_6 (Dense)             (None, 8)                 264       
                                                                 
 dropout_2 (Dropout)         (None, 8)                 0         
                                                                 
 dense_7 (Dense)             (None, 5)                 45        
                                                                 
=================================================================
Total params: 6,229
Trainable params: 5,973
Non-trainable params: 256
_________________________________________________________________

Let's train the model:

In [ ]:
history2 = model_2.fit(X_train, Y_train, epochs=300, batch_size=250, verbose=1, validation_split=0.2)
Epoch 1/300
12/12 [==============================] - 3s 23ms/step - loss: 1.3509 - accuracy: 0.4446 - val_loss: 1.5740 - val_accuracy: 0.4586
Epoch 2/300
12/12 [==============================] - 0s 6ms/step - loss: 1.0448 - accuracy: 0.5968 - val_loss: 1.5359 - val_accuracy: 0.6414
Epoch 3/300
12/12 [==============================] - 0s 7ms/step - loss: 0.8572 - accuracy: 0.6371 - val_loss: 1.4978 - val_accuracy: 0.3200
Epoch 4/300
12/12 [==============================] - 0s 6ms/step - loss: 0.7695 - accuracy: 0.6621 - val_loss: 1.4713 - val_accuracy: 0.2914
Epoch 5/300
12/12 [==============================] - 0s 6ms/step - loss: 0.7098 - accuracy: 0.7054 - val_loss: 1.4602 - val_accuracy: 0.2671
Epoch 6/300
12/12 [==============================] - 0s 6ms/step - loss: 0.6336 - accuracy: 0.7464 - val_loss: 1.4378 - val_accuracy: 0.2586
Epoch 7/300
12/12 [==============================] - 0s 6ms/step - loss: 0.6011 - accuracy: 0.7614 - val_loss: 1.4269 - val_accuracy: 0.2786
Epoch 8/300
12/12 [==============================] - 0s 6ms/step - loss: 0.5644 - accuracy: 0.7829 - val_loss: 1.4126 - val_accuracy: 0.3029
Epoch 9/300
12/12 [==============================] - 0s 6ms/step - loss: 0.5334 - accuracy: 0.7868 - val_loss: 1.4138 - val_accuracy: 0.3571
Epoch 10/300
12/12 [==============================] - 0s 6ms/step - loss: 0.5116 - accuracy: 0.8021 - val_loss: 1.4134 - val_accuracy: 0.3271
Epoch 11/300
12/12 [==============================] - 0s 6ms/step - loss: 0.5031 - accuracy: 0.8064 - val_loss: 1.3680 - val_accuracy: 0.3757
Epoch 12/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4731 - accuracy: 0.8211 - val_loss: 1.3689 - val_accuracy: 0.4043
Epoch 13/300
12/12 [==============================] - 0s 7ms/step - loss: 0.4875 - accuracy: 0.8075 - val_loss: 1.3591 - val_accuracy: 0.4543
Epoch 14/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4655 - accuracy: 0.8196 - val_loss: 1.3319 - val_accuracy: 0.4343
Epoch 15/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4577 - accuracy: 0.8293 - val_loss: 1.2853 - val_accuracy: 0.4600
Epoch 16/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4536 - accuracy: 0.8154 - val_loss: 1.2623 - val_accuracy: 0.5100
Epoch 17/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4170 - accuracy: 0.8404 - val_loss: 1.2486 - val_accuracy: 0.3986
Epoch 18/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4136 - accuracy: 0.8357 - val_loss: 1.1716 - val_accuracy: 0.4929
Epoch 19/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4234 - accuracy: 0.8329 - val_loss: 1.1771 - val_accuracy: 0.4100
Epoch 20/300
12/12 [==============================] - 0s 6ms/step - loss: 0.4075 - accuracy: 0.8461 - val_loss: 1.1657 - val_accuracy: 0.4457
Epoch 21/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3797 - accuracy: 0.8507 - val_loss: 1.1140 - val_accuracy: 0.4629
Epoch 22/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3994 - accuracy: 0.8393 - val_loss: 1.0791 - val_accuracy: 0.4786
Epoch 23/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3803 - accuracy: 0.8561 - val_loss: 1.0853 - val_accuracy: 0.4871
Epoch 24/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3694 - accuracy: 0.8529 - val_loss: 1.0301 - val_accuracy: 0.4586
Epoch 25/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3679 - accuracy: 0.8575 - val_loss: 0.9909 - val_accuracy: 0.4500
Epoch 26/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3648 - accuracy: 0.8557 - val_loss: 0.9550 - val_accuracy: 0.4829
Epoch 27/300
12/12 [==============================] - 0s 8ms/step - loss: 0.3445 - accuracy: 0.8586 - val_loss: 0.8954 - val_accuracy: 0.5814
Epoch 28/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3348 - accuracy: 0.8707 - val_loss: 0.8569 - val_accuracy: 0.6143
Epoch 29/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3390 - accuracy: 0.8636 - val_loss: 0.8470 - val_accuracy: 0.5943
Epoch 30/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3164 - accuracy: 0.8793 - val_loss: 0.8088 - val_accuracy: 0.6143
Epoch 31/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3287 - accuracy: 0.8693 - val_loss: 0.8090 - val_accuracy: 0.6129
Epoch 32/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3152 - accuracy: 0.8804 - val_loss: 0.7334 - val_accuracy: 0.6143
Epoch 33/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3154 - accuracy: 0.8732 - val_loss: 0.6454 - val_accuracy: 0.7214
Epoch 34/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2994 - accuracy: 0.8854 - val_loss: 0.6259 - val_accuracy: 0.6629
Epoch 35/300
12/12 [==============================] - 0s 6ms/step - loss: 0.3043 - accuracy: 0.8793 - val_loss: 0.6108 - val_accuracy: 0.6743
Epoch 36/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2904 - accuracy: 0.8821 - val_loss: 0.5818 - val_accuracy: 0.6957
Epoch 37/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2904 - accuracy: 0.8796 - val_loss: 0.5072 - val_accuracy: 0.7571
Epoch 38/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2932 - accuracy: 0.8825 - val_loss: 0.5017 - val_accuracy: 0.7686
Epoch 39/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2811 - accuracy: 0.8907 - val_loss: 0.4877 - val_accuracy: 0.8014
Epoch 40/300
12/12 [==============================] - 0s 7ms/step - loss: 0.2919 - accuracy: 0.8836 - val_loss: 0.4435 - val_accuracy: 0.7829
Epoch 41/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2867 - accuracy: 0.8850 - val_loss: 0.4207 - val_accuracy: 0.8186
Epoch 42/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2730 - accuracy: 0.8900 - val_loss: 0.4209 - val_accuracy: 0.8214
Epoch 43/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2712 - accuracy: 0.8950 - val_loss: 0.3019 - val_accuracy: 0.9343
Epoch 44/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2647 - accuracy: 0.8904 - val_loss: 0.3315 - val_accuracy: 0.8886
Epoch 45/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2671 - accuracy: 0.8861 - val_loss: 0.2783 - val_accuracy: 0.9400
Epoch 46/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2541 - accuracy: 0.8979 - val_loss: 0.3068 - val_accuracy: 0.9000
Epoch 47/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2618 - accuracy: 0.8875 - val_loss: 0.3118 - val_accuracy: 0.8886
Epoch 48/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2508 - accuracy: 0.8979 - val_loss: 0.2319 - val_accuracy: 0.9357
Epoch 49/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2492 - accuracy: 0.9011 - val_loss: 0.2415 - val_accuracy: 0.9300
Epoch 50/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2583 - accuracy: 0.9007 - val_loss: 0.2415 - val_accuracy: 0.9314
Epoch 51/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2372 - accuracy: 0.9132 - val_loss: 0.2361 - val_accuracy: 0.9200
Epoch 52/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2411 - accuracy: 0.9071 - val_loss: 0.1964 - val_accuracy: 0.9443
Epoch 53/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2360 - accuracy: 0.9125 - val_loss: 0.2024 - val_accuracy: 0.9386
Epoch 54/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2292 - accuracy: 0.9196 - val_loss: 0.1967 - val_accuracy: 0.9386
Epoch 55/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2452 - accuracy: 0.9064 - val_loss: 0.2300 - val_accuracy: 0.9200
Epoch 56/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2368 - accuracy: 0.9161 - val_loss: 0.1918 - val_accuracy: 0.9371
Epoch 57/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2279 - accuracy: 0.9193 - val_loss: 0.1846 - val_accuracy: 0.9443
Epoch 58/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2302 - accuracy: 0.9150 - val_loss: 0.1756 - val_accuracy: 0.9371
Epoch 59/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2198 - accuracy: 0.9186 - val_loss: 0.1684 - val_accuracy: 0.9414
Epoch 60/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2259 - accuracy: 0.9193 - val_loss: 0.1628 - val_accuracy: 0.9400
Epoch 61/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2166 - accuracy: 0.9296 - val_loss: 0.1966 - val_accuracy: 0.9329
Epoch 62/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2144 - accuracy: 0.9246 - val_loss: 0.1624 - val_accuracy: 0.9486
Epoch 63/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2124 - accuracy: 0.9296 - val_loss: 0.1577 - val_accuracy: 0.9371
Epoch 64/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2100 - accuracy: 0.9314 - val_loss: 0.1583 - val_accuracy: 0.9343
Epoch 65/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1960 - accuracy: 0.9318 - val_loss: 0.1544 - val_accuracy: 0.9400
Epoch 66/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1973 - accuracy: 0.9314 - val_loss: 0.1537 - val_accuracy: 0.9443
Epoch 67/300
12/12 [==============================] - 0s 7ms/step - loss: 0.2089 - accuracy: 0.9261 - val_loss: 0.1624 - val_accuracy: 0.9357
Epoch 68/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1994 - accuracy: 0.9371 - val_loss: 0.1608 - val_accuracy: 0.9443
Epoch 69/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1967 - accuracy: 0.9350 - val_loss: 0.1536 - val_accuracy: 0.9371
Epoch 70/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1967 - accuracy: 0.9336 - val_loss: 0.1881 - val_accuracy: 0.9329
Epoch 71/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1996 - accuracy: 0.9321 - val_loss: 0.1652 - val_accuracy: 0.9414
Epoch 72/300
12/12 [==============================] - 0s 6ms/step - loss: 0.2043 - accuracy: 0.9346 - val_loss: 0.1627 - val_accuracy: 0.9400
Epoch 73/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1953 - accuracy: 0.9354 - val_loss: 0.1493 - val_accuracy: 0.9429
Epoch 74/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1852 - accuracy: 0.9389 - val_loss: 0.1669 - val_accuracy: 0.9486
Epoch 75/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1923 - accuracy: 0.9361 - val_loss: 0.1409 - val_accuracy: 0.9443
Epoch 76/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1891 - accuracy: 0.9375 - val_loss: 0.1596 - val_accuracy: 0.9371
Epoch 77/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1840 - accuracy: 0.9386 - val_loss: 0.1566 - val_accuracy: 0.9429
Epoch 78/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1798 - accuracy: 0.9436 - val_loss: 0.1368 - val_accuracy: 0.9443
Epoch 79/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1749 - accuracy: 0.9346 - val_loss: 0.1397 - val_accuracy: 0.9429
Epoch 80/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1873 - accuracy: 0.9364 - val_loss: 0.1304 - val_accuracy: 0.9557
Epoch 81/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1744 - accuracy: 0.9454 - val_loss: 0.1326 - val_accuracy: 0.9500
Epoch 82/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1824 - accuracy: 0.9354 - val_loss: 0.1318 - val_accuracy: 0.9500
Epoch 83/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1809 - accuracy: 0.9357 - val_loss: 0.1593 - val_accuracy: 0.9400
Epoch 84/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1902 - accuracy: 0.9346 - val_loss: 0.1429 - val_accuracy: 0.9471
Epoch 85/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1723 - accuracy: 0.9461 - val_loss: 0.1953 - val_accuracy: 0.9271
Epoch 86/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1835 - accuracy: 0.9375 - val_loss: 0.1384 - val_accuracy: 0.9429
Epoch 87/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1827 - accuracy: 0.9364 - val_loss: 0.1257 - val_accuracy: 0.9471
Epoch 88/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1797 - accuracy: 0.9357 - val_loss: 0.2278 - val_accuracy: 0.9229
Epoch 89/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1868 - accuracy: 0.9343 - val_loss: 0.1736 - val_accuracy: 0.9300
Epoch 90/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1862 - accuracy: 0.9382 - val_loss: 0.1756 - val_accuracy: 0.9329
Epoch 91/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1702 - accuracy: 0.9393 - val_loss: 0.1550 - val_accuracy: 0.9429
Epoch 92/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1656 - accuracy: 0.9400 - val_loss: 0.1438 - val_accuracy: 0.9429
Epoch 93/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1814 - accuracy: 0.9350 - val_loss: 0.1652 - val_accuracy: 0.9400
Epoch 94/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1855 - accuracy: 0.9364 - val_loss: 0.1363 - val_accuracy: 0.9486
Epoch 95/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1702 - accuracy: 0.9379 - val_loss: 0.1446 - val_accuracy: 0.9486
Epoch 96/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1693 - accuracy: 0.9450 - val_loss: 0.1422 - val_accuracy: 0.9471
Epoch 97/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1700 - accuracy: 0.9454 - val_loss: 0.1433 - val_accuracy: 0.9414
Epoch 98/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1576 - accuracy: 0.9457 - val_loss: 0.1225 - val_accuracy: 0.9529
Epoch 99/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1719 - accuracy: 0.9443 - val_loss: 0.1121 - val_accuracy: 0.9529
Epoch 100/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1669 - accuracy: 0.9429 - val_loss: 0.1238 - val_accuracy: 0.9529
Epoch 101/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1622 - accuracy: 0.9436 - val_loss: 0.1182 - val_accuracy: 0.9529
Epoch 102/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1565 - accuracy: 0.9479 - val_loss: 0.1550 - val_accuracy: 0.9414
Epoch 103/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1586 - accuracy: 0.9482 - val_loss: 0.1429 - val_accuracy: 0.9514
Epoch 104/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1630 - accuracy: 0.9446 - val_loss: 0.1172 - val_accuracy: 0.9557
Epoch 105/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1780 - accuracy: 0.9382 - val_loss: 0.1260 - val_accuracy: 0.9500
Epoch 106/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1593 - accuracy: 0.9439 - val_loss: 0.1174 - val_accuracy: 0.9514
Epoch 107/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1599 - accuracy: 0.9486 - val_loss: 0.1612 - val_accuracy: 0.9386
Epoch 108/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1637 - accuracy: 0.9425 - val_loss: 0.2100 - val_accuracy: 0.9200
Epoch 109/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1587 - accuracy: 0.9471 - val_loss: 0.1808 - val_accuracy: 0.9400
Epoch 110/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1523 - accuracy: 0.9450 - val_loss: 0.1225 - val_accuracy: 0.9529
Epoch 111/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1363 - accuracy: 0.9554 - val_loss: 0.1422 - val_accuracy: 0.9443
Epoch 112/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1755 - accuracy: 0.9375 - val_loss: 0.1995 - val_accuracy: 0.9357
Epoch 113/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1644 - accuracy: 0.9461 - val_loss: 0.1596 - val_accuracy: 0.9386
Epoch 114/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1539 - accuracy: 0.9414 - val_loss: 0.1504 - val_accuracy: 0.9443
Epoch 115/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1605 - accuracy: 0.9443 - val_loss: 0.1594 - val_accuracy: 0.9386
Epoch 116/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1654 - accuracy: 0.9486 - val_loss: 0.1256 - val_accuracy: 0.9529
Epoch 117/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1575 - accuracy: 0.9507 - val_loss: 0.1761 - val_accuracy: 0.9357
Epoch 118/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1484 - accuracy: 0.9471 - val_loss: 0.1118 - val_accuracy: 0.9586
Epoch 119/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1476 - accuracy: 0.9536 - val_loss: 0.1846 - val_accuracy: 0.9243
Epoch 120/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1509 - accuracy: 0.9496 - val_loss: 0.1482 - val_accuracy: 0.9443
Epoch 121/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1535 - accuracy: 0.9507 - val_loss: 0.1269 - val_accuracy: 0.9514
Epoch 122/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1514 - accuracy: 0.9479 - val_loss: 0.1866 - val_accuracy: 0.9371
Epoch 123/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1547 - accuracy: 0.9464 - val_loss: 0.1302 - val_accuracy: 0.9457
Epoch 124/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1606 - accuracy: 0.9461 - val_loss: 0.1259 - val_accuracy: 0.9486
Epoch 125/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1455 - accuracy: 0.9550 - val_loss: 0.1187 - val_accuracy: 0.9543
Epoch 126/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1464 - accuracy: 0.9532 - val_loss: 0.1334 - val_accuracy: 0.9514
Epoch 127/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1517 - accuracy: 0.9489 - val_loss: 0.1333 - val_accuracy: 0.9486
Epoch 128/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1433 - accuracy: 0.9529 - val_loss: 0.1385 - val_accuracy: 0.9457
Epoch 129/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1507 - accuracy: 0.9539 - val_loss: 0.1254 - val_accuracy: 0.9500
Epoch 130/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1598 - accuracy: 0.9446 - val_loss: 0.2434 - val_accuracy: 0.9071
Epoch 131/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1616 - accuracy: 0.9471 - val_loss: 0.1288 - val_accuracy: 0.9471
Epoch 132/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1564 - accuracy: 0.9493 - val_loss: 0.1120 - val_accuracy: 0.9586
Epoch 133/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1508 - accuracy: 0.9475 - val_loss: 0.1180 - val_accuracy: 0.9629
Epoch 134/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1617 - accuracy: 0.9486 - val_loss: 0.1150 - val_accuracy: 0.9571
Epoch 135/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1399 - accuracy: 0.9539 - val_loss: 0.1171 - val_accuracy: 0.9529
Epoch 136/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1465 - accuracy: 0.9518 - val_loss: 0.1296 - val_accuracy: 0.9514
Epoch 137/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1375 - accuracy: 0.9532 - val_loss: 0.1163 - val_accuracy: 0.9557
Epoch 138/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1262 - accuracy: 0.9593 - val_loss: 0.1258 - val_accuracy: 0.9571
Epoch 139/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1484 - accuracy: 0.9496 - val_loss: 0.1105 - val_accuracy: 0.9557
Epoch 140/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1446 - accuracy: 0.9554 - val_loss: 0.1699 - val_accuracy: 0.9371
Epoch 141/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1596 - accuracy: 0.9468 - val_loss: 0.1367 - val_accuracy: 0.9471
Epoch 142/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1369 - accuracy: 0.9525 - val_loss: 0.1711 - val_accuracy: 0.9429
Epoch 143/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1561 - accuracy: 0.9482 - val_loss: 0.1381 - val_accuracy: 0.9429
Epoch 144/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1400 - accuracy: 0.9518 - val_loss: 0.1211 - val_accuracy: 0.9486
Epoch 145/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1404 - accuracy: 0.9546 - val_loss: 0.1045 - val_accuracy: 0.9586
Epoch 146/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1438 - accuracy: 0.9504 - val_loss: 0.1238 - val_accuracy: 0.9500
Epoch 147/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1647 - accuracy: 0.9446 - val_loss: 0.1412 - val_accuracy: 0.9486
Epoch 148/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1352 - accuracy: 0.9507 - val_loss: 0.1131 - val_accuracy: 0.9543
Epoch 149/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1386 - accuracy: 0.9550 - val_loss: 0.1353 - val_accuracy: 0.9514
Epoch 150/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1377 - accuracy: 0.9543 - val_loss: 0.1352 - val_accuracy: 0.9471
Epoch 151/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1381 - accuracy: 0.9571 - val_loss: 0.1122 - val_accuracy: 0.9529
Epoch 152/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1370 - accuracy: 0.9546 - val_loss: 0.0984 - val_accuracy: 0.9614
Epoch 153/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1317 - accuracy: 0.9596 - val_loss: 0.1136 - val_accuracy: 0.9614
Epoch 154/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1472 - accuracy: 0.9489 - val_loss: 0.1350 - val_accuracy: 0.9457
Epoch 155/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1373 - accuracy: 0.9550 - val_loss: 0.1048 - val_accuracy: 0.9586
Epoch 156/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1535 - accuracy: 0.9496 - val_loss: 0.1170 - val_accuracy: 0.9543
Epoch 157/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1366 - accuracy: 0.9557 - val_loss: 0.1130 - val_accuracy: 0.9571
Epoch 158/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1444 - accuracy: 0.9539 - val_loss: 0.1274 - val_accuracy: 0.9529
Epoch 159/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1321 - accuracy: 0.9600 - val_loss: 0.1209 - val_accuracy: 0.9557
Epoch 160/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1421 - accuracy: 0.9514 - val_loss: 0.1653 - val_accuracy: 0.9414
Epoch 161/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1349 - accuracy: 0.9532 - val_loss: 0.1046 - val_accuracy: 0.9586
Epoch 162/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1375 - accuracy: 0.9521 - val_loss: 0.1067 - val_accuracy: 0.9571
Epoch 163/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1424 - accuracy: 0.9514 - val_loss: 0.1285 - val_accuracy: 0.9500
Epoch 164/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1425 - accuracy: 0.9529 - val_loss: 0.1265 - val_accuracy: 0.9543
Epoch 165/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1290 - accuracy: 0.9525 - val_loss: 0.1110 - val_accuracy: 0.9586
Epoch 166/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1376 - accuracy: 0.9571 - val_loss: 0.1378 - val_accuracy: 0.9486
Epoch 167/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1439 - accuracy: 0.9486 - val_loss: 0.1643 - val_accuracy: 0.9314
Epoch 168/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1437 - accuracy: 0.9504 - val_loss: 0.1118 - val_accuracy: 0.9600
Epoch 169/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1420 - accuracy: 0.9543 - val_loss: 0.1265 - val_accuracy: 0.9471
Epoch 170/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1431 - accuracy: 0.9554 - val_loss: 0.1058 - val_accuracy: 0.9657
Epoch 171/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1403 - accuracy: 0.9557 - val_loss: 0.1324 - val_accuracy: 0.9500
Epoch 172/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1353 - accuracy: 0.9543 - val_loss: 0.1279 - val_accuracy: 0.9543
Epoch 173/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1493 - accuracy: 0.9539 - val_loss: 0.2212 - val_accuracy: 0.9100
Epoch 174/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1475 - accuracy: 0.9493 - val_loss: 0.1191 - val_accuracy: 0.9500
Epoch 175/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1336 - accuracy: 0.9568 - val_loss: 0.1047 - val_accuracy: 0.9614
Epoch 176/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1279 - accuracy: 0.9571 - val_loss: 0.0985 - val_accuracy: 0.9614
Epoch 177/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1333 - accuracy: 0.9536 - val_loss: 0.1006 - val_accuracy: 0.9571
Epoch 178/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1360 - accuracy: 0.9579 - val_loss: 0.1383 - val_accuracy: 0.9486
Epoch 179/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1364 - accuracy: 0.9554 - val_loss: 0.1418 - val_accuracy: 0.9457
Epoch 180/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1317 - accuracy: 0.9593 - val_loss: 0.1097 - val_accuracy: 0.9600
Epoch 181/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1374 - accuracy: 0.9546 - val_loss: 0.1050 - val_accuracy: 0.9629
Epoch 182/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1422 - accuracy: 0.9521 - val_loss: 0.1164 - val_accuracy: 0.9543
Epoch 183/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1368 - accuracy: 0.9550 - val_loss: 0.1036 - val_accuracy: 0.9543
Epoch 184/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1329 - accuracy: 0.9579 - val_loss: 0.1587 - val_accuracy: 0.9414
Epoch 185/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1402 - accuracy: 0.9550 - val_loss: 0.1187 - val_accuracy: 0.9500
Epoch 186/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1292 - accuracy: 0.9586 - val_loss: 0.1053 - val_accuracy: 0.9657
Epoch 187/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1339 - accuracy: 0.9568 - val_loss: 0.1121 - val_accuracy: 0.9614
Epoch 188/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1553 - accuracy: 0.9479 - val_loss: 0.1491 - val_accuracy: 0.9414
Epoch 189/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1339 - accuracy: 0.9571 - val_loss: 0.1258 - val_accuracy: 0.9514
Epoch 190/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1341 - accuracy: 0.9525 - val_loss: 0.1139 - val_accuracy: 0.9643
Epoch 191/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1366 - accuracy: 0.9539 - val_loss: 0.1257 - val_accuracy: 0.9500
Epoch 192/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1410 - accuracy: 0.9557 - val_loss: 0.1471 - val_accuracy: 0.9457
Epoch 193/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1426 - accuracy: 0.9479 - val_loss: 0.1651 - val_accuracy: 0.9371
Epoch 194/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1345 - accuracy: 0.9604 - val_loss: 0.1117 - val_accuracy: 0.9600
Epoch 195/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1492 - accuracy: 0.9518 - val_loss: 0.1268 - val_accuracy: 0.9443
Epoch 196/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1443 - accuracy: 0.9529 - val_loss: 0.1051 - val_accuracy: 0.9600
Epoch 197/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1372 - accuracy: 0.9568 - val_loss: 0.1783 - val_accuracy: 0.9300
Epoch 198/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1399 - accuracy: 0.9546 - val_loss: 0.1161 - val_accuracy: 0.9529
Epoch 199/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1403 - accuracy: 0.9539 - val_loss: 0.1138 - val_accuracy: 0.9543
Epoch 200/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1265 - accuracy: 0.9571 - val_loss: 0.1237 - val_accuracy: 0.9486
Epoch 201/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1451 - accuracy: 0.9525 - val_loss: 0.1170 - val_accuracy: 0.9514
Epoch 202/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1429 - accuracy: 0.9500 - val_loss: 0.1027 - val_accuracy: 0.9629
Epoch 203/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1470 - accuracy: 0.9525 - val_loss: 0.1040 - val_accuracy: 0.9629
Epoch 204/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1489 - accuracy: 0.9504 - val_loss: 0.1219 - val_accuracy: 0.9486
Epoch 205/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1415 - accuracy: 0.9511 - val_loss: 0.1097 - val_accuracy: 0.9557
Epoch 206/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1425 - accuracy: 0.9518 - val_loss: 0.1096 - val_accuracy: 0.9586
Epoch 207/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1307 - accuracy: 0.9625 - val_loss: 0.1068 - val_accuracy: 0.9600
Epoch 208/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1454 - accuracy: 0.9525 - val_loss: 0.1443 - val_accuracy: 0.9457
Epoch 209/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1389 - accuracy: 0.9554 - val_loss: 0.1810 - val_accuracy: 0.9400
Epoch 210/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1407 - accuracy: 0.9500 - val_loss: 0.1523 - val_accuracy: 0.9414
Epoch 211/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1399 - accuracy: 0.9554 - val_loss: 0.1785 - val_accuracy: 0.9371
Epoch 212/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1411 - accuracy: 0.9504 - val_loss: 0.1075 - val_accuracy: 0.9614
Epoch 213/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1570 - accuracy: 0.9496 - val_loss: 0.2404 - val_accuracy: 0.9086
Epoch 214/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1509 - accuracy: 0.9518 - val_loss: 0.1124 - val_accuracy: 0.9671
Epoch 215/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1455 - accuracy: 0.9557 - val_loss: 0.1515 - val_accuracy: 0.9414
Epoch 216/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1502 - accuracy: 0.9518 - val_loss: 0.1673 - val_accuracy: 0.9471
Epoch 217/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1310 - accuracy: 0.9575 - val_loss: 0.1239 - val_accuracy: 0.9514
Epoch 218/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1307 - accuracy: 0.9536 - val_loss: 0.1079 - val_accuracy: 0.9643
Epoch 219/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1204 - accuracy: 0.9607 - val_loss: 0.1268 - val_accuracy: 0.9557
Epoch 220/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1261 - accuracy: 0.9589 - val_loss: 0.1143 - val_accuracy: 0.9586
Epoch 221/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1246 - accuracy: 0.9600 - val_loss: 0.0982 - val_accuracy: 0.9614
Epoch 222/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1213 - accuracy: 0.9629 - val_loss: 0.1419 - val_accuracy: 0.9443
Epoch 223/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1294 - accuracy: 0.9579 - val_loss: 0.1168 - val_accuracy: 0.9557
Epoch 224/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1232 - accuracy: 0.9561 - val_loss: 0.2258 - val_accuracy: 0.9071
Epoch 225/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1341 - accuracy: 0.9536 - val_loss: 0.1150 - val_accuracy: 0.9571
Epoch 226/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1284 - accuracy: 0.9593 - val_loss: 0.1303 - val_accuracy: 0.9443
Epoch 227/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1173 - accuracy: 0.9632 - val_loss: 0.1289 - val_accuracy: 0.9514
Epoch 228/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1441 - accuracy: 0.9550 - val_loss: 0.1377 - val_accuracy: 0.9471
Epoch 229/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1303 - accuracy: 0.9600 - val_loss: 0.1079 - val_accuracy: 0.9643
Epoch 230/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1417 - accuracy: 0.9489 - val_loss: 0.1445 - val_accuracy: 0.9457
Epoch 231/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1250 - accuracy: 0.9561 - val_loss: 0.0988 - val_accuracy: 0.9657
Epoch 232/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1479 - accuracy: 0.9496 - val_loss: 0.1167 - val_accuracy: 0.9586
Epoch 233/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1478 - accuracy: 0.9500 - val_loss: 0.1340 - val_accuracy: 0.9457
Epoch 234/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1305 - accuracy: 0.9554 - val_loss: 0.1056 - val_accuracy: 0.9600
Epoch 235/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1177 - accuracy: 0.9636 - val_loss: 0.1089 - val_accuracy: 0.9586
Epoch 236/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1168 - accuracy: 0.9629 - val_loss: 0.1022 - val_accuracy: 0.9614
Epoch 237/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1389 - accuracy: 0.9507 - val_loss: 0.1460 - val_accuracy: 0.9500
Epoch 238/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1364 - accuracy: 0.9521 - val_loss: 0.1823 - val_accuracy: 0.9300
Epoch 239/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1326 - accuracy: 0.9582 - val_loss: 0.1411 - val_accuracy: 0.9471
Epoch 240/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1220 - accuracy: 0.9625 - val_loss: 0.1204 - val_accuracy: 0.9557
Epoch 241/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1247 - accuracy: 0.9596 - val_loss: 0.1043 - val_accuracy: 0.9571
Epoch 242/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1389 - accuracy: 0.9546 - val_loss: 0.1036 - val_accuracy: 0.9643
Epoch 243/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1222 - accuracy: 0.9568 - val_loss: 0.1037 - val_accuracy: 0.9600
Epoch 244/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1157 - accuracy: 0.9621 - val_loss: 0.0920 - val_accuracy: 0.9671
Epoch 245/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1234 - accuracy: 0.9571 - val_loss: 0.1124 - val_accuracy: 0.9571
Epoch 246/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1297 - accuracy: 0.9539 - val_loss: 0.1083 - val_accuracy: 0.9571
Epoch 247/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1297 - accuracy: 0.9571 - val_loss: 0.1354 - val_accuracy: 0.9471
Epoch 248/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1197 - accuracy: 0.9582 - val_loss: 0.1049 - val_accuracy: 0.9586
Epoch 249/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1295 - accuracy: 0.9539 - val_loss: 0.1042 - val_accuracy: 0.9600
Epoch 250/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1245 - accuracy: 0.9593 - val_loss: 0.1897 - val_accuracy: 0.9371
Epoch 251/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1374 - accuracy: 0.9532 - val_loss: 0.1129 - val_accuracy: 0.9557
Epoch 252/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1434 - accuracy: 0.9511 - val_loss: 0.1369 - val_accuracy: 0.9429
Epoch 253/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1332 - accuracy: 0.9571 - val_loss: 0.1243 - val_accuracy: 0.9529
Epoch 254/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1257 - accuracy: 0.9557 - val_loss: 0.2038 - val_accuracy: 0.9286
Epoch 255/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1403 - accuracy: 0.9571 - val_loss: 0.1530 - val_accuracy: 0.9443
Epoch 256/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1379 - accuracy: 0.9536 - val_loss: 0.1398 - val_accuracy: 0.9514
Epoch 257/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1338 - accuracy: 0.9561 - val_loss: 0.0977 - val_accuracy: 0.9657
Epoch 258/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1366 - accuracy: 0.9557 - val_loss: 0.1307 - val_accuracy: 0.9514
Epoch 259/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1176 - accuracy: 0.9621 - val_loss: 0.1414 - val_accuracy: 0.9486
Epoch 260/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1197 - accuracy: 0.9607 - val_loss: 0.1083 - val_accuracy: 0.9629
Epoch 261/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1348 - accuracy: 0.9546 - val_loss: 0.1371 - val_accuracy: 0.9500
Epoch 262/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1234 - accuracy: 0.9654 - val_loss: 0.1067 - val_accuracy: 0.9600
Epoch 263/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1109 - accuracy: 0.9689 - val_loss: 0.1030 - val_accuracy: 0.9614
Epoch 264/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1131 - accuracy: 0.9621 - val_loss: 0.0925 - val_accuracy: 0.9657
Epoch 265/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1169 - accuracy: 0.9614 - val_loss: 0.1166 - val_accuracy: 0.9557
Epoch 266/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1164 - accuracy: 0.9614 - val_loss: 0.1149 - val_accuracy: 0.9600
Epoch 267/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1131 - accuracy: 0.9636 - val_loss: 0.1084 - val_accuracy: 0.9600
Epoch 268/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1236 - accuracy: 0.9582 - val_loss: 0.1093 - val_accuracy: 0.9571
Epoch 269/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1270 - accuracy: 0.9532 - val_loss: 0.1077 - val_accuracy: 0.9629
Epoch 270/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1230 - accuracy: 0.9589 - val_loss: 0.0978 - val_accuracy: 0.9671
Epoch 271/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1328 - accuracy: 0.9586 - val_loss: 0.1045 - val_accuracy: 0.9614
Epoch 272/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1217 - accuracy: 0.9543 - val_loss: 0.1333 - val_accuracy: 0.9543
Epoch 273/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1269 - accuracy: 0.9539 - val_loss: 0.1784 - val_accuracy: 0.9429
Epoch 274/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1228 - accuracy: 0.9586 - val_loss: 0.0999 - val_accuracy: 0.9614
Epoch 275/300
12/12 [==============================] - 0s 6ms/step - loss: 0.1199 - accuracy: 0.9571 - val_loss: 0.0958 - val_accuracy: 0.9686
Epoch 276/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1193 - accuracy: 0.9614 - val_loss: 0.1149 - val_accuracy: 0.9500
Epoch 277/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1129 - accuracy: 0.9621 - val_loss: 0.1161 - val_accuracy: 0.9571
Epoch 278/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1069 - accuracy: 0.9643 - val_loss: 0.0972 - val_accuracy: 0.9671
Epoch 279/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1164 - accuracy: 0.9650 - val_loss: 0.0964 - val_accuracy: 0.9614
Epoch 280/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1175 - accuracy: 0.9593 - val_loss: 0.1083 - val_accuracy: 0.9600
Epoch 281/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1262 - accuracy: 0.9589 - val_loss: 0.0945 - val_accuracy: 0.9657
Epoch 282/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1193 - accuracy: 0.9575 - val_loss: 0.1182 - val_accuracy: 0.9571
Epoch 283/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1143 - accuracy: 0.9621 - val_loss: 0.1048 - val_accuracy: 0.9614
Epoch 284/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1253 - accuracy: 0.9596 - val_loss: 0.1298 - val_accuracy: 0.9529
Epoch 285/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1150 - accuracy: 0.9643 - val_loss: 0.1042 - val_accuracy: 0.9571
Epoch 286/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1174 - accuracy: 0.9600 - val_loss: 0.1226 - val_accuracy: 0.9557
Epoch 287/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1076 - accuracy: 0.9657 - val_loss: 0.1036 - val_accuracy: 0.9629
Epoch 288/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1266 - accuracy: 0.9568 - val_loss: 0.0924 - val_accuracy: 0.9657
Epoch 289/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1233 - accuracy: 0.9589 - val_loss: 0.1034 - val_accuracy: 0.9643
Epoch 290/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1154 - accuracy: 0.9621 - val_loss: 0.1005 - val_accuracy: 0.9643
Epoch 291/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1134 - accuracy: 0.9643 - val_loss: 0.1073 - val_accuracy: 0.9600
Epoch 292/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1145 - accuracy: 0.9618 - val_loss: 0.1249 - val_accuracy: 0.9514
Epoch 293/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1233 - accuracy: 0.9600 - val_loss: 0.1213 - val_accuracy: 0.9600
Epoch 294/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1291 - accuracy: 0.9586 - val_loss: 0.0962 - val_accuracy: 0.9686
Epoch 295/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1203 - accuracy: 0.9621 - val_loss: 0.1069 - val_accuracy: 0.9586
Epoch 296/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1214 - accuracy: 0.9607 - val_loss: 0.1070 - val_accuracy: 0.9629
Epoch 297/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1217 - accuracy: 0.9618 - val_loss: 0.1274 - val_accuracy: 0.9514
Epoch 298/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1182 - accuracy: 0.9586 - val_loss: 0.0983 - val_accuracy: 0.9614
Epoch 299/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1120 - accuracy: 0.9639 - val_loss: 0.1337 - val_accuracy: 0.9500
Epoch 300/300
12/12 [==============================] - 0s 7ms/step - loss: 0.1213 - accuracy: 0.9586 - val_loss: 0.1059 - val_accuracy: 0.9600

Next, let's plot the loss and accuracy curves:

In [ ]:
fig, ax = plt.subplots(1,2, figsize=(16,8))
ax[0].plot(history2.history['loss'], color='b', label="Training loss")
ax[0].plot(history2.history['val_loss'], color='r', label="validation loss",axes =ax[0])
legend = ax[0].legend(loc='best', shadow=True)

ax[1].plot(history2.history['accuracy'], color='b', label="Training accuracy")
ax[1].plot(history2.history['val_accuracy'], color='r',label="Validation accuracy")
legend = ax[1].legend(loc='best', shadow=True)
No description has been provided for this image
In [ ]:
y_pred_2 = model_2.predict(X_test)
47/47 [==============================] - 0s 1ms/step
In [ ]:
y_pred_res2 = np.argmax(y_pred_2, axis=1)
In [ ]:
Y_test_res2 = np.argmax(Y_test, axis=1)
In [ ]:
print(classification_report(y_pred_res2, Y_test_res2))
              precision    recall  f1-score   support

           0       1.00      0.98      0.99       290
           1       0.98      0.96      0.97       315
           2       0.93      0.95      0.94       284
           3       0.97      0.97      0.97       315
           4       0.95      0.96      0.95       296

    accuracy                           0.96      1500
   macro avg       0.96      0.96      0.96      1500
weighted avg       0.96      0.96      0.96      1500

In [ ]:
c_matrix = confusion_matrix(Y_test_res2, y_pred_res2)
In [ ]:
names = ['Agua','Vegetação','Solo','Agricultura','Infra_urbana']
In [ ]:
r1 = pd.DataFrame(data=c_matrix, index= names, columns=names)
fig, ax = plt.subplots(figsize=(8,8))
ax = sns.heatmap(r1, annot=True, annot_kws={"size": 18},fmt='d',cmap="Blues", cbar = False)
#for t in ax.texts: t.set_text(t.get_text() + " %")
ax.tick_params(labelsize=16)
ax.set_yticklabels(names, rotation=45)
ax.set_ylabel('True')
ax.set_xlabel('Predict')
Out[ ]:
Text(0.5, 58.5815972222222, 'Predict')
No description has been provided for this image