import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn import datasets
from sklearn.cluster import KMeans
10 Unsupervised Learning
10.1 K-Means Clustering
So far, we have explored various supervised learning algorithms such as Decision Trees and Random Forests, which rely on labeled data with known outcomes. In contrast, unsupervised learning techniques analyze unlabeled data to identify patterns, making them particularly useful for clustering and association problems. Among these, K-means clustering stands out as one of the simplest and most widely used algorithms.
K-means clustering aims to divide a dataset into non-overlapping groups based on similarity. Given a set of data points, each represented as a vector in a multi-dimensional space, the algorithm assigns each point to one of \(k\) clusters in a way that minimizes the variation within each cluster. This is done by reducing the sum of squared distances between each point and its assigned cluster center. Mathematically, we seek to minimize:
\[\begin{equation*} \sum_{i=1}^{k}\sum_{\boldsymbol{x}\in S_i} \left\|\boldsymbol{x}-\boldsymbol{\mu}_i\right\|^2 \end{equation*}\]
where \(S_i\) represents each cluster and \(\boldsymbol{\mu}_i\) is the mean of the points within that cluster.
10.1.1 Lloyd’s Algorithm
K-means clustering is typically solved using Lloyd’s algorithm, which operates iteratively as follows:
- Initialization: Select \(k\) initial cluster centroids \(\boldsymbol{\mu}_i\) randomly.
- Iteration:
Assignment step: Assign each point \(\boldsymbol{x}\) to the cluster whose centroid is closest based on the squared Euclidean distance.
Update step: Recompute the centroids as the mean of all points assigned to each cluster:
\[\begin{equation*} \boldsymbol{\mu}_i \leftarrow \frac{1}{|S_i|} \sum_{\boldsymbol{x}_j \in S_i} \boldsymbol{x}_j \end{equation*}\]
- Termination: The process stops when either the assignments no longer change or a predefined number of iterations is reached.
10.1.2 Example: Iris Data
K-means clustering can be implemented using the scikit-learn
library. Below, we apply it to the Iris dataset.
# Load the Iris dataset
= datasets.load_iris()
iris = iris.data[:, :2] # Using only two features
X = iris.target y
We visualize the observations based on their true species labels.
# Scatter plot of true species labels
= plt.subplots()
fig, ax = ax.scatter(X[:, 0], X[:, 1], c=y,
scatter ='viridis', edgecolors='k')
cmap*scatter.legend_elements(), loc="upper left",
ax.legend(="Species")
title"Feature 1")
plt.xlabel("Feature 2")
plt.ylabel("True Species Distribution")
plt.title( plt.show()
Now, we apply K-means clustering to the data.
# Train K-means model
= KMeans(n_clusters=3, init='k-means++',
Kmean =10, random_state=42)
n_init Kmean.fit(X)
KMeans(n_clusters=3, n_init=10, random_state=42)In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
KMeans(n_clusters=3, n_init=10, random_state=42)
Several parameters can be adjusted for better performance. See: <https://scikit-learn.org/stable/modules/generated/ sklearn.cluster.KMeans.html>
K-means provides cluster centroids, representing the center of each cluster.
# Print predicted cluster centers
print("Cluster Centers:")
print(Kmean.cluster_centers_)
Cluster Centers:
[[6.81276596 3.07446809]
[5.77358491 2.69245283]
[5.006 3.428 ]]
We plot the centroids along with clustered points.
# Plot centroids on the scatter plot
= plt.subplots()
fig, ax 0], X[:, 1], c=Kmean.labels_,
ax.scatter(X[:, ='viridis', edgecolors='k', alpha=0.5)
cmap0],
ax.scatter(Kmean.cluster_centers_[:, 1],
Kmean.cluster_centers_[:, ="black", s=200, marker='s',
c="Centroids")
label
ax.legend()"Feature 1")
plt.xlabel("Feature 2")
plt.ylabel("K-means Clustering Results")
plt.title( plt.show()
10.1.2.1 Comparing True and Predicted Labels
By plotting the results side by side, we can see how well K-means clustering approximates the true labels.
# Compare true vs. predicted labels
= plt.subplots(ncols=2, figsize=(12, 5),
fig, axs =True)
constrained_layout
# True labels plot
0].scatter(X[:, 0], X[:, 1], c=y,
axs[='viridis', alpha=0.5,
cmap='k')
edgecolors0].set_title("True Labels")
axs[0].set_xlabel("Feature 1")
axs[0].set_ylabel("Feature 2")
axs[
# Predicted clusters plot
1].scatter(X[:, 0], X[:, 1], c=Kmean.labels_,
axs[='viridis', alpha=0.5,
cmap='k')
edgecolors1].scatter(Kmean.cluster_centers_[:, 0],
axs[1],
Kmean.cluster_centers_[:, ="s", c="black", s=200,
marker=1, label="Centroids")
alpha1].set_title("Predicted Clusters")
axs[1].set_xlabel("Feature 1")
axs[1].set_ylabel("Feature 2")
axs[1].legend()
axs[ plt.show()
10.1.3 Making Predictions on New Data
Once trained, the model can classify new data points.
# Sample test data points
= np.array([[3, 4], [7, 4]])
sample_test
# Predict cluster assignment
print("Predicted Clusters:", Kmean.predict(sample_test))
Predicted Clusters: [2 0]
10.1.4 Discussion
K-means is intuitive but has limitations:
- Sensitivity to initialization: Poor initialization can yield suboptimal results.
k-means++
mitigates this issue. - Choosing the number of clusters: The choice of \(k\) is critical. The elbow method helps determine an optimal value.
- Assumption of spherical clusters: K-means struggles when clusters have irregular shapes. Alternative methods such as kernel-based clustering may be more effective.
Despite its limitations, K-means is a fundamental tool in exploratory data analysis and practical applications.
10.2 K-Prototypes Clustering
This section was prepared by Mario Tomaino, a Senior majoring in Mathematics/Statistics. This section explores K-Prototypes clustering, an unsupervised machine learning algorithm used to cluster data that contains both numerical and categorical variables.
We will be generating a dataset containing individuals of certain ages and incomes, and clustering them based on their technology product preferences. We will then create some visualizations to further understand the clusters and characteristics of the individuals.
10.2.1 What is K-Prototypes Clustering?
- Clustering is the unsupervised classification of patterns into groups
- It helps group similar observations into distinct, interpretable clusters
- Traditional algorithms like k-means only work with numerical data
- Real world data often includes both numerical and categorical features, which makes K-Prototypes necessary
10.2.2 K-Means Clustering
- Works with numerical data only
- Implemented using the
scikit-learn
library - Uses Euclidean distance to measure similarity (distance between two points)
- Here, cluster centroids are the mean of all points in the cluster
- Common in applications with quantitative features (incident zip, latitude, longitude)
10.2.3 K-Modes Clustering
- Designed for categorical data only
- Uses dissimilarity (counts how many attributes differ)
- Here, centroids are the mode (most common category)
- Useful when data contains strings or categories (borough, complaint type)
10.2.4 Why K-Prototypes?
- Many datasets include both numbers (latitude, longitude) and categories (borough, complaint type)
- K-Means: numerical only
- K-Modes: categorical only
- K-Prototypes: combines both types into a single clustering algorithm
- Here, the centroid is a mix:
- Mean for numeric features
- Mode for categorical features
10.2.5 Why is the centroid important?
The algorithm uses centroids to:
- Measure how close a data point is to a cluster
- Reassign points to the closest cluster
- Update the cluster’s center as new points are assigned
This process repeats until centroids stop changing much (convergence).
10.2.6 How K-Prototypes Works
- K-Prototypes combines K-Means and K-Modes
- Minimizes cost function by combining:
- Euclidean distance for numeric features (numerical distance from cluster average)
- Dissimilarity for categorical features
- Minimizes cost function by combining:
- A hyperparameter
γ
(gamma) balances weight of numerical and categorical variables
10.2.7 Similarity Measure (Distance Function)
= (euclidian distance) + γ * (categorical dissimilarity). distance
Precise numerical formula can be found at (Huang, 1997) page two.
- Measures how different each data point is from the cluster centroids
- Helps assign each point to the most appropriate cluster
γ
(gamma) balances numeric and categorical importance
10.2.8 Python Example
Basic example of a dataset containing customers that we will cluster based on Age, Income, and Preferred Product.
Here, we choose γ
= 0.5 because it gives us an equal trade-off between the numeric and categorical parts of the distance function.
This is the code we will use to generate our dataset.
import pandas as pd
import numpy as np
from kmodes.kprototypes import KPrototypes
42)
np.random.seed(# generated 25 random ages between 20 and 59
= np.random.randint(20, 60, size=25)
ages # generated 25 random incomes between 30 and 119
# (thousands of dollars annually)
= np.random.randint(30, 120, size=25)
incomes # generate 25 random product categories, each with their own probability
= np.random.choice(['Phone', 'Laptop', 'Tablet', 'Accessory'], size=25,
products =[0.4, 0.3, 0.2, 0.1])
p
= pd.DataFrame({
df 'Age': ages,
'Income': incomes,
'Product': products
})
# fitting k-prototypes model with several different gamma values
= {}
results for gamma in [0.1, 0.5, 1.0]:
= KPrototypes(
kproto =2,
n_clusters='Huang',
init=42,
random_state=gamma
gamma
)= kproto.fit_predict(df.to_numpy(), categorical=[2])
clusters = clusters
results[gamma]
# choose gamma value and add to dataframe
= 0.5
chosen_gamma 'Cluster'] = results[chosen_gamma]
df[
print(f"Using gamma = {chosen_gamma}")
print(df.head(10))
import pandas as pd
import numpy as np
from kmodes.kprototypes import KPrototypes
42)
np.random.seed(# generated 25 random ages between 20 and 59
= np.random.randint(20, 60, size=25)
ages # generated 25 random incomes between 30 and 119
# (thousands of dollars annually)
= np.random.randint(30, 120, size=25)
incomes # generate 25 random product categories, each with their own probability
= np.random.choice(['Phone', 'Laptop', 'Tablet', 'Accessory'], size=25,
products =[0.4, 0.3, 0.2, 0.1])
p
= pd.DataFrame({
df 'Age': ages,
'Income': incomes,
'Product': products
})
# fitting k-prototypes model with several different gamma values
= {}
results for gamma in [0.1, 0.5, 1.0]:
= KPrototypes(
kproto =2,
n_clusters='Huang',
init=42,
random_state=gamma
gamma
)= kproto.fit_predict(df.to_numpy(), categorical=[2])
clusters = clusters
results[gamma]
# choose gamma value and add to dataframe
= 0.5
chosen_gamma 'Cluster'] = results[chosen_gamma]
df[
print(f"Using gamma = {chosen_gamma}")
print(df.head(10))
Using gamma = 0.5
Age Income Product Cluster
0 58 118 Tablet 0
1 48 78 Phone 0
2 34 88 Phone 0
3 27 71 Laptop 0
4 40 89 Laptop 0
5 58 109 Phone 0
6 38 44 Laptop 1
7 42 91 Phone 0
8 30 91 Accessory 0
9 30 76 Phone 0
Above, we see the first 10 rows of our dataset of 25 random customers.
10.2.9 Scatter Plot
This code will allow us to create a scatterplot. This scatterplot will vizualize our K-Prototypes clustering, and split our customers into two clusters.
# vizualizing cluster assignments for gamma value chosen
import matplotlib.pyplot as plt
=(8, 6))
plt.figure(figsize
# scatter plot by cluster
for cluster in df['Cluster'].unique():
= df[df['Cluster'] == cluster]
subset
plt.scatter('Age'],
subset['Income'],
subset[=f'Cluster {cluster}',
label=100,
s='black'
edgecolor
)
f'K-Prototypes Clustering (γ = {chosen_gamma})')
plt.title('Age')
plt.xlabel('Income')
plt.ylabel(
plt.legend()
plt.tight_layout() plt.show()
# vizualizing cluster assignments for gamma value chosen
import matplotlib.pyplot as plt
=(8, 6))
plt.figure(figsize
# scatter plot by cluster
for cluster in df['Cluster'].unique():
= df[df['Cluster'] == cluster]
subset
plt.scatter('Age'],
subset['Income'],
subset[=f'Cluster {cluster}',
label=100,
s='black'
edgecolor
)
f'K-Prototypes Clustering (γ = {chosen_gamma})')
plt.title('Age')
plt.xlabel('Income')
plt.ylabel(
plt.legend()
plt.tight_layout() plt.show()
- Our model created two clusters:
Cluster 0 (Blue)
:- Contains roughly 19/25 observations
- Wide age range (late 20s through late 50s)
- Mostly moderate to high incomes (roughly 70-120)
Cluster 1 (Orange)
:- Contains roughly 6/25 observations, smaller group
- Younger age range (20s through early 40s)
- Mostly lower income (30-50)
10.2.10 Bar Chart For Product Counts
The following code will create a bar chart for product counts. Specifically, the raw counts of each product for each cluster, or the amount of customers in each cluster who prefer each product.
The following code will also give us the proportions of each product for each cluster, or the percentage of customers per cluster that prefer a given product.
# computing counts of products in each cluster
= pd.crosstab(df['Cluster'], df['Product'])
counts print("Raw counts:\n", counts)
# normalize counts to proportions per cluster
# divide each row by its total so values sum to 1
= counts.div(counts.sum(axis=1), axis=0)
props print("\nProportions:\n", props)
# plotting a grouped bar chart of product proportions
import matplotlib.pyplot as plt
='bar', figsize=(8,5))
props.plot(kindf'Product Preference by Cluster (γ={chosen_gamma})')
plt.title('Cluster')
plt.xlabel('Proportion of Products')
plt.ylabel(='Product', bbox_to_anchor=(1.02, 1))
plt.legend(title
plt.tight_layout() plt.show()
# computing counts of products in each cluster
= pd.crosstab(df['Cluster'], df['Product'])
counts print("Raw counts:\n", counts)
# normalize counts to proportions per cluster
# divide each row by its total so values sum to 1
= counts.div(counts.sum(axis=1), axis=0)
props print("\nProportions:\n", props)
# plotting a grouped bar chart of product proportions
import matplotlib.pyplot as plt
='bar', figsize=(8,5))
props.plot(kindf'Product Preference by Cluster (γ={chosen_gamma})')
plt.title('Cluster')
plt.xlabel('Proportion of Products')
plt.ylabel(='Product', bbox_to_anchor=(1.02, 1))
plt.legend(title
plt.tight_layout() plt.show()
Raw counts:
Product Accessory Laptop Phone Tablet
Cluster
0 2 6 8 2
1 2 1 3 1
Proportions:
Product Accessory Laptop Phone Tablet
Cluster
0 0.111111 0.333333 0.444444 0.111111
1 0.285714 0.142857 0.428571 0.142857
Bar Chart Analysis
Accessory | Laptop | Phone | Tablet | |
---|---|---|---|---|
Cluster 0 | 2 (11.1%) | 6 (33.3%) | 8 (44.4%) | 2 (11.1%) |
Cluster 1 | 2 (28.6%) | 1 (14.3%) | 3 (42.9%) | 1 (14.3%) |
- Cluster 0: (larger, higher-income & mixed-age cluster)
- Phones are the most popular (44.4% of purchases)
- Laptops follow (33.3% of purchases)
- Very few Accessory and Tablet purchases
- Cluster 1: (smaller, younger & lower income cluster)
- Phone is still the most popular (42.9%)
- Accessory increases (28.6%)
- Laptops drop considerably (14.3%), tablets slightly higher (14%)
We can see that both groups favor phones, although Cluster 0 purchases more laptops, and Cluster 1 purchases more accessories.
These results suggest to us that the younger/lower-income cluster is less likely to purchase the higher-priced laptops and more likely to pick smaller items, like accessories.
10.2.11 Real-World Use Cases
- Patient Profiling in Healthcare
- Age, BMI, Cholesterol level (Numeric)
- Smoking status, Medical history (Categorical)
- Insurance Customer Risk Grouping
- Age, Annual Premium (Numeric)
- Car type, Marital status (Categorical)
- Socioeconomic Grouping
- Household income, Number of dependents (Numeric)
- Home ownership status, Education level (categorical)
10.2.12 Conclusion
- K-Prototypes combines numerical and categorical variables into one clustering algorithm
- It is ideal for real-world datasets where not all features are numbers
- It is also customizeable, as the
γ
(gamma) parameter allows you to balance numerical and categorical importance
10.2.13 Further Readings
10.3 Stochastic Neighbor Embedding
Stochastic Neighbor Embedding (SNE) is a dimensionality reduction technique used to project high-dimensional data into a lower-dimensional space (often 2D or 3D) while preserving local neighborhoods of points. It is particularly popular for visualization tasks, helping to reveal clusters or groupings among similar points. Key characteristics include:
- Unsupervised: It does not require labels, relying on similarity or distance metrics among data points.
- Probabilistic framework: Pairwise distances in the original space are interpreted as conditional probabilities, which SNE attempts to replicate in the lower-dimensional space.
- Common for exploratory data analysis: Especially useful for high-dimensional datasets such as images, text embeddings, or genetic data.
10.3.1 Statistical Rationale
The core idea behind SNE is to preserve local neighborhoods of each point in the data:
For each point \(x_i\) in the high-dimensional space, SNE defines a conditional probability \(p_{j|i}\) that represents how likely \(x_j\) is a neighbor of \(x_i\).
The probability \(p_{j|i}\) is modeled using a Gaussian distribution centered on \(x_i\):
\[ p_{j|i} = \frac{\exp\left(- \| x_i - x_j \|^2 / 2 \sigma_i^2\right)}{\sum_{k \neq i} \exp\left(- \| x_i - x_k \|^2 / 2 \sigma_i^2\right)}, \] where \(\sigma_i\) is a variance parameter controlling the neighborhood size.
Each point \(x_i\) is mapped to a lower-dimensional counterpart \(y_i\), and a corresponding probability \(q_{j|i}\) is defined similarly in that space.
The objective function minimizes the Kullback–Leibler (KL) divergence between the high-dimensional and low-dimensional conditional probabilities, encouraging a faithful representation of local neighborhoods.
10.3.2 t-SNE Variation
The t-SNE (t-distributed Stochastic Neighbor Embedding) addresses two main issues in the original formulation of SNE:
- The crowding problem: In high dimensions, pairwise distances tend to spread out; in 2D or 3D, they can crowd together. t-SNE uses a Student t-distribution (with one degree of freedom) in the low-dimensional space, which has heavier tails than a Gaussian.
- Symmetric probabilities: t-SNE symmetrizes probabilities \(p_{ij} = (p_{j|i} + p_{i|j}) / (2N)\), simplifying computation.
The Student t-distribution for low-dimensional similarity is given by: \[ q_{ij} = \frac{\bigl(1 + \| y_i - y_j \|^2 \bigr)^{-1}}{\sum_{k \neq l} \bigl(1 + \| y_k - y_l \|^2 \bigr)^{-1}}. \] This heavier tail ensures that distant points are not forced too close, thus reducing the crowding effect.
10.3.3 Supervised Variation
Although SNE and t-SNE are fundamentally unsupervised, it is possible to integrate label information. In a supervised variant, distances between similarly labeled points may be reduced (or differently weighted), and additional constraints can be imposed to promote class separation in the lower-dimensional embedding. These approaches can help when partial label information is available and you want to blend supervised and unsupervised insights.
10.3.4 Demonstration with a Subset of the NIST Digits Data
Below is a brief example in Python using t-SNE on a small subset of the MNIST digits (which is itself a curated subset of the original NIST data).
import numpy as np
from sklearn.datasets import fetch_openml
from sklearn.manifold import TSNE
import matplotlib.pyplot as plt
= fetch_openml('mnist_784', version=1)
mnist = mnist.data[:2000]
X = mnist.target[:2000]
y
= TSNE(n_components=2, perplexity=30, learning_rate='auto',
tsne ='random', random_state=42)
init= tsne.fit_transform(X)
X_embedded
# Create a separate scatter plot for each digit to show a legend
plt.figure()= np.unique(y)
digits for digit in digits:
= (y == digit)
idx
plt.scatter(0],
X_embedded[idx, 1],
X_embedded[idx, =f"Digit {digit}",
label=0.5
alpha
)"t-SNE on a Subset of MNIST Digits (by class)")
plt.title("Dimension 1")
plt.xlabel("Dimension 2")
plt.ylabel(
plt.legend() plt.show()
In the visualization:
- Points belonging to the same digit typically cluster together.
- Ambiguous or poorly written digits often end up bridging two clusters.
- Some digits, such as 3 and 5, may be visually similar and can appear partially overlapping in the 2D space.
10.4 Principal Component Analysis (PCA)
The following section is written by Mezmur Edo, a PhD student in the physics department. This section will focus on the motivation, intuition and theory behind PCA. It will also demonstrate the importance of scaling for proper implementation of PCA.
10.4.1 Motivation
Some of the motivations behind PCA are:
Computation Efficiency
Feature Extraction
Visualization
Curse of dimensionality
10.4.1.1 Curse of Dimensionality
The Euclidean distance between data points, which we represent as vectors, shrinks with the number of dimensions. To demonstrate this, let’s generate 10,000 vectors of n dimensions each, where n ranges from 2 to 50, with integer entries ranging from -100 to 100. By selecting a random vector, Q, of the same dimension, we can calculate the Euclidean distance of Q to each of these 10,000 vectors. The plot below shows the logarithm, to the base 10, of difference between the maximum and minimum distances divided by the minimum distance as a function of the number of dimensions.
#import libraries
import numpy as np
import matplotlib.pyplot as plt
from sklearn.decomposition import PCA
from sklearn.preprocessing import StandardScaler, scale, normalize
import os
import math
from matplotlib.ticker import AutoMinorLocator
#define a list to store delta values
#delta is the logarithm, to the base 10, of difference between
#the maximum and minimum Euclidean distances divided
#by the minimum distance
= []
deltas
#loop through dimensions from 2 to 49
for N in range(2, 50):
#generate 10,000 random N-dimensional vectors, P, and
#a single random N-dimensional vector, Q
= [np.random.randint(-100, 100, N) for _ in range(10000)]
P = np.random.randint(-100, 100, N)
Q
#calculate the Euclidean distances between each point in P and Q
= [np.linalg.norm(p - Q) for p in P]
diffs
#find the maximum and minimum Euclidean distances
= max(diffs)
mxd = min(diffs)
mnd
#calculate delta
= math.log10(mxd - mnd) / mnd
delta
deltas.append(delta)
#plot delta versus N, the number of dimensions
range(2, 50), deltas)
plt.plot('Number of dimension', loc='right', fontsize=10)
plt.xlabel('Euclidean Distance', loc='top', fontsize=10)
plt.ylabel(= plt.gca()
ax
#add minor locators to the axes
ax.xaxis.set_minor_locator(AutoMinorLocator())
ax.yaxis.set_minor_locator(AutoMinorLocator())
plt.show()
10.4.2 Intuition
We aim to find orthogonal directions of maximum variance in data. Directions with sufficiently low variance in the data can be removed.
= np.random.RandomState(0)
rng = 200
n_samples
#generate a 2D dataset with 200 entries from
#a multivariate normal distribution
#with covariances [[3, 3], [3, 4]]
#and mean [0, 0]
= rng.multivariate_normal(mean=[0,0], \
X =[[3, 3], [3, 4]], size=n_samples)
cov
#perform PCA on the generated data to find
#the two principal components
= PCA(n_components=2).fit(X)
pca
#plot the generated data wih label 'Data'
0], X[:,1], label = 'Data')
plt.scatter(X[:,
#plot the first principal component scaled by
#its explained variance
#set color, linewidth and label
= pca.explained_variance_[0]
first_principal_cpt_explained_var = [[0, pca.components_[0][0]*first_principal_cpt_explained_var] \
first_principal_cpt 0, pca.components_[0][1]*first_principal_cpt_explained_var]]
, [
0], first_principal_cpt[1] \
plt.plot(first_principal_cpt[='green', linewidth=5 \
, color= r'First Principal Component ($p_1$)')
, label
#plot the second principal component scaled by
#its explained variance
#set color, linewidth and label
= pca.explained_variance_[1]
second_principal_cpt_explained_var = [[0, pca.components_[1][0]*second_principal_cpt_explained_var] \
second_principal_cpt 0, pca.components_[1][1]*second_principal_cpt_explained_var]]
, [
0], second_principal_cpt[1] \
plt.plot(second_principal_cpt[='red', linewidth=5 \
, color= r'Second Principal Component ($p_2$)')
, label
"")
plt.title("First Feature", loc = 'right', fontsize = 10)
plt.xlabel("Second Feature", loc = 'top', fontsize = 10)
plt.ylabel(
plt.legend() plt.show()
We can then project the data onto the first principal component direction, \(p_1\).
10.4.3 Theory
Let \(x\) be a data point with features \(f_1\), \(f_2\), \(f_3\), …, \(f_n\),
\[x = \begin{pmatrix} f_1\\ f_2\\ f_3\\ .\\ .\\ .\\ f_n \end{pmatrix}. \]
The projection of x onto p is then,
\[x^{T} \frac{p}{||p||}.\]
Hence, the projection of all data points onto the principal component direction, p, can be written as,
\[\begin{pmatrix} x_1^{T} \frac{p}{||p||}\\ x_2^{T} \frac{p}{||p||}\\ x_3^{T} \frac{p}{||p||}\\ .\\ .\\ .\\ x_m^{T} \frac{p}{||p||} \end{pmatrix} = X\frac{p}{||p||},\]
where:
- X is the design matrix consisting m datapoints.
10.4.3.1 The Optimization Problem
Let \(\bar{x}\) be the sample mean vector such that,
\[\bar{x} = \frac{1}{m}\sum_{i=1}^{m}x^{(i)}.\]
The sample covariance matrix is then given by,
\[S = \frac{1}{m} X^TX - \bar{x}\bar{x}^T,\]
where:
- \(S_{ij}\) is the covarance of feature i and feature j.
For a sample mean of the projected data, \(\bar{a}\),
\[\bar{a} = \frac{1}{m}\sum_{i=1}^{m}x^{(i)T}p = \bar{x}^Tp,\]
the sample variance of the projected data can be written as,
\[\sigma^{2}= \frac{1}{m}\sum_{i=1}^{m}(x^{(i)T}p)^2 - \bar{a}^{2} = p^{T}Sp.\]
Then, our optimization problem simplifies to maximizing the sample variance,
\[\max_p \space p^{T}Sp \space s.t. ||p||=1,\]
which has the following solution,
\[Sp = \lambda p.\]
10.4.3.2 Scikit-learn Implementation
Computation can be done using the single value decomposition of X,
\[X = U \Sigma V^T.\]
If the data is mean-centered (the default option in scikit-learn), the sample covariance matrix is given by,
\[S = \frac{1}{m} X^TX = \frac{1}{m} V\Sigma U^T U \Sigma V^T = V\frac{1}{m}\Sigma^2V^T,\]
which is the eigenvalue decomposition of S, with its eigenvectors as the columns of \(V\) and the corresponding eigenvalues as diagonal entries of \(\frac{1}{m}\Sigma^2\).
The variance explained by the j-th principal component, \(p_j\), is \(\lambda_{j}\) and the total variance explained is the sum of all the eigenvalues, which is also equal to the trace of S. The total variance explained by the first k principal componentsis then given by,
\[\frac{\sum_{j=1}^{k} \lambda_j}{trace(s)}.\]
10.4.4 PCA With and Without Scaling
For proper implementation of PCA, data must be scaled. To demonstrate this, we generate a dataset with the first 4 features selected from a normal distribution with mean 0 and standard deviation 1. We then append a fifth feature drawn from a uniform distribution with integer entries ranging from 1 to 10. The plot of the projection of the data onto first principal component versus the projection onto the second principal component does not show the expected noise structure unless the data is scaled.
42)
np.random.seed(
#generate a feature of size 10,000 with integer entries
#ranging from 1 to 10
= np.random.randint(1, 10, 10000)
feature = 10000
N = 4
P
#generate a 4D dataset drawn from a normal distribution of 10,000 entries
#then append the feature to X, making it a 5D dataset
= np.random.normal(size=[N,P])
X = np.append(X, feature.reshape(10000,1), axis = 1)
X
#perform PCA with 2 components on the dataset without scaling
= PCA(2)
pca = pca.fit_transform(X)
pca_no_scale
#plot the projection of the data onto the first principal
#component versus the projection onto
#the second principal component
0], pca_no_scale[:,1])
plt.scatter(pca_no_scale[:,"PCA without Scaling")
plt.title("Principal Component 1", loc = 'right', fontsize = 10)
plt.xlabel("Principal Component 2", loc = 'top', fontsize = 10)
plt.ylabel(
plt.show()
#scale data, mean-center and divide by the standard deviation
= scale(X)
Xn
#perform PCA with 2 components on the scaled data
= PCA(2)
pca = pca.fit_transform(Xn)
pca_scale
#plot the projection of the data onto the first principal
#component versus the projection onto
#the second principal component
0], pca_scale[:,1])
plt.scatter(pca_scale[:,"PCA with Scaling")
plt.title("Principal Component 1", loc = 'right', fontsize = 10)
plt.xlabel("Principal Component 2", loc = 'top', fontsize = 10)
plt.ylabel( plt.show()
10.4.5 Summary
PCA is a dimensionality reduction technique that projects data onto directions which explain the most variance in the data.
The principal component directions are the eigenvectors of the sample covariance matrix and the corresponding eigenvalues represent the variances explained.
For proper implementation of PCA, data must be mean-centered, scikit-learn default, and scaled.
10.4.6 Further Readings
10.5 Choosing the Optimal Number of Clusters
This section was contributed by Nicholas Pfeifer, a junior majoring in Statistics and minoring in Real Estate and Computer Science.
This section will cover the following:
Why use clustering? What are its applications?
K-means Clustering and Hierarchical Clustering algorithms
How to determine the optimal number of clusters
10.5.1 Why Clustering? What is it?
Clustering is an exploratory approach looking to identify natural categories in the data. The overall goal is to Place observations into groups (“clusters”) based on similarities or patterns. It can be viewed as an Unsupervised Learning technique since the algorithm does not use a target variable to discover patterns and make groups. This is in contrast to regression, for instance, where the target variable is used in the process of generating a model. Clustering can be effective at identifying trends, patterns, or outliers in a dataset.
- Clustering is useful when…
- the true number of clusters is not known in advance
- working with large unlabeled data
- looking to detect anomolies/outliers
10.5.1.1 Applications
Clustering has a plethora of applications. Some of the most popular ones are outlined below.
- Market Reasearch
- Customer Segmentation - grouping customers by demographics or behaviors
- Sales Analysis - based on the clusters, which groups purchase the product/service and which groups do not
- Anomaly Detection
- Banks - combat fraud by distinguishing characteristics that stand out
- Image Segmentation
- Identifying sections, objects, or regions of interest
- Classify land using satellite imagery - vegetation, industrial use, etc.
10.5.2 How to measure the quality of clustering outcome
When assigning data points to clusters, there are two aspects to consider when judging the quality of the resulting clusters:
- Intra-cluster Distance: The distance between data points within a cluster (can also be referred to as within-cluster distance)
- The smaller the distance/variation within clusters, the better the clustering result
- Ideally similar data points are clustered together
- Inter-cluster Distance: The distance between data points in separate clusters (can also be referred to as between-cluster distance)
- The larger the distance/variation between clusters, the better the clustering result
- Ideally dissimilar data points are in different clusters
In essence, the objective is for points within a cluster to be as similar to each other as possible, and for points belonging to different clusters to be as distinct as possible.
The following code outputs two possible ways to cluster 10 observations from the MNIST handwritten digits dataset introduced in the Unsupervised Learning chapter of these class notes. The dimensionally of the observations has been reduced to 2 dimensions using t-SNE in order to make visualization easier.
from sklearn.datasets import fetch_openml
import numpy as np
import pandas as pd
from sklearn.manifold import TSNE
import matplotlib.pyplot as plt
= fetch_openml('mnist_784', version=1)
mnist = pd.DataFrame(mnist.data)
mnist_example_df = mnist_example_df[:10]
mnist_example_df
= TSNE(n_components=2, perplexity=5,
tsne ='auto',
learning_rate='random', random_state=416)
init
= tsne.fit_transform(mnist_example_df)
mnist_example_df
= pd.DataFrame(mnist_example_df)
mnist_example_df = ['dimension_1', 'dimension_2']
mnist_example_df.columns
'clustering_1'] = [1, 1, 3, 2, 3, 3, 2, 1, 2, 3]
mnist_example_df['clustering_2'] = [1, 1, 2, 2, 3, 3, 1, 3, 2, 2]
mnist_example_df[
= plt.subplots(1, 2, figsize=(10, 5))
fig, (ax1, ax2)
'dimension_1'],
ax1.scatter(mnist_example_df['dimension_2'],
mnist_example_df[= mnist_example_df['clustering_1'],
c = 'rainbow')
cmap 'Dimension 1')
ax1.set_xlabel('Dimension 2')
ax1.set_ylabel('Clustering 1')
ax1.set_title(
'dimension_1'],
ax2.scatter(mnist_example_df['dimension_2'],
mnist_example_df[= mnist_example_df['clustering_2'],
c = 'rainbow')
cmap 'Dimension 1')
ax2.set_xlabel('Dimension 2')
ax2.set_ylabel('Clustering 2')
ax2.set_title(
; plt.tight_layout()
Here are two different clusterings. Hopefully it is apparent which clustering is preferred. Clustering 1 is better than clustering 2 since points in the same cluster are closer to each other, and the clusters themselves are further apart. Some points in clustering 2 are more similar to points of other clusters than points within their own cluster. Ideally a clustering more closely resembles the result seen in clustering 1.
10.5.3 Clustering Algorithms
They are many different clustering algorithms out there, but for simplicity this section will focus on the K-means and Hierarchical clustering algorithms.
- K-means
- Top-down approach
- Centroid based
- Hierarchical (Agglomerative)
- Bottom-up approach
- Tree-like structure
- Others include:
- K-mediods, DBSCAN, Gaussian Mixture Model, etc.
10.5.3.1 K-means Algorithm
The K-means algorithm has already been introduced in the unsupervised learning chapter, so this will serve as a brief refresher. The steps of the algorithm are as follows:
- Must specify a number of clusters k
- Data points are randomly assigned to k intial clusters
- The centroid of each cluster is calculated
- Data points are reassigned to the cluster with the closest centroid according to euclidean distance
- Iterate the previous 2 steps until cluster assignments no longer change or a set number of iterations have been completed
from sklearn.cluster import KMeans
= mnist_example_df.drop(['clustering_1', 'clustering_2'],
mnist_example_df = 1)
axis
= KMeans(n_clusters = 3, random_state = 416,
kmeans = 16).fit(mnist_example_df)
n_init
'labels'] = kmeans.labels_
mnist_example_df[
=(10, 7))
plt.figure(figsize'dimension_1'],
plt.scatter(mnist_example_df['dimension_2'],
mnist_example_df[= mnist_example_df['labels'],
c = 'rainbow')
cmap 0],
plt.scatter(kmeans.cluster_centers_[:, 1],
kmeans.cluster_centers_[:, = '*', c = 'y', label = 'Centroids',
marker = 100)
s
'Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel('K-means with k = 3')
plt.title(; plt.legend()
Here is an example clustering result of K-means clustering with k = 3 (clusters) on the same 10 MNIST observations. The final centroids are included in the plot.
10.5.3.2 Hierarchical Clustering Algorithm
Hierarchical clustering is another algorithm which differs substantially from K-means. Something particularly of note is that the hierarchical approach does not require the number of clusters to be specified in advance. This can be seen as a drawback of K-means. The decision on the number of clusters can be made by based on the resulting tree-like structure called a Dendrogram. The steps of this algorithm are shown below:
- Each data point is initailly assigned to its own cluster
- Check the distance between every possible pair of clusters
- Merge the closest pair of clusters into one cluster
- Iterate the previous 2 steps until all of the data points are in one cluster
- Cut the resulting Dendrogram
from sklearn.cluster import AgglomerativeClustering
from scipy.cluster.hierarchy import dendrogram, linkage
= mnist_example_df.drop(['labels'], axis = 1)
mnist_example_df
= plt.subplots(2, 2, figsize=(10, 8))
fig, axs
= AgglomerativeClustering(n_clusters = None, distance_threshold = 0,
H_Clust = 'ward')
linkage = H_Clust.fit_predict(mnist_example_df)
clusters
= linkage(mnist_example_df, method = 'ward',
clust_linkage = 'euclidean')
metric
#plt.figure(figsize=(10, 7))
= axs[0, 1])
dendrogram(clust_linkage, ax 0, 1].set_title('Dendrogram')
axs[0, 1].set_xlabel('Sample Index')
axs[0, 1].set_ylabel('Distance')
axs[
0, 0].scatter(mnist_example_df['dimension_1'],
axs['dimension_2'], c=clusters, cmap='rainbow')
mnist_example_df[for i, label in enumerate(range(0, 10)):
0, 0].text(mnist_example_df['dimension_1'][i] - 3,
axs['dimension_2'][i], str(label),
mnist_example_df[= 16, ha = 'right')
fontsize 0, 0].set_title('Hierarchical Clustering')
axs[0, 0].set_xlabel('Dimension 1')
axs[0, 0].set_ylabel('Dimension 2')
axs[
= AgglomerativeClustering(n_clusters = 4, distance_threshold = None,
H_Clust = 'ward')
linkage = H_Clust.fit_predict(mnist_example_df)
clusters
= linkage(mnist_example_df, method = 'ward',
clust_linkage = 'euclidean')
metric
#plt.figure(figsize=(10, 7))
= 70, ax = axs[1, 1])
dendrogram(clust_linkage, color_threshold 1, 1].set_title('Dendrogram')
axs[1, 1].set_xlabel('Sample Index')
axs[1, 1].set_ylabel('Distance')
axs[
1, 0].scatter(mnist_example_df['dimension_1'],
axs['dimension_2'], c=clusters, cmap='rainbow')
mnist_example_df[for i, label in enumerate(range(0, 10)):
1, 0].text(mnist_example_df['dimension_1'][i] - 3,
axs['dimension_2'][i], str(label),
mnist_example_df[= 16, ha = 'right')
fontsize 1, 0].set_title('Hierarchical Clustering')
axs[1, 0].set_xlabel('Dimension 1')
axs[1, 0].set_ylabel('Dimension 2')
axs[
plt.tight_layout()
plt.show()
Here is an example of Hierarchical clustering on the same 10 MNIST observations. The top row is the result when the number of clusters has not been specified. In the top left plot each data point is its own cluster. The indices of the data points can be seen in the dendrogram to the right. For this plot (top right) the colors are irrelevant. Potential clusterings of data points can be seen as the clusters are merged from bottom to top. The smaller the vertical distance the closer those clusters are to each other (and vice-versa). In the bottom row, the algorithm has been instructed to generate 4 clusters. The colors in the dendrogram do not align with those shown in the plot, so it is better to refer to the indices. Here, the dendrogram has been cut such that the closest clusters are merged together until there are 4 clusters. How to choose the height to cut the dendrogram will be discussed later on in the section.
10.5.4 Methods for selecting the optimal number of clusters
Selecting the optimal number of clusters is important since the results can be misleading if our clustering differs greatly from the true number of clusters. There are many different methods for selecting the optimal number of clusters, but for now we will delve into 4 of the most popular methods. It is important to note that no method works well in every scenario and that different methods can give differing results.
Here are the methods covered in this section:
- Inspect a Dendrogram
- Elbow Method
- Silhouette Method
- Gap Statistic
10.5.4.1 Hierarchical Clustering Example
In this example we will continue to use the MNIST dataset, however this time 2000 observations will be selected at random to be clustered.
from sklearn.datasets import fetch_openml
import numpy as np
import pandas as pd
from sklearn.utils import resample
# Fetching NIST dataset
= fetch_openml('mnist_784', version=1)
mnist
= pd.DataFrame(mnist.data)
mnist_df
# Taking a random sample of 2000 images
= resample(mnist_df, n_samples = 2000, random_state = 416)
mnist_rand
= mnist_rand.reset_index().drop('index', axis = 1)
mnist_rand
# Keeping track of the target values
= pd.DataFrame(mnist.target)
mnist_target_df = resample(mnist_target_df,
mnist_target_rand = 2000,
n_samples = 416)
random_state = mnist_target_rand.reset_index().drop('index', axis = 1)
mnist_target_rand
# Distribution is fairly even
'class'].value_counts() mnist_target_rand[
class
1 211
3 209
2 208
5 202
9 202
8 201
6 198
7 197
0 189
4 183
Name: count, dtype: int64
The distribution of the 2000 randomly sampled handwritten digits is shown above. The distribution of the digitsappears to be fairly evenly distributed.
Once again, the dimensionality of these images is reduced to 2 dimensions using t-SNE.
from sklearn.manifold import TSNE
import matplotlib.pyplot as plt
# t-SNE dimensionality reduction
= TSNE(n_components=2, perplexity=30,
tsne ='auto',
learning_rate='random', random_state=416)
init
= tsne.fit_transform(mnist_rand)
mnist_embedded
= pd.DataFrame(mnist_embedded)
mnist_embedded_df = ['dimension_1', 'dimension_2']
mnist_embedded_df.columns
=(10, 7))
plt.figure(figsize'dimension_1'],
plt.scatter(mnist_embedded_df['dimension_2'])
mnist_embedded_df['Random Sample of 2000 MNIST Digits')
plt.title('Dimension 1')
plt.xlabel('Dimension 2'); plt.ylabel(
Here is a scatterplot of the 2000 randomly sampled images above without looking at their actual labels.
from sklearn.cluster import AgglomerativeClustering
from scipy.cluster.hierarchy import dendrogram, linkage
= AgglomerativeClustering(n_clusters = None, distance_threshold = 0,
H_Clust = 'ward')
linkage = H_Clust.fit_predict(mnist_embedded_df)
clusters
= linkage(mnist_embedded_df, method = 'ward')
clust_linkage
=(10, 7))
plt.figure(figsize
dendrogram(clust_linkage)'Dendrogram')
plt.title('Sample Index')
plt.xlabel('Distance')
plt.ylabel( plt.show()
After conducting Hierarchical clustering without specifying the number of clusters, we have a dendrogram. Now comes the decision of where to make a horizontal cut. There is a paper about “dynamic cuts” that are flexible and do not cut at a constant height, but that is outside of the current scope (Langfelder et al. (2008)). When looking at the dendrogram above, suppose we do not know the true number of clusters. Generally, when cutting the tree, we want the resulting clusters to be around the same height. Vertical distance represents dissimilarity, so we do not want clusters of high disimilarity to be merged together. Remember that good clustering involves small distances within clusters and large distances between clusters. This is a subjective approach and sometimes it may be difficult to find the best height to cut the dendrogram. Perhaps with domain knowledge a predefined threshold could be a good height at which to cut. For this example I chose to cut the tree at a height of 200. That resulted in 11 clusters which will be analyzed below.
from scipy.cluster.hierarchy import cut_tree
# cut the tree
= cut_tree(clust_linkage, height = 200)
new_clusters
'cluster'] = new_clusters
mnist_embedded_df[
# Plot the new clusters
=(10, 7))
plt.figure(figsize'dimension_1'],
plt.scatter(mnist_embedded_df['dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'Hierarchical Clustering (11 clusters)')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel( plt.show()
Here are the 11 clusters obtained after cutting the tree. What do these clusters signify? Maybe by adding some labels to the clusters, that will become more clear.
# Plot clusters with labels (cluster labels not actual!)
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'Hierarchical Clustering (11 clusters)')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel(= plt.legend(*scatter.legend_elements(), title="Cluster")
legend1
plt.gca().add_artist(legend1) plt.show()
Now the clusters have been associated with their cluster label, but this does not represent the actual handwritten digits.
In this case it is hard to determine what the clusters signify if the target values are unknown. However we do know the target value (actual handwritten digit) for each image. This information can help to label the clusters and make them more interpretable.
'actual'] = mnist_target_rand['class']
mnist_embedded_df[
# calculating mode and proportion of observations in the cluster that are the
# mode in each cluster
= mnist_embedded_df.groupby('cluster').agg(
modes 'actual': [lambda x: x.mode().iloc[0],
{lambda y: (y == y.mode().iloc[0]).sum()/len(y)]})
= ['mode', 'proportion']
modes.columns modes
mode | proportion | |
---|---|---|
cluster | ||
0 | 4 | 0.482394 |
1 | 8 | 0.680180 |
2 | 3 | 0.726923 |
3 | 0 | 0.942708 |
4 | 7 | 0.490196 |
5 | 6 | 0.936893 |
6 | 7 | 0.732394 |
7 | 5 | 0.897436 |
8 | 2 | 0.853846 |
9 | 1 | 0.892157 |
10 | 1 | 0.519481 |
This code above calculates the mode digit of each cluster along with the proportion of observations in the cluster that are the mode. Now let’s label the clusters by their mode.
# Plot clusters with (actual) labels (modes)
= modes['mode']
new_labels
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'Hierarchical Clustering (11 clusters)')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel(
= scatter.legend_elements()
handles, _ ="Mode")
plt.legend(handles, new_labels, title
plt.show()
Now we can get a better understanding of the clustering. Although there are 11 clusters in total, you will notice that every digit does not appear as the mode of a cluster. 9 is not the mode of any cluster while 4 and 7 are the modes of multiple clusters. At the very least the clusters with 4 and 7 as the mode are very close to each other. Also intuitively the digits 0, 6, and 8 are written similarly, so it makes sense to see those clusters in the same general area.
Just out of curiosity, let’s look at the actual distribution of the digits.
# Showing the actual distribution of classes
'actual'] = mnist_embedded_df['actual'].astype('int64')
mnist_embedded_df[
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'],
mnist_embedded_df[= mnist_embedded_df['actual'], cmap='rainbow')
c 'Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel('True Distribution of Target')
plt.title(= plt.legend(*scatter.legend_elements(), title="Value")
legend1 ; plt.gca().add_artist(legend1)
Analyzing the cluster performance by viewing the actual distribution of the target is becoming de facto supervized learning, but not really since the clustering algorithm does not know or use the information of the target. For the purposes of this section it is just to see how well the clustering found the true clusters. For the most part it looks like the clustering did a moderately good job at identifying the true clusters of the digits in 2 dimensions. The digits 4, 7, and 9 seem to be very similar in 2D and is understandably more difficult for the algorithm to distinguish.
Since the true number of clusters is known, let’s see what it looks like with 10 clusters just out of curiosity again.
# Try cutting with 10 clusters instead
= cut_tree(clust_linkage, n_clusters=10)
new_clusters 'cluster'] = new_clusters
mnist_embedded_df[
= mnist_embedded_df.groupby('cluster').agg(
modes 'actual': [lambda x: x.mode().iloc[0],
{lambda y: (y == y.mode().iloc[0]).sum()/len(y),
lambda x: x.value_counts().index[1],
lambda y: (y == y.value_counts().index[1]).sum()/len(y)]})
= ['mode', 'proportion', 'mode_2', 'proportion_2']
modes.columns
# Plot clusters with mode labels
= modes['mode']
new_labels
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'Hierarchical Clustering (10 clusters)')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel(
= scatter.legend_elements()
handles, _ ="Mode")
plt.legend(handles, new_labels, title
plt.show()
The difference appears to be that the two clusters with 4 as mode merged into one cluster.
modes
mode | proportion | mode_2 | proportion_2 | |
---|---|---|---|---|
cluster | ||||
0 | 4 | 0.482394 | 9 | 0.454225 |
1 | 8 | 0.680180 | 5 | 0.274775 |
2 | 3 | 0.726923 | 5 | 0.192308 |
3 | 0 | 0.942708 | 2 | 0.031250 |
4 | 7 | 0.490196 | 9 | 0.385621 |
5 | 6 | 0.936893 | 0 | 0.029126 |
6 | 7 | 0.732394 | 4 | 0.232394 |
7 | 5 | 0.897436 | 3 | 0.064103 |
8 | 2 | 0.853846 | 8 | 0.084615 |
9 | 1 | 0.633634 | 2 | 0.243243 |
This table contains the mode of each cluster as well as the second most common value in each cluster denoted at mode_2. Interestingly, 9 appears as the second most common value in 3 different clusters.
10.5.4.2 K-means Clustering Example: Elbow Method
For the next 3 methods the K-means algorithm will be used on the same random 2000 MNIST images in 2 dimensions.
The goal of the Elbow method is to minimize the within cluster sum of squares (WSS), which is also refered to as inertia. The optimal number of clusters is K such that adding another cluster does not (significantly) improve WSS. Whenever the number of clusters increases, inertia will decrease since there are fewer points in each cluster that become closer to their cluster’s center. The idea of the Elbow method is that the rate of decrease in WSS changes based on the optimal number of clusters, K. When k < K, (approaching optimal number) inertia decreases rapidly. When k > K, (going past optimal number) inertia decreases slowly. K is found by plotting inertia over a range of k and looking for a bend or “elbow”, hence the name.
# K-means
from sklearn.cluster import KMeans
# removing non-nist columns
= mnist_embedded_df.drop(['cluster', 'actual'], axis = 1)
mnist_embedded_df
# elbow method for k between 1 and 20 on same MNIST data
= []
wcss
for k in range(1, 21):
= KMeans(n_clusters = k, random_state = 416).fit(mnist_embedded_df)
model
wcss.append(model.inertia_)
=(10, 7))
plt.figure(figsizerange(1, 21), wcss, 'bx-')
plt.plot('Number of Clusters (k)')
plt.xlabel('Within-Cluster Sum of Squares')
plt.ylabel('Elbow Method')
plt.title(
plt.show()# Seems inconclusive, maybe 7?
The code above stores the inertia for values of k between 1 and 20, and creates the plot. In this case it is somewhat inconclusive. It looks like the decrease in inertia starts to slow down at k = 7. Like with the dendrogram, this method is also subjective.
10.5.4.3 K-means Clustering Example: Silhouette Method
Next is the Silhouette method, which is the most objective method of the 4 covered in this section
Silhouette Score
Before delving into the Silhouette method, it is good to get an understanding of Silhouette Score. The silhouette s of a data point is, \[s = (b-a)/\max(a, b).\]
- At each data point, the distance to its cluster’s center = a
- And the distance to the second best cluster center = b
- Second best suggests closest cluster that is not the current cluster
- s can take any value between -1 and 1
Interpreting Silhouette Score
There are 3 main categories that a data point can fall into:
- If a data point is very close to its own cluster and very far from the second best cluster (a is small, and b is big), then s is close to 1 (close to \(b/b\))
- If a data point is roughly the same distance to its own cluster as the second best cluster (\(a \approx b\)), then s \(\approx\) 0
- If a data point is very far from its own cluster and very close to the second best cluster (a is big, and b is small), then s is close to -1 (close to -a/a)
For optimal clustering, we want most data points to fall into the first category. In other words we want silhouette scores to be as close to 1 as possible.
Silhouette Coefficient
The Silhouette Coefficient is represented by the average silhouette score of the data points. This metric does a good job of summarizing both within-cluster and between-cluster variation. The closer the Silhouette Coefficient is to 1, the better the clustering. Similar to the Elbow method, the optimal K is selected by calculating the Silhouette Coefficient over a range of k’s, and choosing K with the maximum Silhouette Coefficient.
# Silhoutte method
from sklearn.metrics import silhouette_score
= []
silhouette_average_scores
for k in range (2, 21):
= KMeans(n_clusters = k, random_state = 416)
kmeans = kmeans.fit_predict(mnist_embedded_df)
cluster_labels
= silhouette_score(mnist_embedded_df, cluster_labels)
silhouette_avg
silhouette_average_scores.append(silhouette_avg)
# Plot silhouette scores
=(10, 7))
plt.figure(figsizelist(range(2,21)), silhouette_average_scores, marker='o')
plt.plot("Silhouette Coefficients")
plt.title("Number of Clusters (k)")
plt.xlabel("Average Silhouette Score")
plt.ylabel(
plt.show()# k = 7 has the highest average silhouette score
Here the Silhouette Coefficient is calculated for k between 2 and 20. The maximum occurs at k = 7, which is coincidentally the same result as the Elbow method. Let’s visualize how these 7 clusters look on our 2000 MNIST digits.
= KMeans(n_clusters = 7, random_state = 416)
kmeans = kmeans.fit_predict(mnist_embedded_df)
cluster_labels 'cluster'] = cluster_labels
mnist_embedded_df[
# K-means with k= 7 (cluster labels, not actual!)
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'K-means with k = 7 clusters')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel(= plt.legend(*scatter.legend_elements(), title="Cluster")
legend1
plt.gca().add_artist(legend1) plt.show()
These are just the cluster labels, not the actual digits.
'actual'] = mnist_target_rand['class']
mnist_embedded_df[
= mnist_embedded_df.groupby('cluster').agg(
modes 'actual': [lambda x: x.mode().iloc[0],
{lambda y: (y == y.mode().iloc[0]).sum()/len(y),
lambda x: x.value_counts().index[1],
lambda y: (y == y.value_counts().index[1]).sum()/len(y)]})
= ['mode', 'proportion', 'mode_2', 'proportion_2']
modes.columns modes
mode | proportion | mode_2 | proportion_2 | |
---|---|---|---|---|
cluster | ||||
0 | 1 | 0.655280 | 2 | 0.248447 |
1 | 8 | 0.458781 | 2 | 0.390681 |
2 | 7 | 0.550769 | 4 | 0.215385 |
3 | 6 | 0.726592 | 5 | 0.205993 |
4 | 3 | 0.606349 | 5 | 0.228571 |
5 | 9 | 0.451957 | 4 | 0.380783 |
6 | 0 | 0.862559 | 5 | 0.075829 |
Here are the modes which can be used to label the 7 clusters.
# Plot clusters with (actual) labels (modes)
= modes['mode']
new_labels
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'K-means with k = 7 clusters')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel(
= scatter.legend_elements()
handles, _ ="Mode")
plt.legend(handles, new_labels, title
plt.show()
Now we have the actual mode labels of the 7 clusters obtained from K-means. Interestingly, the area that used to have 4 as the label now has 9. Now this digits that do not appear as the mode in any cluster are 4, 5, and 8. Looking back at the modes table we see that these digits frequently appear as the second most common value in a cluster at a high rate. Obviously 7 is not the true number of clusters, but perhaps the 2D representation is obscuring the ability to find disimilarities between some of the digits.
# Showing the actual distribution of classes
'actual'] = mnist_embedded_df['actual'].astype('int64')
mnist_embedded_df[
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'],
mnist_embedded_df[= mnist_embedded_df['actual'], cmap='rainbow')
c 'Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel('True Distribution of Target')
plt.title(= plt.legend(*scatter.legend_elements(), title="Value")
legend1 ; plt.gca().add_artist(legend1)
For reference, here is the true distribution of the handwritten digits again.
10.5.4.4 K-means Clustering Example: Gap Statistic
The last method to be covered is the Gap Statistic. The Gap Statistic for a number of clusters k can be written as
\[Gap(k) = \frac{1}{B}\sum_{b=1}^{B} \log(W_{kb}) - \log(W_k).\]
- Compares the total (within) intra-cluster variation for a range of k’s with their expected values
- Calculated by comparing the inertia of a clustered dataset with the inertia of a uniformly distributed random data set (covering the same ranges in the data space)
- A number of random samples (B) are generated that are then clustered over a range of k’s while keeping track of the inertia
- \(W_{kb}\) is the inertia of the b-th random sample with k clusters and \(W_k\) is the inertia of the original data with k clusters
We also need the standard deviation,
\[s_k = \sqrt{1 + \frac{1}{B}}\sqrt{\frac{1}{B}\sum_{b=1}^{B} (\log(W_{kb}) - \overline{W})^2}.\]
Where
\[\overline{W} = \frac{1}{B}\sum_{b=1}^{B} \log(W_{kb}).\]
Choose the smallest k such that the gap statistic is within one standard deviation of the gap at k + 1.
This can be represented by the expression,
\[Gap(k) \geq Gap(k+1) - s_{k+1}.\]
The optimal k may vary over multiple gap statistic simulations since there is randomness involved.
# gap statistic
# removing non-nist columns
= mnist_embedded_df.drop(['cluster', 'actual'], axis = 1)
mnist_embedded_df
def calc_gap_statistic(data, max_k, n = 10):
# Generate reference data from a uniform distribution
def generate_reference_data(X):
return np.random.uniform(low = data.min(axis=0),
= data.max(axis=0),
high =X.shape)
size
= []
gap_values
# Loop over a range of k values
for k in range(1, max_k + 1):
# Fit K-means to the original data
= KMeans(n_clusters = k, random_state = 416)
kmeans
kmeans.fit(data)= kmeans.inertia_
original_inertia
# Compute the average inertia for the reference datasets
= []
reference_inertia for _ in range(n):
= generate_reference_data(data)
random_data
kmeans.fit(random_data)
reference_inertia.append(kmeans.inertia_)
# Calculate the Gap statistic
= np.log(np.mean(reference_inertia)) - np.log(original_inertia)
gap
gap_values.append(gap)
return gap_values
= calc_gap_statistic(mnist_embedded_df, 20, n = 100)
gap_values
=(10, 7))
plt.figure(figsizerange(1, 21), gap_values, marker='o')
plt.plot('Gap Statistic vs Number of Clusters')
plt.title('Number of Clusters (k)')
plt.xlabel('Gap Statistic')
plt.ylabel(
plt.grid()
plt.show()# 2 is the best?
Here a function is defined to calculate the gap statistic. It is calculated for k between 1 and 20 with B = 100 random datasets (the more datasets that are used, the more computationally expensive). In the plot we are looking for the k where the gap statistic is greater than at k + 1 minus standard deviation. In this case we do not even need standard deviation since we observe that Gap(2) is greater than Gap(3). This means that the optimal K is 2 based on this method.
This process can also be conducted using the gapstatistics package.
pip install gapstatistics
from gapstatistics import gapstatistics
= gapstatistics.GapStatistics(distance_metric='euclidean')
gs
= gs.fit_predict(K = 20, X = np.array(mnist_embedded_df))
optimal
print(f'Optimal: {optimal}')
Optimal: 2
The result is also an optimal K of 2. It appears that this method is not very good for this dataset.
= KMeans(n_clusters = 2, random_state = 416)
kmeans = kmeans.fit_predict(mnist_embedded_df)
cluster_labels 'cluster'] = cluster_labels
mnist_embedded_df[
# Cluster labels!
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'K-means with k = 2 clusters')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel(= plt.legend(*scatter.legend_elements(), title="Cluster")
legend1
plt.gca().add_artist(legend1) plt.show()
Here we have K-means with k = 2 with default cluster labels.
'actual'] = mnist_target_rand['class']
mnist_embedded_df[
= mnist_embedded_df.groupby('cluster').agg(
modes 'actual': [lambda x: x.mode().iloc[0],
{lambda y: (y == y.mode().iloc[0]).sum()/len(y),
lambda x: x.value_counts().index[1],
lambda y: (y == y.value_counts().index[1]).sum()/len(y)]})
= ['mode', 'proportion', 'mode_2', 'proportion_2']
modes.columns
# Plot clusters with labels (actual)
= modes['mode']
new_labels
=(10, 7))
plt.figure(figsize= plt.scatter(mnist_embedded_df['dimension_1'],
scatter 'dimension_2'], c=mnist_embedded_df['cluster'],
mnist_embedded_df[='rainbow')
cmap'K-means with k = 2 clusters')
plt.title('Dimension 1')
plt.xlabel('Dimension 2')
plt.ylabel(
= scatter.legend_elements()
handles, _ ="Mode")
plt.legend(handles, new_labels, title
plt.show()
Here are the mode labels but that does not tell us very much.
'cluster'] == 0]['actual'].value_counts()
mnist_embedded_df[mnist_embedded_df[# 1, 9, 7, 4, and 2 are similar
actual
1 211
9 197
7 196
4 181
2 109
5 77
8 38
3 11
6 1
0 0
Name: count, dtype: int64
'cluster'] == 1]['actual'].value_counts()
mnist_embedded_df[mnist_embedded_df[# 3, 6, 0, 8, and 5 are similar
actual
3 198
6 197
0 189
8 163
5 125
2 99
9 5
4 2
7 1
1 0
Name: count, dtype: int64
At the very least we can see which images of handwritten digits look similar in 2 dimensions.
10.5.5 Conclusions
- Using clustering we can figure out which digits look similar to each other when writing by hand
- The true number of clusters in 2D may be different than in the original dimensions. Maybe the algorithms would be better at identifying the different clusters of the MNIST data in 3D
- Choosing the right number of clusters can be challenging but is very important
- There are many methods for selecting the optimal number of clusters and they can yield different results
10.5.6 Further Readings
Defining clusters from a hierarchical cluster tree
sklearn AgglomerativeClustering Documentation
10.6 Autoencoders
This section was written by Kyle Reed, a senior at the University of Connecticut double majoring in Applied Data Analysis and Geographic Information Science.
This section will explore:
What an Autoencoder is
How an Autoencoder works
Potential applications of autoencoders
10.6.1 Introduction
Autoencoders are a specialized type of neural network used in unsupervised learning. They are trained to encode input data into a compressed representation and then decode it back to something as close as possible to the original. This process forces the model to learn the most important features or patterns in the data.
Unlike traditional supervised learning models, autoencoders do not require labeled data. Instead, the model learns from the data itself by minimizing the difference between the original input and the reconstructed output.
10.6.2 How it works
An autoencoder consists of two primary components:
Encoder: Deconstructs the input data into a lower-dimensional representation.
Decoder: Reconstructs the original input from the compressed encoding.
10.6.2.1 Encoder
The encoder compresses data into smaller lower-dimensional groupings. It learns the data and identifies the most essential features of the data while discarding redundant information.
10.6.2.2 Decoder
The decoder reconstructs the data set from the compressed analyzed data generated by the encoder. The decoder attempts to reproduce the data as closely as possible by reversing the compression process.
10.6.3 Application
Autoencoders can be used for:
Data Compression Reduce dataset size for storage and transmission while retaining key information.
Anomaly Detection Identify unusual patterns that differ from the learned norm based on reconstruction error.
Image/Audio Refining
Remove noise, fill missing pixels or sound samples, colorize images, and more.Data Refining/Denoising
Improve dataset quality by correcting errors and filling missing values.
10.6.4 Example usage
For the example, I wish show how autoencoders can be used to compress data and refine data/images. The mnist data set will be used which is a collection of various numbers that are drawn out on a small pixel image.
10.6.4.1 Load and Prepare Data
#Import necessary packages
from tensorflow.keras.datasets import mnist
from tensorflow.keras import layers, models
import numpy as np
import matplotlib.pyplot as plt
#Load data from dataset
= mnist.load_data()
(X_train, _), (X_test, _)
#Ensure proper format and divide by 255 to normalize data
= X_train.astype("float32") / 255.
X_train = X_test.astype("float32") / 255.
X_test
#Reshape the image to make it one-dimensional
= X_train.reshape((len(X_train), -1))
X_train = X_test.reshape((len(X_test), -1)) X_test
2025-05-06 14:59:56.288201: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
Here I used Tensorflow Keras package which is one of the most common and easy to use frameworks people use for building and training autoencoders.
Data is broken into train and test data, and converted to floating-point format. Also, the data is divided by 255 as that is the maximum value for the grey scale.
Reshaping images as seen in the last two lines allows to neural network to compress the data easier when it is one-dimensional.
10.6.4.2 Build The Encoder
#Set the input size to the number of pixels for each image
= X_train.shape[1]
input_dim
#Create and define elements of the autoencoder
= models.Sequential([
autoencoder =(input_dim,)),
layers.Input(shape256, activation='relu'), #Initial encoding
layers.Dense(128, activation='relu'), #Compressed version (Bottleneck)
layers.Dense(256, activation='relu'), #Reconstruction
layers.Dense(='sigmoid') #Final reconstructed version
layers.Dense(input_dim, activation ])
For each of the activation lines, the number represents the number of neurons for each layer, so ‘256’ means that this layer transforms the input into a 256-dimensional representation
When choosing the number of neurons, you want to pick a number that fits your data well. More neurons are needed for larger complex models, but they are not necessary for smaller, less complex models as this could cause overfitting.
10.6.4.3 Train the encoder
#Prepare the autoencoder with the optomizer
compile(optimizer='adam', loss='binary_crossentropy')
autoencoder.
#Train the autoencoder
autoencoder.fit(X_train, X_train,=40,
epochs=256,
batch_size=True,
shuffle=(X_test, X_test)) validation_data
Epoch 1/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 8:21 2s/step - loss: 0.6946 4/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.6874 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.6693 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.6403 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.6096 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.5825 24/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.5589 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.5386 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.5209 36/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.5054 40/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.4917 44/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.4794 48/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.4684 50/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.4633 51/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.4608 53/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.4560 56/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.4492 58/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.4449 61/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.4387 65/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.4311 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.4239 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.4189 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.4156 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.4110 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.4079 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.4022 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.3967 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.3941 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.3903 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.3867 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.3831 101/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.3797 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.3786 104/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.3765 107/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.3733 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.3712 112/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.3682 115/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.3653 117/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.3634 120/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.3606 123/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.3579 126/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.3553 128/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.3535 131/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.3510 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.3486 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.3462 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.3439 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.3431 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.3416 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.3401 147/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.3387 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3380 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3365 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3351 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3331 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3311 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3291 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3272 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3253 170/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3234 173/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3216 176/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3198 179/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3181 181/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3170 182/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.3164 184/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.3153 187/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.3136 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.3120 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.3104 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.3083 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.3068 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.3053 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.3038 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.3028 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.3014 213/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.3005 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2991 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2977 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2963 225/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2950 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2937 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2928 233/235 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2916 235/235 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - loss: 0.2903 - val_loss: 0.1134 Epoch 2/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 13s 59ms/step - loss: 0.1185 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.1175 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.1170 11/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.1166 13/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.1163 16/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.1159 19/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.1155 22/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.1153 25/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.1151 27/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.1149 30/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.1147 32/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.1146 35/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.1144 38/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.1143 41/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.1141 44/235 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - loss: 0.1140 46/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.1139 48/235 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - loss: 0.1138 50/235 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - loss: 0.1137 53/235 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - loss: 0.1136 55/235 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - loss: 0.1135 59/235 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - loss: 0.1134 61/235 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - loss: 0.1133 65/235 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - loss: 0.1132 68/235 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - loss: 0.1131 71/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1129 74/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1128 77/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1127 79/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1127 82/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1125 85/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1124 89/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1123 92/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1122 95/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.1121 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1120 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1119 103/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1118 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1117 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1116 111/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1116 114/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1115 117/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1114 120/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1113 123/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1112 127/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.1111 130/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.1110 133/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.1109 136/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.1108 139/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.1107 141/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1106 142/235 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - loss: 0.1106 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1105 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1104 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1103 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.1102 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1102 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1101 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1100 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1099 170/235 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - loss: 0.1098 174/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.1097 178/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.1096 182/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.1095 186/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.1094 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1093 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1092 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1091 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1090 205/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1089 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1089 209/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1088 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1088 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1087 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1086 221/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1085 225/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1085 229/235 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1084 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1083 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1082 235/235 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - loss: 0.1082 - val_loss: 0.0937 Epoch 3/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 11s 49ms/step - loss: 0.0971 4/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0960 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0957 11/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0955 15/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0954 19/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0952 23/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0951 26/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0951 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0950 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0950 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0949 38/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0949 41/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0948 44/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0948 47/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0948 50/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0947 53/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0947 56/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0946 59/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0946 62/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0946 66/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0945 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0945 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0944 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0944 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0943 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0943 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0943 90/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0942 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0942 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0941 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0941 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0940 110/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0940 114/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0940 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0939 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0939 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0939 128/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0939 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0938 132/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0938 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0938 136/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0938 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0938 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0938 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0937 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0937 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0937 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0937 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0937 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0936 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0936 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0936 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0936 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0935 172/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0935 176/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0935 180/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0934 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0934 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0934 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0934 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0933 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0933 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0933 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0932 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0932 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0932 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0931 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0931 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0931 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0931 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0930 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0930 235/235 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - loss: 0.0930 - val_loss: 0.0873 Epoch 4/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 40ms/step - loss: 0.0869 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0874 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0875 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0876 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0876 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0877 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0876 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0876 33/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0876 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0876 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0876 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0875 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0875 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0875 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0875 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0875 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0875 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0875 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0874 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0874 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0874 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0874 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0874 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0874 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0874 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0874 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0873 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0873 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0873 110/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0873 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0873 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0872 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0872 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0872 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0872 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0872 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0871 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0871 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0871 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0871 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0871 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0870 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0870 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0870 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0870 171/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0869 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0869 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0869 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0869 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0869 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0868 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0868 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0868 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0868 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0868 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0867 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0867 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0867 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0867 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0867 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0866 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0866 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0866 - val_loss: 0.0826 Epoch 5/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 44ms/step - loss: 0.0790 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0804 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0812 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0816 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0819 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0820 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0822 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0823 30/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0823 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0823 34/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0824 36/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0824 38/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0824 42/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0825 46/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0825 50/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0825 54/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0825 58/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0825 62/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0826 66/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0826 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0826 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0826 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0826 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0826 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 101/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 113/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 117/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 121/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0826 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0826 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0826 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0825 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0825 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0825 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0825 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0825 175/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0825 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0825 213/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 225/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0824 235/235 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - loss: 0.0824 - val_loss: 0.0795 Epoch 6/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 2:28 634ms/step - loss: 0.0806 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 13ms/step - loss: 0.0804 10/235 ━━━━━━━━━━━━━━━━━━━━ 2s 12ms/step - loss: 0.0806 15/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0807 19/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0807 23/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0807 27/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0807 31/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0806 35/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0806 39/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0806 43/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0805 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0805 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0805 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0805 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0805 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0804 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0804 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0804 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0804 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0804 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0804 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0804 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0804 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0804 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0803 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0803 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0803 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0803 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0803 103/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0803 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0803 107/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0803 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0803 111/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0803 113/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0803 116/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0803 119/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0802 122/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0802 125/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0802 128/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0802 131/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0802 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0802 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0802 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0802 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0802 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0802 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0802 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0802 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 151/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 169/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 173/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0801 175/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.0801 176/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.0801 179/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.0801 182/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.0801 185/235 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - loss: 0.0801 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 217/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 229/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0800 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0799 235/235 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - loss: 0.0799 - val_loss: 0.0778 Epoch 7/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 11s 49ms/step - loss: 0.0792 4/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0784 7/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0781 10/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0781 13/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0781 16/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0781 19/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0781 22/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0782 25/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0782 28/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0782 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0782 34/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0782 37/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0782 40/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 43/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 46/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 49/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 52/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 55/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 58/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 61/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 64/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 67/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 70/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 73/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 76/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0783 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 103/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 112/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 115/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 119/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 122/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 125/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 128/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0783 131/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0783 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0783 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0783 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0783 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0782 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0782 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 175/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 179/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0782 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0782 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0781 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0781 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0781 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0781 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0781 233/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0781 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0781 - val_loss: 0.0762 Epoch 8/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 43ms/step - loss: 0.0775 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0777 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0777 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0776 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0776 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0776 24/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0776 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0776 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0776 36/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0775 40/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0775 44/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0774 48/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0774 52/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0774 56/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0773 60/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0773 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0773 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0773 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0773 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0773 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 104/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0772 107/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0772 111/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0771 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0771 115/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0771 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0771 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0771 124/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0771 128/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0771 132/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0771 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0771 139/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0771 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 170/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 173/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 176/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0770 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0770 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0770 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0770 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 209/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 221/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 233/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0769 235/235 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - loss: 0.0768 - val_loss: 0.0753 Epoch 9/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 43ms/step - loss: 0.0752 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0762 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0764 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0764 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0764 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0764 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0764 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0763 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0763 36/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0762 40/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0762 44/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0762 48/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0762 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0761 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0761 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0761 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0761 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0761 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0761 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0760 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0760 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0760 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0760 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0760 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0760 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0760 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0759 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0759 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0759 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0759 112/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0759 116/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0759 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0759 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0759 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0759 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0759 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0758 172/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0758 175/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0758 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0757 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0757 184/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0757 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0757 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0757 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0757 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 213/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 217/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 221/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0757 235/235 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - loss: 0.0757 - val_loss: 0.0744 Epoch 10/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 42ms/step - loss: 0.0755 4/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0749 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0747 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0745 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0744 19/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0743 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0743 24/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0742 27/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0742 30/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0742 34/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0742 37/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0742 39/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0742 41/235 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - loss: 0.0742 44/235 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - loss: 0.0742 48/235 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - loss: 0.0742 52/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0742 56/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0742 60/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0742 64/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0742 68/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0743 71/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0743 74/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0743 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0743 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0743 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0743 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0743 90/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0743 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0743 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0743 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0744 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0744 110/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0744 114/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0744 118/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0744 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0744 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0744 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0744 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0744 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0744 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0744 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0744 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0744 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0744 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0745 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0745 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0745 170/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0745 174/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0745 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0745 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0745 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0745 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0745 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0745 - val_loss: 0.0735 Epoch 11/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 13s 57ms/step - loss: 0.0757 4/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0751 7/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0748 10/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0745 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0744 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0743 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0742 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0742 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0742 33/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0742 37/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0742 41/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0742 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0741 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0741 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0741 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0741 60/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0741 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0741 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0741 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0741 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0741 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0741 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0740 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0740 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0740 96/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0740 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0740 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0740 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0740 108/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0740 111/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0740 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0740 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0740 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0740 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0740 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0740 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 170/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0739 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0739 176/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0739 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0738 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0738 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0738 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0738 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0738 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0738 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0738 - val_loss: 0.0728 Epoch 12/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 11s 48ms/step - loss: 0.0729 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0728 8/235 ━━━━━━━━━━━━━━━━━━━━ 5s 25ms/step - loss: 0.0730 9/235 ━━━━━━━━━━━━━━━━━━━━ 7s 34ms/step - loss: 0.0730 12/235 ━━━━━━━━━━━━━━━━━━━━ 7s 32ms/step - loss: 0.0731 15/235 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - loss: 0.0731 18/235 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - loss: 0.0731 20/235 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - loss: 0.0731 22/235 ━━━━━━━━━━━━━━━━━━━━ 6s 30ms/step - loss: 0.0731 23/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0731 25/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0731 27/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0732 29/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0732 31/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0732 33/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0732 36/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0732 39/235 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - loss: 0.0732 42/235 ━━━━━━━━━━━━━━━━━━━━ 5s 29ms/step - loss: 0.0732 46/235 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - loss: 0.0732 50/235 ━━━━━━━━━━━━━━━━━━━━ 4s 27ms/step - loss: 0.0732 54/235 ━━━━━━━━━━━━━━━━━━━━ 4s 26ms/step - loss: 0.0732 58/235 ━━━━━━━━━━━━━━━━━━━━ 4s 25ms/step - loss: 0.0732 62/235 ━━━━━━━━━━━━━━━━━━━━ 4s 25ms/step - loss: 0.0732 66/235 ━━━━━━━━━━━━━━━━━━━━ 4s 24ms/step - loss: 0.0732 70/235 ━━━━━━━━━━━━━━━━━━━━ 3s 24ms/step - loss: 0.0732 74/235 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - loss: 0.0732 78/235 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - loss: 0.0732 82/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.0732 86/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.0732 90/235 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - loss: 0.0732 94/235 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - loss: 0.0732 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.0732 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.0732 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.0732 110/235 ━━━━━━━━━━━━━━━━━━━━ 2s 21ms/step - loss: 0.0732 114/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.0732 117/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.0732 120/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.0732 123/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.0732 126/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.0732 130/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.0732 134/235 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - loss: 0.0731 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0731 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0731 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0731 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0731 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 172/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 175/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 179/235 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - loss: 0.0731 183/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0731 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 217/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0731 235/235 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - loss: 0.0731 - val_loss: 0.0723 Epoch 13/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 1:22 354ms/step - loss: 0.0725 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0723 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0723 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0723 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0724 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0724 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0724 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0723 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0723 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 110/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0723 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 169/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 173/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0723 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0723 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0723 - val_loss: 0.0733 Epoch 14/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 46ms/step - loss: 0.0740 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0736 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0736 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0736 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0735 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0734 22/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0733 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0732 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0732 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0731 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0730 39/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0730 43/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0729 47/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0728 51/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0728 54/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0728 58/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0727 61/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0727 64/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0727 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0727 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0726 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0726 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0726 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0725 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0725 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0725 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0725 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0725 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0725 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0725 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0724 112/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0724 115/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0724 118/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0724 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0724 123/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0724 126/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0724 128/235 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - loss: 0.0724 131/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0724 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0724 139/235 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - loss: 0.0724 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 147/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 151/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 169/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 173/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 177/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0723 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0723 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0723 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0723 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0723 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0723 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0722 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0722 - val_loss: 0.0712 Epoch 15/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 13s 58ms/step - loss: 0.0720 4/235 ━━━━━━━━━━━━━━━━━━━━ 12s 56ms/step - loss: 0.0716 5/235 ━━━━━━━━━━━━━━━━━━━━ 12s 56ms/step - loss: 0.0716 9/235 ━━━━━━━━━━━━━━━━━━━━ 8s 36ms/step - loss: 0.0717 13/235 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - loss: 0.0717 17/235 ━━━━━━━━━━━━━━━━━━━━ 5s 26ms/step - loss: 0.0717 21/235 ━━━━━━━━━━━━━━━━━━━━ 5s 24ms/step - loss: 0.0716 25/235 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - loss: 0.0716 29/235 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - loss: 0.0716 33/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.0716 37/235 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - loss: 0.0716 40/235 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - loss: 0.0716 44/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0716 47/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0716 51/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0716 55/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0716 58/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0716 62/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0716 66/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0716 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0716 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0716 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0716 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0716 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0716 90/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0716 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0716 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0716 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0716 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0716 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0715 113/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0715 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 128/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 132/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 136/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 172/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0715 176/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 180/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 184/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0715 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0715 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0715 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0715 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0715 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0715 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0715 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0714 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0714 - val_loss: 0.0708 Epoch 16/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 42ms/step - loss: 0.0707 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0711 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0711 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0709 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0709 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0709 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0709 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0709 33/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0709 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0708 40/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0708 44/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0708 48/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0708 52/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0709 56/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 60/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0709 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0709 96/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0709 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0709 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0709 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0709 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0709 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0709 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0709 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0709 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0709 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0710 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0710 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0710 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0710 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0710 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0710 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 172/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 176/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 180/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 184/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 233/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0710 - val_loss: 0.0707 Epoch 17/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 40ms/step - loss: 0.0713 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0715 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0714 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0713 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0713 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0713 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0713 29/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0713 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0712 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0712 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0712 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0712 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0711 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0711 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0711 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0711 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0711 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0711 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0711 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 101/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0710 105/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0710 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0709 169/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 173/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 184/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 200/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0708 - val_loss: 0.0701 Epoch 18/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 46ms/step - loss: 0.0726 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0716 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0713 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0710 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0709 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0708 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0707 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0707 33/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0706 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0706 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0706 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0706 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0706 52/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0705 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0705 58/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0705 110/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0705 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0705 170/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 205/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 209/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 213/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0705 - val_loss: 0.0701 Epoch 19/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 11s 49ms/step - loss: 0.0707 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0706 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0704 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0704 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0703 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0703 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0703 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0702 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0702 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0702 39/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0701 43/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 90/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 105/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 170/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0700 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 221/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0700 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0700 - val_loss: 0.0698 Epoch 20/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 43ms/step - loss: 0.0696 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0702 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0702 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0702 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0701 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0701 24/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0701 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0701 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0701 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0701 39/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0701 43/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 62/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 66/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0701 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0701 101/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0700 169/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 173/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 205/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 209/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0700 221/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0699 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0699 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0699 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0699 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0699 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0699 - val_loss: 0.0696 Epoch 21/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 11s 48ms/step - loss: 0.0717 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0703 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0703 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0702 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0702 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0702 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0701 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0701 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0700 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0699 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0699 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0699 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0699 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0699 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0699 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0698 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0698 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0698 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0698 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0698 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0698 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0698 101/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 105/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 116/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0698 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0697 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0697 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0697 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0697 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0697 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0697 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0697 170/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0697 - val_loss: 0.0692 Epoch 22/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 5:15 1s/step - loss: 0.0693 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 13ms/step - loss: 0.0691 10/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0689 15/235 ━━━━━━━━━━━━━━━━━━━━ 2s 12ms/step - loss: 0.0689 19/235 ━━━━━━━━━━━━━━━━━━━━ 2s 12ms/step - loss: 0.0689 23/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0690 27/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0690 31/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0690 35/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0690 39/235 ━━━━━━━━━━━━━━━━━━━━ 2s 13ms/step - loss: 0.0690 42/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0691 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0691 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0691 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 103/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 107/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 111/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 114/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0691 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0691 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0691 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0691 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0691 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0691 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0692 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0692 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0692 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0692 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0692 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0692 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0692 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 235/235 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - loss: 0.0692 - val_loss: 0.0690 Epoch 23/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 12s 55ms/step - loss: 0.0691 4/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0692 7/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0693 10/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0692 13/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0691 17/235 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - loss: 0.0691 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0691 24/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0691 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0691 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0691 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 38/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 41/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0692 45/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 49/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 52/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 55/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 58/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 61/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0692 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0692 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0692 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0692 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0692 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0692 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0692 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0692 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0692 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0693 101/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0693 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0693 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0693 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0693 172/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 176/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 180/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 184/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0693 - val_loss: 0.0690 Epoch 24/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 43ms/step - loss: 0.0687 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0686 7/235 ━━━━━━━━━━━━━━━━━━━━ 8s 35ms/step - loss: 0.0685 10/235 ━━━━━━━━━━━━━━━━━━━━ 6s 31ms/step - loss: 0.0685 13/235 ━━━━━━━━━━━━━━━━━━━━ 6s 27ms/step - loss: 0.0685 17/235 ━━━━━━━━━━━━━━━━━━━━ 5s 24ms/step - loss: 0.0685 21/235 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - loss: 0.0686 25/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.0686 29/235 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - loss: 0.0686 33/235 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - loss: 0.0687 37/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0687 41/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0688 44/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0688 47/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0688 50/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0688 53/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0688 56/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0689 59/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0689 62/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0689 66/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0689 70/235 ━━━━━━━━━━━━━━━━━━━━ 3s 19ms/step - loss: 0.0689 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0689 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0690 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0690 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0690 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0690 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - loss: 0.0690 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0690 101/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0690 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0690 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0690 113/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0690 116/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0690 119/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0690 123/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0690 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0690 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0690 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0690 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 151/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0691 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 172/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 176/235 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - loss: 0.0691 180/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0691 184/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0691 188/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0691 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0691 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0691 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0691 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0691 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0691 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0691 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0691 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0691 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0690 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0690 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0690 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0690 235/235 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - loss: 0.0690 - val_loss: 0.0687 Epoch 25/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 42ms/step - loss: 0.0687 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0690 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0689 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0687 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0687 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0687 24/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0687 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0687 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0687 36/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0687 40/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 44/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 48/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 52/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 56/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 60/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0687 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0688 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0688 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0688 108/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0688 112/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 116/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 120/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 124/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 169/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0688 173/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 205/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 220/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0688 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0688 - val_loss: 0.0693 Epoch 26/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 44ms/step - loss: 0.0689 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0693 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0693 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0692 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0692 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0691 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0690 29/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0690 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0690 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0690 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0689 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0689 52/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0689 56/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0689 60/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0689 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0689 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 96/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0688 100/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 104/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 108/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 112/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 116/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 120/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 124/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 128/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 132/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 136/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0688 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0687 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0687 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0687 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0687 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0687 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0687 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0687 170/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0687 173/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0687 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 180/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 208/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 217/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0687 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0687 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0687 225/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0687 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0687 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0687 233/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0687 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0687 - val_loss: 0.0688 Epoch 27/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 44ms/step - loss: 0.0676 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0679 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0681 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0682 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0682 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0682 23/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0683 27/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0683 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0683 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0683 39/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0684 43/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0684 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0684 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0684 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0684 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0684 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0684 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0684 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0684 103/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0685 107/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0685 111/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0685 115/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0685 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0685 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0685 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0685 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 204/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 213/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0685 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0685 - val_loss: 0.0683 Epoch 28/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 40ms/step - loss: 0.0698 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0691 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0688 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0688 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0688 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0687 24/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0686 27/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0686 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0685 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0685 39/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0684 43/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0684 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0683 103/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 107/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 111/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 115/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 123/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 131/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 139/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 147/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 151/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0683 171/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 229/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 233/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0683 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0683 - val_loss: 0.0688 Epoch 29/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 11s 51ms/step - loss: 0.0704 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0696 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0694 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0693 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0692 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0691 23/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0690 26/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0690 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0689 32/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0689 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0689 38/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0689 42/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0688 46/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0688 50/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0688 54/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0688 58/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0687 62/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0687 66/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0687 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0687 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0687 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0687 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0686 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0686 90/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0686 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0686 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0686 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0686 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0686 105/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0686 109/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0686 113/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0686 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0685 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0685 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 147/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 151/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0685 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0684 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0684 - val_loss: 0.0681 Epoch 30/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 46ms/step - loss: 0.0654 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0666 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0672 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0675 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0676 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0677 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0678 29/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0678 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0679 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0679 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0679 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0680 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0680 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0680 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 101/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 105/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 112/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 115/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 124/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 131/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 139/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 147/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 151/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0681 171/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0681 - val_loss: 0.0682 Epoch 31/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 44ms/step - loss: 0.0668 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 13ms/step - loss: 0.0674 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0676 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0677 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0678 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0679 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0679 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0679 33/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0680 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 101/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0682 105/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0682 171/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 209/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 212/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0682 221/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 224/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0681 - val_loss: 0.0678 Epoch 32/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 13s 59ms/step - loss: 0.0686 4/235 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - loss: 0.0681 7/235 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - loss: 0.0681 11/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0680 15/235 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - loss: 0.0680 19/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0679 23/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0679 27/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0679 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0678 34/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0678 38/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0678 42/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0678 45/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0678 48/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0678 51/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0678 55/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0678 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0678 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0678 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0678 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 103/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 107/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0678 111/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 115/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 123/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 131/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 139/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 143/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 147/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 151/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 155/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0678 171/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0678 - val_loss: 0.0677 Epoch 33/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 44ms/step - loss: 0.0682 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0676 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0675 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0676 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0676 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0675 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0675 29/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0676 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0676 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0676 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 44/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 50/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0676 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0676 56/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0676 60/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0676 64/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0676 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0676 72/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0676 76/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 80/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 84/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 96/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 104/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 107/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 111/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0677 172/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0677 - val_loss: 0.0677 Epoch 34/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 44ms/step - loss: 0.0697 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0688 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0685 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0684 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0683 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0683 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0682 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0682 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0682 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0682 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0681 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0680 101/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0680 105/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 116/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 132/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 136/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 159/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0679 171/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0679 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0678 - val_loss: 0.0680 Epoch 35/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 42ms/step - loss: 0.0679 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0682 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0681 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0680 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0680 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0679 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0679 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0678 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0678 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0678 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0678 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0678 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 57/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 69/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 77/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 81/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0677 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 89/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0676 97/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0676 101/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 105/235 ━━━━━━━━━━━━━━━━━━━━ 1s 14ms/step - loss: 0.0676 109/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 113/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 117/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 121/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 14ms/step - loss: 0.0676 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 14ms/step - loss: 0.0676 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 14ms/step - loss: 0.0676 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0676 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 14ms/step - loss: 0.0676 169/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 173/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 205/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 209/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 213/235 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0676 216/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0676 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0676 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0676 225/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0676 228/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0676 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0676 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0676 - val_loss: 0.0674 Epoch 36/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 12s 55ms/step - loss: 0.0650 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0662 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0668 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0671 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0672 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0673 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0673 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0673 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 40/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 43/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 106/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0674 110/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0674 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0674 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0674 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 158/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 162/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 166/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0675 170/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 178/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 190/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 194/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 198/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0675 - val_loss: 0.0676 Epoch 37/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 43ms/step - loss: 0.0675 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0676 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0674 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0673 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0673 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0673 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0674 28/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0674 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0674 34/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0674 38/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0674 42/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0675 46/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0675 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 52/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 58/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 61/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 65/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 68/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0675 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0675 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 90/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 94/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 98/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 102/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 106/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0675 110/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 114/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 118/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 126/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 134/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 140/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 163/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 167/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0675 175/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 182/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 186/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 192/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 203/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 207/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 211/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 215/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 219/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 223/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 227/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 231/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 235/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - loss: 0.0675 - val_loss: 0.0673 Epoch 38/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 44ms/step - loss: 0.0668 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 14ms/step - loss: 0.0671 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0671 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0671 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0671 20/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0671 23/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0671 27/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0671 31/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0671 35/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0671 39/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0671 43/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 47/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 51/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 55/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 62/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 66/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 74/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 85/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 88/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 92/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 96/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 104/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0671 108/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0671 112/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0671 116/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0672 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0672 122/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0672 125/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0672 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0672 128/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 129/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 132/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 135/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 138/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 142/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 146/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 150/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 154/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 169/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 173/235 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - loss: 0.0672 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 196/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0672 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 229/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 232/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0673 - val_loss: 0.0672 Epoch 39/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 10s 45ms/step - loss: 0.0676 4/235 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - loss: 0.0672 8/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0674 12/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0674 16/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0674 19/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0673 23/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0673 26/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0673 30/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0673 34/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0673 38/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0673 42/235 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - loss: 0.0673 46/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 50/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 54/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 58/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 62/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 66/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 70/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 73/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0673 78/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0673 82/235 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - loss: 0.0673 86/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 90/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 93/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 96/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 100/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 104/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 108/235 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - loss: 0.0673 112/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 116/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 123/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 144/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 148/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0674 152/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 156/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 160/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 164/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 168/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 171/235 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - loss: 0.0673 174/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0673 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0673 179/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0673 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 183/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 187/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 191/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 195/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 199/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 202/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 206/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 210/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 214/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 218/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 222/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 226/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 230/235 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0673 234/235 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0673 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - loss: 0.0673 - val_loss: 0.0670 Epoch 40/40 1/235 ━━━━━━━━━━━━━━━━━━━━ 9s 43ms/step - loss: 0.0663 5/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0666 9/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0669 13/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0671 17/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0672 21/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0673 25/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0674 29/235 ━━━━━━━━━━━━━━━━━━━━ 3s 15ms/step - loss: 0.0674 33/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0674 37/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0674 41/235 ━━━━━━━━━━━━━━━━━━━━ 2s 14ms/step - loss: 0.0674 45/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 49/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 53/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 56/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 59/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 63/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0674 67/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 71/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 75/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 79/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 83/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 87/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 91/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 95/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 99/235 ━━━━━━━━━━━━━━━━━━━━ 2s 15ms/step - loss: 0.0673 103/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 107/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 111/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 115/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 119/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 123/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 127/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 130/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 133/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 137/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 141/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 145/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 149/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 153/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 157/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 161/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 165/235 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - loss: 0.0673 169/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 173/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 177/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 181/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 185/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 189/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 193/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 197/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 201/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 205/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 209/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 213/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 217/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 221/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 225/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 229/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 233/235 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0673 235/235 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - loss: 0.0673 - val_loss: 0.0671
<keras.src.callbacks.history.History at 0x132c838c0>
For the model preparation I used “Adam,” which stands for Adaptive Moment Estimation. This is a very popular optimizer for autoencoders that works well with noisy or large data sets. binary_crossentropy calculates the difference between the original and reconstructed image.
For training: ‘epochs’ = number of full passes through the data,‘batch_size’ = number of samples per training step, ‘shuffle’ = True, randomly shuffles the data each epoch to help construct the data.
10.6.4.4 Reconstruct The Data
#Reconstruct the images
= autoencoder.predict(X_test)
reconstruct_images
#Set the parameters for the plot with original and reconstructed images
= 10
n =(16, 4))
plt.figure(figsizefor i in range(n):
#Top plot for original images
= plt.subplot(2, n, i + 1)
ax 28, 28), cmap="gray")
plt.imshow(X_test[i].reshape("Original")
plt.title("off")
plt.axis(
#Bottom plot for reconstructed images
= plt.subplot(2, n, i + 1 + n)
ax 28, 28), cmap="gray")
plt.imshow(reconstruct_images[i].reshape("Reconstructed")
plt.title("off")
plt.axis( plt.show()
1/313 ━━━━━━━━━━━━━━━━━━━━ 29s 93ms/step 20/313 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step 40/313 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step 65/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 91/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 116/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 141/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 166/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 189/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 212/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 236/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 260/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 284/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 310/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 313/313 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 313/313 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step
As we can see above the reconstructed images are slightly more pixelated and not as distict as the original. In order to improve on this the number of epochs or creating more dimensions when building the model.
10.6.5 Conclusion
Autoencoders compress and reconstruct data, enabling pattern recognition without labeled data.
Useful in tasks like anomaly detection, data cleaning, and feature extraction.
Training involves minimizing reconstruction error, using loss functions such as MSE.
10.6.6 Further Readings
- Doersch’s Tutorial on Variational Autoencoders
- Michelucci’s Deep Learning with TensorFlow 2