Train Sleep Detection Model¶
Date created: 2024/10/02
Last Modified: 2024/10/02
Description: Train a simple wrist-based sleep detection model using accelerometer data.
Overview¶
In this guide, we will train a small TCN network to detect sleep and wake stages using accelerometer data collected from the wrist.
Input
- Sensor: IMU
- Location: Wrist
- Sampling Rate: 0.2 Hz
- Frame Size: 60 seconds
Class Mapping
Identify activity into one of two categories: SLEEP, AWAKE.
Base Class | Target Class | Label |
---|---|---|
0-WAKE | 0 | WAKE |
1-SLEEP | 1 | SLEEP |
Datasets
- CMIDSS: The Child Mind Institute - Detect Sleep States (CMIDSS) dataset comprises 300 subjects with over 500 multi-day recordings of wrist-worn accelerometer data annotated with two event types: onset, the beginning of sleep, and wakeup, the end of sleep.
Setup¶
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
import IPython
import contextlib
import tempfile
from pathlib import Path
import keras
import neuralspot_edge as nse
import sleepkit as sk
import matplotlib.pyplot as plt
import plotly.io as pio
/workspaces/sleepkit/.venv/lib/python3.11/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html from .autonotebook import tqdm as notebook_tqdm
# Be sure to set the dataset path to the correct location
datasets_dir = Path(os.getenv('SK_DATASET_PATH', '../../datasets'))
plot_theme = sk.utils.dark_theme
nse.utils.silence_tensorflow()
sk.utils.setup_plotting(plot_theme)
logger = nse.utils.setup_logger(__name__)
Configure datasets¶
We are going to train our model using the [CMIDSS Dataset(https://ambiqai.github.io/sleepkit/datasets/cmidss/). This dataset uses the slug cmidss within SleepKit. We will download the dataset if it is not already available.
datasets = [sk.NamedParams(
name="cmidss",
params=dict(
path=datasets_dir / "cmidss",
)
)]
Target classes¶
For this task, we are going to simply classify the data into two classes: SLEEP and AWAKE.
class_map = {
sk.SleepStage.wake: 0,
sk.SleepStage.stage1: 1,
sk.SleepStage.stage2: 1,
sk.SleepStage.stage3: 1,
sk.SleepStage.stage4: 1,
sk.SleepStage.rem: 1,
}
class_names = ["WAKE", "SLEEP"]
Feature set¶
From the dataset, we will create a feature set using the FS-W-A-5 features. This feature set computes 5 features over 60-second windows captured from the accelerometer sensor collected on the wrist. The CMIDSS dataset already provides accelerometer averaged over 5 secods (i.e. Fs=0.2 Hz). Therefore, we will use a frame size of 12 to capture 60 seconds of data (i.e. 6 samples at 0.2 Hz) with a 50% overlap.
feature = dict(
name="FS-W-A-5",
sampling_rate=0.2,
frame_size=12,
loader="hdf5",
feat_key="features",
label_key="detect_labels",
mask_key="mask",
feat_cols=None,
save_path=datasets_dir / "store" / "fs-w-a-5-60",
params={},
)
Define TCN model architecture¶
For this task, we are going to leverage a customized TCN model architecture that is smaller and can handle 1D signals. The model consists of 4 TCN blocks with a depth of 1. Each block leverages dilated depthwise-separable convolutions along with inverted expansion and squeeze and excitation layers. The model is followed by a 1D convolutional layer and a final dense layer for regression. Unlike vision tasks, we leverage larger kernel sizes and strides to capture temporal dependencies in the signal.
architecture = sk.NamedParams(
name="tcn",
params=dict(
input_kernel=[1, 5],
input_norm="batch",
blocks=[
dict(depth=1, branch=1, filters=16, kernel=(1, 5), dilation=[1, 1], dropout=0.10, ex_ratio=1, se_ratio=4, norm="batch"),
dict(depth=1, branch=1, filters=32, kernel=(1, 5), dilation=[1, 2], dropout=0.10, ex_ratio=1, se_ratio=4, norm="batch"),
dict(depth=1, branch=1, filters=48, kernel=(1, 5), dilation=[1, 4], dropout=0.10, ex_ratio=1, se_ratio=4, norm="batch"),
dict(depth=1, branch=1, filters=64, kernel=(1, 5), dilation=[1, 8], dropout=0.10, ex_ratio=1, se_ratio=4, norm="batch")
],
output_kernel=(1, 5),
include_top=True,
use_logits=True,
model_name="tcn"
)
)
Task configuration¶
Here we provide the complete configuration for the task. This includes the dataset configuration, features, model architecture, and training parameters.
params = sk.TaskParams(
name="sk-detect",
job_dir=Path(tempfile.gettempdir()) / "sk-detect",
verbose=1,
datasets=datasets,
feature=feature,
sampling_rate=0.0083333,
frame_size=240,
num_classes=len(class_names),
class_map=class_map,
class_names=class_names,
samples_per_subject=100,
val_samples_per_subject=100,
test_samples_per_subject=50,
val_size=4000,
test_size=2500,
val_subjects=0.20,
batch_size=128,
buffer_size=10000,
epochs=200,
steps_per_epoch=25,
val_steps_per_epoch=25,
val_metric="loss",
lr_rate=1e-3,
lr_cycles=1,
label_smoothing=0,
test_metric="f1",
test_metric_threshold=0.02,
tflm_var_name="sk_detect_flatbuffer",
tflm_file="sk_detect_flatbuffer.h",
backend="pc",
display_report=False,
model_file="model.keras",
use_logits=False,
architecture=architecture
)
Load detect task¶
SleepKit provides a TaskFactory that includes a number ready-to-use tasks. Each task provides methods for training, evaluating, exporting, and demoing. We will grab the detect task and configure it for our use case.
task = sk.TaskFactory.get("detect")
Download the datasets¶
We will download the datasets using the sleepkit
library. If already downloaded, this step will be skipped.
task.download(params=params)
Generate the features¶
Next, we will generate the features from the given dataset. The features will be generated using the fs_w_a_5
feature set.
Once the command finishes, the feature set will be saved in the feature.save_path
directory. These features will be stored in HDF5 files with one file per subject. Each HDF5 file will include the following entries:
/features
: Time x Feature tensor (fp32). Features are computed over windows of sensor data./mask
: Time x Mask tensor (bool). Mask indicates valid feature values./detect_labels
: Time x Label (int). Labels are awake/sleep.
task.feature(params=params)
Gen features for cmidss: 100%|██████████| 277/277 [01:59<00:00, 2.32it/s]
Visualize the model¶
Lets quickly instantiate and visualize the model.
model = nse.models.tcn.TcnModel.model_from_params(
inputs=keras.Input(shape=(params.frame_size, 5), name="inputs"),
params=architecture.params,
num_classes=len(class_names)
)
model.summary(layer_range=('inputs', model.layers[10].name))
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR I0000 00:00:1727880897.880315 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880897.954085 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880897.954212 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880897.956184 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880897.956276 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880897.956323 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880898.013209 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880898.013298 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727880898.013358 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355
Model: "functional"
┏━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ Connected to ┃ ┡━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━┩ │ inputs (InputLayer) │ (None, 240, 5) │ 0 │ - │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ reshape (Reshape) │ (None, 1, 240, 5) │ 0 │ inputs[0][0] │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ ENC.CN │ (None, 1, 240, 5) │ 25 │ reshape[0][0] │ │ (DepthwiseConv2D) │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ ENC.BN │ (None, 1, 240, 5) │ 20 │ ENC.CN[0][0] │ │ (BatchNormalizatio… │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.DW.B1.CN │ (None, 1, 240, 5) │ 25 │ ENC.BN[0][0] │ │ (DepthwiseConv2D) │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.DW.B1.BN │ (None, 1, 240, 5) │ 20 │ B1.D1.DW.B1.CN[0… │ │ (BatchNormalizatio… │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.DW.ACT │ (None, 1, 240, 5) │ 0 │ B1.D1.DW.B1.BN[0… │ │ (Activation) │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.SE.pool │ (None, 1, 1, 5) │ 0 │ B1.D1.DW.ACT[0][… │ │ (GlobalAveragePool… │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.SE.sq │ (None, 1, 1, 1) │ 6 │ B1.D1.SE.pool[0]… │ │ (Conv2D) │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.SE.sq.act │ (None, 1, 1, 1) │ 0 │ B1.D1.SE.sq[0][0] │ │ (Activation) │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.SE.ex │ (None, 1, 1, 5) │ 10 │ B1.D1.SE.sq.act[… │ │ (Conv2D) │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.SE.ex.act │ (None, 1, 1, 5) │ 0 │ B1.D1.SE.ex[0][0] │ │ (Activation) │ │ │ │ ├─────────────────────┼───────────────────┼────────────┼───────────────────┤ │ B1.D1.SE.ex.mul │ (None, 1, 240, 5) │ 0 │ B1.D1.DW.ACT[0][… │ │ (Multiply) │ │ │ B1.D1.SE.ex.act[… │ └─────────────────────┴───────────────────┴────────────┴───────────────────┘
Total params: 9,364 (36.58 KB)
Trainable params: 8,832 (34.50 KB)
Non-trainable params: 532 (2.08 KB)
Train the model¶
At this point, we can train the model from the generated feature set for the sleep detect task. The model will be trained for 200 epochs with a batch size of 128 and a learning rate of 1e-3. The model will be fed a frame_size
of 240 samples which equates to 120 minutes.
Using the task configuration, we will train the model on the dataset.
task.train(params)
Epoch 1/200
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR I0000 00:00:1727880905.268682 6547 service.cc:146] XLA service 0x7cb4a4011fd0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: I0000 00:00:1727880905.268702 6547 service.cc:154] StreamExecutor device (0): NVIDIA GeForce RTX 4090, Compute Capability 8.9
7/25 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - acc: 0.5802 - f1: 0.6041 - iou: 0.3636 - loss: 0.4143
I0000 00:00:1727880911.515249 6547 device_compiler.h:188] Compiled cluster using XLA! This line is logged at most once for the lifetime of the process.
25/25 ━━━━━━━━━━━━━━━━━━━━ 13s 80ms/step - acc: 0.6424 - f1: 0.6496 - iou: 0.4014 - loss: 0.3955 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.3363 Epoch 2/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 35ms/step - acc: 0.7426 - f1: 0.7359 - iou: 0.4776 - loss: 0.3269 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.2995 Epoch 3/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.7694 - f1: 0.7657 - iou: 0.5360 - loss: 0.2839 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.2656 Epoch 4/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.7952 - f1: 0.7877 - iou: 0.5491 - loss: 0.2479 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.2366 Epoch 5/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.7991 - f1: 0.7946 - iou: 0.5685 - loss: 0.2184 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.2112 Epoch 6/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.8120 - f1: 0.8088 - iou: 0.6102 - loss: 0.1927 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.1893 Epoch 7/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.8154 - f1: 0.8166 - iou: 0.6085 - loss: 0.1709 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.1704 Epoch 8/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.8334 - f1: 0.8362 - iou: 0.6520 - loss: 0.1513 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.1542 Epoch 9/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.8465 - f1: 0.8449 - iou: 0.6582 - loss: 0.1339 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.1395 Epoch 10/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.8530 - f1: 0.8566 - iou: 0.6995 - loss: 0.1197 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.1279 Epoch 11/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.8705 - f1: 0.8701 - iou: 0.7140 - loss: 0.1071 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.1168 Epoch 12/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.8801 - f1: 0.8817 - iou: 0.7295 - loss: 0.0959 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.1073 Epoch 13/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.8821 - f1: 0.8822 - iou: 0.7343 - loss: 0.0866 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0996 Epoch 14/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - acc: 0.9043 - f1: 0.9051 - iou: 0.7723 - loss: 0.0769 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0936 Epoch 15/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9053 - f1: 0.9056 - iou: 0.7832 - loss: 0.0701 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0872 Epoch 16/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9114 - f1: 0.9132 - iou: 0.7963 - loss: 0.0631 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0831 Epoch 17/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9196 - f1: 0.9200 - iou: 0.8157 - loss: 0.0569 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0792 Epoch 18/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9307 - f1: 0.9313 - iou: 0.8343 - loss: 0.0510 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0771 Epoch 19/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9224 - f1: 0.9230 - iou: 0.8160 - loss: 0.0484 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0732 Epoch 20/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9274 - f1: 0.9278 - iou: 0.8212 - loss: 0.0440 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0704 Epoch 21/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9152 - f1: 0.9169 - iou: 0.8028 - loss: 0.0418 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0626 Epoch 22/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9177 - f1: 0.9191 - iou: 0.8108 - loss: 0.0394 - val_acc: 0.7596 - val_f1: 0.6558 - val_iou: 0.3798 - val_loss: 0.0639 Epoch 23/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9278 - f1: 0.9286 - iou: 0.8268 - loss: 0.0357 - val_acc: 0.7594 - val_f1: 0.6563 - val_iou: 0.3803 - val_loss: 0.0605 Epoch 24/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9209 - f1: 0.9226 - iou: 0.8139 - loss: 0.0338 - val_acc: 0.7600 - val_f1: 0.6597 - val_iou: 0.3839 - val_loss: 0.0577 Epoch 25/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9236 - f1: 0.9249 - iou: 0.8217 - loss: 0.0318 - val_acc: 0.7676 - val_f1: 0.6847 - val_iou: 0.4114 - val_loss: 0.0487 Epoch 26/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - acc: 0.9201 - f1: 0.9216 - iou: 0.8130 - loss: 0.0305 - val_acc: 0.7635 - val_f1: 0.6727 - val_iou: 0.3979 - val_loss: 0.0540 Epoch 27/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9367 - f1: 0.9374 - iou: 0.8488 - loss: 0.0274 - val_acc: 0.7632 - val_f1: 0.6713 - val_iou: 0.3964 - val_loss: 0.0556 Epoch 28/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9260 - f1: 0.9274 - iou: 0.8260 - loss: 0.0270 - val_acc: 0.7709 - val_f1: 0.6933 - val_iou: 0.4214 - val_loss: 0.0478 Epoch 29/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - acc: 0.9237 - f1: 0.9252 - iou: 0.8189 - loss: 0.0259 - val_acc: 0.7616 - val_f1: 0.6638 - val_iou: 0.3882 - val_loss: 0.0633 Epoch 30/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9338 - f1: 0.9349 - iou: 0.8392 - loss: 0.0240 - val_acc: 0.7889 - val_f1: 0.7352 - val_iou: 0.4745 - val_loss: 0.0435 Epoch 31/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - acc: 0.9353 - f1: 0.9362 - iou: 0.8455 - loss: 0.0230 - val_acc: 0.7769 - val_f1: 0.7076 - val_iou: 0.4387 - val_loss: 0.0487 Epoch 32/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9331 - f1: 0.9341 - iou: 0.8339 - loss: 0.0219 - val_acc: 0.7825 - val_f1: 0.7210 - val_iou: 0.4557 - val_loss: 0.0451 Epoch 33/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9316 - f1: 0.9327 - iou: 0.8439 - loss: 0.0213 - val_acc: 0.7917 - val_f1: 0.7400 - val_iou: 0.4811 - val_loss: 0.0414 Epoch 34/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9280 - f1: 0.9290 - iou: 0.8377 - loss: 0.0215 - val_acc: 0.8168 - val_f1: 0.7842 - val_iou: 0.5459 - val_loss: 0.0352 Epoch 35/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9336 - f1: 0.9349 - iou: 0.8405 - loss: 0.0197 - val_acc: 0.8428 - val_f1: 0.8241 - val_iou: 0.6127 - val_loss: 0.0297 Epoch 36/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9314 - f1: 0.9327 - iou: 0.8417 - loss: 0.0194 - val_acc: 0.8045 - val_f1: 0.7640 - val_iou: 0.5153 - val_loss: 0.0356 Epoch 37/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9366 - f1: 0.9375 - iou: 0.8476 - loss: 0.0183 - val_acc: 0.7892 - val_f1: 0.7341 - val_iou: 0.4729 - val_loss: 0.0422 Epoch 38/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9399 - f1: 0.9412 - iou: 0.8570 - loss: 0.0173 - val_acc: 0.7822 - val_f1: 0.7176 - val_iou: 0.4511 - val_loss: 0.0469 Epoch 39/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9421 - f1: 0.9429 - iou: 0.8587 - loss: 0.0166 - val_acc: 0.8549 - val_f1: 0.8420 - val_iou: 0.6459 - val_loss: 0.0287 Epoch 40/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9376 - f1: 0.9383 - iou: 0.8513 - loss: 0.0166 - val_acc: 0.8677 - val_f1: 0.8584 - val_iou: 0.6769 - val_loss: 0.0247 Epoch 41/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9298 - f1: 0.9310 - iou: 0.8284 - loss: 0.0170 - val_acc: 0.8873 - val_f1: 0.8860 - val_iou: 0.7365 - val_loss: 0.0226 Epoch 42/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9367 - f1: 0.9378 - iou: 0.8478 - loss: 0.0164 - val_acc: 0.8938 - val_f1: 0.8963 - val_iou: 0.7630 - val_loss: 0.0201 Epoch 43/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9432 - f1: 0.9441 - iou: 0.8625 - loss: 0.0151 - val_acc: 0.8997 - val_f1: 0.8997 - val_iou: 0.7655 - val_loss: 0.0194 Epoch 44/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9439 - f1: 0.9446 - iou: 0.8636 - loss: 0.0147 - val_acc: 0.8770 - val_f1: 0.8682 - val_iou: 0.6950 - val_loss: 0.0243 Epoch 45/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9415 - f1: 0.9421 - iou: 0.8580 - loss: 0.0149 - val_acc: 0.9113 - val_f1: 0.9121 - val_iou: 0.7922 - val_loss: 0.0189 Epoch 46/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9383 - f1: 0.9394 - iou: 0.8530 - loss: 0.0150 - val_acc: 0.9101 - val_f1: 0.9107 - val_iou: 0.7888 - val_loss: 0.0178 Epoch 47/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9337 - f1: 0.9348 - iou: 0.8452 - loss: 0.0155 - val_acc: 0.9075 - val_f1: 0.9078 - val_iou: 0.7823 - val_loss: 0.0178 Epoch 48/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9431 - f1: 0.9439 - iou: 0.8686 - loss: 0.0144 - val_acc: 0.9060 - val_f1: 0.9042 - val_iou: 0.7716 - val_loss: 0.0182 Epoch 49/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9450 - f1: 0.9457 - iou: 0.8727 - loss: 0.0136 - val_acc: 0.9094 - val_f1: 0.9120 - val_iou: 0.7958 - val_loss: 0.0171 Epoch 50/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9408 - f1: 0.9418 - iou: 0.8567 - loss: 0.0139 - val_acc: 0.9075 - val_f1: 0.9099 - val_iou: 0.7910 - val_loss: 0.0169 Epoch 51/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9387 - f1: 0.9399 - iou: 0.8487 - loss: 0.0137 - val_acc: 0.9144 - val_f1: 0.9158 - val_iou: 0.8010 - val_loss: 0.0165 Epoch 52/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - acc: 0.9346 - f1: 0.9350 - iou: 0.8463 - loss: 0.0142 - val_acc: 0.9045 - val_f1: 0.9081 - val_iou: 0.7897 - val_loss: 0.0169 Epoch 53/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9390 - f1: 0.9395 - iou: 0.8490 - loss: 0.0145 - val_acc: 0.9114 - val_f1: 0.9135 - val_iou: 0.7979 - val_loss: 0.0161 Epoch 54/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9416 - f1: 0.9423 - iou: 0.8567 - loss: 0.0129 - val_acc: 0.8999 - val_f1: 0.9037 - val_iou: 0.7811 - val_loss: 0.0175 Epoch 55/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9421 - f1: 0.9428 - iou: 0.8567 - loss: 0.0128 - val_acc: 0.9144 - val_f1: 0.9147 - val_iou: 0.7969 - val_loss: 0.0172 Epoch 56/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9422 - f1: 0.9431 - iou: 0.8513 - loss: 0.0128 - val_acc: 0.8573 - val_f1: 0.8423 - val_iou: 0.6448 - val_loss: 0.0300 Epoch 57/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9484 - f1: 0.9492 - iou: 0.8688 - loss: 0.0116 - val_acc: 0.9030 - val_f1: 0.9036 - val_iou: 0.7743 - val_loss: 0.0177 Epoch 58/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9424 - f1: 0.9430 - iou: 0.8671 - loss: 0.0133 - val_acc: 0.8995 - val_f1: 0.8980 - val_iou: 0.7596 - val_loss: 0.0184 Epoch 59/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9425 - f1: 0.9432 - iou: 0.8596 - loss: 0.0123 - val_acc: 0.9223 - val_f1: 0.9233 - val_iou: 0.8162 - val_loss: 0.0146 Epoch 60/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9324 - f1: 0.9329 - iou: 0.8439 - loss: 0.0148 - val_acc: 0.9216 - val_f1: 0.9219 - val_iou: 0.8120 - val_loss: 0.0146 Epoch 61/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9506 - f1: 0.9511 - iou: 0.8775 - loss: 0.0112 - val_acc: 0.9112 - val_f1: 0.9147 - val_iou: 0.8037 - val_loss: 0.0166 Epoch 62/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9426 - f1: 0.9431 - iou: 0.8659 - loss: 0.0126 - val_acc: 0.9232 - val_f1: 0.9242 - val_iou: 0.8184 - val_loss: 0.0143 Epoch 63/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9486 - f1: 0.9494 - iou: 0.8724 - loss: 0.0116 - val_acc: 0.9201 - val_f1: 0.9226 - val_iou: 0.8179 - val_loss: 0.0151 Epoch 64/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9324 - f1: 0.9332 - iou: 0.8377 - loss: 0.0138 - val_acc: 0.8908 - val_f1: 0.8965 - val_iou: 0.7708 - val_loss: 0.0205 Epoch 65/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9397 - f1: 0.9407 - iou: 0.8572 - loss: 0.0123 - val_acc: 0.9289 - val_f1: 0.9304 - val_iou: 0.8330 - val_loss: 0.0146 Epoch 66/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9468 - f1: 0.9475 - iou: 0.8720 - loss: 0.0112 - val_acc: 0.9133 - val_f1: 0.9168 - val_iou: 0.8077 - val_loss: 0.0149 Epoch 67/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9407 - f1: 0.9418 - iou: 0.8597 - loss: 0.0122 - val_acc: 0.9174 - val_f1: 0.9200 - val_iou: 0.8124 - val_loss: 0.0146 Epoch 68/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9461 - f1: 0.9464 - iou: 0.8674 - loss: 0.0110 - val_acc: 0.9084 - val_f1: 0.9123 - val_iou: 0.7996 - val_loss: 0.0154 Epoch 69/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9291 - f1: 0.9302 - iou: 0.8383 - loss: 0.0131 - val_acc: 0.9256 - val_f1: 0.9274 - val_iou: 0.8270 - val_loss: 0.0139 Epoch 70/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9466 - f1: 0.9473 - iou: 0.8730 - loss: 0.0117 - val_acc: 0.9168 - val_f1: 0.9195 - val_iou: 0.8119 - val_loss: 0.0143 Epoch 71/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9443 - f1: 0.9452 - iou: 0.8654 - loss: 0.0116 - val_acc: 0.9243 - val_f1: 0.9264 - val_iou: 0.8254 - val_loss: 0.0137 Epoch 72/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - acc: 0.9486 - f1: 0.9494 - iou: 0.8737 - loss: 0.0117 - val_acc: 0.9260 - val_f1: 0.9276 - val_iou: 0.8269 - val_loss: 0.0143 Epoch 73/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9464 - f1: 0.9471 - iou: 0.8724 - loss: 0.0114 - val_acc: 0.9246 - val_f1: 0.9244 - val_iou: 0.8163 - val_loss: 0.0142 Epoch 74/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9444 - f1: 0.9449 - iou: 0.8658 - loss: 0.0116 - val_acc: 0.9275 - val_f1: 0.9289 - val_iou: 0.8292 - val_loss: 0.0134 Epoch 75/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - acc: 0.9477 - f1: 0.9484 - iou: 0.8706 - loss: 0.0107 - val_acc: 0.9225 - val_f1: 0.9249 - val_iou: 0.8228 - val_loss: 0.0137 Epoch 76/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9462 - f1: 0.9469 - iou: 0.8717 - loss: 0.0121 - val_acc: 0.9197 - val_f1: 0.9224 - val_iou: 0.8178 - val_loss: 0.0141 Epoch 77/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9492 - f1: 0.9498 - iou: 0.8743 - loss: 0.0102 - val_acc: 0.9013 - val_f1: 0.9060 - val_iou: 0.7882 - val_loss: 0.0183 Epoch 78/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9514 - f1: 0.9519 - iou: 0.8800 - loss: 0.0106 - val_acc: 0.9171 - val_f1: 0.9201 - val_iou: 0.8140 - val_loss: 0.0160 Epoch 79/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9480 - f1: 0.9488 - iou: 0.8691 - loss: 0.0110 - val_acc: 0.9244 - val_f1: 0.9265 - val_iou: 0.8258 - val_loss: 0.0134 Epoch 80/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9382 - f1: 0.9394 - iou: 0.8458 - loss: 0.0116 - val_acc: 0.9140 - val_f1: 0.9174 - val_iou: 0.8089 - val_loss: 0.0148 Epoch 81/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9425 - f1: 0.9436 - iou: 0.8597 - loss: 0.0117 - val_acc: 0.9157 - val_f1: 0.9189 - val_iou: 0.8120 - val_loss: 0.0161 Epoch 82/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9513 - f1: 0.9517 - iou: 0.8822 - loss: 0.0109 - val_acc: 0.9190 - val_f1: 0.9219 - val_iou: 0.8177 - val_loss: 0.0141 Epoch 83/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9527 - f1: 0.9534 - iou: 0.8803 - loss: 0.0101 - val_acc: 0.9193 - val_f1: 0.9221 - val_iou: 0.8178 - val_loss: 0.0154 Epoch 84/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9502 - f1: 0.9508 - iou: 0.8778 - loss: 0.0110 - val_acc: 0.9191 - val_f1: 0.9220 - val_iou: 0.8176 - val_loss: 0.0142 Epoch 85/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9410 - f1: 0.9419 - iou: 0.8604 - loss: 0.0115 - val_acc: 0.9278 - val_f1: 0.9297 - val_iou: 0.8324 - val_loss: 0.0130 Epoch 86/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - acc: 0.9442 - f1: 0.9453 - iou: 0.8597 - loss: 0.0108 - val_acc: 0.9228 - val_f1: 0.9254 - val_iou: 0.8243 - val_loss: 0.0144 Epoch 87/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9446 - f1: 0.9455 - iou: 0.8676 - loss: 0.0108 - val_acc: 0.9276 - val_f1: 0.9297 - val_iou: 0.8328 - val_loss: 0.0131 Epoch 88/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9504 - f1: 0.9511 - iou: 0.8790 - loss: 0.0101 - val_acc: 0.9083 - val_f1: 0.9124 - val_iou: 0.8000 - val_loss: 0.0184 Epoch 89/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9469 - f1: 0.9479 - iou: 0.8692 - loss: 0.0106 - val_acc: 0.9253 - val_f1: 0.9277 - val_iou: 0.8287 - val_loss: 0.0135 Epoch 90/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9455 - f1: 0.9463 - iou: 0.8681 - loss: 0.0108 - val_acc: 0.9295 - val_f1: 0.9318 - val_iou: 0.8378 - val_loss: 0.0135 Epoch 91/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9455 - f1: 0.9462 - iou: 0.8664 - loss: 0.0104 - val_acc: 0.9224 - val_f1: 0.9251 - val_iou: 0.8239 - val_loss: 0.0137 Epoch 92/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9446 - f1: 0.9451 - iou: 0.8661 - loss: 0.0110 - val_acc: 0.9234 - val_f1: 0.9261 - val_iou: 0.8263 - val_loss: 0.0148 Epoch 93/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9420 - f1: 0.9431 - iou: 0.8610 - loss: 0.0116 - val_acc: 0.9197 - val_f1: 0.9227 - val_iou: 0.8193 - val_loss: 0.0143 Epoch 94/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - acc: 0.9508 - f1: 0.9513 - iou: 0.8832 - loss: 0.0104 - val_acc: 0.9196 - val_f1: 0.9225 - val_iou: 0.8190 - val_loss: 0.0139 Epoch 95/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9502 - f1: 0.9506 - iou: 0.8771 - loss: 0.0102 - val_acc: 0.9295 - val_f1: 0.9314 - val_iou: 0.8361 - val_loss: 0.0131 Epoch 96/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9570 - f1: 0.9574 - iou: 0.8918 - loss: 0.0092 - val_acc: 0.9348 - val_f1: 0.9358 - val_iou: 0.8438 - val_loss: 0.0123 Epoch 97/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - acc: 0.9497 - f1: 0.9502 - iou: 0.8796 - loss: 0.0106 - val_acc: 0.9120 - val_f1: 0.9157 - val_iou: 0.8062 - val_loss: 0.0152 Epoch 98/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9431 - f1: 0.9442 - iou: 0.8592 - loss: 0.0114 - val_acc: 0.9139 - val_f1: 0.9175 - val_iou: 0.8100 - val_loss: 0.0157 Epoch 99/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9483 - f1: 0.9489 - iou: 0.8771 - loss: 0.0111 - val_acc: 0.9368 - val_f1: 0.9382 - val_iou: 0.8501 - val_loss: 0.0120 Epoch 100/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9413 - f1: 0.9421 - iou: 0.8579 - loss: 0.0110 - val_acc: 0.9081 - val_f1: 0.9123 - val_iou: 0.8000 - val_loss: 0.0158 Epoch 101/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9489 - f1: 0.9498 - iou: 0.8752 - loss: 0.0100 - val_acc: 0.9314 - val_f1: 0.9333 - val_iou: 0.8402 - val_loss: 0.0124 Epoch 102/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9418 - f1: 0.9427 - iou: 0.8575 - loss: 0.0111 - val_acc: 0.9379 - val_f1: 0.9388 - val_iou: 0.8503 - val_loss: 0.0118 Epoch 103/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9516 - f1: 0.9522 - iou: 0.8802 - loss: 0.0101 - val_acc: 0.9332 - val_f1: 0.9349 - val_iou: 0.8433 - val_loss: 0.0130 Epoch 104/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9458 - f1: 0.9468 - iou: 0.8600 - loss: 0.0101 - val_acc: 0.9303 - val_f1: 0.9323 - val_iou: 0.8381 - val_loss: 0.0126 Epoch 105/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - acc: 0.9438 - f1: 0.9448 - iou: 0.8622 - loss: 0.0108 - val_acc: 0.9275 - val_f1: 0.9297 - val_iou: 0.8331 - val_loss: 0.0125 Epoch 106/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 29ms/step - acc: 0.9514 - f1: 0.9520 - iou: 0.8796 - loss: 0.0106 - val_acc: 0.9162 - val_f1: 0.9195 - val_iou: 0.8134 - val_loss: 0.0147 Epoch 107/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9446 - f1: 0.9457 - iou: 0.8631 - loss: 0.0107 - val_acc: 0.9249 - val_f1: 0.9275 - val_iou: 0.8290 - val_loss: 0.0138 Epoch 108/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 36ms/step - acc: 0.9493 - f1: 0.9502 - iou: 0.8735 - loss: 0.0100 - val_acc: 0.9346 - val_f1: 0.9364 - val_iou: 0.8466 - val_loss: 0.0119 Epoch 109/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9518 - f1: 0.9525 - iou: 0.8761 - loss: 0.0102 - val_acc: 0.9309 - val_f1: 0.9331 - val_iou: 0.8405 - val_loss: 0.0128 Epoch 110/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9451 - f1: 0.9464 - iou: 0.8617 - loss: 0.0105 - val_acc: 0.9307 - val_f1: 0.9327 - val_iou: 0.8394 - val_loss: 0.0124 Epoch 111/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - acc: 0.9495 - f1: 0.9501 - iou: 0.8743 - loss: 0.0099 - val_acc: 0.9312 - val_f1: 0.9332 - val_iou: 0.8401 - val_loss: 0.0125 Epoch 112/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9540 - f1: 0.9547 - iou: 0.8830 - loss: 0.0095 - val_acc: 0.9202 - val_f1: 0.9233 - val_iou: 0.8210 - val_loss: 0.0150 Epoch 113/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 36ms/step - acc: 0.9475 - f1: 0.9487 - iou: 0.8701 - loss: 0.0103 - val_acc: 0.9338 - val_f1: 0.9355 - val_iou: 0.8448 - val_loss: 0.0119 Epoch 114/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9517 - f1: 0.9525 - iou: 0.8793 - loss: 0.0099 - val_acc: 0.9360 - val_f1: 0.9375 - val_iou: 0.8485 - val_loss: 0.0123 Epoch 115/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9447 - f1: 0.9454 - iou: 0.8623 - loss: 0.0113 - val_acc: 0.9244 - val_f1: 0.9270 - val_iou: 0.8280 - val_loss: 0.0129 Epoch 116/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9503 - f1: 0.9509 - iou: 0.8784 - loss: 0.0100 - val_acc: 0.9249 - val_f1: 0.9273 - val_iou: 0.8280 - val_loss: 0.0136 Epoch 117/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9520 - f1: 0.9525 - iou: 0.8810 - loss: 0.0101 - val_acc: 0.9227 - val_f1: 0.9255 - val_iou: 0.8253 - val_loss: 0.0136 Epoch 118/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 29ms/step - acc: 0.9516 - f1: 0.9522 - iou: 0.8837 - loss: 0.0102 - val_acc: 0.9322 - val_f1: 0.9343 - val_iou: 0.8431 - val_loss: 0.0138 Epoch 119/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9521 - f1: 0.9528 - iou: 0.8843 - loss: 0.0099 - val_acc: 0.9248 - val_f1: 0.9275 - val_iou: 0.8294 - val_loss: 0.0147 Epoch 120/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - acc: 0.9550 - f1: 0.9556 - iou: 0.8865 - loss: 0.0094 - val_acc: 0.9344 - val_f1: 0.9362 - val_iou: 0.8464 - val_loss: 0.0120 Epoch 121/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9455 - f1: 0.9464 - iou: 0.8658 - loss: 0.0108 - val_acc: 0.9359 - val_f1: 0.9376 - val_iou: 0.8496 - val_loss: 0.0120 Epoch 122/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9517 - f1: 0.9525 - iou: 0.8791 - loss: 0.0100 - val_acc: 0.9236 - val_f1: 0.9263 - val_iou: 0.8268 - val_loss: 0.0130 Epoch 123/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9565 - f1: 0.9571 - iou: 0.8947 - loss: 0.0092 - val_acc: 0.9259 - val_f1: 0.9286 - val_iou: 0.8316 - val_loss: 0.0146 Epoch 124/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9477 - f1: 0.9487 - iou: 0.8694 - loss: 0.0102 - val_acc: 0.9288 - val_f1: 0.9313 - val_iou: 0.8370 - val_loss: 0.0129 Epoch 125/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - acc: 0.9469 - f1: 0.9480 - iou: 0.8663 - loss: 0.0102 - val_acc: 0.9231 - val_f1: 0.9260 - val_iou: 0.8265 - val_loss: 0.0134 Epoch 126/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9491 - f1: 0.9499 - iou: 0.8731 - loss: 0.0097 - val_acc: 0.9270 - val_f1: 0.9295 - val_iou: 0.8335 - val_loss: 0.0132 Epoch 127/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 37ms/step - acc: 0.9496 - f1: 0.9505 - iou: 0.8692 - loss: 0.0097 - val_acc: 0.9392 - val_f1: 0.9406 - val_iou: 0.8553 - val_loss: 0.0113 Epoch 128/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9505 - f1: 0.9510 - iou: 0.8845 - loss: 0.0100 - val_acc: 0.9387 - val_f1: 0.9401 - val_iou: 0.8542 - val_loss: 0.0114 Epoch 129/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9448 - f1: 0.9453 - iou: 0.8596 - loss: 0.0109 - val_acc: 0.9356 - val_f1: 0.9375 - val_iou: 0.8495 - val_loss: 0.0119 Epoch 130/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9479 - f1: 0.9485 - iou: 0.8739 - loss: 0.0107 - val_acc: 0.9318 - val_f1: 0.9340 - val_iou: 0.8425 - val_loss: 0.0128 Epoch 131/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9580 - f1: 0.9584 - iou: 0.8994 - loss: 0.0091 - val_acc: 0.9331 - val_f1: 0.9351 - val_iou: 0.8444 - val_loss: 0.0126 Epoch 132/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9563 - f1: 0.9568 - iou: 0.8925 - loss: 0.0093 - val_acc: 0.9254 - val_f1: 0.9280 - val_iou: 0.8300 - val_loss: 0.0130 Epoch 133/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9492 - f1: 0.9501 - iou: 0.8690 - loss: 0.0105 - val_acc: 0.9268 - val_f1: 0.9293 - val_iou: 0.8324 - val_loss: 0.0125 Epoch 134/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9517 - f1: 0.9527 - iou: 0.8806 - loss: 0.0095 - val_acc: 0.9232 - val_f1: 0.9261 - val_iou: 0.8265 - val_loss: 0.0132 Epoch 135/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9548 - f1: 0.9553 - iou: 0.8878 - loss: 0.0092 - val_acc: 0.9273 - val_f1: 0.9297 - val_iou: 0.8335 - val_loss: 0.0125 Epoch 136/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 29ms/step - acc: 0.9474 - f1: 0.9478 - iou: 0.8689 - loss: 0.0102 - val_acc: 0.9279 - val_f1: 0.9304 - val_iou: 0.8349 - val_loss: 0.0137 Epoch 137/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9513 - f1: 0.9523 - iou: 0.8797 - loss: 0.0101 - val_acc: 0.9353 - val_f1: 0.9371 - val_iou: 0.8487 - val_loss: 0.0124 Epoch 138/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9526 - f1: 0.9531 - iou: 0.8869 - loss: 0.0100 - val_acc: 0.9325 - val_f1: 0.9346 - val_iou: 0.8436 - val_loss: 0.0122 Epoch 139/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9512 - f1: 0.9518 - iou: 0.8831 - loss: 0.0093 - val_acc: 0.9273 - val_f1: 0.9298 - val_iou: 0.8340 - val_loss: 0.0138 Epoch 140/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9534 - f1: 0.9541 - iou: 0.8865 - loss: 0.0091 - val_acc: 0.9411 - val_f1: 0.9422 - val_iou: 0.8585 - val_loss: 0.0112 Epoch 141/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9524 - f1: 0.9529 - iou: 0.8814 - loss: 0.0096 - val_acc: 0.9325 - val_f1: 0.9346 - val_iou: 0.8437 - val_loss: 0.0123 Epoch 142/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - acc: 0.9473 - f1: 0.9480 - iou: 0.8716 - loss: 0.0101 - val_acc: 0.9386 - val_f1: 0.9402 - val_iou: 0.8547 - val_loss: 0.0113 Epoch 143/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9483 - f1: 0.9490 - iou: 0.8731 - loss: 0.0100 - val_acc: 0.9389 - val_f1: 0.9405 - val_iou: 0.8557 - val_loss: 0.0112 Epoch 144/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9517 - f1: 0.9523 - iou: 0.8841 - loss: 0.0096 - val_acc: 0.9267 - val_f1: 0.9293 - val_iou: 0.8328 - val_loss: 0.0127 Epoch 145/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9516 - f1: 0.9524 - iou: 0.8816 - loss: 0.0097 - val_acc: 0.9384 - val_f1: 0.9400 - val_iou: 0.8544 - val_loss: 0.0114 Epoch 146/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9535 - f1: 0.9537 - iou: 0.8898 - loss: 0.0095 - val_acc: 0.9317 - val_f1: 0.9338 - val_iou: 0.8420 - val_loss: 0.0128 Epoch 147/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9564 - f1: 0.9569 - iou: 0.8933 - loss: 0.0088 - val_acc: 0.9394 - val_f1: 0.9409 - val_iou: 0.8562 - val_loss: 0.0117 Epoch 148/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9532 - f1: 0.9538 - iou: 0.8774 - loss: 0.0090 - val_acc: 0.9281 - val_f1: 0.9306 - val_iou: 0.8356 - val_loss: 0.0131 Epoch 149/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9555 - f1: 0.9559 - iou: 0.8915 - loss: 0.0094 - val_acc: 0.9353 - val_f1: 0.9373 - val_iou: 0.8494 - val_loss: 0.0126 Epoch 150/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9530 - f1: 0.9537 - iou: 0.8853 - loss: 0.0095 - val_acc: 0.9207 - val_f1: 0.9237 - val_iou: 0.8220 - val_loss: 0.0143 Epoch 151/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9575 - f1: 0.9582 - iou: 0.8921 - loss: 0.0087 - val_acc: 0.9392 - val_f1: 0.9407 - val_iou: 0.8559 - val_loss: 0.0113 Epoch 152/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9489 - f1: 0.9495 - iou: 0.8713 - loss: 0.0099 - val_acc: 0.9330 - val_f1: 0.9350 - val_iou: 0.8445 - val_loss: 0.0122 Epoch 153/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9499 - f1: 0.9503 - iou: 0.8806 - loss: 0.0096 - val_acc: 0.9405 - val_f1: 0.9419 - val_iou: 0.8585 - val_loss: 0.0113 Epoch 154/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9460 - f1: 0.9468 - iou: 0.8681 - loss: 0.0100 - val_acc: 0.9386 - val_f1: 0.9402 - val_iou: 0.8549 - val_loss: 0.0113 Epoch 155/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9533 - f1: 0.9535 - iou: 0.8871 - loss: 0.0094 - val_acc: 0.9362 - val_f1: 0.9381 - val_iou: 0.8508 - val_loss: 0.0120 Epoch 156/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9489 - f1: 0.9498 - iou: 0.8694 - loss: 0.0100 - val_acc: 0.9375 - val_f1: 0.9394 - val_iou: 0.8537 - val_loss: 0.0117 Epoch 157/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9540 - f1: 0.9544 - iou: 0.8870 - loss: 0.0090 - val_acc: 0.9296 - val_f1: 0.9320 - val_iou: 0.8385 - val_loss: 0.0130 Epoch 158/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9604 - f1: 0.9608 - iou: 0.8996 - loss: 0.0084 - val_acc: 0.9320 - val_f1: 0.9342 - val_iou: 0.8427 - val_loss: 0.0128 Epoch 159/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9566 - f1: 0.9569 - iou: 0.8928 - loss: 0.0087 - val_acc: 0.9344 - val_f1: 0.9364 - val_iou: 0.8475 - val_loss: 0.0124 Epoch 160/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9624 - f1: 0.9630 - iou: 0.9030 - loss: 0.0082 - val_acc: 0.9346 - val_f1: 0.9366 - val_iou: 0.8477 - val_loss: 0.0119 Epoch 161/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9576 - f1: 0.9580 - iou: 0.8941 - loss: 0.0089 - val_acc: 0.9300 - val_f1: 0.9323 - val_iou: 0.8393 - val_loss: 0.0134 Epoch 162/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9606 - f1: 0.9612 - iou: 0.9038 - loss: 0.0085 - val_acc: 0.9340 - val_f1: 0.9361 - val_iou: 0.8469 - val_loss: 0.0127 Epoch 163/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9546 - f1: 0.9551 - iou: 0.8882 - loss: 0.0095 - val_acc: 0.9420 - val_f1: 0.9433 - val_iou: 0.8615 - val_loss: 0.0111 Epoch 164/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9460 - f1: 0.9470 - iou: 0.8642 - loss: 0.0100 - val_acc: 0.9402 - val_f1: 0.9417 - val_iou: 0.8582 - val_loss: 0.0112 Epoch 165/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9490 - f1: 0.9497 - iou: 0.8757 - loss: 0.0098 - val_acc: 0.9316 - val_f1: 0.9338 - val_iou: 0.8420 - val_loss: 0.0121 Epoch 166/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9565 - f1: 0.9568 - iou: 0.8938 - loss: 0.0088 - val_acc: 0.9299 - val_f1: 0.9323 - val_iou: 0.8392 - val_loss: 0.0129 Epoch 167/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9528 - f1: 0.9534 - iou: 0.8868 - loss: 0.0095 - val_acc: 0.9340 - val_f1: 0.9361 - val_iou: 0.8471 - val_loss: 0.0124 Epoch 168/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9547 - f1: 0.9554 - iou: 0.8871 - loss: 0.0095 - val_acc: 0.9379 - val_f1: 0.9397 - val_iou: 0.8543 - val_loss: 0.0117 Epoch 169/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9583 - f1: 0.9587 - iou: 0.8958 - loss: 0.0090 - val_acc: 0.9330 - val_f1: 0.9351 - val_iou: 0.8449 - val_loss: 0.0126 Epoch 170/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - acc: 0.9584 - f1: 0.9588 - iou: 0.8966 - loss: 0.0089 - val_acc: 0.9307 - val_f1: 0.9330 - val_iou: 0.8406 - val_loss: 0.0129 Epoch 171/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 29ms/step - acc: 0.9616 - f1: 0.9620 - iou: 0.9034 - loss: 0.0083 - val_acc: 0.9360 - val_f1: 0.9380 - val_iou: 0.8508 - val_loss: 0.0123 Epoch 172/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - acc: 0.9563 - f1: 0.9565 - iou: 0.8916 - loss: 0.0092 - val_acc: 0.9326 - val_f1: 0.9348 - val_iou: 0.8443 - val_loss: 0.0129 Epoch 173/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9524 - f1: 0.9530 - iou: 0.8802 - loss: 0.0092 - val_acc: 0.9340 - val_f1: 0.9361 - val_iou: 0.8469 - val_loss: 0.0123 Epoch 174/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - acc: 0.9523 - f1: 0.9530 - iou: 0.8803 - loss: 0.0095 - val_acc: 0.9386 - val_f1: 0.9404 - val_iou: 0.8557 - val_loss: 0.0116 Epoch 175/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - acc: 0.9529 - f1: 0.9536 - iou: 0.8810 - loss: 0.0091 - val_acc: 0.9372 - val_f1: 0.9390 - val_iou: 0.8529 - val_loss: 0.0118 Epoch 176/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 35ms/step - acc: 0.9571 - f1: 0.9574 - iou: 0.8940 - loss: 0.0089 - val_acc: 0.9342 - val_f1: 0.9362 - val_iou: 0.8471 - val_loss: 0.0124 Epoch 177/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9430 - f1: 0.9437 - iou: 0.8637 - loss: 0.0106 - val_acc: 0.9329 - val_f1: 0.9350 - val_iou: 0.8446 - val_loss: 0.0124 Epoch 178/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9535 - f1: 0.9541 - iou: 0.8860 - loss: 0.0094 - val_acc: 0.9324 - val_f1: 0.9346 - val_iou: 0.8438 - val_loss: 0.0124 Epoch 179/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9615 - f1: 0.9619 - iou: 0.9038 - loss: 0.0082 - val_acc: 0.9324 - val_f1: 0.9346 - val_iou: 0.8437 - val_loss: 0.0125 Epoch 180/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9557 - f1: 0.9561 - iou: 0.8909 - loss: 0.0091 - val_acc: 0.9328 - val_f1: 0.9349 - val_iou: 0.8445 - val_loss: 0.0124 Epoch 181/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9544 - f1: 0.9547 - iou: 0.8889 - loss: 0.0096 - val_acc: 0.9324 - val_f1: 0.9346 - val_iou: 0.8437 - val_loss: 0.0125 Epoch 182/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - acc: 0.9514 - f1: 0.9521 - iou: 0.8815 - loss: 0.0096 - val_acc: 0.9325 - val_f1: 0.9347 - val_iou: 0.8439 - val_loss: 0.0124 Epoch 183/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - acc: 0.9459 - f1: 0.9462 - iou: 0.8747 - loss: 0.0115 - val_acc: 0.9330 - val_f1: 0.9351 - val_iou: 0.8449 - val_loss: 0.0123 Epoch 184/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9504 - f1: 0.9509 - iou: 0.8806 - loss: 0.0093 - val_acc: 0.9352 - val_f1: 0.9372 - val_iou: 0.8490 - val_loss: 0.0119 Epoch 185/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9499 - f1: 0.9503 - iou: 0.8773 - loss: 0.0093 - val_acc: 0.9353 - val_f1: 0.9372 - val_iou: 0.8491 - val_loss: 0.0119 Epoch 186/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 29ms/step - acc: 0.9576 - f1: 0.9583 - iou: 0.8935 - loss: 0.0091 - val_acc: 0.9352 - val_f1: 0.9372 - val_iou: 0.8490 - val_loss: 0.0119 Epoch 187/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9477 - f1: 0.9484 - iou: 0.8735 - loss: 0.0097 - val_acc: 0.9350 - val_f1: 0.9369 - val_iou: 0.8485 - val_loss: 0.0119 Epoch 188/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - acc: 0.9592 - f1: 0.9597 - iou: 0.8980 - loss: 0.0085 - val_acc: 0.9346 - val_f1: 0.9366 - val_iou: 0.8478 - val_loss: 0.0120 Epoch 189/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9576 - f1: 0.9579 - iou: 0.8971 - loss: 0.0087 - val_acc: 0.9343 - val_f1: 0.9364 - val_iou: 0.8473 - val_loss: 0.0120 Epoch 190/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - acc: 0.9539 - f1: 0.9544 - iou: 0.8848 - loss: 0.0085 - val_acc: 0.9340 - val_f1: 0.9360 - val_iou: 0.8466 - val_loss: 0.0121 Epoch 191/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9629 - f1: 0.9633 - iou: 0.9051 - loss: 0.0077 - val_acc: 0.9338 - val_f1: 0.9359 - val_iou: 0.8464 - val_loss: 0.0122 Epoch 192/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 29ms/step - acc: 0.9532 - f1: 0.9538 - iou: 0.8860 - loss: 0.0097 - val_acc: 0.9340 - val_f1: 0.9360 - val_iou: 0.8467 - val_loss: 0.0122 Epoch 193/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - acc: 0.9605 - f1: 0.9610 - iou: 0.8996 - loss: 0.0084 - val_acc: 0.9341 - val_f1: 0.9361 - val_iou: 0.8468 - val_loss: 0.0122 Epoch 194/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - acc: 0.9522 - f1: 0.9528 - iou: 0.8815 - loss: 0.0095 - val_acc: 0.9340 - val_f1: 0.9361 - val_iou: 0.8468 - val_loss: 0.0122 Epoch 195/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - acc: 0.9529 - f1: 0.9535 - iou: 0.8859 - loss: 0.0093 - val_acc: 0.9339 - val_f1: 0.9359 - val_iou: 0.8465 - val_loss: 0.0122 Epoch 196/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - acc: 0.9525 - f1: 0.9534 - iou: 0.8785 - loss: 0.0095 - val_acc: 0.9337 - val_f1: 0.9358 - val_iou: 0.8462 - val_loss: 0.0122 Epoch 197/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 25ms/step - acc: 0.9536 - f1: 0.9542 - iou: 0.8846 - loss: 0.0089 - val_acc: 0.9337 - val_f1: 0.9358 - val_iou: 0.8461 - val_loss: 0.0122 Epoch 198/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - acc: 0.9636 - f1: 0.9640 - iou: 0.9078 - loss: 0.0079 - val_acc: 0.9334 - val_f1: 0.9355 - val_iou: 0.8457 - val_loss: 0.0123 Epoch 199/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - acc: 0.9538 - f1: 0.9544 - iou: 0.8844 - loss: 0.0097 - val_acc: 0.9335 - val_f1: 0.9356 - val_iou: 0.8458 - val_loss: 0.0123 Epoch 200/200 25/25 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - acc: 0.9491 - f1: 0.9499 - iou: 0.8771 - loss: 0.0099 - val_acc: 0.9336 - val_f1: 0.9357 - val_iou: 0.8459 - val_loss: 0.0123 31/31 ━━━━━━━━━━━━━━━━━━━━ 1s 918us/step 31/31 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - acc: 0.9448 - f1: 0.9461 - iou: 0.8636 - loss: 0.0111
Model evaluation¶
Now that we have trained the model, we will evaluate the model on the test dataset. Similar to training, we will provide the high-level configuration to the task process.
task.evaluate(params)
Subject: 100%|██████████| 56/56 [00:17<00:00, 3.21it/s]
INFO Testing Results evaluate.py:130
19/19 ━━━━━━━━━━━━━━━━━━━━ 2s 25ms/step - acc: 0.9436 - f1: 0.9448 - iou: 0.8600 - loss: 0.0101
INFO [TEST SET] acc=94.59%, f1=94.65%, iou=86.69%, loss=0.97% evaluate.py:132
Confusion matrix¶
Let's visualize the confusion matrix to understand the model's performance on each class.
IPython.display.Image(filename=params.job_dir / "confusion_matrix_test.png", width=500)
Export model to TF Lite / TFLM¶
Once we have trained and evaluated the model, we need to export the model into a format that can be used for inference on the edge. Currently, we export the model to TensorFlow Lite flatbuffer format. This will also generate a C header file that can be used with TensorFlow Lite for Microcontrollers (TFLM).
Apply post-training quantization (PTQ)¶
For running on bare metal, we will perform post-training quantization to convert the model to an 8-bit integer model. The weights and activations will be quantized to 8-bits and biases will be quantized to 32-bits. This will reduce the model size and improve the inference speed.
quantization = sk.QuantizationParams(
enabled=True,
format="FP32",
io_type="float32",
conversion="CONCRETE",
)
params.quantization = quantization
# TF dumps a lot of info to stdout, so we redirect it to /dev/null
with open(os.devnull, 'w') as devnull:
with contextlib.redirect_stdout(devnull), contextlib.redirect_stderr(devnull):
task.export(params)
I0000 00:00:1727881289.426610 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727881289.426701 5610 devices.cc:67] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 1 I0000 00:00:1727881289.427005 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727881289.427061 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727881289.427104 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727881289.427172 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727881289.427216 5610 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 W0000 00:00:1727881289.517520 5610 tf_tfl_flatbuffer_helpers.cc:392] Ignored output_format. W0000 00:00:1727881289.517532 5610 tf_tfl_flatbuffer_helpers.cc:395] Ignored drop_control_dependency.
Run inference demo¶
We will run a demo on the PC to verify that the model is working as expected. The demo will load the model and run inferences across a randomly selected subject. The demo will also provide the model's prediction and the corresponding class name.
task.demo(params=params)
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR I0000 00:00:1727883197.048706 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.068145 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.068255 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.069518 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.069592 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.069636 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.117509 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.117605 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 I0000 00:00:1727883197.117663 223691 cuda_executor.cc:1015] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355 Inference: 0%| | 0/296 [00:00<?, ?it/s]WARNING: All log messages before absl::InitializeLog() is called are written to STDERR I0000 00:00:1727883197.752020 223865 service.cc:146] XLA service 0x741ed800fff0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: I0000 00:00:1727883197.752040 223865 service.cc:154] StreamExecutor device (0): NVIDIA GeForce RTX 4090, Compute Capability 8.9 I0000 00:00:1727883198.494078 223865 device_compiler.h:188] Compiled cluster using XLA! This line is logged at most once for the lifetime of the process. Inference: 100%|██████████| 296/296 [00:06<00:00, 42.97it/s]