Update README and add additional benchmarking logs
Browse files- README.md +184 -18
- logs_modchembert_classification_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_antimalarial_epochs100_batch_size32_20250925_224136.log +365 -0
- logs_modchembert_classification_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_cocrystal_epochs100_batch_size32_20250926_032547.log +349 -0
- logs_modchembert_classification_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_covid19_epochs100_batch_size32_20250925_210846.log +327 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_microsom_stab_h_epochs100_batch_size32_20250926_053842.log +359 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_microsom_stab_r_epochs100_batch_size32_20250926_061544.log +323 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_permeability_epochs100_batch_size32_20250926_070028.log +343 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_ppb_h_epochs100_batch_size32_20250926_075041.log +315 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_ppb_r_epochs100_batch_size32_20250926_080329.log +329 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_solubility_epochs100_batch_size32_20250926_081611.log +343 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_cl_epochs100_batch_size32_20250926_091752.log +317 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_logd74_epochs100_batch_size32_20250926_100505.log +407 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_ppb_epochs100_batch_size32_20250926_121436.log +327 -0
- logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_solubility_epochs100_batch_size32_20250926_131505.log +355 -0
README.md
CHANGED
|
@@ -118,6 +118,123 @@ model-index:
|
|
| 118 |
metrics:
|
| 119 |
- type: rmse
|
| 120 |
value: 0.6820
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 121 |
---
|
| 122 |
|
| 123 |
# ModChemBERT: ModernBERT as a Chemical Language Model
|
|
@@ -159,10 +276,10 @@ print(fill("c1ccccc1[MASK]"))
|
|
| 159 |
- Encoder Layers: 22
|
| 160 |
- Attention heads: 12
|
| 161 |
- Max sequence length: 256 tokens (MLM primarily trained with 128-token sequences)
|
| 162 |
-
-
|
| 163 |
|
| 164 |
## Pooling (Classifier / Regressor Head)
|
| 165 |
-
Kallergis et al. [1] demonstrated that the CLM embedding method prior to the prediction head
|
| 166 |
|
| 167 |
Behrendt et al. [2] noted that the last few layers contain task-specific information and that pooling methods leveraging information from multiple layers can enhance model performance. Their results further demonstrated that the `max_seq_mha` pooling method was particularly effective in low-data regimes, which is often the case for molecular property prediction tasks.
|
| 168 |
|
|
@@ -178,6 +295,9 @@ Multiple pooling strategies are supported by ModChemBERT to explore their impact
|
|
| 178 |
- `mean_sum`: Mean over all layers then sum tokens
|
| 179 |
- `max_seq_mean`: Max over last k layers then mean tokens
|
| 180 |
|
|
|
|
|
|
|
|
|
|
| 181 |
## Training Pipeline
|
| 182 |
<div align="center">
|
| 183 |
<img src="https://cdn-uploads.huggingface.co/production/uploads/656892962693fa22e18b5331/bxNbpgMkU8m60ypyEJoWQ.png" alt="ModChemBERT Training Pipeline" width="650"/>
|
|
@@ -190,23 +310,33 @@ Following Sultan et al. [3], multi-task regression (physicochemical properties)
|
|
| 190 |
Inspired by ModernBERT [4], JaColBERTv2.5 [5], and Llama 3.1 [6], where results show that model merging can enhance generalization or performance while mitigating overfitting to any single fine-tune or annealing checkpoint.
|
| 191 |
|
| 192 |
## Datasets
|
| 193 |
-
- Pretraining: [Derify/augmented_canonical_druglike_QED_Pfizer_15M](https://huggingface.co/datasets/Derify/augmented_canonical_druglike_QED_Pfizer_15M)
|
| 194 |
-
- Domain Adaptive Pretraining (DAPT) & Task Adaptive Fine-tuning (TAFT): ADME + AstraZeneca
|
| 195 |
-
- Benchmarking:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 196 |
|
| 197 |
## Benchmarking
|
| 198 |
-
Benchmarks were conducted
|
|
|
|
|
|
|
| 199 |
|
| 200 |
### Evaluation Methodology
|
| 201 |
-
- Classification Metric: ROC AUC
|
| 202 |
-
- Regression Metric: RMSE
|
| 203 |
- Aggregation: Mean ± standard deviation of the triplicate results.
|
| 204 |
-
- Input Constraints: SMILES truncated / filtered to ≤200 tokens, following
|
| 205 |
|
| 206 |
### Results
|
| 207 |
<details><summary>Click to expand</summary>
|
| 208 |
|
| 209 |
-
#### Classification Datasets (ROC AUC - Higher is better)
|
| 210 |
|
| 211 |
| Model | BACE↑ | BBBP↑ | CLINTOX↑ | HIV↑ | SIDER↑ | TOX21↑ | AVG† |
|
| 212 |
| ---------------------------------------------------------------------------- | ----------------- | ----------------- | --------------------- | --------------------- | --------------------- | ----------------- | ------ |
|
|
@@ -214,14 +344,14 @@ Benchmarks were conducted with the ChemBERTa-3 framework using DeepChem scaffold
|
|
| 214 |
| [ChemBERTa-100M-MLM](https://huggingface.co/DeepChem/ChemBERTa-100M-MLM)* | 0.781 ± 0.019 | 0.700 ± 0.027 | 0.979 ± 0.022 | 0.740 ± 0.013 | 0.611 ± 0.002 | 0.718 ± 0.011 | 0.7548 |
|
| 215 |
| [c3-MoLFormer-1.1B](https://huggingface.co/DeepChem/MoLFormer-c3-1.1B)* | 0.819 ± 0.019 | 0.735 ± 0.019 | 0.839 ± 0.013 | 0.762 ± 0.005 | 0.618 ± 0.005 | 0.723 ± 0.012 | 0.7493 |
|
| 216 |
| MoLFormer-LHPC* | **0.887 ± 0.004** | **0.908 ± 0.013** | 0.993 ± 0.004 | 0.750 ± 0.003 | 0.622 ± 0.007 | **0.791 ± 0.014** | 0.8252 |
|
| 217 |
-
|
|
| 218 |
| [MLM](https://huggingface.co/Derify/ModChemBERT-MLM) | 0.8065 ± 0.0103 | 0.7222 ± 0.0150 | 0.9709 ± 0.0227 | ***0.7800 ± 0.0133*** | 0.6419 ± 0.0113 | 0.7400 ± 0.0044 | 0.7769 |
|
| 219 |
| [MLM + DAPT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT) | 0.8224 ± 0.0156 | 0.7402 ± 0.0095 | 0.9820 ± 0.0138 | 0.7702 ± 0.0020 | 0.6303 ± 0.0039 | 0.7360 ± 0.0036 | 0.7802 |
|
| 220 |
| [MLM + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-TAFT) | 0.7924 ± 0.0155 | 0.7282 ± 0.0058 | 0.9725 ± 0.0213 | 0.7770 ± 0.0047 | 0.6542 ± 0.0128 | *0.7646 ± 0.0039* | 0.7815 |
|
| 221 |
| [MLM + DAPT + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT-TAFT) | 0.8213 ± 0.0051 | 0.7356 ± 0.0094 | 0.9664 ± 0.0202 | 0.7750 ± 0.0048 | 0.6415 ± 0.0094 | 0.7263 ± 0.0036 | 0.7777 |
|
| 222 |
| [MLM + DAPT + TAFT OPT](https://huggingface.co/Derify/ModChemBERT) | *0.8346 ± 0.0045* | *0.7573 ± 0.0120* | ***0.9938 ± 0.0017*** | 0.7737 ± 0.0034 | ***0.6600 ± 0.0061*** | 0.7518 ± 0.0047 | 0.7952 |
|
| 223 |
|
| 224 |
-
#### Regression Datasets (RMSE - Lower is better)
|
| 225 |
|
| 226 |
| Model | BACE↓ | CLEARANCE↓ | ESOL↓ | FREESOLV↓ | LIPO↓ | AVG‡ |
|
| 227 |
| ---------------------------------------------------------------------------- | --------------------- | ---------------------- | --------------------- | --------------------- | --------------------- | ---------------- |
|
|
@@ -229,17 +359,45 @@ Benchmarks were conducted with the ChemBERTa-3 framework using DeepChem scaffold
|
|
| 229 |
| [ChemBERTa-100M-MLM](https://huggingface.co/DeepChem/ChemBERTa-100M-MLM)* | 1.011 ± 0.038 | 51.582 ± 3.079 | 0.920 ± 0.011 | 0.536 ± 0.016 | 0.758 ± 0.013 | 0.8063 / 10.9614 |
|
| 230 |
| [c3-MoLFormer-1.1B](https://huggingface.co/DeepChem/MoLFormer-c3-1.1B)* | 1.094 ± 0.126 | 52.058 ± 2.767 | 0.829 ± 0.019 | 0.572 ± 0.023 | 0.728 ± 0.016 | 0.8058 / 11.0562 |
|
| 231 |
| MoLFormer-LHPC* | 1.201 ± 0.100 | 45.74 ± 2.637 | 0.848 ± 0.031 | 0.683 ± 0.040 | 0.895 ± 0.080 | 0.9068 / 9.8734 |
|
| 232 |
-
|
|
| 233 |
| [MLM](https://huggingface.co/Derify/ModChemBERT-MLM) | 1.0893 ± 0.1319 | 49.0005 ± 1.2787 | 0.8456 ± 0.0406 | 0.5491 ± 0.0134 | 0.7147 ± 0.0062 | 0.7997 / 10.4398 |
|
| 234 |
| [MLM + DAPT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT) | 0.9931 ± 0.0258 | 45.4951 ± 0.7112 | 0.9319 ± 0.0153 | 0.6049 ± 0.0666 | 0.6874 ± 0.0040 | 0.8043 / 9.7425 |
|
| 235 |
| [MLM + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-TAFT) | 1.0304 ± 0.1146 | 47.8418 ± 0.4070 | ***0.7669 ± 0.0024*** | 0.5293 ± 0.0267 | 0.6708 ± 0.0074 | 0.7493 / 10.1678 |
|
| 236 |
| [MLM + DAPT + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT-TAFT) | 0.9713 ± 0.0224 | ***42.8010 ± 3.3475*** | 0.8169 ± 0.0268 | 0.5445 ± 0.0257 | 0.6820 ± 0.0028 | 0.7537 / 9.1631 |
|
| 237 |
| [MLM + DAPT + TAFT OPT](https://huggingface.co/Derify/ModChemBERT) | ***0.9665 ± 0.0250*** | 44.0137 ± 1.1110 | 0.8158 ± 0.0115 | ***0.4979 ± 0.0158*** | ***0.6505 ± 0.0126*** | 0.7327 / 9.3889 |
|
| 238 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 239 |
**Bold** indicates the best result in the column; *italic* indicates the best result among ModChemBERT checkpoints.<br/>
|
| 240 |
\* Published results from the ChemBERTa-3 [7] paper for optimized chemical language models using DeepChem scaffold splits.<br/>
|
| 241 |
-
† AVG column shows the mean score across
|
| 242 |
-
‡ AVG column shows the mean scores across
|
| 243 |
|
| 244 |
</details>
|
| 245 |
|
|
@@ -279,6 +437,9 @@ Optimal parameters (per dataset) for the `MLM + DAPT + TAFT OPT` merged model:
|
|
| 279 |
| esol | 64 | sum_mean | N/A | 0.1 | 0.0 | 0.1 |
|
| 280 |
| freesolv | 32 | max_seq_mha | 5 | 0.1 | 0.0 | 0.0 |
|
| 281 |
| lipo | 32 | max_seq_mha | 3 | 0.1 | 0.1 | 0.1 |
|
|
|
|
|
|
|
|
|
|
| 282 |
|
| 283 |
</details>
|
| 284 |
|
|
@@ -312,10 +473,15 @@ If you use ModChemBERT in your research, please cite the checkpoint and the foll
|
|
| 312 |
```
|
| 313 |
|
| 314 |
## References
|
| 315 |
-
1. Kallergis,
|
| 316 |
2. Behrendt, Maike, Stefan Sylvius Wagner, and Stefan Harmeling. "MaxPoolBERT: Enhancing BERT Classification via Layer-and Token-Wise Aggregation." arXiv preprint arXiv:2505.15696 (2025).
|
| 317 |
3. Sultan, Afnan, et al. "Transformers for molecular property prediction: Domain adaptation efficiently improves performance." arXiv preprint arXiv:2503.03360 (2025).
|
| 318 |
4. Warner, Benjamin, et al. "Smarter, better, faster, longer: A modern bidirectional encoder for fast, memory efficient, and long context finetuning and inference." arXiv preprint arXiv:2412.13663 (2024).
|
| 319 |
-
5. Clavié, Benjamin. "JaColBERTv2.5: Optimising Multi-Vector Retrievers to Create State-of-the-Art Japanese Retrievers with Constrained Resources."
|
| 320 |
6. Grattafiori, Aaron, et al. "The llama 3 herd of models." arXiv preprint arXiv:2407.21783 (2024).
|
| 321 |
-
7. Singh,
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 118 |
metrics:
|
| 119 |
- type: rmse
|
| 120 |
value: 0.6820
|
| 121 |
+
- task:
|
| 122 |
+
type: text-classification
|
| 123 |
+
name: Classification (ROC AUC)
|
| 124 |
+
dataset:
|
| 125 |
+
name: Antimalarial
|
| 126 |
+
type: Antimalarial
|
| 127 |
+
metrics:
|
| 128 |
+
- type: roc_auc
|
| 129 |
+
value: 0.8819
|
| 130 |
+
- task:
|
| 131 |
+
type: text-classification
|
| 132 |
+
name: Classification (ROC AUC)
|
| 133 |
+
dataset:
|
| 134 |
+
name: Cocrystal
|
| 135 |
+
type: Cocrystal
|
| 136 |
+
metrics:
|
| 137 |
+
- type: roc_auc
|
| 138 |
+
value: 0.8550
|
| 139 |
+
- task:
|
| 140 |
+
type: text-classification
|
| 141 |
+
name: Classification (ROC AUC)
|
| 142 |
+
dataset:
|
| 143 |
+
name: COVID19
|
| 144 |
+
type: COVID19
|
| 145 |
+
metrics:
|
| 146 |
+
- type: roc_auc
|
| 147 |
+
value: 0.8013
|
| 148 |
+
- task:
|
| 149 |
+
type: regression
|
| 150 |
+
name: Regression (RMSE)
|
| 151 |
+
dataset:
|
| 152 |
+
name: ADME microsom stab human
|
| 153 |
+
type: ADME
|
| 154 |
+
metrics:
|
| 155 |
+
- type: rmse
|
| 156 |
+
value: 0.4206
|
| 157 |
+
- task:
|
| 158 |
+
type: regression
|
| 159 |
+
name: Regression (RMSE)
|
| 160 |
+
dataset:
|
| 161 |
+
name: ADME microsom stab rat
|
| 162 |
+
type: ADME
|
| 163 |
+
metrics:
|
| 164 |
+
- type: rmse
|
| 165 |
+
value: 0.4400
|
| 166 |
+
- task:
|
| 167 |
+
type: regression
|
| 168 |
+
name: Regression (RMSE)
|
| 169 |
+
dataset:
|
| 170 |
+
name: ADME permeability
|
| 171 |
+
type: ADME
|
| 172 |
+
metrics:
|
| 173 |
+
- type: rmse
|
| 174 |
+
value: 0.4899
|
| 175 |
+
- task:
|
| 176 |
+
type: regression
|
| 177 |
+
name: Regression (RMSE)
|
| 178 |
+
dataset:
|
| 179 |
+
name: ADME ppb human
|
| 180 |
+
type: ADME
|
| 181 |
+
metrics:
|
| 182 |
+
- type: rmse
|
| 183 |
+
value: 0.8927
|
| 184 |
+
- task:
|
| 185 |
+
type: regression
|
| 186 |
+
name: Regression (RMSE)
|
| 187 |
+
dataset:
|
| 188 |
+
name: ADME ppb rat
|
| 189 |
+
type: ADME
|
| 190 |
+
metrics:
|
| 191 |
+
- type: rmse
|
| 192 |
+
value: 0.6942
|
| 193 |
+
- task:
|
| 194 |
+
type: regression
|
| 195 |
+
name: Regression (RMSE)
|
| 196 |
+
dataset:
|
| 197 |
+
name: ADME solubility
|
| 198 |
+
type: ADME
|
| 199 |
+
metrics:
|
| 200 |
+
- type: rmse
|
| 201 |
+
value: 0.4641
|
| 202 |
+
- task:
|
| 203 |
+
type: regression
|
| 204 |
+
name: Regression (RMSE)
|
| 205 |
+
dataset:
|
| 206 |
+
name: AstraZeneca CL
|
| 207 |
+
type: AstraZeneca
|
| 208 |
+
metrics:
|
| 209 |
+
- type: rmse
|
| 210 |
+
value: 0.5022
|
| 211 |
+
- task:
|
| 212 |
+
type: regression
|
| 213 |
+
name: Regression (RMSE)
|
| 214 |
+
dataset:
|
| 215 |
+
name: AstraZeneca LogD74
|
| 216 |
+
type: AstraZeneca
|
| 217 |
+
metrics:
|
| 218 |
+
- type: rmse
|
| 219 |
+
value: 0.7467
|
| 220 |
+
- task:
|
| 221 |
+
type: regression
|
| 222 |
+
name: Regression (RMSE)
|
| 223 |
+
dataset:
|
| 224 |
+
name: AstraZeneca PPB
|
| 225 |
+
type: AstraZeneca
|
| 226 |
+
metrics:
|
| 227 |
+
- type: rmse
|
| 228 |
+
value: 0.1195
|
| 229 |
+
- task:
|
| 230 |
+
type: regression
|
| 231 |
+
name: Regression (RMSE)
|
| 232 |
+
dataset:
|
| 233 |
+
name: AstraZeneca Solubility
|
| 234 |
+
type: AstraZeneca
|
| 235 |
+
metrics:
|
| 236 |
+
- type: rmse
|
| 237 |
+
value: 0.8564
|
| 238 |
---
|
| 239 |
|
| 240 |
# ModChemBERT: ModernBERT as a Chemical Language Model
|
|
|
|
| 276 |
- Encoder Layers: 22
|
| 277 |
- Attention heads: 12
|
| 278 |
- Max sequence length: 256 tokens (MLM primarily trained with 128-token sequences)
|
| 279 |
+
- Tokenizer: BPE tokenizer using [MolFormer's vocab](https://github.com/emapco/ModChemBERT/blob/main/modchembert/tokenizers/molformer/vocab.json) (2362 tokens)
|
| 280 |
|
| 281 |
## Pooling (Classifier / Regressor Head)
|
| 282 |
+
Kallergis et al. [1] demonstrated that the CLM embedding method prior to the prediction head was the strongest contributor to downstream performance among evaluated hyperparameters.
|
| 283 |
|
| 284 |
Behrendt et al. [2] noted that the last few layers contain task-specific information and that pooling methods leveraging information from multiple layers can enhance model performance. Their results further demonstrated that the `max_seq_mha` pooling method was particularly effective in low-data regimes, which is often the case for molecular property prediction tasks.
|
| 285 |
|
|
|
|
| 295 |
- `mean_sum`: Mean over all layers then sum tokens
|
| 296 |
- `max_seq_mean`: Max over last k layers then mean tokens
|
| 297 |
|
| 298 |
+
Note: ModChemBERT’s `max_seq_mha` differs from MaxPoolBERT [2]. MaxPoolBERT uses PyTorch `nn.MultiheadAttention`, whereas ModChemBERT's `ModChemBertPoolingAttention` adapts ModernBERT’s `ModernBertAttention`.
|
| 299 |
+
On ChemBERTa-3 benchmarks this variant produced stronger validation metrics and avoided the training instabilities (sporadic zero / NaN losses and gradient norms) seen with `nn.MultiheadAttention`. Training instability with ModernBERT has been reported in the past ([discussion 1](https://huggingface.co/answerdotai/ModernBERT-base/discussions/59) and [discussion 2](https://huggingface.co/answerdotai/ModernBERT-base/discussions/63)).
|
| 300 |
+
|
| 301 |
## Training Pipeline
|
| 302 |
<div align="center">
|
| 303 |
<img src="https://cdn-uploads.huggingface.co/production/uploads/656892962693fa22e18b5331/bxNbpgMkU8m60ypyEJoWQ.png" alt="ModChemBERT Training Pipeline" width="650"/>
|
|
|
|
| 310 |
Inspired by ModernBERT [4], JaColBERTv2.5 [5], and Llama 3.1 [6], where results show that model merging can enhance generalization or performance while mitigating overfitting to any single fine-tune or annealing checkpoint.
|
| 311 |
|
| 312 |
## Datasets
|
| 313 |
+
- Pretraining: [Derify/augmented_canonical_druglike_QED_Pfizer_15M](https://huggingface.co/datasets/Derify/augmented_canonical_druglike_QED_Pfizer_15M) (canonical_smiles column)
|
| 314 |
+
- Domain Adaptive Pretraining (DAPT) & Task Adaptive Fine-tuning (TAFT): ADME (6 tasks) + AstraZeneca (4 tasks) datasets that are split using DA4MT's [3] Bemis-Murcko scaffold splitter (see [domain-adaptation-molecular-transformers](https://github.com/emapco/ModChemBERT/blob/main/domain-adaptation-molecular-transformers/da4mt/splitting.py))
|
| 315 |
+
- Benchmarking:
|
| 316 |
+
- ChemBERTa-3 [7]
|
| 317 |
+
- classification: BACE, BBBP, TOX21, HIV, SIDER, CLINTOX
|
| 318 |
+
- regression: ESOL, FREESOLV, LIPO, BACE, CLEARANCE
|
| 319 |
+
- Mswahili, et al. [8] proposed additional datasets for benchmarking chemical language models:
|
| 320 |
+
- classification: Antimalarial [9], Cocrystal [10], COVID19 [11]
|
| 321 |
+
- DAPT/TAFT stage regression datasets:
|
| 322 |
+
- ADME [12]: adme_microsom_stab_h, adme_microsom_stab_r, adme_permeability, adme_ppb_h, adme_ppb_r, adme_solubility
|
| 323 |
+
- AstraZeneca: astrazeneca_CL, astrazeneca_LogD74, astrazeneca_PPB, astrazeneca_Solubility
|
| 324 |
|
| 325 |
## Benchmarking
|
| 326 |
+
Benchmarks were conducted using the ChemBERTa-3 framework. DeepChem scaffold splits were utilized for all datasets, with the exception of the Antimalarial dataset, which employed a random split. Each task was trained for 100 epochs, with results averaged across 3 random seeds.
|
| 327 |
+
|
| 328 |
+
The complete hyperparameter configurations for these benchmarks are available here: [ChemBERTa3 configs](https://github.com/emapco/ModChemBERT/tree/main/conf/chemberta3)
|
| 329 |
|
| 330 |
### Evaluation Methodology
|
| 331 |
+
- Classification Metric: ROC AUC
|
| 332 |
+
- Regression Metric: RMSE
|
| 333 |
- Aggregation: Mean ± standard deviation of the triplicate results.
|
| 334 |
+
- Input Constraints: SMILES truncated / filtered to ≤200 tokens, following ChemBERTa-3's recommendation.
|
| 335 |
|
| 336 |
### Results
|
| 337 |
<details><summary>Click to expand</summary>
|
| 338 |
|
| 339 |
+
#### ChemBERTa-3 Classification Datasets (ROC AUC - Higher is better)
|
| 340 |
|
| 341 |
| Model | BACE↑ | BBBP↑ | CLINTOX↑ | HIV↑ | SIDER↑ | TOX21↑ | AVG† |
|
| 342 |
| ---------------------------------------------------------------------------- | ----------------- | ----------------- | --------------------- | --------------------- | --------------------- | ----------------- | ------ |
|
|
|
|
| 344 |
| [ChemBERTa-100M-MLM](https://huggingface.co/DeepChem/ChemBERTa-100M-MLM)* | 0.781 ± 0.019 | 0.700 ± 0.027 | 0.979 ± 0.022 | 0.740 ± 0.013 | 0.611 ± 0.002 | 0.718 ± 0.011 | 0.7548 |
|
| 345 |
| [c3-MoLFormer-1.1B](https://huggingface.co/DeepChem/MoLFormer-c3-1.1B)* | 0.819 ± 0.019 | 0.735 ± 0.019 | 0.839 ± 0.013 | 0.762 ± 0.005 | 0.618 ± 0.005 | 0.723 ± 0.012 | 0.7493 |
|
| 346 |
| MoLFormer-LHPC* | **0.887 ± 0.004** | **0.908 ± 0.013** | 0.993 ± 0.004 | 0.750 ± 0.003 | 0.622 ± 0.007 | **0.791 ± 0.014** | 0.8252 |
|
| 347 |
+
| | | | | | | | |
|
| 348 |
| [MLM](https://huggingface.co/Derify/ModChemBERT-MLM) | 0.8065 ± 0.0103 | 0.7222 ± 0.0150 | 0.9709 ± 0.0227 | ***0.7800 ± 0.0133*** | 0.6419 ± 0.0113 | 0.7400 ± 0.0044 | 0.7769 |
|
| 349 |
| [MLM + DAPT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT) | 0.8224 ± 0.0156 | 0.7402 ± 0.0095 | 0.9820 ± 0.0138 | 0.7702 ± 0.0020 | 0.6303 ± 0.0039 | 0.7360 ± 0.0036 | 0.7802 |
|
| 350 |
| [MLM + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-TAFT) | 0.7924 ± 0.0155 | 0.7282 ± 0.0058 | 0.9725 ± 0.0213 | 0.7770 ± 0.0047 | 0.6542 ± 0.0128 | *0.7646 ± 0.0039* | 0.7815 |
|
| 351 |
| [MLM + DAPT + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT-TAFT) | 0.8213 ± 0.0051 | 0.7356 ± 0.0094 | 0.9664 ± 0.0202 | 0.7750 ± 0.0048 | 0.6415 ± 0.0094 | 0.7263 ± 0.0036 | 0.7777 |
|
| 352 |
| [MLM + DAPT + TAFT OPT](https://huggingface.co/Derify/ModChemBERT) | *0.8346 ± 0.0045* | *0.7573 ± 0.0120* | ***0.9938 ± 0.0017*** | 0.7737 ± 0.0034 | ***0.6600 ± 0.0061*** | 0.7518 ± 0.0047 | 0.7952 |
|
| 353 |
|
| 354 |
+
#### ChemBERTa-3 Regression Datasets (RMSE - Lower is better)
|
| 355 |
|
| 356 |
| Model | BACE↓ | CLEARANCE↓ | ESOL↓ | FREESOLV↓ | LIPO↓ | AVG‡ |
|
| 357 |
| ---------------------------------------------------------------------------- | --------------------- | ---------------------- | --------------------- | --------------------- | --------------------- | ---------------- |
|
|
|
|
| 359 |
| [ChemBERTa-100M-MLM](https://huggingface.co/DeepChem/ChemBERTa-100M-MLM)* | 1.011 ± 0.038 | 51.582 ± 3.079 | 0.920 ± 0.011 | 0.536 ± 0.016 | 0.758 ± 0.013 | 0.8063 / 10.9614 |
|
| 360 |
| [c3-MoLFormer-1.1B](https://huggingface.co/DeepChem/MoLFormer-c3-1.1B)* | 1.094 ± 0.126 | 52.058 ± 2.767 | 0.829 ± 0.019 | 0.572 ± 0.023 | 0.728 ± 0.016 | 0.8058 / 11.0562 |
|
| 361 |
| MoLFormer-LHPC* | 1.201 ± 0.100 | 45.74 ± 2.637 | 0.848 ± 0.031 | 0.683 ± 0.040 | 0.895 ± 0.080 | 0.9068 / 9.8734 |
|
| 362 |
+
| | | | | | |
|
| 363 |
| [MLM](https://huggingface.co/Derify/ModChemBERT-MLM) | 1.0893 ± 0.1319 | 49.0005 ± 1.2787 | 0.8456 ± 0.0406 | 0.5491 ± 0.0134 | 0.7147 ± 0.0062 | 0.7997 / 10.4398 |
|
| 364 |
| [MLM + DAPT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT) | 0.9931 ± 0.0258 | 45.4951 ± 0.7112 | 0.9319 ± 0.0153 | 0.6049 ± 0.0666 | 0.6874 ± 0.0040 | 0.8043 / 9.7425 |
|
| 365 |
| [MLM + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-TAFT) | 1.0304 ± 0.1146 | 47.8418 ± 0.4070 | ***0.7669 ± 0.0024*** | 0.5293 ± 0.0267 | 0.6708 ± 0.0074 | 0.7493 / 10.1678 |
|
| 366 |
| [MLM + DAPT + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT-TAFT) | 0.9713 ± 0.0224 | ***42.8010 ± 3.3475*** | 0.8169 ± 0.0268 | 0.5445 ± 0.0257 | 0.6820 ± 0.0028 | 0.7537 / 9.1631 |
|
| 367 |
| [MLM + DAPT + TAFT OPT](https://huggingface.co/Derify/ModChemBERT) | ***0.9665 ± 0.0250*** | 44.0137 ± 1.1110 | 0.8158 ± 0.0115 | ***0.4979 ± 0.0158*** | ***0.6505 ± 0.0126*** | 0.7327 / 9.3889 |
|
| 368 |
|
| 369 |
+
#### Mswahili, et al. [8] Proposed Classification Datasets (ROC AUC - Higher is better)
|
| 370 |
+
|
| 371 |
+
| Model | Antimalarial↑ | Cocrystal↑ | COVID19↑ | AVG† |
|
| 372 |
+
| ---------------------------------------------------------------------------- | --------------------- | --------------------- | --------------------- | ------ |
|
| 373 |
+
| **Tasks** | 1 | 1 | 1 | |
|
| 374 |
+
| [MLM](https://huggingface.co/Derify/ModChemBERT-MLM) | 0.8707 ± 0.0032 | 0.7967 ± 0.0124 | 0.8106 ± 0.0170 | 0.8260 |
|
| 375 |
+
| [MLM + DAPT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT) | 0.8756 ± 0.0056 | 0.8288 ± 0.0143 | 0.8029 ± 0.0159 | 0.8358 |
|
| 376 |
+
| [MLM + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-TAFT) | 0.8832 ± 0.0051 | 0.7866 ± 0.0204 | ***0.8308 ± 0.0026*** | 0.8335 |
|
| 377 |
+
| [MLM + DAPT + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT-TAFT) | 0.8819 ± 0.0052 | 0.8550 ± 0.0106 | 0.8013 ± 0.0118 | 0.8461 |
|
| 378 |
+
| [MLM + DAPT + TAFT OPT](https://huggingface.co/Derify/ModChemBERT) | ***0.8966 ± 0.0045*** | ***0.8654 ± 0.0080*** | 0.8132 ± 0.0195 | 0.8584 |
|
| 379 |
+
|
| 380 |
+
#### ADME/AstraZeneca Regression Datasets (RMSE - Lower is better)
|
| 381 |
+
|
| 382 |
+
Hyperparameter optimization for the TAFT stage appears to induce overfitting, as the `MLM + DAPT + TAFT OPT` model shows slightly degraded performance on the ADME/AstraZeneca datasets compared to the `MLM + DAPT + TAFT` model.
|
| 383 |
+
The `MLM + DAPT + TAFT` model, a merge of unoptimized TAFT checkpoints trained with `max_seq_mean` pooling, achieved the best overall performance across the ADME/AstraZeneca datasets.
|
| 384 |
+
|
| 385 |
+
| | ADME | | | | | | AstraZeneca | | | | |
|
| 386 |
+
| ---------------------------------------------------------------------------- | ------------------- | ------------------- | ------------------- | ------------------- | ------------------- | ------------------- | ------------------- | ------------------- | ------------------- | ------------------- | ------ |
|
| 387 |
+
| Model | microsom_stab_h↓ | microsom_stab_r↓ | permeability↓ | ppb_h↓ | ppb_r↓ | solubility↓ | CL↓ | LogD74↓ | PPB↓ | Solubility↓ | AVG† |
|
| 388 |
+
| | | | | | | | | | | |
|
| 389 |
+
| **Tasks** | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |
|
| 390 |
+
| [MLM](https://huggingface.co/Derify/ModChemBERT-MLM) | 0.4489 ± 0.0114 | 0.4685 ± 0.0225 | 0.5423 ± 0.0076 | 0.8041 ± 0.0378 | 0.7849 ± 0.0394 | 0.5191 ± 0.0147 | **0.4812 ± 0.0073** | 0.8204 ± 0.0070 | 0.1365 ± 0.0066 | 0.9614 ± 0.0189 | 0.5967 |
|
| 391 |
+
| [MLM + DAPT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT) | **0.4199 ± 0.0064** | 0.4568 ± 0.0091 | 0.5042 ± 0.0135 | 0.8376 ± 0.0629 | 0.8446 ± 0.0756 | 0.4800 ± 0.0118 | 0.5351 ± 0.0036 | 0.8191 ± 0.0066 | 0.1237 ± 0.0022 | 0.9280 ± 0.0088 | 0.5949 |
|
| 392 |
+
| [MLM + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-TAFT) | 0.4375 ± 0.0027 | 0.4542 ± 0.0024 | 0.5202 ± 0.0141 | **0.7618 ± 0.0138** | 0.7027 ± 0.0023 | 0.5023 ± 0.0107 | 0.5104 ± 0.0110 | 0.7599 ± 0.0050 | 0.1233 ± 0.0088 | 0.8730 ± 0.0112 | 0.5645 |
|
| 393 |
+
| [MLM + DAPT + TAFT](https://huggingface.co/Derify/ModChemBERT-MLM-DAPT-TAFT) | 0.4206 ± 0.0071 | **0.4400 ± 0.0039** | **0.4899 ± 0.0068** | 0.8927 ± 0.0163 | **0.6942 ± 0.0397** | 0.4641 ± 0.0082 | 0.5022 ± 0.0136 | **0.7467 ± 0.0041** | 0.1195 ± 0.0026 | **0.8564 ± 0.0265** | 0.5626 |
|
| 394 |
+
| [MLM + DAPT + TAFT OPT](https://huggingface.co/Derify/ModChemBERT) | 0.4248 ± 0.0041 | 0.4403 ± 0.0046 | 0.5025 ± 0.0029 | 0.8901 ± 0.0123 | 0.7268 ± 0.0090 | **0.4627 ± 0.0083** | 0.4932 ± 0.0079 | 0.7596 ± 0.0044 | **0.1150 ± 0.0002** | 0.8735 ± 0.0053 | 0.5689 |
|
| 395 |
+
|
| 396 |
+
|
| 397 |
**Bold** indicates the best result in the column; *italic* indicates the best result among ModChemBERT checkpoints.<br/>
|
| 398 |
\* Published results from the ChemBERTa-3 [7] paper for optimized chemical language models using DeepChem scaffold splits.<br/>
|
| 399 |
+
† AVG column shows the mean score across classification tasks.<br/>
|
| 400 |
+
‡ AVG column shows the mean scores across regression tasks without and with the clearance score.
|
| 401 |
|
| 402 |
</details>
|
| 403 |
|
|
|
|
| 437 |
| esol | 64 | sum_mean | N/A | 0.1 | 0.0 | 0.1 |
|
| 438 |
| freesolv | 32 | max_seq_mha | 5 | 0.1 | 0.0 | 0.0 |
|
| 439 |
| lipo | 32 | max_seq_mha | 3 | 0.1 | 0.1 | 0.1 |
|
| 440 |
+
| antimalarial | 16 | max_seq_mha | 3 | 0.1 | 0.1 | 0.1 |
|
| 441 |
+
| cocrystal | 16 | max_cls | 3 | 0.1 | 0.0 | 0.1 |
|
| 442 |
+
| covid19 | 16 | sum_mean | N/A | 0.1 | 0.0 | 0.1 |
|
| 443 |
|
| 444 |
</details>
|
| 445 |
|
|
|
|
| 473 |
```
|
| 474 |
|
| 475 |
## References
|
| 476 |
+
1. Kallergis, G., Asgari, E., Empting, M. et al. Domain adaptable language modeling of chemical compounds identifies potent pathoblockers for Pseudomonas aeruginosa. Commun Chem 8, 114 (2025). https://doi.org/10.1038/s42004-025-01484-4
|
| 477 |
2. Behrendt, Maike, Stefan Sylvius Wagner, and Stefan Harmeling. "MaxPoolBERT: Enhancing BERT Classification via Layer-and Token-Wise Aggregation." arXiv preprint arXiv:2505.15696 (2025).
|
| 478 |
3. Sultan, Afnan, et al. "Transformers for molecular property prediction: Domain adaptation efficiently improves performance." arXiv preprint arXiv:2503.03360 (2025).
|
| 479 |
4. Warner, Benjamin, et al. "Smarter, better, faster, longer: A modern bidirectional encoder for fast, memory efficient, and long context finetuning and inference." arXiv preprint arXiv:2412.13663 (2024).
|
| 480 |
+
5. Clavié, Benjamin. "JaColBERTv2.5: Optimising Multi-Vector Retrievers to Create State-of-the-Art Japanese Retrievers with Constrained Resources." arXiv preprint arXiv:2407.20750 (2024).
|
| 481 |
6. Grattafiori, Aaron, et al. "The llama 3 herd of models." arXiv preprint arXiv:2407.21783 (2024).
|
| 482 |
+
7. Singh R, Barsainyan AA, Irfan R, Amorin CJ, He S, Davis T, et al. ChemBERTa-3: An Open Source Training Framework for Chemical Foundation Models. ChemRxiv. 2025; doi:10.26434/chemrxiv-2025-4glrl-v2 This content is a preprint and has not been peer-reviewed.
|
| 483 |
+
8. Mswahili, M.E., Hwang, J., Rajapakse, J.C. et al. Positional embeddings and zero-shot learning using BERT for molecular-property prediction. J Cheminform 17, 17 (2025). https://doi.org/10.1186/s13321-025-00959-9
|
| 484 |
+
9. Mswahili, M.E.; Ndomba, G.E.; Jo, K.; Jeong, Y.-S. Graph Neural Network and BERT Model for Antimalarial Drug Predictions Using Plasmodium Potential Targets. Applied Sciences, 2024, 14(4), 1472. https://doi.org/10.3390/app14041472
|
| 485 |
+
10. Mswahili, M.E.; Lee, M.-J.; Martin, G.L.; Kim, J.; Kim, P.; Choi, G.J.; Jeong, Y.-S. Cocrystal Prediction Using Machine Learning Models and Descriptors. Applied Sciences, 2021, 11, 1323. https://doi.org/10.3390/app11031323
|
| 486 |
+
11. Harigua-Souiai, E.; Heinhane, M.M.; Abdelkrim, Y.Z.; Souiai, O.; Abdeljaoued-Tej, I.; Guizani, I. Deep Learning Algorithms Achieved Satisfactory Predictions When Trained on a Novel Collection of Anticoronavirus Molecules. Frontiers in Genetics, 2021, 12:744170. https://doi.org/10.3389/fgene.2021.744170
|
| 487 |
+
12. Cheng Fang, Ye Wang, Richard Grater, Sudarshan Kapadnis, Cheryl Black, Patrick Trapa, and Simone Sciabola. "Prospective Validation of Machine Learning Algorithms for Absorption, Distribution, Metabolism, and Excretion Prediction: An Industrial Perspective" Journal of Chemical Information and Modeling 2023 63 (11), 3263-3274 https://doi.org/10.1021/acs.jcim.3c00160
|
logs_modchembert_classification_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_antimalarial_epochs100_batch_size32_20250925_224136.log
ADDED
|
@@ -0,0 +1,365 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-25 22:41:36,360 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Running benchmark for dataset: antimalarial
|
| 2 |
+
2025-09-25 22:41:36,360 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - dataset: antimalarial, tasks: ['label'], epochs: 100, learning rate: 3e-05
|
| 3 |
+
2025-09-25 22:41:36,365 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset antimalarial at 2025-09-25_22-41-36
|
| 4 |
+
2025-09-25 22:41:56,236 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5344 | Val mean-roc_auc_score: 0.7796
|
| 5 |
+
2025-09-25 22:41:56,237 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 120
|
| 6 |
+
2025-09-25 22:41:57,212 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.7796
|
| 7 |
+
2025-09-25 22:42:13,799 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4594 | Val mean-roc_auc_score: 0.8554
|
| 8 |
+
2025-09-25 22:42:13,990 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 240
|
| 9 |
+
2025-09-25 22:42:14,568 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8554
|
| 10 |
+
2025-09-25 22:42:30,961 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.4000 | Val mean-roc_auc_score: 0.8815
|
| 11 |
+
2025-09-25 22:42:31,159 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 360
|
| 12 |
+
2025-09-25 22:42:31,790 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val mean-roc_auc_score: 0.8815
|
| 13 |
+
2025-09-25 22:42:51,319 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3187 | Val mean-roc_auc_score: 0.8836
|
| 14 |
+
2025-09-25 22:42:51,522 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 480
|
| 15 |
+
2025-09-25 22:42:52,176 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val mean-roc_auc_score: 0.8836
|
| 16 |
+
2025-09-25 22:43:09,276 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2537 | Val mean-roc_auc_score: 0.8897
|
| 17 |
+
2025-09-25 22:43:09,508 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 600
|
| 18 |
+
2025-09-25 22:43:10,199 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val mean-roc_auc_score: 0.8897
|
| 19 |
+
2025-09-25 22:43:27,715 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1961 | Val mean-roc_auc_score: 0.8898
|
| 20 |
+
2025-09-25 22:43:28,200 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 720
|
| 21 |
+
2025-09-25 22:43:28,826 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 6 with val mean-roc_auc_score: 0.8898
|
| 22 |
+
2025-09-25 22:43:48,848 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1828 | Val mean-roc_auc_score: 0.8939
|
| 23 |
+
2025-09-25 22:43:49,046 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 840
|
| 24 |
+
2025-09-25 22:43:49,667 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val mean-roc_auc_score: 0.8939
|
| 25 |
+
2025-09-25 22:44:06,778 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1490 | Val mean-roc_auc_score: 0.8953
|
| 26 |
+
2025-09-25 22:44:07,000 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 960
|
| 27 |
+
2025-09-25 22:44:07,658 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val mean-roc_auc_score: 0.8953
|
| 28 |
+
2025-09-25 22:44:26,461 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1211 | Val mean-roc_auc_score: 0.8930
|
| 29 |
+
2025-09-25 22:44:45,288 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1200 | Val mean-roc_auc_score: 0.8913
|
| 30 |
+
2025-09-25 22:45:01,318 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.1125 | Val mean-roc_auc_score: 0.8972
|
| 31 |
+
2025-09-25 22:45:01,758 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 1320
|
| 32 |
+
2025-09-25 22:45:02,412 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 11 with val mean-roc_auc_score: 0.8972
|
| 33 |
+
2025-09-25 22:45:18,827 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0844 | Val mean-roc_auc_score: 0.8939
|
| 34 |
+
2025-09-25 22:45:37,205 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0797 | Val mean-roc_auc_score: 0.8939
|
| 35 |
+
2025-09-25 22:45:53,535 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0785 | Val mean-roc_auc_score: 0.8985
|
| 36 |
+
2025-09-25 22:45:53,707 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 1680
|
| 37 |
+
2025-09-25 22:45:54,373 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 14 with val mean-roc_auc_score: 0.8985
|
| 38 |
+
2025-09-25 22:46:10,088 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0600 | Val mean-roc_auc_score: 0.8973
|
| 39 |
+
2025-09-25 22:46:28,784 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0512 | Val mean-roc_auc_score: 0.9017
|
| 40 |
+
2025-09-25 22:46:29,356 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 1920
|
| 41 |
+
2025-09-25 22:46:30,075 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 16 with val mean-roc_auc_score: 0.9017
|
| 42 |
+
2025-09-25 22:46:47,678 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0369 | Val mean-roc_auc_score: 0.9029
|
| 43 |
+
2025-09-25 22:46:47,916 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 2040
|
| 44 |
+
2025-09-25 22:46:48,573 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 17 with val mean-roc_auc_score: 0.9029
|
| 45 |
+
2025-09-25 22:47:05,293 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0391 | Val mean-roc_auc_score: 0.9013
|
| 46 |
+
2025-09-25 22:47:24,201 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0342 | Val mean-roc_auc_score: 0.9008
|
| 47 |
+
2025-09-25 22:47:40,480 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0516 | Val mean-roc_auc_score: 0.8958
|
| 48 |
+
2025-09-25 22:47:56,571 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0367 | Val mean-roc_auc_score: 0.8956
|
| 49 |
+
2025-09-25 22:48:16,388 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0305 | Val mean-roc_auc_score: 0.9009
|
| 50 |
+
2025-09-25 22:48:33,414 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0612 | Val mean-roc_auc_score: 0.8944
|
| 51 |
+
2025-09-25 22:48:51,024 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0322 | Val mean-roc_auc_score: 0.9025
|
| 52 |
+
2025-09-25 22:49:11,636 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0234 | Val mean-roc_auc_score: 0.8990
|
| 53 |
+
2025-09-25 22:49:29,259 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0311 | Val mean-roc_auc_score: 0.8914
|
| 54 |
+
2025-09-25 22:49:46,794 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0363 | Val mean-roc_auc_score: 0.8948
|
| 55 |
+
2025-09-25 22:50:06,239 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0272 | Val mean-roc_auc_score: 0.8961
|
| 56 |
+
2025-09-25 22:50:23,701 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0179 | Val mean-roc_auc_score: 0.8964
|
| 57 |
+
2025-09-25 22:50:40,442 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0145 | Val mean-roc_auc_score: 0.8947
|
| 58 |
+
2025-09-25 22:51:00,554 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0363 | Val mean-roc_auc_score: 0.8961
|
| 59 |
+
2025-09-25 22:51:17,790 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0239 | Val mean-roc_auc_score: 0.8930
|
| 60 |
+
2025-09-25 22:51:34,478 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0221 | Val mean-roc_auc_score: 0.8976
|
| 61 |
+
2025-09-25 22:51:55,118 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0307 | Val mean-roc_auc_score: 0.8981
|
| 62 |
+
2025-09-25 22:52:13,260 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0148 | Val mean-roc_auc_score: 0.8987
|
| 63 |
+
2025-09-25 22:52:31,161 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0118 | Val mean-roc_auc_score: 0.9029
|
| 64 |
+
2025-09-25 22:52:31,639 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 4320
|
| 65 |
+
2025-09-25 22:52:32,298 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 36 with val mean-roc_auc_score: 0.9029
|
| 66 |
+
2025-09-25 22:52:52,902 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0121 | Val mean-roc_auc_score: 0.9003
|
| 67 |
+
2025-09-25 22:53:10,280 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0210 | Val mean-roc_auc_score: 0.8991
|
| 68 |
+
2025-09-25 22:53:27,720 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0153 | Val mean-roc_auc_score: 0.8995
|
| 69 |
+
2025-09-25 22:53:49,082 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0110 | Val mean-roc_auc_score: 0.8989
|
| 70 |
+
2025-09-25 22:54:06,209 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0127 | Val mean-roc_auc_score: 0.8984
|
| 71 |
+
2025-09-25 22:54:23,802 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0143 | Val mean-roc_auc_score: 0.8972
|
| 72 |
+
2025-09-25 22:54:43,214 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0182 | Val mean-roc_auc_score: 0.8951
|
| 73 |
+
2025-09-25 22:55:00,344 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0132 | Val mean-roc_auc_score: 0.8968
|
| 74 |
+
2025-09-25 22:55:16,925 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0142 | Val mean-roc_auc_score: 0.8939
|
| 75 |
+
2025-09-25 22:55:36,912 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0092 | Val mean-roc_auc_score: 0.8980
|
| 76 |
+
2025-09-25 22:55:54,393 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0227 | Val mean-roc_auc_score: 0.9001
|
| 77 |
+
2025-09-25 22:56:12,252 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0139 | Val mean-roc_auc_score: 0.8987
|
| 78 |
+
2025-09-25 22:56:32,690 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0105 | Val mean-roc_auc_score: 0.9004
|
| 79 |
+
2025-09-25 22:56:50,861 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0109 | Val mean-roc_auc_score: 0.8998
|
| 80 |
+
2025-09-25 22:57:08,028 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0060 | Val mean-roc_auc_score: 0.8994
|
| 81 |
+
2025-09-25 22:57:29,033 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0187 | Val mean-roc_auc_score: 0.8949
|
| 82 |
+
2025-09-25 22:57:46,256 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0106 | Val mean-roc_auc_score: 0.8986
|
| 83 |
+
2025-09-25 22:58:04,196 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0076 | Val mean-roc_auc_score: 0.8971
|
| 84 |
+
2025-09-25 22:58:24,025 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0098 | Val mean-roc_auc_score: 0.8965
|
| 85 |
+
2025-09-25 22:58:41,269 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0082 | Val mean-roc_auc_score: 0.8984
|
| 86 |
+
2025-09-25 22:58:58,388 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0058 | Val mean-roc_auc_score: 0.9000
|
| 87 |
+
2025-09-25 22:59:17,462 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0083 | Val mean-roc_auc_score: 0.8988
|
| 88 |
+
2025-09-25 22:59:35,324 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0152 | Val mean-roc_auc_score: 0.8938
|
| 89 |
+
2025-09-25 22:59:52,263 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0091 | Val mean-roc_auc_score: 0.8944
|
| 90 |
+
2025-09-25 23:00:12,237 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0064 | Val mean-roc_auc_score: 0.8937
|
| 91 |
+
2025-09-25 23:00:29,511 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0107 | Val mean-roc_auc_score: 0.8941
|
| 92 |
+
2025-09-25 23:00:45,965 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0057 | Val mean-roc_auc_score: 0.8951
|
| 93 |
+
2025-09-25 23:01:05,493 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0054 | Val mean-roc_auc_score: 0.8946
|
| 94 |
+
2025-09-25 23:01:22,010 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0060 | Val mean-roc_auc_score: 0.8943
|
| 95 |
+
2025-09-25 23:01:38,214 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0049 | Val mean-roc_auc_score: 0.8949
|
| 96 |
+
2025-09-25 23:02:00,064 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0091 | Val mean-roc_auc_score: 0.8956
|
| 97 |
+
2025-09-25 23:02:16,868 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0052 | Val mean-roc_auc_score: 0.8948
|
| 98 |
+
2025-09-25 23:02:33,051 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0059 | Val mean-roc_auc_score: 0.8940
|
| 99 |
+
2025-09-25 23:02:52,010 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0064 | Val mean-roc_auc_score: 0.8941
|
| 100 |
+
2025-09-25 23:03:07,994 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0022 | Val mean-roc_auc_score: 0.8936
|
| 101 |
+
2025-09-25 23:03:24,626 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0106 | Val mean-roc_auc_score: 0.8910
|
| 102 |
+
2025-09-25 23:03:43,643 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0049 | Val mean-roc_auc_score: 0.8934
|
| 103 |
+
2025-09-25 23:04:00,230 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0130 | Val mean-roc_auc_score: 0.8914
|
| 104 |
+
2025-09-25 23:04:17,201 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0088 | Val mean-roc_auc_score: 0.8920
|
| 105 |
+
2025-09-25 23:04:36,536 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0135 | Val mean-roc_auc_score: 0.8957
|
| 106 |
+
2025-09-25 23:04:53,299 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0101 | Val mean-roc_auc_score: 0.8943
|
| 107 |
+
2025-09-25 23:05:12,941 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0107 | Val mean-roc_auc_score: 0.8935
|
| 108 |
+
2025-09-25 23:05:29,538 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0061 | Val mean-roc_auc_score: 0.8964
|
| 109 |
+
2025-09-25 23:05:46,044 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0070 | Val mean-roc_auc_score: 0.8969
|
| 110 |
+
2025-09-25 23:06:04,840 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0098 | Val mean-roc_auc_score: 0.8968
|
| 111 |
+
2025-09-25 23:06:21,623 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0074 | Val mean-roc_auc_score: 0.8964
|
| 112 |
+
2025-09-25 23:06:37,699 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0045 | Val mean-roc_auc_score: 0.8960
|
| 113 |
+
2025-09-25 23:07:00,079 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.8957
|
| 114 |
+
2025-09-25 23:07:18,149 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0045 | Val mean-roc_auc_score: 0.8942
|
| 115 |
+
2025-09-25 23:07:35,773 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0081 | Val mean-roc_auc_score: 0.8944
|
| 116 |
+
2025-09-25 23:07:55,760 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0055 | Val mean-roc_auc_score: 0.8939
|
| 117 |
+
2025-09-25 23:08:12,493 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0065 | Val mean-roc_auc_score: 0.8923
|
| 118 |
+
2025-09-25 23:08:28,789 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0030 | Val mean-roc_auc_score: 0.8928
|
| 119 |
+
2025-09-25 23:08:47,794 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0047 | Val mean-roc_auc_score: 0.8925
|
| 120 |
+
2025-09-25 23:09:03,807 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0050 | Val mean-roc_auc_score: 0.8926
|
| 121 |
+
2025-09-25 23:09:21,497 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0033 | Val mean-roc_auc_score: 0.8927
|
| 122 |
+
2025-09-25 23:09:41,153 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0039 | Val mean-roc_auc_score: 0.8923
|
| 123 |
+
2025-09-25 23:09:58,040 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0057 | Val mean-roc_auc_score: 0.8918
|
| 124 |
+
2025-09-25 23:10:15,826 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0039 | Val mean-roc_auc_score: 0.8934
|
| 125 |
+
2025-09-25 23:10:34,723 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0051 | Val mean-roc_auc_score: 0.8933
|
| 126 |
+
2025-09-25 23:10:51,647 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0058 | Val mean-roc_auc_score: 0.8904
|
| 127 |
+
2025-09-25 23:11:08,316 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0064 | Val mean-roc_auc_score: 0.8928
|
| 128 |
+
2025-09-25 23:11:27,435 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0043 | Val mean-roc_auc_score: 0.8913
|
| 129 |
+
2025-09-25 23:11:46,186 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0043 | Val mean-roc_auc_score: 0.8921
|
| 130 |
+
2025-09-25 23:11:47,532 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8775
|
| 131 |
+
2025-09-25 23:11:47,909 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset antimalarial at 2025-09-25_23-11-47
|
| 132 |
+
2025-09-25 23:12:03,598 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5437 | Val mean-roc_auc_score: 0.8052
|
| 133 |
+
2025-09-25 23:12:03,598 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 120
|
| 134 |
+
2025-09-25 23:12:04,603 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.8052
|
| 135 |
+
2025-09-25 23:12:24,800 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4500 | Val mean-roc_auc_score: 0.8500
|
| 136 |
+
2025-09-25 23:12:25,007 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 240
|
| 137 |
+
2025-09-25 23:12:25,764 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8500
|
| 138 |
+
2025-09-25 23:12:41,650 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3812 | Val mean-roc_auc_score: 0.8866
|
| 139 |
+
2025-09-25 23:12:41,844 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 360
|
| 140 |
+
2025-09-25 23:12:42,461 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val mean-roc_auc_score: 0.8866
|
| 141 |
+
2025-09-25 23:12:59,097 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3078 | Val mean-roc_auc_score: 0.8830
|
| 142 |
+
2025-09-25 23:13:18,546 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2338 | Val mean-roc_auc_score: 0.8908
|
| 143 |
+
2025-09-25 23:13:18,712 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 600
|
| 144 |
+
2025-09-25 23:13:19,429 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val mean-roc_auc_score: 0.8908
|
| 145 |
+
2025-09-25 23:13:35,362 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.2500 | Val mean-roc_auc_score: 0.8826
|
| 146 |
+
2025-09-25 23:13:51,920 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1773 | Val mean-roc_auc_score: 0.8913
|
| 147 |
+
2025-09-25 23:13:52,085 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 840
|
| 148 |
+
2025-09-25 23:13:52,720 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val mean-roc_auc_score: 0.8913
|
| 149 |
+
2025-09-25 23:14:13,528 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1385 | Val mean-roc_auc_score: 0.8916
|
| 150 |
+
2025-09-25 23:14:13,763 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 960
|
| 151 |
+
2025-09-25 23:14:14,486 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val mean-roc_auc_score: 0.8916
|
| 152 |
+
2025-09-25 23:14:32,840 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1414 | Val mean-roc_auc_score: 0.8973
|
| 153 |
+
2025-09-25 23:14:33,096 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 1080
|
| 154 |
+
2025-09-25 23:14:33,756 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 9 with val mean-roc_auc_score: 0.8973
|
| 155 |
+
2025-09-25 23:14:50,148 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0981 | Val mean-roc_auc_score: 0.8956
|
| 156 |
+
2025-09-25 23:15:09,608 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0910 | Val mean-roc_auc_score: 0.8932
|
| 157 |
+
2025-09-25 23:15:26,444 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0938 | Val mean-roc_auc_score: 0.8979
|
| 158 |
+
2025-09-25 23:15:26,609 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 1440
|
| 159 |
+
2025-09-25 23:15:27,347 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 12 with val mean-roc_auc_score: 0.8979
|
| 160 |
+
2025-09-25 23:15:44,498 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0891 | Val mean-roc_auc_score: 0.8956
|
| 161 |
+
2025-09-25 23:16:03,932 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0539 | Val mean-roc_auc_score: 0.9027
|
| 162 |
+
2025-09-25 23:16:04,099 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 1680
|
| 163 |
+
2025-09-25 23:16:04,743 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 14 with val mean-roc_auc_score: 0.9027
|
| 164 |
+
2025-09-25 23:16:21,532 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0469 | Val mean-roc_auc_score: 0.8987
|
| 165 |
+
2025-09-25 23:16:37,338 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0432 | Val mean-roc_auc_score: 0.9054
|
| 166 |
+
2025-09-25 23:16:37,882 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 1920
|
| 167 |
+
2025-09-25 23:16:38,537 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 16 with val mean-roc_auc_score: 0.9054
|
| 168 |
+
2025-09-25 23:16:58,069 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0412 | Val mean-roc_auc_score: 0.9034
|
| 169 |
+
2025-09-25 23:17:14,685 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0578 | Val mean-roc_auc_score: 0.8972
|
| 170 |
+
2025-09-25 23:17:31,803 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0414 | Val mean-roc_auc_score: 0.9014
|
| 171 |
+
2025-09-25 23:17:51,614 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0348 | Val mean-roc_auc_score: 0.8943
|
| 172 |
+
2025-09-25 23:18:07,847 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0406 | Val mean-roc_auc_score: 0.8960
|
| 173 |
+
2025-09-25 23:18:25,216 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0535 | Val mean-roc_auc_score: 0.8950
|
| 174 |
+
2025-09-25 23:18:44,748 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0380 | Val mean-roc_auc_score: 0.8997
|
| 175 |
+
2025-09-25 23:19:01,458 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0230 | Val mean-roc_auc_score: 0.8997
|
| 176 |
+
2025-09-25 23:19:18,782 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0328 | Val mean-roc_auc_score: 0.8989
|
| 177 |
+
2025-09-25 23:19:37,793 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0395 | Val mean-roc_auc_score: 0.9025
|
| 178 |
+
2025-09-25 23:19:54,878 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0334 | Val mean-roc_auc_score: 0.8979
|
| 179 |
+
2025-09-25 23:20:11,500 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0329 | Val mean-roc_auc_score: 0.8990
|
| 180 |
+
2025-09-25 23:20:32,238 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0245 | Val mean-roc_auc_score: 0.8940
|
| 181 |
+
2025-09-25 23:20:50,022 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0187 | Val mean-roc_auc_score: 0.8950
|
| 182 |
+
2025-09-25 23:21:07,172 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0181 | Val mean-roc_auc_score: 0.8968
|
| 183 |
+
2025-09-25 23:21:27,569 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0120 | Val mean-roc_auc_score: 0.8959
|
| 184 |
+
2025-09-25 23:21:44,397 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0172 | Val mean-roc_auc_score: 0.8940
|
| 185 |
+
2025-09-25 23:22:01,751 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0128 | Val mean-roc_auc_score: 0.8950
|
| 186 |
+
2025-09-25 23:22:21,091 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0198 | Val mean-roc_auc_score: 0.8971
|
| 187 |
+
2025-09-25 23:22:37,467 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0173 | Val mean-roc_auc_score: 0.8949
|
| 188 |
+
2025-09-25 23:22:57,057 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0090 | Val mean-roc_auc_score: 0.8961
|
| 189 |
+
2025-09-25 23:23:13,688 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0096 | Val mean-roc_auc_score: 0.8942
|
| 190 |
+
2025-09-25 23:23:30,246 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0114 | Val mean-roc_auc_score: 0.8965
|
| 191 |
+
2025-09-25 23:23:49,358 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0120 | Val mean-roc_auc_score: 0.8953
|
| 192 |
+
2025-09-25 23:24:05,527 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0158 | Val mean-roc_auc_score: 0.8970
|
| 193 |
+
2025-09-25 23:24:24,339 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0146 | Val mean-roc_auc_score: 0.8972
|
| 194 |
+
2025-09-25 23:24:43,535 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0071 | Val mean-roc_auc_score: 0.8955
|
| 195 |
+
2025-09-25 23:25:00,242 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0147 | Val mean-roc_auc_score: 0.8985
|
| 196 |
+
2025-09-25 23:25:16,914 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0087 | Val mean-roc_auc_score: 0.8968
|
| 197 |
+
2025-09-25 23:25:36,144 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0093 | Val mean-roc_auc_score: 0.8968
|
| 198 |
+
2025-09-25 23:25:53,535 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0075 | Val mean-roc_auc_score: 0.8963
|
| 199 |
+
2025-09-25 23:26:10,397 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0086 | Val mean-roc_auc_score: 0.8959
|
| 200 |
+
2025-09-25 23:26:29,844 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0080 | Val mean-roc_auc_score: 0.8963
|
| 201 |
+
2025-09-25 23:26:47,156 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0146 | Val mean-roc_auc_score: 0.8999
|
| 202 |
+
2025-09-25 23:27:04,493 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0094 | Val mean-roc_auc_score: 0.8948
|
| 203 |
+
2025-09-25 23:27:24,760 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0147 | Val mean-roc_auc_score: 0.8916
|
| 204 |
+
2025-09-25 23:27:41,243 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0173 | Val mean-roc_auc_score: 0.8981
|
| 205 |
+
2025-09-25 23:27:57,791 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0091 | Val mean-roc_auc_score: 0.8989
|
| 206 |
+
2025-09-25 23:28:17,480 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0079 | Val mean-roc_auc_score: 0.8977
|
| 207 |
+
2025-09-25 23:28:34,864 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0046 | Val mean-roc_auc_score: 0.8988
|
| 208 |
+
2025-09-25 23:28:52,026 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0064 | Val mean-roc_auc_score: 0.8994
|
| 209 |
+
2025-09-25 23:29:11,592 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0097 | Val mean-roc_auc_score: 0.8981
|
| 210 |
+
2025-09-25 23:29:29,650 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0084 | Val mean-roc_auc_score: 0.8955
|
| 211 |
+
2025-09-25 23:29:46,235 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0080 | Val mean-roc_auc_score: 0.8954
|
| 212 |
+
2025-09-25 23:30:05,343 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0805 | Val mean-roc_auc_score: 0.8871
|
| 213 |
+
2025-09-25 23:30:22,989 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0408 | Val mean-roc_auc_score: 0.8844
|
| 214 |
+
2025-09-25 23:30:39,942 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0165 | Val mean-roc_auc_score: 0.8875
|
| 215 |
+
2025-09-25 23:30:59,815 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0222 | Val mean-roc_auc_score: 0.8955
|
| 216 |
+
2025-09-25 23:31:16,377 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0148 | Val mean-roc_auc_score: 0.8984
|
| 217 |
+
2025-09-25 23:31:32,939 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0108 | Val mean-roc_auc_score: 0.8983
|
| 218 |
+
2025-09-25 23:31:53,050 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0058 | Val mean-roc_auc_score: 0.8986
|
| 219 |
+
2025-09-25 23:32:09,789 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0081 | Val mean-roc_auc_score: 0.8989
|
| 220 |
+
2025-09-25 23:32:29,014 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0094 | Val mean-roc_auc_score: 0.9011
|
| 221 |
+
2025-09-25 23:32:45,490 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0089 | Val mean-roc_auc_score: 0.9003
|
| 222 |
+
2025-09-25 23:33:01,952 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0134 | Val mean-roc_auc_score: 0.9018
|
| 223 |
+
2025-09-25 23:33:22,203 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0122 | Val mean-roc_auc_score: 0.9008
|
| 224 |
+
2025-09-25 23:33:38,615 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0082 | Val mean-roc_auc_score: 0.9036
|
| 225 |
+
2025-09-25 23:33:55,456 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0059 | Val mean-roc_auc_score: 0.9035
|
| 226 |
+
2025-09-25 23:34:15,133 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0088 | Val mean-roc_auc_score: 0.9026
|
| 227 |
+
2025-09-25 23:34:31,586 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0044 | Val mean-roc_auc_score: 0.9018
|
| 228 |
+
2025-09-25 23:34:49,009 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0086 | Val mean-roc_auc_score: 0.8978
|
| 229 |
+
2025-09-25 23:35:08,496 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0066 | Val mean-roc_auc_score: 0.9007
|
| 230 |
+
2025-09-25 23:35:24,956 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0080 | Val mean-roc_auc_score: 0.8992
|
| 231 |
+
2025-09-25 23:35:41,063 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.9000
|
| 232 |
+
2025-09-25 23:36:00,262 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0028 | Val mean-roc_auc_score: 0.9004
|
| 233 |
+
2025-09-25 23:36:17,151 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0057 | Val mean-roc_auc_score: 0.8999
|
| 234 |
+
2025-09-25 23:36:33,382 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0067 | Val mean-roc_auc_score: 0.9020
|
| 235 |
+
2025-09-25 23:36:53,220 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0062 | Val mean-roc_auc_score: 0.9007
|
| 236 |
+
2025-09-25 23:37:09,358 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0054 | Val mean-roc_auc_score: 0.9011
|
| 237 |
+
2025-09-25 23:37:28,297 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0057 | Val mean-roc_auc_score: 0.9020
|
| 238 |
+
2025-09-25 23:37:45,152 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.9008
|
| 239 |
+
2025-09-25 23:38:03,234 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.9012
|
| 240 |
+
2025-09-25 23:38:24,430 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0047 | Val mean-roc_auc_score: 0.9004
|
| 241 |
+
2025-09-25 23:38:41,811 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0079 | Val mean-roc_auc_score: 0.9000
|
| 242 |
+
2025-09-25 23:38:59,464 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0050 | Val mean-roc_auc_score: 0.9015
|
| 243 |
+
2025-09-25 23:39:17,683 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.9043
|
| 244 |
+
2025-09-25 23:39:37,844 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0037 | Val mean-roc_auc_score: 0.9038
|
| 245 |
+
2025-09-25 23:39:55,411 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.9037
|
| 246 |
+
2025-09-25 23:40:12,307 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0061 | Val mean-roc_auc_score: 0.9020
|
| 247 |
+
2025-09-25 23:40:31,280 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0061 | Val mean-roc_auc_score: 0.8991
|
| 248 |
+
2025-09-25 23:40:48,678 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0046 | Val mean-roc_auc_score: 0.9009
|
| 249 |
+
2025-09-25 23:41:05,946 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0054 | Val mean-roc_auc_score: 0.9012
|
| 250 |
+
2025-09-25 23:41:26,295 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0103 | Val mean-roc_auc_score: 0.9011
|
| 251 |
+
2025-09-25 23:41:43,783 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0052 | Val mean-roc_auc_score: 0.9031
|
| 252 |
+
2025-09-25 23:41:44,853 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8892
|
| 253 |
+
2025-09-25 23:41:45,231 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset antimalarial at 2025-09-25_23-41-45
|
| 254 |
+
2025-09-25 23:42:01,000 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5094 | Val mean-roc_auc_score: 0.7974
|
| 255 |
+
2025-09-25 23:42:01,000 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 120
|
| 256 |
+
2025-09-25 23:42:01,893 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.7974
|
| 257 |
+
2025-09-25 23:42:21,762 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4281 | Val mean-roc_auc_score: 0.8639
|
| 258 |
+
2025-09-25 23:42:22,024 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 240
|
| 259 |
+
2025-09-25 23:42:22,928 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8639
|
| 260 |
+
2025-09-25 23:42:39,389 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3646 | Val mean-roc_auc_score: 0.8782
|
| 261 |
+
2025-09-25 23:42:39,588 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 360
|
| 262 |
+
2025-09-25 23:42:40,261 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val mean-roc_auc_score: 0.8782
|
| 263 |
+
2025-09-25 23:42:56,323 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3047 | Val mean-roc_auc_score: 0.8841
|
| 264 |
+
2025-09-25 23:42:56,540 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 480
|
| 265 |
+
2025-09-25 23:42:57,190 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val mean-roc_auc_score: 0.8841
|
| 266 |
+
2025-09-25 23:43:17,060 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2500 | Val mean-roc_auc_score: 0.8984
|
| 267 |
+
2025-09-25 23:43:17,235 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Global step of best model: 600
|
| 268 |
+
2025-09-25 23:43:17,892 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val mean-roc_auc_score: 0.8984
|
| 269 |
+
2025-09-25 23:43:34,713 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.2406 | Val mean-roc_auc_score: 0.8885
|
| 270 |
+
2025-09-25 23:43:52,620 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1344 | Val mean-roc_auc_score: 0.8958
|
| 271 |
+
2025-09-25 23:44:13,142 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1417 | Val mean-roc_auc_score: 0.8924
|
| 272 |
+
2025-09-25 23:44:31,114 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1187 | Val mean-roc_auc_score: 0.8877
|
| 273 |
+
2025-09-25 23:44:48,517 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0850 | Val mean-roc_auc_score: 0.8856
|
| 274 |
+
2025-09-25 23:45:08,781 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0934 | Val mean-roc_auc_score: 0.8899
|
| 275 |
+
2025-09-25 23:45:26,240 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0652 | Val mean-roc_auc_score: 0.8956
|
| 276 |
+
2025-09-25 23:45:43,610 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.1125 | Val mean-roc_auc_score: 0.8945
|
| 277 |
+
2025-09-25 23:46:03,412 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0727 | Val mean-roc_auc_score: 0.8825
|
| 278 |
+
2025-09-25 23:46:20,351 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0537 | Val mean-roc_auc_score: 0.8899
|
| 279 |
+
2025-09-25 23:46:37,311 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0523 | Val mean-roc_auc_score: 0.8883
|
| 280 |
+
2025-09-25 23:46:58,155 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0570 | Val mean-roc_auc_score: 0.8842
|
| 281 |
+
2025-09-25 23:47:15,023 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0479 | Val mean-roc_auc_score: 0.8905
|
| 282 |
+
2025-09-25 23:47:31,985 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0396 | Val mean-roc_auc_score: 0.8883
|
| 283 |
+
2025-09-25 23:47:51,538 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0353 | Val mean-roc_auc_score: 0.8877
|
| 284 |
+
2025-09-25 23:48:07,978 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0316 | Val mean-roc_auc_score: 0.8895
|
| 285 |
+
2025-09-25 23:48:25,508 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0361 | Val mean-roc_auc_score: 0.8875
|
| 286 |
+
2025-09-25 23:48:45,202 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0190 | Val mean-roc_auc_score: 0.8853
|
| 287 |
+
2025-09-25 23:49:02,540 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0326 | Val mean-roc_auc_score: 0.8813
|
| 288 |
+
2025-09-25 23:49:19,898 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0334 | Val mean-roc_auc_score: 0.8898
|
| 289 |
+
2025-09-25 23:49:38,845 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0459 | Val mean-roc_auc_score: 0.8904
|
| 290 |
+
2025-09-25 23:49:56,265 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0283 | Val mean-roc_auc_score: 0.8924
|
| 291 |
+
2025-09-25 23:50:12,686 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0156 | Val mean-roc_auc_score: 0.8915
|
| 292 |
+
2025-09-25 23:50:32,251 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0217 | Val mean-roc_auc_score: 0.8906
|
| 293 |
+
2025-09-25 23:50:48,946 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0244 | Val mean-roc_auc_score: 0.8907
|
| 294 |
+
2025-09-25 23:51:08,005 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0291 | Val mean-roc_auc_score: 0.8857
|
| 295 |
+
2025-09-25 23:51:25,548 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0129 | Val mean-roc_auc_score: 0.8897
|
| 296 |
+
2025-09-25 23:51:42,311 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0146 | Val mean-roc_auc_score: 0.8861
|
| 297 |
+
2025-09-25 23:52:02,049 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0273 | Val mean-roc_auc_score: 0.8869
|
| 298 |
+
2025-09-25 23:52:17,326 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0152 | Val mean-roc_auc_score: 0.8854
|
| 299 |
+
2025-09-25 23:52:35,043 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0158 | Val mean-roc_auc_score: 0.8828
|
| 300 |
+
2025-09-25 23:52:54,626 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0180 | Val mean-roc_auc_score: 0.8825
|
| 301 |
+
2025-09-25 23:53:11,386 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0139 | Val mean-roc_auc_score: 0.8858
|
| 302 |
+
2025-09-25 23:53:27,722 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0171 | Val mean-roc_auc_score: 0.8846
|
| 303 |
+
2025-09-25 23:53:47,342 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0126 | Val mean-roc_auc_score: 0.8852
|
| 304 |
+
2025-09-25 23:54:03,613 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0150 | Val mean-roc_auc_score: 0.8830
|
| 305 |
+
2025-09-25 23:54:22,435 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0137 | Val mean-roc_auc_score: 0.8825
|
| 306 |
+
2025-09-25 23:54:43,026 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0092 | Val mean-roc_auc_score: 0.8791
|
| 307 |
+
2025-09-25 23:55:01,016 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0115 | Val mean-roc_auc_score: 0.8788
|
| 308 |
+
2025-09-25 23:55:18,614 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0112 | Val mean-roc_auc_score: 0.8839
|
| 309 |
+
2025-09-25 23:55:38,493 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0084 | Val mean-roc_auc_score: 0.8815
|
| 310 |
+
2025-09-25 23:55:55,374 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0079 | Val mean-roc_auc_score: 0.8843
|
| 311 |
+
2025-09-25 23:56:12,209 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0100 | Val mean-roc_auc_score: 0.8807
|
| 312 |
+
2025-09-25 23:56:31,950 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0097 | Val mean-roc_auc_score: 0.8822
|
| 313 |
+
2025-09-25 23:56:49,737 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0073 | Val mean-roc_auc_score: 0.8830
|
| 314 |
+
2025-09-25 23:57:05,801 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0206 | Val mean-roc_auc_score: 0.8815
|
| 315 |
+
2025-09-25 23:57:25,971 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0140 | Val mean-roc_auc_score: 0.8809
|
| 316 |
+
2025-09-25 23:57:42,863 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0123 | Val mean-roc_auc_score: 0.8815
|
| 317 |
+
2025-09-25 23:57:59,969 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0140 | Val mean-roc_auc_score: 0.8807
|
| 318 |
+
2025-09-25 23:58:19,846 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0138 | Val mean-roc_auc_score: 0.8779
|
| 319 |
+
2025-09-25 23:58:35,876 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0134 | Val mean-roc_auc_score: 0.8844
|
| 320 |
+
2025-09-25 23:58:52,559 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0083 | Val mean-roc_auc_score: 0.8831
|
| 321 |
+
2025-09-25 23:59:11,811 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0073 | Val mean-roc_auc_score: 0.8852
|
| 322 |
+
2025-09-25 23:59:28,689 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0102 | Val mean-roc_auc_score: 0.8845
|
| 323 |
+
2025-09-25 23:59:45,412 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0074 | Val mean-roc_auc_score: 0.8831
|
| 324 |
+
2025-09-26 00:00:04,576 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0069 | Val mean-roc_auc_score: 0.8815
|
| 325 |
+
2025-09-26 00:00:21,459 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0070 | Val mean-roc_auc_score: 0.8837
|
| 326 |
+
2025-09-26 00:00:40,736 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0084 | Val mean-roc_auc_score: 0.8829
|
| 327 |
+
2025-09-26 00:00:57,523 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0066 | Val mean-roc_auc_score: 0.8826
|
| 328 |
+
2025-09-26 00:01:14,876 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0065 | Val mean-roc_auc_score: 0.8842
|
| 329 |
+
2025-09-26 00:01:34,486 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0070 | Val mean-roc_auc_score: 0.8832
|
| 330 |
+
2025-09-26 00:01:51,399 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0041 | Val mean-roc_auc_score: 0.8819
|
| 331 |
+
2025-09-26 00:02:08,280 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0062 | Val mean-roc_auc_score: 0.8840
|
| 332 |
+
2025-09-26 00:02:28,029 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0072 | Val mean-roc_auc_score: 0.8816
|
| 333 |
+
2025-09-26 00:02:44,976 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0082 | Val mean-roc_auc_score: 0.8829
|
| 334 |
+
2025-09-26 00:03:01,635 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0045 | Val mean-roc_auc_score: 0.8832
|
| 335 |
+
2025-09-26 00:03:21,761 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.8835
|
| 336 |
+
2025-09-26 00:03:38,919 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0059 | Val mean-roc_auc_score: 0.8810
|
| 337 |
+
2025-09-26 00:03:56,029 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0079 | Val mean-roc_auc_score: 0.8828
|
| 338 |
+
2025-09-26 00:04:16,890 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0049 | Val mean-roc_auc_score: 0.8825
|
| 339 |
+
2025-09-26 00:04:33,954 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0123 | Val mean-roc_auc_score: 0.8828
|
| 340 |
+
2025-09-26 00:04:51,673 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0084 | Val mean-roc_auc_score: 0.8838
|
| 341 |
+
2025-09-26 00:05:11,981 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0040 | Val mean-roc_auc_score: 0.8838
|
| 342 |
+
2025-09-26 00:05:28,600 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0041 | Val mean-roc_auc_score: 0.8842
|
| 343 |
+
2025-09-26 00:05:45,281 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0041 | Val mean-roc_auc_score: 0.8845
|
| 344 |
+
2025-09-26 00:06:04,720 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0103 | Val mean-roc_auc_score: 0.8819
|
| 345 |
+
2025-09-26 00:06:21,795 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0045 | Val mean-roc_auc_score: 0.8839
|
| 346 |
+
2025-09-26 00:06:38,695 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0057 | Val mean-roc_auc_score: 0.8837
|
| 347 |
+
2025-09-26 00:06:59,057 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0059 | Val mean-roc_auc_score: 0.8828
|
| 348 |
+
2025-09-26 00:07:16,131 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0056 | Val mean-roc_auc_score: 0.8823
|
| 349 |
+
2025-09-26 00:07:32,800 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0043 | Val mean-roc_auc_score: 0.8829
|
| 350 |
+
2025-09-26 00:07:52,405 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0053 | Val mean-roc_auc_score: 0.8821
|
| 351 |
+
2025-09-26 00:08:09,095 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0063 | Val mean-roc_auc_score: 0.8804
|
| 352 |
+
2025-09-26 00:08:26,231 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0081 | Val mean-roc_auc_score: 0.8840
|
| 353 |
+
2025-09-26 00:08:45,994 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0065 | Val mean-roc_auc_score: 0.8844
|
| 354 |
+
2025-09-26 00:09:04,506 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0069 | Val mean-roc_auc_score: 0.8846
|
| 355 |
+
2025-09-26 00:09:23,148 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0037 | Val mean-roc_auc_score: 0.8848
|
| 356 |
+
2025-09-26 00:09:43,008 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0044 | Val mean-roc_auc_score: 0.8850
|
| 357 |
+
2025-09-26 00:10:00,374 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0035 | Val mean-roc_auc_score: 0.8854
|
| 358 |
+
2025-09-26 00:10:18,544 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0029 | Val mean-roc_auc_score: 0.8856
|
| 359 |
+
2025-09-26 00:10:39,542 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0112 | Val mean-roc_auc_score: 0.8783
|
| 360 |
+
2025-09-26 00:10:56,911 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0091 | Val mean-roc_auc_score: 0.8789
|
| 361 |
+
2025-09-26 00:11:15,635 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0083 | Val mean-roc_auc_score: 0.8803
|
| 362 |
+
2025-09-26 00:11:34,936 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0056 | Val mean-roc_auc_score: 0.8800
|
| 363 |
+
2025-09-26 00:11:51,580 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0058 | Val mean-roc_auc_score: 0.8813
|
| 364 |
+
2025-09-26 00:11:52,283 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8792
|
| 365 |
+
2025-09-26 00:11:52,749 - logs_modchembert_antimalarial_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg mean-roc_auc_score: 0.8819, Std Dev: 0.0052
|
logs_modchembert_classification_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_cocrystal_epochs100_batch_size32_20250926_032547.log
ADDED
|
@@ -0,0 +1,349 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 03:25:47,078 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Running benchmark for dataset: cocrystal
|
| 2 |
+
2025-09-26 03:25:47,078 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - dataset: cocrystal, tasks: ['label'], epochs: 100, learning rate: 3e-05
|
| 3 |
+
2025-09-26 03:25:47,083 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset cocrystal at 2025-09-26_03-25-47
|
| 4 |
+
2025-09-26 03:25:55,449 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6520 | Val mean-roc_auc_score: 0.6901
|
| 5 |
+
2025-09-26 03:25:55,449 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 37
|
| 6 |
+
2025-09-26 03:25:56,355 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.6901
|
| 7 |
+
2025-09-26 03:26:05,874 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4899 | Val mean-roc_auc_score: 0.8204
|
| 8 |
+
2025-09-26 03:26:06,060 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 74
|
| 9 |
+
2025-09-26 03:26:06,623 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8204
|
| 10 |
+
2025-09-26 03:26:17,257 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.4460 | Val mean-roc_auc_score: 0.8560
|
| 11 |
+
2025-09-26 03:26:17,473 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 111
|
| 12 |
+
2025-09-26 03:26:18,049 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val mean-roc_auc_score: 0.8560
|
| 13 |
+
2025-09-26 03:26:24,894 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.4071 | Val mean-roc_auc_score: 0.8234
|
| 14 |
+
2025-09-26 03:26:35,032 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.3834 | Val mean-roc_auc_score: 0.8247
|
| 15 |
+
2025-09-26 03:26:45,559 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.3438 | Val mean-roc_auc_score: 0.8507
|
| 16 |
+
2025-09-26 03:26:53,944 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.3074 | Val mean-roc_auc_score: 0.8527
|
| 17 |
+
2025-09-26 03:27:04,456 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.2872 | Val mean-roc_auc_score: 0.8545
|
| 18 |
+
2025-09-26 03:27:14,578 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.3087 | Val mean-roc_auc_score: 0.8635
|
| 19 |
+
2025-09-26 03:27:14,754 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 333
|
| 20 |
+
2025-09-26 03:27:15,383 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 9 with val mean-roc_auc_score: 0.8635
|
| 21 |
+
2025-09-26 03:27:23,101 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.2365 | Val mean-roc_auc_score: 0.8800
|
| 22 |
+
2025-09-26 03:27:23,324 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 370
|
| 23 |
+
2025-09-26 03:27:24,006 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 10 with val mean-roc_auc_score: 0.8800
|
| 24 |
+
2025-09-26 03:27:34,608 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.2801 | Val mean-roc_auc_score: 0.8866
|
| 25 |
+
2025-09-26 03:27:35,164 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 407
|
| 26 |
+
2025-09-26 03:27:35,871 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 11 with val mean-roc_auc_score: 0.8866
|
| 27 |
+
2025-09-26 03:27:45,557 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.2002 | Val mean-roc_auc_score: 0.8889
|
| 28 |
+
2025-09-26 03:27:45,803 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 444
|
| 29 |
+
2025-09-26 03:27:46,547 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 12 with val mean-roc_auc_score: 0.8889
|
| 30 |
+
2025-09-26 03:27:52,944 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.1681 | Val mean-roc_auc_score: 0.8778
|
| 31 |
+
2025-09-26 03:28:01,477 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.2448 | Val mean-roc_auc_score: 0.8815
|
| 32 |
+
2025-09-26 03:28:10,720 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.1571 | Val mean-roc_auc_score: 0.8705
|
| 33 |
+
2025-09-26 03:28:20,084 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.1579 | Val mean-roc_auc_score: 0.8337
|
| 34 |
+
2025-09-26 03:28:27,131 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.1185 | Val mean-roc_auc_score: 0.8346
|
| 35 |
+
2025-09-26 03:28:36,198 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.1596 | Val mean-roc_auc_score: 0.8454
|
| 36 |
+
2025-09-26 03:28:45,042 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0566 | Val mean-roc_auc_score: 0.8218
|
| 37 |
+
2025-09-26 03:28:51,826 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.1123 | Val mean-roc_auc_score: 0.8535
|
| 38 |
+
2025-09-26 03:29:01,300 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.1043 | Val mean-roc_auc_score: 0.8158
|
| 39 |
+
2025-09-26 03:29:11,041 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.1708 | Val mean-roc_auc_score: 0.8659
|
| 40 |
+
2025-09-26 03:29:20,203 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.1149 | Val mean-roc_auc_score: 0.8631
|
| 41 |
+
2025-09-26 03:29:27,058 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0519 | Val mean-roc_auc_score: 0.8452
|
| 42 |
+
2025-09-26 03:29:36,686 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0400 | Val mean-roc_auc_score: 0.8350
|
| 43 |
+
2025-09-26 03:29:47,071 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0291 | Val mean-roc_auc_score: 0.8296
|
| 44 |
+
2025-09-26 03:29:56,852 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0513 | Val mean-roc_auc_score: 0.8117
|
| 45 |
+
2025-09-26 03:30:06,372 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0942 | Val mean-roc_auc_score: 0.8422
|
| 46 |
+
2025-09-26 03:30:15,772 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0600 | Val mean-roc_auc_score: 0.8424
|
| 47 |
+
2025-09-26 03:30:22,995 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0699 | Val mean-roc_auc_score: 0.8466
|
| 48 |
+
2025-09-26 03:30:32,631 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0382 | Val mean-roc_auc_score: 0.8422
|
| 49 |
+
2025-09-26 03:30:42,065 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0308 | Val mean-roc_auc_score: 0.8450
|
| 50 |
+
2025-09-26 03:30:48,956 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0242 | Val mean-roc_auc_score: 0.8562
|
| 51 |
+
2025-09-26 03:30:58,432 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0566 | Val mean-roc_auc_score: 0.8591
|
| 52 |
+
2025-09-26 03:31:08,522 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0253 | Val mean-roc_auc_score: 0.8500
|
| 53 |
+
2025-09-26 03:31:18,525 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0229 | Val mean-roc_auc_score: 0.8474
|
| 54 |
+
2025-09-26 03:31:25,813 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0109 | Val mean-roc_auc_score: 0.8489
|
| 55 |
+
2025-09-26 03:31:35,200 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0226 | Val mean-roc_auc_score: 0.8390
|
| 56 |
+
2025-09-26 03:31:44,551 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0125 | Val mean-roc_auc_score: 0.8424
|
| 57 |
+
2025-09-26 03:31:51,587 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0148 | Val mean-roc_auc_score: 0.8444
|
| 58 |
+
2025-09-26 03:32:01,424 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0081 | Val mean-roc_auc_score: 0.8445
|
| 59 |
+
2025-09-26 03:32:11,133 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0106 | Val mean-roc_auc_score: 0.8431
|
| 60 |
+
2025-09-26 03:32:18,012 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0076 | Val mean-roc_auc_score: 0.8390
|
| 61 |
+
2025-09-26 03:32:27,718 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0041 | Val mean-roc_auc_score: 0.8370
|
| 62 |
+
2025-09-26 03:32:37,061 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0049 | Val mean-roc_auc_score: 0.8411
|
| 63 |
+
2025-09-26 03:32:46,448 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0012 | Val mean-roc_auc_score: 0.8417
|
| 64 |
+
2025-09-26 03:32:53,557 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0081 | Val mean-roc_auc_score: 0.8360
|
| 65 |
+
2025-09-26 03:33:02,719 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0110 | Val mean-roc_auc_score: 0.8366
|
| 66 |
+
2025-09-26 03:33:11,506 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0063 | Val mean-roc_auc_score: 0.8293
|
| 67 |
+
2025-09-26 03:33:18,208 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0038 | Val mean-roc_auc_score: 0.8239
|
| 68 |
+
2025-09-26 03:33:27,308 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0037 | Val mean-roc_auc_score: 0.8203
|
| 69 |
+
2025-09-26 03:33:36,553 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0044 | Val mean-roc_auc_score: 0.8206
|
| 70 |
+
2025-09-26 03:33:46,137 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0035 | Val mean-roc_auc_score: 0.8117
|
| 71 |
+
2025-09-26 03:33:53,419 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0044 | Val mean-roc_auc_score: 0.8100
|
| 72 |
+
2025-09-26 03:34:03,411 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0424 | Val mean-roc_auc_score: 0.8705
|
| 73 |
+
2025-09-26 03:34:12,387 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0378 | Val mean-roc_auc_score: 0.8535
|
| 74 |
+
2025-09-26 03:34:19,024 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0694 | Val mean-roc_auc_score: 0.8623
|
| 75 |
+
2025-09-26 03:34:27,894 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0574 | Val mean-roc_auc_score: 0.8127
|
| 76 |
+
2025-09-26 03:34:36,580 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0266 | Val mean-roc_auc_score: 0.8252
|
| 77 |
+
2025-09-26 03:34:45,573 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0111 | Val mean-roc_auc_score: 0.8412
|
| 78 |
+
2025-09-26 03:34:52,318 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0079 | Val mean-roc_auc_score: 0.8390
|
| 79 |
+
2025-09-26 03:35:01,446 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0054 | Val mean-roc_auc_score: 0.8354
|
| 80 |
+
2025-09-26 03:35:10,546 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0068 | Val mean-roc_auc_score: 0.8350
|
| 81 |
+
2025-09-26 03:35:16,430 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0054 | Val mean-roc_auc_score: 0.8394
|
| 82 |
+
2025-09-26 03:35:25,617 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0043 | Val mean-roc_auc_score: 0.8402
|
| 83 |
+
2025-09-26 03:35:34,483 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0046 | Val mean-roc_auc_score: 0.8399
|
| 84 |
+
2025-09-26 03:35:43,560 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0045 | Val mean-roc_auc_score: 0.8387
|
| 85 |
+
2025-09-26 03:35:50,380 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0027 | Val mean-roc_auc_score: 0.8372
|
| 86 |
+
2025-09-26 03:35:58,849 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0036 | Val mean-roc_auc_score: 0.8377
|
| 87 |
+
2025-09-26 03:36:08,043 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0033 | Val mean-roc_auc_score: 0.8378
|
| 88 |
+
2025-09-26 03:36:14,712 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0025 | Val mean-roc_auc_score: 0.8378
|
| 89 |
+
2025-09-26 03:36:24,062 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0040 | Val mean-roc_auc_score: 0.8364
|
| 90 |
+
2025-09-26 03:36:32,663 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0008 | Val mean-roc_auc_score: 0.8369
|
| 91 |
+
2025-09-26 03:36:41,569 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0025 | Val mean-roc_auc_score: 0.8363
|
| 92 |
+
2025-09-26 03:36:47,953 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0031 | Val mean-roc_auc_score: 0.8310
|
| 93 |
+
2025-09-26 03:36:57,043 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0019 | Val mean-roc_auc_score: 0.8291
|
| 94 |
+
2025-09-26 03:37:06,233 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0036 | Val mean-roc_auc_score: 0.8290
|
| 95 |
+
2025-09-26 03:37:15,112 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0031 | Val mean-roc_auc_score: 0.8271
|
| 96 |
+
2025-09-26 03:37:21,205 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0028 | Val mean-roc_auc_score: 0.8259
|
| 97 |
+
2025-09-26 03:37:31,152 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0023 | Val mean-roc_auc_score: 0.8264
|
| 98 |
+
2025-09-26 03:37:41,519 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0019 | Val mean-roc_auc_score: 0.8264
|
| 99 |
+
2025-09-26 03:37:52,288 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0023 | Val mean-roc_auc_score: 0.8264
|
| 100 |
+
2025-09-26 03:38:01,137 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0022 | Val mean-roc_auc_score: 0.8268
|
| 101 |
+
2025-09-26 03:38:09,818 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0034 | Val mean-roc_auc_score: 0.8260
|
| 102 |
+
2025-09-26 03:38:15,947 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0021 | Val mean-roc_auc_score: 0.8259
|
| 103 |
+
2025-09-26 03:38:25,033 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0024 | Val mean-roc_auc_score: 0.8255
|
| 104 |
+
2025-09-26 03:38:34,570 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0037 | Val mean-roc_auc_score: 0.8252
|
| 105 |
+
2025-09-26 03:38:43,884 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0015 | Val mean-roc_auc_score: 0.8256
|
| 106 |
+
2025-09-26 03:38:50,211 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0025 | Val mean-roc_auc_score: 0.8253
|
| 107 |
+
2025-09-26 03:38:59,162 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0026 | Val mean-roc_auc_score: 0.8252
|
| 108 |
+
2025-09-26 03:39:08,145 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0016 | Val mean-roc_auc_score: 0.8252
|
| 109 |
+
2025-09-26 03:39:15,155 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0017 | Val mean-roc_auc_score: 0.8365
|
| 110 |
+
2025-09-26 03:39:24,435 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0026 | Val mean-roc_auc_score: 0.8334
|
| 111 |
+
2025-09-26 03:39:33,615 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0029 | Val mean-roc_auc_score: 0.8323
|
| 112 |
+
2025-09-26 03:39:42,886 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0250 | Val mean-roc_auc_score: 0.8307
|
| 113 |
+
2025-09-26 03:39:49,667 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0151 | Val mean-roc_auc_score: 0.8391
|
| 114 |
+
2025-09-26 03:39:59,390 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0065 | Val mean-roc_auc_score: 0.8411
|
| 115 |
+
2025-09-26 03:40:08,390 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0022 | Val mean-roc_auc_score: 0.8398
|
| 116 |
+
2025-09-26 03:40:14,587 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0033 | Val mean-roc_auc_score: 0.8354
|
| 117 |
+
2025-09-26 03:40:23,272 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0024 | Val mean-roc_auc_score: 0.8342
|
| 118 |
+
2025-09-26 03:40:24,300 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8421
|
| 119 |
+
2025-09-26 03:40:24,704 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset cocrystal at 2025-09-26_03-40-24
|
| 120 |
+
2025-09-26 03:40:32,091 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6047 | Val mean-roc_auc_score: 0.8130
|
| 121 |
+
2025-09-26 03:40:32,091 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 37
|
| 122 |
+
2025-09-26 03:40:32,748 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.8130
|
| 123 |
+
2025-09-26 03:40:41,431 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4628 | Val mean-roc_auc_score: 0.8358
|
| 124 |
+
2025-09-26 03:40:41,645 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 74
|
| 125 |
+
2025-09-26 03:40:42,302 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8358
|
| 126 |
+
2025-09-26 03:40:49,208 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.4091 | Val mean-roc_auc_score: 0.8739
|
| 127 |
+
2025-09-26 03:40:49,421 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 111
|
| 128 |
+
2025-09-26 03:40:50,135 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val mean-roc_auc_score: 0.8739
|
| 129 |
+
2025-09-26 03:40:59,245 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3530 | Val mean-roc_auc_score: 0.8620
|
| 130 |
+
2025-09-26 03:41:08,168 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.3328 | Val mean-roc_auc_score: 0.8628
|
| 131 |
+
2025-09-26 03:41:13,541 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.2969 | Val mean-roc_auc_score: 0.8998
|
| 132 |
+
2025-09-26 03:41:14,088 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 222
|
| 133 |
+
2025-09-26 03:41:14,731 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 6 with val mean-roc_auc_score: 0.8998
|
| 134 |
+
2025-09-26 03:41:23,987 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.2703 | Val mean-roc_auc_score: 0.8737
|
| 135 |
+
2025-09-26 03:41:32,720 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.2652 | Val mean-roc_auc_score: 0.8989
|
| 136 |
+
2025-09-26 03:41:41,056 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.2292 | Val mean-roc_auc_score: 0.8909
|
| 137 |
+
2025-09-26 03:41:47,858 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.2264 | Val mean-roc_auc_score: 0.9117
|
| 138 |
+
2025-09-26 03:41:48,059 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 370
|
| 139 |
+
2025-09-26 03:41:48,736 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 10 with val mean-roc_auc_score: 0.9117
|
| 140 |
+
2025-09-26 03:41:58,666 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.1942 | Val mean-roc_auc_score: 0.8883
|
| 141 |
+
2025-09-26 03:42:09,003 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.1951 | Val mean-roc_auc_score: 0.8512
|
| 142 |
+
2025-09-26 03:42:16,355 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.1681 | Val mean-roc_auc_score: 0.8758
|
| 143 |
+
2025-09-26 03:42:25,925 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.1476 | Val mean-roc_auc_score: 0.8583
|
| 144 |
+
2025-09-26 03:42:35,358 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.1191 | Val mean-roc_auc_score: 0.8650
|
| 145 |
+
2025-09-26 03:42:42,249 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.1056 | Val mean-roc_auc_score: 0.8653
|
| 146 |
+
2025-09-26 03:42:52,077 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.1606 | Val mean-roc_auc_score: 0.8701
|
| 147 |
+
2025-09-26 03:43:01,300 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.1166 | Val mean-roc_auc_score: 0.8710
|
| 148 |
+
2025-09-26 03:43:10,462 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.1387 | Val mean-roc_auc_score: 0.8614
|
| 149 |
+
2025-09-26 03:43:17,123 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0587 | Val mean-roc_auc_score: 0.8675
|
| 150 |
+
2025-09-26 03:43:26,002 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0912 | Val mean-roc_auc_score: 0.8314
|
| 151 |
+
2025-09-26 03:43:35,568 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0943 | Val mean-roc_auc_score: 0.8570
|
| 152 |
+
2025-09-26 03:43:42,422 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0490 | Val mean-roc_auc_score: 0.8587
|
| 153 |
+
2025-09-26 03:43:51,644 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0319 | Val mean-roc_auc_score: 0.8636
|
| 154 |
+
2025-09-26 03:43:59,518 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0831 | Val mean-roc_auc_score: 0.8580
|
| 155 |
+
2025-09-26 03:44:08,652 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0756 | Val mean-roc_auc_score: 0.8590
|
| 156 |
+
2025-09-26 03:44:16,831 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0785 | Val mean-roc_auc_score: 0.8375
|
| 157 |
+
2025-09-26 03:44:26,919 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0527 | Val mean-roc_auc_score: 0.8465
|
| 158 |
+
2025-09-26 03:44:36,788 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0209 | Val mean-roc_auc_score: 0.8358
|
| 159 |
+
2025-09-26 03:44:43,911 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0101 | Val mean-roc_auc_score: 0.8360
|
| 160 |
+
2025-09-26 03:44:53,523 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0210 | Val mean-roc_auc_score: 0.8456
|
| 161 |
+
2025-09-26 03:45:04,226 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0353 | Val mean-roc_auc_score: 0.8133
|
| 162 |
+
2025-09-26 03:45:12,117 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0322 | Val mean-roc_auc_score: 0.8100
|
| 163 |
+
2025-09-26 03:45:22,117 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0190 | Val mean-roc_auc_score: 0.8171
|
| 164 |
+
2025-09-26 03:45:32,198 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0583 | Val mean-roc_auc_score: 0.8155
|
| 165 |
+
2025-09-26 03:45:42,148 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.1055 | Val mean-roc_auc_score: 0.8387
|
| 166 |
+
2025-09-26 03:45:50,077 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0414 | Val mean-roc_auc_score: 0.8271
|
| 167 |
+
2025-09-26 03:45:59,874 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0083 | Val mean-roc_auc_score: 0.8202
|
| 168 |
+
2025-09-26 03:46:09,303 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0148 | Val mean-roc_auc_score: 0.8276
|
| 169 |
+
2025-09-26 03:46:16,329 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0096 | Val mean-roc_auc_score: 0.8282
|
| 170 |
+
2025-09-26 03:46:25,821 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0111 | Val mean-roc_auc_score: 0.8281
|
| 171 |
+
2025-09-26 03:46:35,628 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0063 | Val mean-roc_auc_score: 0.8288
|
| 172 |
+
2025-09-26 03:46:42,617 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0090 | Val mean-roc_auc_score: 0.8244
|
| 173 |
+
2025-09-26 03:46:51,852 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0095 | Val mean-roc_auc_score: 0.8221
|
| 174 |
+
2025-09-26 03:47:01,156 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0071 | Val mean-roc_auc_score: 0.8193
|
| 175 |
+
2025-09-26 03:47:10,446 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0532 | Val mean-roc_auc_score: 0.8269
|
| 176 |
+
2025-09-26 03:47:17,794 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0076 | Val mean-roc_auc_score: 0.8267
|
| 177 |
+
2025-09-26 03:47:27,592 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0053 | Val mean-roc_auc_score: 0.8279
|
| 178 |
+
2025-09-26 03:47:36,671 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0145 | Val mean-roc_auc_score: 0.8337
|
| 179 |
+
2025-09-26 03:47:43,401 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0066 | Val mean-roc_auc_score: 0.8389
|
| 180 |
+
2025-09-26 03:47:52,518 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0064 | Val mean-roc_auc_score: 0.8256
|
| 181 |
+
2025-09-26 03:48:00,867 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0050 | Val mean-roc_auc_score: 0.8292
|
| 182 |
+
2025-09-26 03:48:09,541 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0100 | Val mean-roc_auc_score: 0.8293
|
| 183 |
+
2025-09-26 03:48:15,582 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0053 | Val mean-roc_auc_score: 0.8287
|
| 184 |
+
2025-09-26 03:48:26,022 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0041 | Val mean-roc_auc_score: 0.8290
|
| 185 |
+
2025-09-26 03:48:35,260 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0022 | Val mean-roc_auc_score: 0.8287
|
| 186 |
+
2025-09-26 03:48:42,231 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0040 | Val mean-roc_auc_score: 0.8272
|
| 187 |
+
2025-09-26 03:48:51,306 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0027 | Val mean-roc_auc_score: 0.8252
|
| 188 |
+
2025-09-26 03:49:00,201 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0049 | Val mean-roc_auc_score: 0.8325
|
| 189 |
+
2025-09-26 03:49:09,012 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0023 | Val mean-roc_auc_score: 0.8413
|
| 190 |
+
2025-09-26 03:49:15,378 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0025 | Val mean-roc_auc_score: 0.8366
|
| 191 |
+
2025-09-26 03:49:25,018 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0075 | Val mean-roc_auc_score: 0.8469
|
| 192 |
+
2025-09-26 03:49:34,559 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0046 | Val mean-roc_auc_score: 0.8371
|
| 193 |
+
2025-09-26 03:49:41,471 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0072 | Val mean-roc_auc_score: 0.8474
|
| 194 |
+
2025-09-26 03:49:50,774 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0023 | Val mean-roc_auc_score: 0.8543
|
| 195 |
+
2025-09-26 03:50:00,156 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0044 | Val mean-roc_auc_score: 0.8485
|
| 196 |
+
2025-09-26 03:50:10,665 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0023 | Val mean-roc_auc_score: 0.8458
|
| 197 |
+
2025-09-26 03:50:17,926 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0408 | Val mean-roc_auc_score: 0.7763
|
| 198 |
+
2025-09-26 03:50:27,343 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0676 | Val mean-roc_auc_score: 0.8378
|
| 199 |
+
2025-09-26 03:50:36,936 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0315 | Val mean-roc_auc_score: 0.8197
|
| 200 |
+
2025-09-26 03:50:43,966 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0092 | Val mean-roc_auc_score: 0.8201
|
| 201 |
+
2025-09-26 03:50:54,408 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0054 | Val mean-roc_auc_score: 0.8169
|
| 202 |
+
2025-09-26 03:51:04,873 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0014 | Val mean-roc_auc_score: 0.8228
|
| 203 |
+
2025-09-26 03:51:12,684 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0063 | Val mean-roc_auc_score: 0.8065
|
| 204 |
+
2025-09-26 03:51:23,033 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0038 | Val mean-roc_auc_score: 0.8071
|
| 205 |
+
2025-09-26 03:51:33,705 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0042 | Val mean-roc_auc_score: 0.8063
|
| 206 |
+
2025-09-26 03:51:42,097 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0037 | Val mean-roc_auc_score: 0.8062
|
| 207 |
+
2025-09-26 03:51:52,589 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0027 | Val mean-roc_auc_score: 0.8056
|
| 208 |
+
2025-09-26 03:52:01,552 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0059 | Val mean-roc_auc_score: 0.8097
|
| 209 |
+
2025-09-26 03:52:08,456 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0036 | Val mean-roc_auc_score: 0.8262
|
| 210 |
+
2025-09-26 03:52:17,881 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0028 | Val mean-roc_auc_score: 0.8244
|
| 211 |
+
2025-09-26 03:52:29,341 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0022 | Val mean-roc_auc_score: 0.8239
|
| 212 |
+
2025-09-26 03:52:39,416 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0029 | Val mean-roc_auc_score: 0.8169
|
| 213 |
+
2025-09-26 03:52:47,439 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0022 | Val mean-roc_auc_score: 0.8313
|
| 214 |
+
2025-09-26 03:52:58,200 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0022 | Val mean-roc_auc_score: 0.8277
|
| 215 |
+
2025-09-26 03:53:08,603 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0028 | Val mean-roc_auc_score: 0.8262
|
| 216 |
+
2025-09-26 03:53:16,392 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0236 | Val mean-roc_auc_score: 0.8408
|
| 217 |
+
2025-09-26 03:53:26,514 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0073 | Val mean-roc_auc_score: 0.8394
|
| 218 |
+
2025-09-26 03:53:36,675 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0034 | Val mean-roc_auc_score: 0.8378
|
| 219 |
+
2025-09-26 03:53:44,508 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0038 | Val mean-roc_auc_score: 0.8341
|
| 220 |
+
2025-09-26 03:53:55,012 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0033 | Val mean-roc_auc_score: 0.8312
|
| 221 |
+
2025-09-26 03:54:05,467 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0017 | Val mean-roc_auc_score: 0.8311
|
| 222 |
+
2025-09-26 03:54:13,166 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0024 | Val mean-roc_auc_score: 0.8303
|
| 223 |
+
2025-09-26 03:54:23,546 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0028 | Val mean-roc_auc_score: 0.8323
|
| 224 |
+
2025-09-26 03:54:34,137 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0026 | Val mean-roc_auc_score: 0.8318
|
| 225 |
+
2025-09-26 03:54:42,109 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0029 | Val mean-roc_auc_score: 0.8322
|
| 226 |
+
2025-09-26 03:54:52,363 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0020 | Val mean-roc_auc_score: 0.8317
|
| 227 |
+
2025-09-26 03:55:02,710 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0019 | Val mean-roc_auc_score: 0.8317
|
| 228 |
+
2025-09-26 03:55:09,836 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0023 | Val mean-roc_auc_score: 0.8305
|
| 229 |
+
2025-09-26 03:55:18,624 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0057 | Val mean-roc_auc_score: 0.8237
|
| 230 |
+
2025-09-26 03:55:19,359 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8680
|
| 231 |
+
2025-09-26 03:55:19,712 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset cocrystal at 2025-09-26_03-55-19
|
| 232 |
+
2025-09-26 03:55:27,619 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5980 | Val mean-roc_auc_score: 0.7956
|
| 233 |
+
2025-09-26 03:55:27,619 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 37
|
| 234 |
+
2025-09-26 03:55:28,230 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.7956
|
| 235 |
+
2025-09-26 03:55:36,084 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4493 | Val mean-roc_auc_score: 0.8325
|
| 236 |
+
2025-09-26 03:55:36,276 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 74
|
| 237 |
+
2025-09-26 03:55:36,857 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8325
|
| 238 |
+
2025-09-26 03:55:41,990 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3864 | Val mean-roc_auc_score: 0.8562
|
| 239 |
+
2025-09-26 03:55:42,187 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 111
|
| 240 |
+
2025-09-26 03:55:42,758 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val mean-roc_auc_score: 0.8562
|
| 241 |
+
2025-09-26 03:55:51,158 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3868 | Val mean-roc_auc_score: 0.8600
|
| 242 |
+
2025-09-26 03:55:51,354 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 148
|
| 243 |
+
2025-09-26 03:55:51,932 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val mean-roc_auc_score: 0.8600
|
| 244 |
+
2025-09-26 03:56:00,168 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.3412 | Val mean-roc_auc_score: 0.8645
|
| 245 |
+
2025-09-26 03:56:00,363 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 185
|
| 246 |
+
2025-09-26 03:56:00,960 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val mean-roc_auc_score: 0.8645
|
| 247 |
+
2025-09-26 03:56:06,701 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.3338 | Val mean-roc_auc_score: 0.8593
|
| 248 |
+
2025-09-26 03:56:14,293 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.3193 | Val mean-roc_auc_score: 0.8687
|
| 249 |
+
2025-09-26 03:56:14,495 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 259
|
| 250 |
+
2025-09-26 03:56:15,108 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val mean-roc_auc_score: 0.8687
|
| 251 |
+
2025-09-26 03:56:22,452 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.2821 | Val mean-roc_auc_score: 0.8717
|
| 252 |
+
2025-09-26 03:56:22,649 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 296
|
| 253 |
+
2025-09-26 03:56:23,233 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val mean-roc_auc_score: 0.8717
|
| 254 |
+
2025-09-26 03:56:31,549 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.2519 | Val mean-roc_auc_score: 0.8677
|
| 255 |
+
2025-09-26 03:56:37,646 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.2297 | Val mean-roc_auc_score: 0.8820
|
| 256 |
+
2025-09-26 03:56:37,851 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Global step of best model: 370
|
| 257 |
+
2025-09-26 03:56:38,463 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Best model saved at epoch 10 with val mean-roc_auc_score: 0.8820
|
| 258 |
+
2025-09-26 03:56:47,833 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.2723 | Val mean-roc_auc_score: 0.8631
|
| 259 |
+
2025-09-26 03:56:57,951 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.2044 | Val mean-roc_auc_score: 0.8515
|
| 260 |
+
2025-09-26 03:57:05,156 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.1731 | Val mean-roc_auc_score: 0.8528
|
| 261 |
+
2025-09-26 03:57:14,178 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.1710 | Val mean-roc_auc_score: 0.8423
|
| 262 |
+
2025-09-26 03:57:23,414 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.1579 | Val mean-roc_auc_score: 0.8332
|
| 263 |
+
2025-09-26 03:57:32,812 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.1596 | Val mean-roc_auc_score: 0.8064
|
| 264 |
+
2025-09-26 03:57:40,459 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.1627 | Val mean-roc_auc_score: 0.8544
|
| 265 |
+
2025-09-26 03:57:50,316 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.1309 | Val mean-roc_auc_score: 0.8216
|
| 266 |
+
2025-09-26 03:57:59,787 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0684 | Val mean-roc_auc_score: 0.8104
|
| 267 |
+
2025-09-26 03:58:07,146 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0861 | Val mean-roc_auc_score: 0.8010
|
| 268 |
+
2025-09-26 03:58:16,956 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0492 | Val mean-roc_auc_score: 0.8030
|
| 269 |
+
2025-09-26 03:58:27,826 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.1942 | Val mean-roc_auc_score: 0.8417
|
| 270 |
+
2025-09-26 03:58:35,864 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.1250 | Val mean-roc_auc_score: 0.8197
|
| 271 |
+
2025-09-26 03:58:45,815 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0887 | Val mean-roc_auc_score: 0.8234
|
| 272 |
+
2025-09-26 03:58:54,619 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0666 | Val mean-roc_auc_score: 0.8505
|
| 273 |
+
2025-09-26 03:59:03,384 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0562 | Val mean-roc_auc_score: 0.8176
|
| 274 |
+
2025-09-26 03:59:11,454 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0760 | Val mean-roc_auc_score: 0.8407
|
| 275 |
+
2025-09-26 03:59:21,993 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0373 | Val mean-roc_auc_score: 0.8241
|
| 276 |
+
2025-09-26 03:59:31,845 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0263 | Val mean-roc_auc_score: 0.8206
|
| 277 |
+
2025-09-26 03:59:39,261 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0197 | Val mean-roc_auc_score: 0.8112
|
| 278 |
+
2025-09-26 03:59:49,368 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0141 | Val mean-roc_auc_score: 0.8093
|
| 279 |
+
2025-09-26 04:00:00,319 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0081 | Val mean-roc_auc_score: 0.8046
|
| 280 |
+
2025-09-26 04:00:08,211 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0124 | Val mean-roc_auc_score: 0.7984
|
| 281 |
+
2025-09-26 04:00:17,978 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0484 | Val mean-roc_auc_score: 0.8401
|
| 282 |
+
2025-09-26 04:00:27,946 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0515 | Val mean-roc_auc_score: 0.8290
|
| 283 |
+
2025-09-26 04:00:35,232 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0244 | Val mean-roc_auc_score: 0.8115
|
| 284 |
+
2025-09-26 04:00:45,789 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0187 | Val mean-roc_auc_score: 0.8072
|
| 285 |
+
2025-09-26 04:00:56,025 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0124 | Val mean-roc_auc_score: 0.8034
|
| 286 |
+
2025-09-26 04:01:03,272 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0072 | Val mean-roc_auc_score: 0.8026
|
| 287 |
+
2025-09-26 04:01:12,967 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0118 | Val mean-roc_auc_score: 0.8180
|
| 288 |
+
2025-09-26 04:01:22,669 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0183 | Val mean-roc_auc_score: 0.8210
|
| 289 |
+
2025-09-26 04:01:33,004 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0131 | Val mean-roc_auc_score: 0.8210
|
| 290 |
+
2025-09-26 04:01:40,528 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0115 | Val mean-roc_auc_score: 0.8069
|
| 291 |
+
2025-09-26 04:01:50,245 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0848 | Val mean-roc_auc_score: 0.7798
|
| 292 |
+
2025-09-26 04:02:00,090 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.1014 | Val mean-roc_auc_score: 0.8189
|
| 293 |
+
2025-09-26 04:02:07,515 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0247 | Val mean-roc_auc_score: 0.8259
|
| 294 |
+
2025-09-26 04:02:17,958 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0186 | Val mean-roc_auc_score: 0.8301
|
| 295 |
+
2025-09-26 04:02:28,198 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0168 | Val mean-roc_auc_score: 0.8282
|
| 296 |
+
2025-09-26 04:02:35,521 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0153 | Val mean-roc_auc_score: 0.8204
|
| 297 |
+
2025-09-26 04:02:45,541 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0094 | Val mean-roc_auc_score: 0.8209
|
| 298 |
+
2025-09-26 04:02:55,353 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0100 | Val mean-roc_auc_score: 0.8216
|
| 299 |
+
2025-09-26 04:03:04,429 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0118 | Val mean-roc_auc_score: 0.8271
|
| 300 |
+
2025-09-26 04:03:10,806 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0078 | Val mean-roc_auc_score: 0.8207
|
| 301 |
+
2025-09-26 04:03:19,175 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0051 | Val mean-roc_auc_score: 0.8188
|
| 302 |
+
2025-09-26 04:03:30,069 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0040 | Val mean-roc_auc_score: 0.8197
|
| 303 |
+
2025-09-26 04:03:37,390 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0030 | Val mean-roc_auc_score: 0.8194
|
| 304 |
+
2025-09-26 04:03:47,687 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0020 | Val mean-roc_auc_score: 0.8188
|
| 305 |
+
2025-09-26 04:03:57,909 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0053 | Val mean-roc_auc_score: 0.8203
|
| 306 |
+
2025-09-26 04:04:05,217 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0046 | Val mean-roc_auc_score: 0.8217
|
| 307 |
+
2025-09-26 04:04:15,095 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0017 | Val mean-roc_auc_score: 0.8231
|
| 308 |
+
2025-09-26 04:04:25,109 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0021 | Val mean-roc_auc_score: 0.8230
|
| 309 |
+
2025-09-26 04:04:33,412 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0027 | Val mean-roc_auc_score: 0.8234
|
| 310 |
+
2025-09-26 04:04:43,521 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0173 | Val mean-roc_auc_score: 0.8283
|
| 311 |
+
2025-09-26 04:04:53,356 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0181 | Val mean-roc_auc_score: 0.8287
|
| 312 |
+
2025-09-26 04:05:03,205 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0029 | Val mean-roc_auc_score: 0.8294
|
| 313 |
+
2025-09-26 04:05:10,849 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0079 | Val mean-roc_auc_score: 0.8259
|
| 314 |
+
2025-09-26 04:05:21,501 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0068 | Val mean-roc_auc_score: 0.8249
|
| 315 |
+
2025-09-26 04:05:31,660 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0035 | Val mean-roc_auc_score: 0.8247
|
| 316 |
+
2025-09-26 04:05:38,997 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0024 | Val mean-roc_auc_score: 0.8245
|
| 317 |
+
2025-09-26 04:05:48,872 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0151 | Val mean-roc_auc_score: 0.8242
|
| 318 |
+
2025-09-26 04:05:58,798 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0310 | Val mean-roc_auc_score: 0.8228
|
| 319 |
+
2025-09-26 04:06:06,967 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0160 | Val mean-roc_auc_score: 0.8297
|
| 320 |
+
2025-09-26 04:06:17,494 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0044 | Val mean-roc_auc_score: 0.8232
|
| 321 |
+
2025-09-26 04:06:27,334 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0045 | Val mean-roc_auc_score: 0.8224
|
| 322 |
+
2025-09-26 04:06:34,982 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0054 | Val mean-roc_auc_score: 0.8191
|
| 323 |
+
2025-09-26 04:06:45,040 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0094 | Val mean-roc_auc_score: 0.8149
|
| 324 |
+
2025-09-26 04:06:55,604 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0074 | Val mean-roc_auc_score: 0.8111
|
| 325 |
+
2025-09-26 04:07:03,413 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0100 | Val mean-roc_auc_score: 0.8155
|
| 326 |
+
2025-09-26 04:07:11,750 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0113 | Val mean-roc_auc_score: 0.8147
|
| 327 |
+
2025-09-26 04:07:20,430 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0039 | Val mean-roc_auc_score: 0.8179
|
| 328 |
+
2025-09-26 04:07:29,099 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0030 | Val mean-roc_auc_score: 0.8181
|
| 329 |
+
2025-09-26 04:07:38,022 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0048 | Val mean-roc_auc_score: 0.8242
|
| 330 |
+
2025-09-26 04:07:48,298 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0021 | Val mean-roc_auc_score: 0.8226
|
| 331 |
+
2025-09-26 04:07:58,059 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0019 | Val mean-roc_auc_score: 0.8266
|
| 332 |
+
2025-09-26 04:08:05,362 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0019 | Val mean-roc_auc_score: 0.8262
|
| 333 |
+
2025-09-26 04:08:15,319 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0015 | Val mean-roc_auc_score: 0.8259
|
| 334 |
+
2025-09-26 04:08:26,140 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0017 | Val mean-roc_auc_score: 0.8257
|
| 335 |
+
2025-09-26 04:08:34,134 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0016 | Val mean-roc_auc_score: 0.8254
|
| 336 |
+
2025-09-26 04:08:44,108 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0015 | Val mean-roc_auc_score: 0.8252
|
| 337 |
+
2025-09-26 04:08:53,966 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0057 | Val mean-roc_auc_score: 0.8162
|
| 338 |
+
2025-09-26 04:09:01,456 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0277 | Val mean-roc_auc_score: 0.8258
|
| 339 |
+
2025-09-26 04:09:12,155 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0095 | Val mean-roc_auc_score: 0.8295
|
| 340 |
+
2025-09-26 04:09:22,417 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0031 | Val mean-roc_auc_score: 0.8294
|
| 341 |
+
2025-09-26 04:09:29,690 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0024 | Val mean-roc_auc_score: 0.8306
|
| 342 |
+
2025-09-26 04:09:39,734 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0030 | Val mean-roc_auc_score: 0.8309
|
| 343 |
+
2025-09-26 04:09:49,682 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0023 | Val mean-roc_auc_score: 0.8311
|
| 344 |
+
2025-09-26 04:10:00,308 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0031 | Val mean-roc_auc_score: 0.8335
|
| 345 |
+
2025-09-26 04:10:03,328 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0024 | Val mean-roc_auc_score: 0.8331
|
| 346 |
+
2025-09-26 04:10:08,431 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0021 | Val mean-roc_auc_score: 0.8333
|
| 347 |
+
2025-09-26 04:10:13,638 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0018 | Val mean-roc_auc_score: 0.8333
|
| 348 |
+
2025-09-26 04:10:14,275 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8548
|
| 349 |
+
2025-09-26 04:10:14,589 - logs_modchembert_cocrystal_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg mean-roc_auc_score: 0.8550, Std Dev: 0.0106
|
logs_modchembert_classification_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_covid19_epochs100_batch_size32_20250925_210846.log
ADDED
|
@@ -0,0 +1,327 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-25 21:08:46,585 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Running benchmark for dataset: covid19
|
| 2 |
+
2025-09-25 21:08:46,585 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - dataset: covid19, tasks: ['label'], epochs: 100, learning rate: 3e-05
|
| 3 |
+
2025-09-25 21:08:46,589 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset covid19 at 2025-09-25_21-08-46
|
| 4 |
+
2025-09-25 21:08:52,856 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5308 | Val mean-roc_auc_score: 0.8266
|
| 5 |
+
2025-09-25 21:08:52,856 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 65
|
| 6 |
+
2025-09-25 21:08:53,897 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.8266
|
| 7 |
+
2025-09-25 21:09:04,157 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4146 | Val mean-roc_auc_score: 0.8431
|
| 8 |
+
2025-09-25 21:09:04,520 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 130
|
| 9 |
+
2025-09-25 21:09:05,249 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8431
|
| 10 |
+
2025-09-25 21:09:15,245 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3596 | Val mean-roc_auc_score: 0.8176
|
| 11 |
+
2025-09-25 21:09:23,797 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2958 | Val mean-roc_auc_score: 0.8365
|
| 12 |
+
2025-09-25 21:09:34,156 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2412 | Val mean-roc_auc_score: 0.8175
|
| 13 |
+
2025-09-25 21:09:42,781 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1827 | Val mean-roc_auc_score: 0.8138
|
| 14 |
+
2025-09-25 21:09:53,827 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1205 | Val mean-roc_auc_score: 0.7970
|
| 15 |
+
2025-09-25 21:10:04,973 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0941 | Val mean-roc_auc_score: 0.8017
|
| 16 |
+
2025-09-25 21:10:12,737 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1091 | Val mean-roc_auc_score: 0.8158
|
| 17 |
+
2025-09-25 21:10:23,598 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0762 | Val mean-roc_auc_score: 0.8104
|
| 18 |
+
2025-09-25 21:10:34,336 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.1193 | Val mean-roc_auc_score: 0.8119
|
| 19 |
+
2025-09-25 21:10:42,163 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0582 | Val mean-roc_auc_score: 0.8147
|
| 20 |
+
2025-09-25 21:10:53,196 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0552 | Val mean-roc_auc_score: 0.8149
|
| 21 |
+
2025-09-25 21:11:04,194 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0336 | Val mean-roc_auc_score: 0.8169
|
| 22 |
+
2025-09-25 21:11:11,078 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0406 | Val mean-roc_auc_score: 0.8218
|
| 23 |
+
2025-09-25 21:11:22,902 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0338 | Val mean-roc_auc_score: 0.8177
|
| 24 |
+
2025-09-25 21:11:30,680 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0132 | Val mean-roc_auc_score: 0.8270
|
| 25 |
+
2025-09-25 21:11:41,447 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0296 | Val mean-roc_auc_score: 0.8199
|
| 26 |
+
2025-09-25 21:11:52,485 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0278 | Val mean-roc_auc_score: 0.8185
|
| 27 |
+
2025-09-25 21:12:01,107 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0287 | Val mean-roc_auc_score: 0.8165
|
| 28 |
+
2025-09-25 21:12:12,066 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0416 | Val mean-roc_auc_score: 0.8257
|
| 29 |
+
2025-09-25 21:12:23,269 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0557 | Val mean-roc_auc_score: 0.8277
|
| 30 |
+
2025-09-25 21:12:32,205 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0582 | Val mean-roc_auc_score: 0.8083
|
| 31 |
+
2025-09-25 21:12:43,398 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0516 | Val mean-roc_auc_score: 0.7988
|
| 32 |
+
2025-09-25 21:12:54,076 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0344 | Val mean-roc_auc_score: 0.8164
|
| 33 |
+
2025-09-25 21:13:04,818 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0264 | Val mean-roc_auc_score: 0.8189
|
| 34 |
+
2025-09-25 21:13:12,333 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0226 | Val mean-roc_auc_score: 0.8232
|
| 35 |
+
2025-09-25 21:13:21,702 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0301 | Val mean-roc_auc_score: 0.8244
|
| 36 |
+
2025-09-25 21:13:32,801 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0206 | Val mean-roc_auc_score: 0.8264
|
| 37 |
+
2025-09-25 21:13:43,270 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0227 | Val mean-roc_auc_score: 0.8264
|
| 38 |
+
2025-09-25 21:13:52,042 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0207 | Val mean-roc_auc_score: 0.8330
|
| 39 |
+
2025-09-25 21:14:03,418 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0272 | Val mean-roc_auc_score: 0.8219
|
| 40 |
+
2025-09-25 21:14:13,563 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0264 | Val mean-roc_auc_score: 0.8227
|
| 41 |
+
2025-09-25 21:14:21,223 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0208 | Val mean-roc_auc_score: 0.8233
|
| 42 |
+
2025-09-25 21:14:32,047 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0206 | Val mean-roc_auc_score: 0.8250
|
| 43 |
+
2025-09-25 21:14:43,170 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0234 | Val mean-roc_auc_score: 0.8304
|
| 44 |
+
2025-09-25 21:14:52,198 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0159 | Val mean-roc_auc_score: 0.8272
|
| 45 |
+
2025-09-25 21:15:03,064 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0225 | Val mean-roc_auc_score: 0.8230
|
| 46 |
+
2025-09-25 21:15:11,449 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0231 | Val mean-roc_auc_score: 0.8230
|
| 47 |
+
2025-09-25 21:15:21,783 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0179 | Val mean-roc_auc_score: 0.8227
|
| 48 |
+
2025-09-25 21:15:32,241 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0192 | Val mean-roc_auc_score: 0.8261
|
| 49 |
+
2025-09-25 21:15:40,417 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0204 | Val mean-roc_auc_score: 0.8215
|
| 50 |
+
2025-09-25 21:15:50,375 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0179 | Val mean-roc_auc_score: 0.8272
|
| 51 |
+
2025-09-25 21:16:01,313 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0191 | Val mean-roc_auc_score: 0.8254
|
| 52 |
+
2025-09-25 21:16:09,191 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0130 | Val mean-roc_auc_score: 0.8276
|
| 53 |
+
2025-09-25 21:16:19,977 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0171 | Val mean-roc_auc_score: 0.8306
|
| 54 |
+
2025-09-25 21:16:32,410 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0192 | Val mean-roc_auc_score: 0.8274
|
| 55 |
+
2025-09-25 21:16:39,941 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0201 | Val mean-roc_auc_score: 0.8277
|
| 56 |
+
2025-09-25 21:16:51,150 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0275 | Val mean-roc_auc_score: 0.8283
|
| 57 |
+
2025-09-25 21:16:59,781 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0252 | Val mean-roc_auc_score: 0.8255
|
| 58 |
+
2025-09-25 21:17:10,886 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0411 | Val mean-roc_auc_score: 0.8425
|
| 59 |
+
2025-09-25 21:17:22,185 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0329 | Val mean-roc_auc_score: 0.8203
|
| 60 |
+
2025-09-25 21:17:29,640 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0332 | Val mean-roc_auc_score: 0.8230
|
| 61 |
+
2025-09-25 21:17:40,297 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0184 | Val mean-roc_auc_score: 0.8248
|
| 62 |
+
2025-09-25 21:17:51,283 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0207 | Val mean-roc_auc_score: 0.8217
|
| 63 |
+
2025-09-25 21:17:59,367 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0192 | Val mean-roc_auc_score: 0.8240
|
| 64 |
+
2025-09-25 21:18:11,317 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0116 | Val mean-roc_auc_score: 0.8269
|
| 65 |
+
2025-09-25 21:18:19,832 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0166 | Val mean-roc_auc_score: 0.8234
|
| 66 |
+
2025-09-25 21:18:31,107 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0148 | Val mean-roc_auc_score: 0.8275
|
| 67 |
+
2025-09-25 21:18:41,816 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0156 | Val mean-roc_auc_score: 0.8268
|
| 68 |
+
2025-09-25 21:18:50,467 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0156 | Val mean-roc_auc_score: 0.8287
|
| 69 |
+
2025-09-25 21:19:03,048 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0185 | Val mean-roc_auc_score: 0.8305
|
| 70 |
+
2025-09-25 21:19:14,297 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0180 | Val mean-roc_auc_score: 0.8294
|
| 71 |
+
2025-09-25 21:19:22,603 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0193 | Val mean-roc_auc_score: 0.8277
|
| 72 |
+
2025-09-25 21:19:33,525 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0094 | Val mean-roc_auc_score: 0.8294
|
| 73 |
+
2025-09-25 21:19:41,918 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0161 | Val mean-roc_auc_score: 0.8290
|
| 74 |
+
2025-09-25 21:19:53,230 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0190 | Val mean-roc_auc_score: 0.8341
|
| 75 |
+
2025-09-25 21:20:04,415 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0144 | Val mean-roc_auc_score: 0.8325
|
| 76 |
+
2025-09-25 21:20:12,893 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0531 | Val mean-roc_auc_score: 0.8155
|
| 77 |
+
2025-09-25 21:20:24,788 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0434 | Val mean-roc_auc_score: 0.8303
|
| 78 |
+
2025-09-25 21:20:35,793 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0419 | Val mean-roc_auc_score: 0.8360
|
| 79 |
+
2025-09-25 21:20:46,537 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0228 | Val mean-roc_auc_score: 0.8316
|
| 80 |
+
2025-09-25 21:20:58,799 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0193 | Val mean-roc_auc_score: 0.8319
|
| 81 |
+
2025-09-25 21:21:09,460 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0129 | Val mean-roc_auc_score: 0.8326
|
| 82 |
+
2025-09-25 21:21:17,291 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0175 | Val mean-roc_auc_score: 0.8312
|
| 83 |
+
2025-09-25 21:21:28,756 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0174 | Val mean-roc_auc_score: 0.8324
|
| 84 |
+
2025-09-25 21:21:40,934 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0190 | Val mean-roc_auc_score: 0.8346
|
| 85 |
+
2025-09-25 21:21:51,572 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0166 | Val mean-roc_auc_score: 0.8358
|
| 86 |
+
2025-09-25 21:22:01,760 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0190 | Val mean-roc_auc_score: 0.8352
|
| 87 |
+
2025-09-25 21:22:12,844 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0166 | Val mean-roc_auc_score: 0.8341
|
| 88 |
+
2025-09-25 21:22:20,341 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0177 | Val mean-roc_auc_score: 0.8360
|
| 89 |
+
2025-09-25 21:22:30,915 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0151 | Val mean-roc_auc_score: 0.8363
|
| 90 |
+
2025-09-25 21:22:41,542 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0167 | Val mean-roc_auc_score: 0.8355
|
| 91 |
+
2025-09-25 21:22:52,677 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0171 | Val mean-roc_auc_score: 0.8363
|
| 92 |
+
2025-09-25 21:23:00,470 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0128 | Val mean-roc_auc_score: 0.8352
|
| 93 |
+
2025-09-25 21:23:11,498 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0142 | Val mean-roc_auc_score: 0.8343
|
| 94 |
+
2025-09-25 21:23:23,015 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8348
|
| 95 |
+
2025-09-25 21:23:33,776 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0173 | Val mean-roc_auc_score: 0.8349
|
| 96 |
+
2025-09-25 21:23:40,965 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0155 | Val mean-roc_auc_score: 0.8357
|
| 97 |
+
2025-09-25 21:23:50,459 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0162 | Val mean-roc_auc_score: 0.8356
|
| 98 |
+
2025-09-25 21:24:00,661 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0132 | Val mean-roc_auc_score: 0.8352
|
| 99 |
+
2025-09-25 21:24:12,297 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0155 | Val mean-roc_auc_score: 0.8355
|
| 100 |
+
2025-09-25 21:24:24,027 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0155 | Val mean-roc_auc_score: 0.8350
|
| 101 |
+
2025-09-25 21:24:31,042 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0160 | Val mean-roc_auc_score: 0.8309
|
| 102 |
+
2025-09-25 21:24:41,886 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0183 | Val mean-roc_auc_score: 0.8366
|
| 103 |
+
2025-09-25 21:24:51,549 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0158 | Val mean-roc_auc_score: 0.8346
|
| 104 |
+
2025-09-25 21:25:02,982 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0190 | Val mean-roc_auc_score: 0.8343
|
| 105 |
+
2025-09-25 21:25:11,675 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0141 | Val mean-roc_auc_score: 0.8353
|
| 106 |
+
2025-09-25 21:25:22,796 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0157 | Val mean-roc_auc_score: 0.8343
|
| 107 |
+
2025-09-25 21:25:33,733 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0165 | Val mean-roc_auc_score: 0.8355
|
| 108 |
+
2025-09-25 21:25:34,437 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8024
|
| 109 |
+
2025-09-25 21:25:34,892 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset covid19 at 2025-09-25_21-25-34
|
| 110 |
+
2025-09-25 21:25:41,812 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5115 | Val mean-roc_auc_score: 0.8227
|
| 111 |
+
2025-09-25 21:25:41,812 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 65
|
| 112 |
+
2025-09-25 21:25:42,783 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.8227
|
| 113 |
+
2025-09-25 21:25:52,920 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4000 | Val mean-roc_auc_score: 0.8382
|
| 114 |
+
2025-09-25 21:25:53,120 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 130
|
| 115 |
+
2025-09-25 21:25:53,761 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8382
|
| 116 |
+
2025-09-25 21:26:04,344 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3519 | Val mean-roc_auc_score: 0.8351
|
| 117 |
+
2025-09-25 21:26:14,575 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2896 | Val mean-roc_auc_score: 0.8280
|
| 118 |
+
2025-09-25 21:26:21,016 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2762 | Val mean-roc_auc_score: 0.8173
|
| 119 |
+
2025-09-25 21:26:31,082 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1673 | Val mean-roc_auc_score: 0.8253
|
| 120 |
+
2025-09-25 21:26:40,744 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1267 | Val mean-roc_auc_score: 0.8241
|
| 121 |
+
2025-09-25 21:26:51,033 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0996 | Val mean-roc_auc_score: 0.8160
|
| 122 |
+
2025-09-25 21:26:58,879 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0683 | Val mean-roc_auc_score: 0.8122
|
| 123 |
+
2025-09-25 21:27:09,339 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0778 | Val mean-roc_auc_score: 0.8327
|
| 124 |
+
2025-09-25 21:27:19,940 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0740 | Val mean-roc_auc_score: 0.8076
|
| 125 |
+
2025-09-25 21:27:28,495 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0435 | Val mean-roc_auc_score: 0.8129
|
| 126 |
+
2025-09-25 21:27:39,041 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0347 | Val mean-roc_auc_score: 0.8117
|
| 127 |
+
2025-09-25 21:27:50,303 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0488 | Val mean-roc_auc_score: 0.8067
|
| 128 |
+
2025-09-25 21:28:00,116 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0358 | Val mean-roc_auc_score: 0.8042
|
| 129 |
+
2025-09-25 21:28:08,731 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0254 | Val mean-roc_auc_score: 0.8219
|
| 130 |
+
2025-09-25 21:28:20,199 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0473 | Val mean-roc_auc_score: 0.8194
|
| 131 |
+
2025-09-25 21:28:29,532 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0445 | Val mean-roc_auc_score: 0.8198
|
| 132 |
+
2025-09-25 21:28:40,212 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0429 | Val mean-roc_auc_score: 0.8091
|
| 133 |
+
2025-09-25 21:28:48,155 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0517 | Val mean-roc_auc_score: 0.8102
|
| 134 |
+
2025-09-25 21:28:58,839 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0697 | Val mean-roc_auc_score: 0.8239
|
| 135 |
+
2025-09-25 21:29:10,252 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0311 | Val mean-roc_auc_score: 0.8207
|
| 136 |
+
2025-09-25 21:29:18,525 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0268 | Val mean-roc_auc_score: 0.8243
|
| 137 |
+
2025-09-25 21:29:29,048 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0241 | Val mean-roc_auc_score: 0.8260
|
| 138 |
+
2025-09-25 21:29:39,602 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0295 | Val mean-roc_auc_score: 0.8226
|
| 139 |
+
2025-09-25 21:29:47,930 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0230 | Val mean-roc_auc_score: 0.8187
|
| 140 |
+
2025-09-25 21:29:59,334 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0213 | Val mean-roc_auc_score: 0.8237
|
| 141 |
+
2025-09-25 21:30:09,851 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0346 | Val mean-roc_auc_score: 0.8237
|
| 142 |
+
2025-09-25 21:30:19,610 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0293 | Val mean-roc_auc_score: 0.8369
|
| 143 |
+
2025-09-25 21:30:26,112 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0444 | Val mean-roc_auc_score: 0.8390
|
| 144 |
+
2025-09-25 21:30:26,284 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 1950
|
| 145 |
+
2025-09-25 21:30:26,977 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 30 with val mean-roc_auc_score: 0.8390
|
| 146 |
+
2025-09-25 21:30:38,890 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0273 | Val mean-roc_auc_score: 0.8221
|
| 147 |
+
2025-09-25 21:30:51,077 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0264 | Val mean-roc_auc_score: 0.8247
|
| 148 |
+
2025-09-25 21:31:02,018 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0363 | Val mean-roc_auc_score: 0.8235
|
| 149 |
+
2025-09-25 21:31:10,175 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0326 | Val mean-roc_auc_score: 0.8214
|
| 150 |
+
2025-09-25 21:31:21,148 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0204 | Val mean-roc_auc_score: 0.8277
|
| 151 |
+
2025-09-25 21:31:31,912 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0171 | Val mean-roc_auc_score: 0.8233
|
| 152 |
+
2025-09-25 21:31:40,265 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0161 | Val mean-roc_auc_score: 0.8256
|
| 153 |
+
2025-09-25 21:31:50,848 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0175 | Val mean-roc_auc_score: 0.8242
|
| 154 |
+
2025-09-25 21:31:58,518 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0230 | Val mean-roc_auc_score: 0.8237
|
| 155 |
+
2025-09-25 21:32:09,050 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0187 | Val mean-roc_auc_score: 0.8232
|
| 156 |
+
2025-09-25 21:32:20,363 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0171 | Val mean-roc_auc_score: 0.8246
|
| 157 |
+
2025-09-25 21:32:29,404 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0223 | Val mean-roc_auc_score: 0.8283
|
| 158 |
+
2025-09-25 21:32:40,198 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0180 | Val mean-roc_auc_score: 0.8243
|
| 159 |
+
2025-09-25 21:32:51,540 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0182 | Val mean-roc_auc_score: 0.8241
|
| 160 |
+
2025-09-25 21:32:59,012 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0126 | Val mean-roc_auc_score: 0.8236
|
| 161 |
+
2025-09-25 21:33:09,626 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0178 | Val mean-roc_auc_score: 0.8244
|
| 162 |
+
2025-09-25 21:33:21,960 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0152 | Val mean-roc_auc_score: 0.8242
|
| 163 |
+
2025-09-25 21:33:32,302 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0180 | Val mean-roc_auc_score: 0.8257
|
| 164 |
+
2025-09-25 21:33:38,763 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0167 | Val mean-roc_auc_score: 0.8255
|
| 165 |
+
2025-09-25 21:33:47,969 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0177 | Val mean-roc_auc_score: 0.8258
|
| 166 |
+
2025-09-25 21:33:58,467 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0168 | Val mean-roc_auc_score: 0.8262
|
| 167 |
+
2025-09-25 21:34:09,714 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0171 | Val mean-roc_auc_score: 0.8233
|
| 168 |
+
2025-09-25 21:34:18,363 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0171 | Val mean-roc_auc_score: 0.8244
|
| 169 |
+
2025-09-25 21:34:29,194 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0164 | Val mean-roc_auc_score: 0.8259
|
| 170 |
+
2025-09-25 21:34:40,435 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0148 | Val mean-roc_auc_score: 0.8242
|
| 171 |
+
2025-09-25 21:34:49,453 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0134 | Val mean-roc_auc_score: 0.8219
|
| 172 |
+
2025-09-25 21:35:01,530 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0226 | Val mean-roc_auc_score: 0.8226
|
| 173 |
+
2025-09-25 21:35:09,989 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0150 | Val mean-roc_auc_score: 0.8229
|
| 174 |
+
2025-09-25 21:35:20,664 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0125 | Val mean-roc_auc_score: 0.8237
|
| 175 |
+
2025-09-25 21:35:31,834 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0151 | Val mean-roc_auc_score: 0.8234
|
| 176 |
+
2025-09-25 21:35:39,760 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0145 | Val mean-roc_auc_score: 0.8234
|
| 177 |
+
2025-09-25 21:35:52,419 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0161 | Val mean-roc_auc_score: 0.8238
|
| 178 |
+
2025-09-25 21:36:03,383 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0148 | Val mean-roc_auc_score: 0.8242
|
| 179 |
+
2025-09-25 21:36:13,900 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0161 | Val mean-roc_auc_score: 0.8241
|
| 180 |
+
2025-09-25 21:36:21,689 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0216 | Val mean-roc_auc_score: 0.8263
|
| 181 |
+
2025-09-25 21:36:31,694 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0163 | Val mean-roc_auc_score: 0.8236
|
| 182 |
+
2025-09-25 21:36:43,233 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0168 | Val mean-roc_auc_score: 0.8272
|
| 183 |
+
2025-09-25 21:36:54,093 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0258 | Val mean-roc_auc_score: 0.8236
|
| 184 |
+
2025-09-25 21:37:01,679 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0210 | Val mean-roc_auc_score: 0.8168
|
| 185 |
+
2025-09-25 21:37:12,049 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0161 | Val mean-roc_auc_score: 0.8198
|
| 186 |
+
2025-09-25 21:37:22,790 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0120 | Val mean-roc_auc_score: 0.8219
|
| 187 |
+
2025-09-25 21:37:31,189 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0145 | Val mean-roc_auc_score: 0.8231
|
| 188 |
+
2025-09-25 21:37:42,170 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0143 | Val mean-roc_auc_score: 0.8228
|
| 189 |
+
2025-09-25 21:37:52,154 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0094 | Val mean-roc_auc_score: 0.8219
|
| 190 |
+
2025-09-25 21:38:00,078 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0156 | Val mean-roc_auc_score: 0.8232
|
| 191 |
+
2025-09-25 21:38:10,893 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0191 | Val mean-roc_auc_score: 0.8219
|
| 192 |
+
2025-09-25 21:38:22,916 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0150 | Val mean-roc_auc_score: 0.8208
|
| 193 |
+
2025-09-25 21:38:33,573 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0146 | Val mean-roc_auc_score: 0.8216
|
| 194 |
+
2025-09-25 21:38:40,502 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0116 | Val mean-roc_auc_score: 0.8242
|
| 195 |
+
2025-09-25 21:38:49,040 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0145 | Val mean-roc_auc_score: 0.8251
|
| 196 |
+
2025-09-25 21:39:01,514 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0151 | Val mean-roc_auc_score: 0.8266
|
| 197 |
+
2025-09-25 21:39:14,179 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0107 | Val mean-roc_auc_score: 0.8247
|
| 198 |
+
2025-09-25 21:39:23,748 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0162 | Val mean-roc_auc_score: 0.8215
|
| 199 |
+
2025-09-25 21:39:35,945 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8226
|
| 200 |
+
2025-09-25 21:39:46,333 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0209 | Val mean-roc_auc_score: 0.8233
|
| 201 |
+
2025-09-25 21:39:58,646 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0150 | Val mean-roc_auc_score: 0.8228
|
| 202 |
+
2025-09-25 21:40:12,070 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0150 | Val mean-roc_auc_score: 0.8227
|
| 203 |
+
2025-09-25 21:40:22,197 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0196 | Val mean-roc_auc_score: 0.8231
|
| 204 |
+
2025-09-25 21:40:34,453 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0168 | Val mean-roc_auc_score: 0.8243
|
| 205 |
+
2025-09-25 21:40:44,560 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8244
|
| 206 |
+
2025-09-25 21:40:55,989 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8242
|
| 207 |
+
2025-09-25 21:41:06,472 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0163 | Val mean-roc_auc_score: 0.8253
|
| 208 |
+
2025-09-25 21:41:20,841 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8234
|
| 209 |
+
2025-09-25 21:41:32,726 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0204 | Val mean-roc_auc_score: 0.8248
|
| 210 |
+
2025-09-25 21:41:44,562 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0156 | Val mean-roc_auc_score: 0.8215
|
| 211 |
+
2025-09-25 21:41:53,713 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0153 | Val mean-roc_auc_score: 0.8245
|
| 212 |
+
2025-09-25 21:42:05,830 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0037 | Val mean-roc_auc_score: 0.8255
|
| 213 |
+
2025-09-25 21:42:18,160 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0150 | Val mean-roc_auc_score: 0.8259
|
| 214 |
+
2025-09-25 21:42:26,165 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8272
|
| 215 |
+
2025-09-25 21:42:37,389 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0146 | Val mean-roc_auc_score: 0.8249
|
| 216 |
+
2025-09-25 21:42:38,359 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.8152
|
| 217 |
+
2025-09-25 21:42:38,860 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset covid19 at 2025-09-25_21-42-38
|
| 218 |
+
2025-09-25 21:42:47,959 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5231 | Val mean-roc_auc_score: 0.8278
|
| 219 |
+
2025-09-25 21:42:47,959 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 65
|
| 220 |
+
2025-09-25 21:42:49,023 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val mean-roc_auc_score: 0.8278
|
| 221 |
+
2025-09-25 21:42:56,770 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4437 | Val mean-roc_auc_score: 0.8320
|
| 222 |
+
2025-09-25 21:42:56,978 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 130
|
| 223 |
+
2025-09-25 21:42:57,635 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val mean-roc_auc_score: 0.8320
|
| 224 |
+
2025-09-25 21:43:07,984 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3635 | Val mean-roc_auc_score: 0.8377
|
| 225 |
+
2025-09-25 21:43:08,234 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 195
|
| 226 |
+
2025-09-25 21:43:08,999 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val mean-roc_auc_score: 0.8377
|
| 227 |
+
2025-09-25 21:43:19,927 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3000 | Val mean-roc_auc_score: 0.8271
|
| 228 |
+
2025-09-25 21:43:28,059 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2900 | Val mean-roc_auc_score: 0.8327
|
| 229 |
+
2025-09-25 21:43:38,516 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1712 | Val mean-roc_auc_score: 0.7696
|
| 230 |
+
2025-09-25 21:43:49,210 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1727 | Val mean-roc_auc_score: 0.8369
|
| 231 |
+
2025-09-25 21:43:56,662 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1586 | Val mean-roc_auc_score: 0.8394
|
| 232 |
+
2025-09-25 21:43:56,834 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Global step of best model: 520
|
| 233 |
+
2025-09-25 21:43:57,676 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val mean-roc_auc_score: 0.8394
|
| 234 |
+
2025-09-25 21:44:08,211 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0764 | Val mean-roc_auc_score: 0.8357
|
| 235 |
+
2025-09-25 21:44:16,328 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0691 | Val mean-roc_auc_score: 0.8134
|
| 236 |
+
2025-09-25 21:44:27,824 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0589 | Val mean-roc_auc_score: 0.8265
|
| 237 |
+
2025-09-25 21:44:39,953 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0459 | Val mean-roc_auc_score: 0.8259
|
| 238 |
+
2025-09-25 21:44:48,657 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0441 | Val mean-roc_auc_score: 0.8131
|
| 239 |
+
2025-09-25 21:45:00,043 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0504 | Val mean-roc_auc_score: 0.8170
|
| 240 |
+
2025-09-25 21:45:10,529 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0454 | Val mean-roc_auc_score: 0.8267
|
| 241 |
+
2025-09-25 21:45:19,840 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0406 | Val mean-roc_auc_score: 0.8243
|
| 242 |
+
2025-09-25 21:45:31,194 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0727 | Val mean-roc_auc_score: 0.8188
|
| 243 |
+
2025-09-25 21:45:39,249 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0454 | Val mean-roc_auc_score: 0.8153
|
| 244 |
+
2025-09-25 21:45:49,788 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0348 | Val mean-roc_auc_score: 0.8212
|
| 245 |
+
2025-09-25 21:46:00,742 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0292 | Val mean-roc_auc_score: 0.8183
|
| 246 |
+
2025-09-25 21:46:08,794 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0252 | Val mean-roc_auc_score: 0.8199
|
| 247 |
+
2025-09-25 21:46:20,471 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0243 | Val mean-roc_auc_score: 0.8236
|
| 248 |
+
2025-09-25 21:46:31,546 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0251 | Val mean-roc_auc_score: 0.8144
|
| 249 |
+
2025-09-25 21:46:40,027 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0286 | Val mean-roc_auc_score: 0.8025
|
| 250 |
+
2025-09-25 21:46:50,228 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0284 | Val mean-roc_auc_score: 0.8133
|
| 251 |
+
2025-09-25 21:47:01,231 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0341 | Val mean-roc_auc_score: 0.8233
|
| 252 |
+
2025-09-25 21:47:10,364 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0494 | Val mean-roc_auc_score: 0.8296
|
| 253 |
+
2025-09-25 21:47:21,237 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0239 | Val mean-roc_auc_score: 0.8296
|
| 254 |
+
2025-09-25 21:47:29,284 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0246 | Val mean-roc_auc_score: 0.8214
|
| 255 |
+
2025-09-25 21:47:39,530 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0666 | Val mean-roc_auc_score: 0.8257
|
| 256 |
+
2025-09-25 21:47:51,032 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0385 | Val mean-roc_auc_score: 0.8308
|
| 257 |
+
2025-09-25 21:47:59,706 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0231 | Val mean-roc_auc_score: 0.8210
|
| 258 |
+
2025-09-25 21:48:10,922 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0219 | Val mean-roc_auc_score: 0.8246
|
| 259 |
+
2025-09-25 21:48:21,514 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0226 | Val mean-roc_auc_score: 0.8270
|
| 260 |
+
2025-09-25 21:48:29,524 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0202 | Val mean-roc_auc_score: 0.8221
|
| 261 |
+
2025-09-25 21:48:40,793 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0237 | Val mean-roc_auc_score: 0.8316
|
| 262 |
+
2025-09-25 21:48:51,040 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0436 | Val mean-roc_auc_score: 0.8259
|
| 263 |
+
2025-09-25 21:49:02,936 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0218 | Val mean-roc_auc_score: 0.8241
|
| 264 |
+
2025-09-25 21:49:14,785 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0272 | Val mean-roc_auc_score: 0.8248
|
| 265 |
+
2025-09-25 21:49:22,715 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0302 | Val mean-roc_auc_score: 0.8090
|
| 266 |
+
2025-09-25 21:49:33,563 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0413 | Val mean-roc_auc_score: 0.8220
|
| 267 |
+
2025-09-25 21:49:44,542 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0298 | Val mean-roc_auc_score: 0.8173
|
| 268 |
+
2025-09-25 21:49:53,193 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0243 | Val mean-roc_auc_score: 0.8131
|
| 269 |
+
2025-09-25 21:50:04,117 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0246 | Val mean-roc_auc_score: 0.8202
|
| 270 |
+
2025-09-25 21:50:11,701 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0200 | Val mean-roc_auc_score: 0.8288
|
| 271 |
+
2025-09-25 21:50:22,315 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0203 | Val mean-roc_auc_score: 0.8276
|
| 272 |
+
2025-09-25 21:50:34,935 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0189 | Val mean-roc_auc_score: 0.8255
|
| 273 |
+
2025-09-25 21:50:45,121 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0143 | Val mean-roc_auc_score: 0.8243
|
| 274 |
+
2025-09-25 21:50:51,783 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0194 | Val mean-roc_auc_score: 0.8257
|
| 275 |
+
2025-09-25 21:51:01,815 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0152 | Val mean-roc_auc_score: 0.8268
|
| 276 |
+
2025-09-25 21:51:12,225 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0271 | Val mean-roc_auc_score: 0.8264
|
| 277 |
+
2025-09-25 21:51:23,705 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0179 | Val mean-roc_auc_score: 0.8220
|
| 278 |
+
2025-09-25 21:51:31,928 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0205 | Val mean-roc_auc_score: 0.8250
|
| 279 |
+
2025-09-25 21:51:42,600 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0211 | Val mean-roc_auc_score: 0.8236
|
| 280 |
+
2025-09-25 21:51:52,912 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0144 | Val mean-roc_auc_score: 0.8220
|
| 281 |
+
2025-09-25 21:52:01,294 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0186 | Val mean-roc_auc_score: 0.8247
|
| 282 |
+
2025-09-25 21:52:14,712 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0137 | Val mean-roc_auc_score: 0.8244
|
| 283 |
+
2025-09-25 21:52:26,492 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0189 | Val mean-roc_auc_score: 0.8219
|
| 284 |
+
2025-09-25 21:52:34,686 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0200 | Val mean-roc_auc_score: 0.8289
|
| 285 |
+
2025-09-25 21:52:45,281 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0202 | Val mean-roc_auc_score: 0.8277
|
| 286 |
+
2025-09-25 21:52:53,362 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8242
|
| 287 |
+
2025-09-25 21:53:07,898 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0161 | Val mean-roc_auc_score: 0.8224
|
| 288 |
+
2025-09-25 21:53:18,857 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0160 | Val mean-roc_auc_score: 0.8243
|
| 289 |
+
2025-09-25 21:53:27,378 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0184 | Val mean-roc_auc_score: 0.8205
|
| 290 |
+
2025-09-25 21:53:38,005 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0173 | Val mean-roc_auc_score: 0.8227
|
| 291 |
+
2025-09-25 21:53:48,594 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0165 | Val mean-roc_auc_score: 0.8214
|
| 292 |
+
2025-09-25 21:53:56,962 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0143 | Val mean-roc_auc_score: 0.8249
|
| 293 |
+
2025-09-25 21:54:08,374 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0184 | Val mean-roc_auc_score: 0.8236
|
| 294 |
+
2025-09-25 21:54:17,164 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0163 | Val mean-roc_auc_score: 0.8237
|
| 295 |
+
2025-09-25 21:54:28,276 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0178 | Val mean-roc_auc_score: 0.8231
|
| 296 |
+
2025-09-25 21:54:39,245 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0236 | Val mean-roc_auc_score: 0.8227
|
| 297 |
+
2025-09-25 21:54:48,047 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8215
|
| 298 |
+
2025-09-25 21:54:59,357 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0170 | Val mean-roc_auc_score: 0.8225
|
| 299 |
+
2025-09-25 21:55:10,375 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0254 | Val mean-roc_auc_score: 0.8241
|
| 300 |
+
2025-09-25 21:55:18,168 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0167 | Val mean-roc_auc_score: 0.8201
|
| 301 |
+
2025-09-25 21:55:29,336 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0141 | Val mean-roc_auc_score: 0.8243
|
| 302 |
+
2025-09-25 21:55:39,826 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0035 | Val mean-roc_auc_score: 0.8216
|
| 303 |
+
2025-09-25 21:55:50,736 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0150 | Val mean-roc_auc_score: 0.8226
|
| 304 |
+
2025-09-25 21:56:01,968 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0152 | Val mean-roc_auc_score: 0.8224
|
| 305 |
+
2025-09-25 21:56:09,685 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0141 | Val mean-roc_auc_score: 0.8239
|
| 306 |
+
2025-09-25 21:56:20,529 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0154 | Val mean-roc_auc_score: 0.8254
|
| 307 |
+
2025-09-25 21:56:31,593 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0164 | Val mean-roc_auc_score: 0.8245
|
| 308 |
+
2025-09-25 21:56:39,737 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0163 | Val mean-roc_auc_score: 0.8245
|
| 309 |
+
2025-09-25 21:56:50,526 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0163 | Val mean-roc_auc_score: 0.8238
|
| 310 |
+
2025-09-25 21:57:01,193 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0136 | Val mean-roc_auc_score: 0.8239
|
| 311 |
+
2025-09-25 21:57:09,279 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0173 | Val mean-roc_auc_score: 0.8234
|
| 312 |
+
2025-09-25 21:57:22,579 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0160 | Val mean-roc_auc_score: 0.8243
|
| 313 |
+
2025-09-25 21:57:31,857 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0173 | Val mean-roc_auc_score: 0.8260
|
| 314 |
+
2025-09-25 21:57:42,494 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8248
|
| 315 |
+
2025-09-25 21:57:52,984 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0140 | Val mean-roc_auc_score: 0.8233
|
| 316 |
+
2025-09-25 21:58:00,832 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0134 | Val mean-roc_auc_score: 0.8230
|
| 317 |
+
2025-09-25 21:58:12,527 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0151 | Val mean-roc_auc_score: 0.8235
|
| 318 |
+
2025-09-25 21:58:22,087 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0169 | Val mean-roc_auc_score: 0.8247
|
| 319 |
+
2025-09-25 21:58:32,694 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0104 | Val mean-roc_auc_score: 0.8239
|
| 320 |
+
2025-09-25 21:58:43,222 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0163 | Val mean-roc_auc_score: 0.8246
|
| 321 |
+
2025-09-25 21:58:51,271 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0153 | Val mean-roc_auc_score: 0.8226
|
| 322 |
+
2025-09-25 21:59:02,762 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0102 | Val mean-roc_auc_score: 0.8224
|
| 323 |
+
2025-09-25 21:59:13,718 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0156 | Val mean-roc_auc_score: 0.8251
|
| 324 |
+
2025-09-25 21:59:19,752 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0145 | Val mean-roc_auc_score: 0.8214
|
| 325 |
+
2025-09-25 21:59:27,995 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0141 | Val mean-roc_auc_score: 0.8208
|
| 326 |
+
2025-09-25 21:59:29,031 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Test mean-roc_auc_score: 0.7863
|
| 327 |
+
2025-09-25 21:59:29,558 - logs_modchembert_covid19_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg mean-roc_auc_score: 0.8013, Std Dev: 0.0118
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_microsom_stab_h_epochs100_batch_size32_20250926_053842.log
ADDED
|
@@ -0,0 +1,359 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 05:38:42,489 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Running benchmark for dataset: adme_microsom_stab_h
|
| 2 |
+
2025-09-26 05:38:42,489 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - dataset: adme_microsom_stab_h, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 05:38:42,493 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset adme_microsom_stab_h at 2025-09-26_05-38-42
|
| 4 |
+
2025-09-26 05:38:49,613 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7778 | Val rms_score: 0.4035
|
| 5 |
+
2025-09-26 05:38:49,614 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 54
|
| 6 |
+
2025-09-26 05:38:50,495 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4035
|
| 7 |
+
2025-09-26 05:38:55,923 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4180 | Val rms_score: 0.3741
|
| 8 |
+
2025-09-26 05:38:56,171 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 108
|
| 9 |
+
2025-09-26 05:38:56,867 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3741
|
| 10 |
+
2025-09-26 05:39:02,420 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3819 | Val rms_score: 0.3783
|
| 11 |
+
2025-09-26 05:39:08,259 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2930 | Val rms_score: 0.3769
|
| 12 |
+
2025-09-26 05:39:12,308 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2222 | Val rms_score: 0.3734
|
| 13 |
+
2025-09-26 05:39:12,499 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 270
|
| 14 |
+
2025-09-26 05:39:13,082 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val rms_score: 0.3734
|
| 15 |
+
2025-09-26 05:39:19,273 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1413 | Val rms_score: 0.3697
|
| 16 |
+
2025-09-26 05:39:19,800 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 324
|
| 17 |
+
2025-09-26 05:39:20,551 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 6 with val rms_score: 0.3697
|
| 18 |
+
2025-09-26 05:39:28,452 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1123 | Val rms_score: 0.3841
|
| 19 |
+
2025-09-26 05:39:36,144 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0952 | Val rms_score: 0.3670
|
| 20 |
+
2025-09-26 05:39:36,359 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 432
|
| 21 |
+
2025-09-26 05:39:37,024 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val rms_score: 0.3670
|
| 22 |
+
2025-09-26 05:39:42,591 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0816 | Val rms_score: 0.3660
|
| 23 |
+
2025-09-26 05:39:42,889 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 486
|
| 24 |
+
2025-09-26 05:39:43,717 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 9 with val rms_score: 0.3660
|
| 25 |
+
2025-09-26 05:39:52,141 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0629 | Val rms_score: 0.3782
|
| 26 |
+
2025-09-26 05:40:01,138 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0556 | Val rms_score: 0.3806
|
| 27 |
+
2025-09-26 05:40:07,468 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0573 | Val rms_score: 0.3689
|
| 28 |
+
2025-09-26 05:40:15,915 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0393 | Val rms_score: 0.3719
|
| 29 |
+
2025-09-26 05:40:24,325 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0472 | Val rms_score: 0.3745
|
| 30 |
+
2025-09-26 05:40:32,740 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0400 | Val rms_score: 0.3698
|
| 31 |
+
2025-09-26 05:40:38,238 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0420 | Val rms_score: 0.3709
|
| 32 |
+
2025-09-26 05:40:46,672 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0365 | Val rms_score: 0.3755
|
| 33 |
+
2025-09-26 05:40:55,100 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0337 | Val rms_score: 0.3758
|
| 34 |
+
2025-09-26 05:41:03,889 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0325 | Val rms_score: 0.3788
|
| 35 |
+
2025-09-26 05:41:09,940 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0298 | Val rms_score: 0.3787
|
| 36 |
+
2025-09-26 05:41:18,153 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0306 | Val rms_score: 0.3715
|
| 37 |
+
2025-09-26 05:41:26,530 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0304 | Val rms_score: 0.3668
|
| 38 |
+
2025-09-26 05:41:34,795 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0281 | Val rms_score: 0.3752
|
| 39 |
+
2025-09-26 05:41:40,364 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0273 | Val rms_score: 0.3689
|
| 40 |
+
2025-09-26 05:41:48,179 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0291 | Val rms_score: 0.3720
|
| 41 |
+
2025-09-26 05:41:56,733 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0317 | Val rms_score: 0.3623
|
| 42 |
+
2025-09-26 05:41:57,330 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 1404
|
| 43 |
+
2025-09-26 05:41:57,969 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 26 with val rms_score: 0.3623
|
| 44 |
+
2025-09-26 05:42:06,815 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0271 | Val rms_score: 0.3706
|
| 45 |
+
2025-09-26 05:42:11,738 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0260 | Val rms_score: 0.3715
|
| 46 |
+
2025-09-26 05:42:19,336 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0253 | Val rms_score: 0.3721
|
| 47 |
+
2025-09-26 05:42:27,211 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0258 | Val rms_score: 0.3725
|
| 48 |
+
2025-09-26 05:42:35,013 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0249 | Val rms_score: 0.3661
|
| 49 |
+
2025-09-26 05:42:40,799 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0254 | Val rms_score: 0.3680
|
| 50 |
+
2025-09-26 05:42:49,294 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0230 | Val rms_score: 0.3724
|
| 51 |
+
2025-09-26 05:42:57,414 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0240 | Val rms_score: 0.3715
|
| 52 |
+
2025-09-26 05:43:05,824 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0224 | Val rms_score: 0.3712
|
| 53 |
+
2025-09-26 05:43:11,746 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0223 | Val rms_score: 0.3716
|
| 54 |
+
2025-09-26 05:43:20,518 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0216 | Val rms_score: 0.3702
|
| 55 |
+
2025-09-26 05:43:29,697 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0210 | Val rms_score: 0.3677
|
| 56 |
+
2025-09-26 05:43:35,370 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0212 | Val rms_score: 0.3680
|
| 57 |
+
2025-09-26 05:43:43,280 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0211 | Val rms_score: 0.3686
|
| 58 |
+
2025-09-26 05:43:51,750 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0211 | Val rms_score: 0.3739
|
| 59 |
+
2025-09-26 05:44:00,167 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0220 | Val rms_score: 0.3680
|
| 60 |
+
2025-09-26 05:44:04,779 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0181 | Val rms_score: 0.3762
|
| 61 |
+
2025-09-26 05:44:12,652 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0200 | Val rms_score: 0.3674
|
| 62 |
+
2025-09-26 05:44:20,531 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0199 | Val rms_score: 0.3780
|
| 63 |
+
2025-09-26 05:44:28,681 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0198 | Val rms_score: 0.3686
|
| 64 |
+
2025-09-26 05:44:34,876 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0182 | Val rms_score: 0.3702
|
| 65 |
+
2025-09-26 05:44:43,006 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0181 | Val rms_score: 0.3734
|
| 66 |
+
2025-09-26 05:44:51,456 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0191 | Val rms_score: 0.3738
|
| 67 |
+
2025-09-26 05:44:59,924 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0171 | Val rms_score: 0.3659
|
| 68 |
+
2025-09-26 05:45:05,344 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0201 | Val rms_score: 0.3715
|
| 69 |
+
2025-09-26 05:45:13,100 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0190 | Val rms_score: 0.3678
|
| 70 |
+
2025-09-26 05:45:21,469 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0178 | Val rms_score: 0.3700
|
| 71 |
+
2025-09-26 05:45:29,338 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0192 | Val rms_score: 0.3706
|
| 72 |
+
2025-09-26 05:45:34,802 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0171 | Val rms_score: 0.3704
|
| 73 |
+
2025-09-26 05:45:43,651 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0178 | Val rms_score: 0.3676
|
| 74 |
+
2025-09-26 05:45:51,407 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0175 | Val rms_score: 0.3693
|
| 75 |
+
2025-09-26 05:45:59,126 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0183 | Val rms_score: 0.3659
|
| 76 |
+
2025-09-26 05:46:03,913 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0166 | Val rms_score: 0.3692
|
| 77 |
+
2025-09-26 05:46:11,414 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0161 | Val rms_score: 0.3684
|
| 78 |
+
2025-09-26 05:46:19,168 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0162 | Val rms_score: 0.3695
|
| 79 |
+
2025-09-26 05:46:26,642 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0167 | Val rms_score: 0.3666
|
| 80 |
+
2025-09-26 05:46:34,581 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0138 | Val rms_score: 0.3743
|
| 81 |
+
2025-09-26 05:46:40,059 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0156 | Val rms_score: 0.3660
|
| 82 |
+
2025-09-26 05:46:48,012 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0140 | Val rms_score: 0.3705
|
| 83 |
+
2025-09-26 05:46:56,209 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0156 | Val rms_score: 0.3681
|
| 84 |
+
2025-09-26 05:47:04,539 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0139 | Val rms_score: 0.3720
|
| 85 |
+
2025-09-26 05:47:10,037 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0161 | Val rms_score: 0.3711
|
| 86 |
+
2025-09-26 05:47:18,065 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0148 | Val rms_score: 0.3719
|
| 87 |
+
2025-09-26 05:47:26,101 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0143 | Val rms_score: 0.3690
|
| 88 |
+
2025-09-26 05:47:33,911 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0149 | Val rms_score: 0.3668
|
| 89 |
+
2025-09-26 05:47:40,143 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0153 | Val rms_score: 0.3729
|
| 90 |
+
2025-09-26 05:47:47,710 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0152 | Val rms_score: 0.3678
|
| 91 |
+
2025-09-26 05:47:55,478 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0155 | Val rms_score: 0.3686
|
| 92 |
+
2025-09-26 05:48:02,231 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0155 | Val rms_score: 0.3672
|
| 93 |
+
2025-09-26 05:48:10,623 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0187 | Val rms_score: 0.3712
|
| 94 |
+
2025-09-26 05:48:18,799 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0133 | Val rms_score: 0.3691
|
| 95 |
+
2025-09-26 05:48:26,820 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0157 | Val rms_score: 0.3696
|
| 96 |
+
2025-09-26 05:48:31,967 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0144 | Val rms_score: 0.3709
|
| 97 |
+
2025-09-26 05:48:39,501 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0136 | Val rms_score: 0.3675
|
| 98 |
+
2025-09-26 05:48:46,605 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0139 | Val rms_score: 0.3682
|
| 99 |
+
2025-09-26 05:48:54,904 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0140 | Val rms_score: 0.3691
|
| 100 |
+
2025-09-26 05:49:02,414 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0140 | Val rms_score: 0.3657
|
| 101 |
+
2025-09-26 05:49:07,761 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0139 | Val rms_score: 0.3682
|
| 102 |
+
2025-09-26 05:49:15,491 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0144 | Val rms_score: 0.3635
|
| 103 |
+
2025-09-26 05:49:22,866 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0130 | Val rms_score: 0.3674
|
| 104 |
+
2025-09-26 05:49:31,099 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0143 | Val rms_score: 0.3695
|
| 105 |
+
2025-09-26 05:49:35,482 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0141 | Val rms_score: 0.3678
|
| 106 |
+
2025-09-26 05:49:43,049 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0153 | Val rms_score: 0.3674
|
| 107 |
+
2025-09-26 05:49:50,429 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0142 | Val rms_score: 0.3644
|
| 108 |
+
2025-09-26 05:49:57,970 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0172 | Val rms_score: 0.3728
|
| 109 |
+
2025-09-26 05:50:03,851 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0165 | Val rms_score: 0.3672
|
| 110 |
+
2025-09-26 05:50:12,485 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0138 | Val rms_score: 0.3694
|
| 111 |
+
2025-09-26 05:50:20,205 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0133 | Val rms_score: 0.3669
|
| 112 |
+
2025-09-26 05:50:27,884 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0124 | Val rms_score: 0.3701
|
| 113 |
+
2025-09-26 05:50:32,705 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0138 | Val rms_score: 0.3674
|
| 114 |
+
2025-09-26 05:50:41,311 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0137 | Val rms_score: 0.3653
|
| 115 |
+
2025-09-26 05:50:48,914 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0136 | Val rms_score: 0.3713
|
| 116 |
+
2025-09-26 05:50:57,228 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0120 | Val rms_score: 0.3694
|
| 117 |
+
2025-09-26 05:51:02,637 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0131 | Val rms_score: 0.3671
|
| 118 |
+
2025-09-26 05:51:03,230 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Test rms_score: 0.4236
|
| 119 |
+
2025-09-26 05:51:03,583 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset adme_microsom_stab_h at 2025-09-26_05-51-03
|
| 120 |
+
2025-09-26 05:51:10,633 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7546 | Val rms_score: 0.4018
|
| 121 |
+
2025-09-26 05:51:10,633 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 54
|
| 122 |
+
2025-09-26 05:51:11,332 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4018
|
| 123 |
+
2025-09-26 05:51:19,299 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.5391 | Val rms_score: 0.3981
|
| 124 |
+
2025-09-26 05:51:19,634 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 108
|
| 125 |
+
2025-09-26 05:51:20,267 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3981
|
| 126 |
+
2025-09-26 05:51:27,527 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3704 | Val rms_score: 0.3947
|
| 127 |
+
2025-09-26 05:51:27,751 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 162
|
| 128 |
+
2025-09-26 05:51:28,444 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.3947
|
| 129 |
+
2025-09-26 05:51:33,732 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2988 | Val rms_score: 0.3946
|
| 130 |
+
2025-09-26 05:51:33,994 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 216
|
| 131 |
+
2025-09-26 05:51:34,840 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.3946
|
| 132 |
+
2025-09-26 05:51:42,796 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2164 | Val rms_score: 0.3748
|
| 133 |
+
2025-09-26 05:51:43,064 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 270
|
| 134 |
+
2025-09-26 05:51:43,819 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val rms_score: 0.3748
|
| 135 |
+
2025-09-26 05:51:51,901 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1484 | Val rms_score: 0.3850
|
| 136 |
+
2025-09-26 05:52:01,113 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1273 | Val rms_score: 0.3702
|
| 137 |
+
2025-09-26 05:52:01,318 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 378
|
| 138 |
+
2025-09-26 05:52:01,933 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val rms_score: 0.3702
|
| 139 |
+
2025-09-26 05:52:07,355 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0991 | Val rms_score: 0.3715
|
| 140 |
+
2025-09-26 05:52:15,376 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0804 | Val rms_score: 0.3750
|
| 141 |
+
2025-09-26 05:52:23,312 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0730 | Val rms_score: 0.3689
|
| 142 |
+
2025-09-26 05:52:23,534 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 540
|
| 143 |
+
2025-09-26 05:52:24,217 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 10 with val rms_score: 0.3689
|
| 144 |
+
2025-09-26 05:52:29,802 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0608 | Val rms_score: 0.3736
|
| 145 |
+
2025-09-26 05:52:37,827 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0553 | Val rms_score: 0.3723
|
| 146 |
+
2025-09-26 05:52:45,676 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.1099 | Val rms_score: 0.3798
|
| 147 |
+
2025-09-26 05:52:53,383 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0524 | Val rms_score: 0.3754
|
| 148 |
+
2025-09-26 05:53:00,613 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0393 | Val rms_score: 0.3677
|
| 149 |
+
2025-09-26 05:53:00,796 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 810
|
| 150 |
+
2025-09-26 05:53:01,425 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 15 with val rms_score: 0.3677
|
| 151 |
+
2025-09-26 05:53:06,374 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0402 | Val rms_score: 0.3862
|
| 152 |
+
2025-09-26 05:53:14,822 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0401 | Val rms_score: 0.3784
|
| 153 |
+
2025-09-26 05:53:22,386 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0336 | Val rms_score: 0.3740
|
| 154 |
+
2025-09-26 05:53:30,494 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0332 | Val rms_score: 0.3696
|
| 155 |
+
2025-09-26 05:53:36,069 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0321 | Val rms_score: 0.3873
|
| 156 |
+
2025-09-26 05:53:43,497 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0310 | Val rms_score: 0.3694
|
| 157 |
+
2025-09-26 05:53:51,225 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0324 | Val rms_score: 0.3668
|
| 158 |
+
2025-09-26 05:53:51,407 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 1188
|
| 159 |
+
2025-09-26 05:53:52,066 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 22 with val rms_score: 0.3668
|
| 160 |
+
2025-09-26 05:53:59,793 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0264 | Val rms_score: 0.3715
|
| 161 |
+
2025-09-26 05:54:04,349 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0285 | Val rms_score: 0.3649
|
| 162 |
+
2025-09-26 05:54:04,558 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 1296
|
| 163 |
+
2025-09-26 05:54:05,199 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 24 with val rms_score: 0.3649
|
| 164 |
+
2025-09-26 05:54:12,485 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0275 | Val rms_score: 0.3665
|
| 165 |
+
2025-09-26 05:54:20,160 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0306 | Val rms_score: 0.3756
|
| 166 |
+
2025-09-26 05:54:28,060 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0259 | Val rms_score: 0.3740
|
| 167 |
+
2025-09-26 05:54:33,281 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0210 | Val rms_score: 0.3649
|
| 168 |
+
2025-09-26 05:54:33,453 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 1512
|
| 169 |
+
2025-09-26 05:54:34,114 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 28 with val rms_score: 0.3649
|
| 170 |
+
2025-09-26 05:54:41,660 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0239 | Val rms_score: 0.3658
|
| 171 |
+
2025-09-26 05:54:49,855 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0234 | Val rms_score: 0.3764
|
| 172 |
+
2025-09-26 05:54:58,279 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0213 | Val rms_score: 0.3690
|
| 173 |
+
2025-09-26 05:55:04,258 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0269 | Val rms_score: 0.3665
|
| 174 |
+
2025-09-26 05:55:12,047 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0246 | Val rms_score: 0.3688
|
| 175 |
+
2025-09-26 05:55:19,872 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0250 | Val rms_score: 0.3699
|
| 176 |
+
2025-09-26 05:55:27,576 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0224 | Val rms_score: 0.3642
|
| 177 |
+
2025-09-26 05:55:27,749 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 1890
|
| 178 |
+
2025-09-26 05:55:28,396 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 35 with val rms_score: 0.3642
|
| 179 |
+
2025-09-26 05:55:33,001 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0222 | Val rms_score: 0.3673
|
| 180 |
+
2025-09-26 05:55:41,100 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0220 | Val rms_score: 0.3729
|
| 181 |
+
2025-09-26 05:55:49,764 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0221 | Val rms_score: 0.3701
|
| 182 |
+
2025-09-26 05:55:57,169 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0215 | Val rms_score: 0.3697
|
| 183 |
+
2025-09-26 05:56:01,841 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0203 | Val rms_score: 0.3632
|
| 184 |
+
2025-09-26 05:56:02,012 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 2160
|
| 185 |
+
2025-09-26 05:56:02,668 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 40 with val rms_score: 0.3632
|
| 186 |
+
2025-09-26 05:56:10,083 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0213 | Val rms_score: 0.3735
|
| 187 |
+
2025-09-26 05:56:18,127 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0194 | Val rms_score: 0.3704
|
| 188 |
+
2025-09-26 05:56:25,233 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0211 | Val rms_score: 0.3723
|
| 189 |
+
2025-09-26 05:56:29,359 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0197 | Val rms_score: 0.3681
|
| 190 |
+
2025-09-26 05:56:36,805 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0194 | Val rms_score: 0.3685
|
| 191 |
+
2025-09-26 05:56:44,287 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0182 | Val rms_score: 0.3686
|
| 192 |
+
2025-09-26 05:56:52,393 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0189 | Val rms_score: 0.3683
|
| 193 |
+
2025-09-26 05:56:59,930 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0207 | Val rms_score: 0.3673
|
| 194 |
+
2025-09-26 05:57:05,175 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0189 | Val rms_score: 0.3700
|
| 195 |
+
2025-09-26 05:57:13,223 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0184 | Val rms_score: 0.3691
|
| 196 |
+
2025-09-26 05:57:21,240 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0172 | Val rms_score: 0.3743
|
| 197 |
+
2025-09-26 05:57:29,280 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0153 | Val rms_score: 0.3670
|
| 198 |
+
2025-09-26 05:57:34,737 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0174 | Val rms_score: 0.3710
|
| 199 |
+
2025-09-26 05:57:42,917 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0172 | Val rms_score: 0.3701
|
| 200 |
+
2025-09-26 05:57:50,607 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0163 | Val rms_score: 0.3674
|
| 201 |
+
2025-09-26 05:57:59,348 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0163 | Val rms_score: 0.3677
|
| 202 |
+
2025-09-26 05:58:04,930 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0158 | Val rms_score: 0.3701
|
| 203 |
+
2025-09-26 05:58:12,482 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0175 | Val rms_score: 0.3739
|
| 204 |
+
2025-09-26 05:58:20,218 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0171 | Val rms_score: 0.3713
|
| 205 |
+
2025-09-26 05:58:28,425 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0166 | Val rms_score: 0.3749
|
| 206 |
+
2025-09-26 05:58:33,981 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0169 | Val rms_score: 0.3708
|
| 207 |
+
2025-09-26 05:58:41,923 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0161 | Val rms_score: 0.3691
|
| 208 |
+
2025-09-26 05:58:49,555 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0189 | Val rms_score: 0.3721
|
| 209 |
+
2025-09-26 05:58:57,779 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0166 | Val rms_score: 0.3670
|
| 210 |
+
2025-09-26 05:59:03,047 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0173 | Val rms_score: 0.3708
|
| 211 |
+
2025-09-26 05:59:11,132 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0164 | Val rms_score: 0.3693
|
| 212 |
+
2025-09-26 05:59:20,062 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0189 | Val rms_score: 0.3700
|
| 213 |
+
2025-09-26 05:59:28,293 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0161 | Val rms_score: 0.3654
|
| 214 |
+
2025-09-26 05:59:34,402 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0149 | Val rms_score: 0.3713
|
| 215 |
+
2025-09-26 05:59:42,836 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0152 | Val rms_score: 0.3677
|
| 216 |
+
2025-09-26 05:59:50,862 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0155 | Val rms_score: 0.3751
|
| 217 |
+
2025-09-26 05:59:56,322 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0143 | Val rms_score: 0.3709
|
| 218 |
+
2025-09-26 06:00:03,760 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0151 | Val rms_score: 0.3669
|
| 219 |
+
2025-09-26 06:00:11,195 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0150 | Val rms_score: 0.3725
|
| 220 |
+
2025-09-26 06:00:20,485 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0141 | Val rms_score: 0.3720
|
| 221 |
+
2025-09-26 06:00:25,985 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0164 | Val rms_score: 0.3670
|
| 222 |
+
2025-09-26 06:00:34,530 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0155 | Val rms_score: 0.3709
|
| 223 |
+
2025-09-26 06:00:42,295 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0170 | Val rms_score: 0.3654
|
| 224 |
+
2025-09-26 06:00:50,005 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0143 | Val rms_score: 0.3728
|
| 225 |
+
2025-09-26 06:00:57,447 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0137 | Val rms_score: 0.3662
|
| 226 |
+
2025-09-26 06:01:02,416 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0140 | Val rms_score: 0.3729
|
| 227 |
+
2025-09-26 06:01:10,770 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0137 | Val rms_score: 0.3684
|
| 228 |
+
2025-09-26 06:01:18,749 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0138 | Val rms_score: 0.3702
|
| 229 |
+
2025-09-26 06:01:26,570 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0143 | Val rms_score: 0.3691
|
| 230 |
+
2025-09-26 06:01:31,771 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0140 | Val rms_score: 0.3720
|
| 231 |
+
2025-09-26 06:01:39,324 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0138 | Val rms_score: 0.3713
|
| 232 |
+
2025-09-26 06:01:47,270 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0145 | Val rms_score: 0.3710
|
| 233 |
+
2025-09-26 06:01:54,114 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0148 | Val rms_score: 0.3692
|
| 234 |
+
2025-09-26 06:01:59,249 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0121 | Val rms_score: 0.3738
|
| 235 |
+
2025-09-26 06:02:07,473 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0140 | Val rms_score: 0.3668
|
| 236 |
+
2025-09-26 06:02:15,319 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0151 | Val rms_score: 0.3696
|
| 237 |
+
2025-09-26 06:02:23,017 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0145 | Val rms_score: 0.3695
|
| 238 |
+
2025-09-26 06:02:29,803 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0128 | Val rms_score: 0.3719
|
| 239 |
+
2025-09-26 06:02:37,663 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0131 | Val rms_score: 0.3683
|
| 240 |
+
2025-09-26 06:02:44,971 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0142 | Val rms_score: 0.3686
|
| 241 |
+
2025-09-26 06:02:52,866 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0141 | Val rms_score: 0.3693
|
| 242 |
+
2025-09-26 06:02:58,241 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0132 | Val rms_score: 0.3683
|
| 243 |
+
2025-09-26 06:03:05,696 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0134 | Val rms_score: 0.3686
|
| 244 |
+
2025-09-26 06:03:13,406 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0129 | Val rms_score: 0.3696
|
| 245 |
+
2025-09-26 06:03:20,877 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0132 | Val rms_score: 0.3744
|
| 246 |
+
2025-09-26 06:03:21,471 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Test rms_score: 0.4274
|
| 247 |
+
2025-09-26 06:03:21,833 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset adme_microsom_stab_h at 2025-09-26_06-03-21
|
| 248 |
+
2025-09-26 06:03:26,423 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7731 | Val rms_score: 0.4020
|
| 249 |
+
2025-09-26 06:03:26,423 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 54
|
| 250 |
+
2025-09-26 06:03:27,256 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4020
|
| 251 |
+
2025-09-26 06:03:35,264 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4980 | Val rms_score: 0.3924
|
| 252 |
+
2025-09-26 06:03:35,482 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 108
|
| 253 |
+
2025-09-26 06:03:36,132 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3924
|
| 254 |
+
2025-09-26 06:03:44,067 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3634 | Val rms_score: 0.3792
|
| 255 |
+
2025-09-26 06:03:44,268 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 162
|
| 256 |
+
2025-09-26 06:03:44,908 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.3792
|
| 257 |
+
2025-09-26 06:03:53,505 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2520 | Val rms_score: 0.3765
|
| 258 |
+
2025-09-26 06:03:53,735 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 216
|
| 259 |
+
2025-09-26 06:03:54,398 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.3765
|
| 260 |
+
2025-09-26 06:03:59,431 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1944 | Val rms_score: 0.3620
|
| 261 |
+
2025-09-26 06:03:59,647 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Global step of best model: 270
|
| 262 |
+
2025-09-26 06:04:00,353 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val rms_score: 0.3620
|
| 263 |
+
2025-09-26 06:04:08,455 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1432 | Val rms_score: 0.3636
|
| 264 |
+
2025-09-26 06:04:17,060 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1175 | Val rms_score: 0.3728
|
| 265 |
+
2025-09-26 06:04:25,078 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0981 | Val rms_score: 0.3875
|
| 266 |
+
2025-09-26 06:04:29,694 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0799 | Val rms_score: 0.3794
|
| 267 |
+
2025-09-26 06:04:37,099 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0621 | Val rms_score: 0.3656
|
| 268 |
+
2025-09-26 06:04:44,658 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0532 | Val rms_score: 0.3735
|
| 269 |
+
2025-09-26 06:04:52,661 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0518 | Val rms_score: 0.3787
|
| 270 |
+
2025-09-26 06:04:57,520 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0530 | Val rms_score: 0.3801
|
| 271 |
+
2025-09-26 06:05:04,883 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0477 | Val rms_score: 0.3791
|
| 272 |
+
2025-09-26 06:05:12,799 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0490 | Val rms_score: 0.3812
|
| 273 |
+
2025-09-26 06:05:20,426 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0385 | Val rms_score: 0.3812
|
| 274 |
+
2025-09-26 06:05:26,135 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0404 | Val rms_score: 0.3811
|
| 275 |
+
2025-09-26 06:05:33,134 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0362 | Val rms_score: 0.3723
|
| 276 |
+
2025-09-26 06:05:42,068 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0344 | Val rms_score: 0.3795
|
| 277 |
+
2025-09-26 06:05:50,151 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0357 | Val rms_score: 0.3730
|
| 278 |
+
2025-09-26 06:05:55,057 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0345 | Val rms_score: 0.3738
|
| 279 |
+
2025-09-26 06:06:02,662 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0269 | Val rms_score: 0.3716
|
| 280 |
+
2025-09-26 06:06:10,352 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0292 | Val rms_score: 0.3760
|
| 281 |
+
2025-09-26 06:06:18,190 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0272 | Val rms_score: 0.3757
|
| 282 |
+
2025-09-26 06:06:24,048 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0272 | Val rms_score: 0.3789
|
| 283 |
+
2025-09-26 06:06:31,892 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0282 | Val rms_score: 0.3757
|
| 284 |
+
2025-09-26 06:06:40,302 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0275 | Val rms_score: 0.3817
|
| 285 |
+
2025-09-26 06:06:48,210 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0205 | Val rms_score: 0.3793
|
| 286 |
+
2025-09-26 06:06:53,913 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0239 | Val rms_score: 0.3736
|
| 287 |
+
2025-09-26 06:07:02,266 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0233 | Val rms_score: 0.3703
|
| 288 |
+
2025-09-26 06:07:10,476 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0247 | Val rms_score: 0.3695
|
| 289 |
+
2025-09-26 06:07:18,580 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0219 | Val rms_score: 0.3749
|
| 290 |
+
2025-09-26 06:07:23,768 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0210 | Val rms_score: 0.3739
|
| 291 |
+
2025-09-26 06:07:31,382 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0218 | Val rms_score: 0.3817
|
| 292 |
+
2025-09-26 06:07:39,426 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0204 | Val rms_score: 0.3769
|
| 293 |
+
2025-09-26 06:07:46,977 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0195 | Val rms_score: 0.3776
|
| 294 |
+
2025-09-26 06:07:52,669 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0205 | Val rms_score: 0.3745
|
| 295 |
+
2025-09-26 06:08:00,854 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0204 | Val rms_score: 0.3729
|
| 296 |
+
2025-09-26 06:08:08,562 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0197 | Val rms_score: 0.3749
|
| 297 |
+
2025-09-26 06:08:16,061 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0208 | Val rms_score: 0.3791
|
| 298 |
+
2025-09-26 06:08:24,218 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0197 | Val rms_score: 0.3766
|
| 299 |
+
2025-09-26 06:08:30,006 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0197 | Val rms_score: 0.3779
|
| 300 |
+
2025-09-26 06:08:37,572 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0206 | Val rms_score: 0.3705
|
| 301 |
+
2025-09-26 06:08:45,478 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0217 | Val rms_score: 0.3691
|
| 302 |
+
2025-09-26 06:08:53,679 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0184 | Val rms_score: 0.3717
|
| 303 |
+
2025-09-26 06:08:59,638 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0174 | Val rms_score: 0.3735
|
| 304 |
+
2025-09-26 06:09:08,145 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0196 | Val rms_score: 0.3771
|
| 305 |
+
2025-09-26 06:09:16,128 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0195 | Val rms_score: 0.3792
|
| 306 |
+
2025-09-26 06:09:21,886 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0213 | Val rms_score: 0.3743
|
| 307 |
+
2025-09-26 06:09:30,632 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0200 | Val rms_score: 0.3701
|
| 308 |
+
2025-09-26 06:09:38,547 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0200 | Val rms_score: 0.3738
|
| 309 |
+
2025-09-26 06:09:46,820 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0173 | Val rms_score: 0.3763
|
| 310 |
+
2025-09-26 06:09:52,385 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0166 | Val rms_score: 0.3713
|
| 311 |
+
2025-09-26 06:10:00,737 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0175 | Val rms_score: 0.3698
|
| 312 |
+
2025-09-26 06:10:08,793 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0161 | Val rms_score: 0.3726
|
| 313 |
+
2025-09-26 06:10:18,070 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0168 | Val rms_score: 0.3731
|
| 314 |
+
2025-09-26 06:10:24,250 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0175 | Val rms_score: 0.3758
|
| 315 |
+
2025-09-26 06:10:31,880 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0159 | Val rms_score: 0.3748
|
| 316 |
+
2025-09-26 06:10:39,262 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0158 | Val rms_score: 0.3737
|
| 317 |
+
2025-09-26 06:10:46,919 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0160 | Val rms_score: 0.3718
|
| 318 |
+
2025-09-26 06:10:51,649 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0159 | Val rms_score: 0.3721
|
| 319 |
+
2025-09-26 06:10:59,556 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0159 | Val rms_score: 0.3741
|
| 320 |
+
2025-09-26 06:11:07,366 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0145 | Val rms_score: 0.3739
|
| 321 |
+
2025-09-26 06:11:15,171 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0161 | Val rms_score: 0.3785
|
| 322 |
+
2025-09-26 06:11:22,545 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0165 | Val rms_score: 0.3723
|
| 323 |
+
2025-09-26 06:11:27,904 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0156 | Val rms_score: 0.3753
|
| 324 |
+
2025-09-26 06:11:35,934 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0149 | Val rms_score: 0.3768
|
| 325 |
+
2025-09-26 06:11:43,887 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0159 | Val rms_score: 0.3748
|
| 326 |
+
2025-09-26 06:11:51,636 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0161 | Val rms_score: 0.3729
|
| 327 |
+
2025-09-26 06:11:56,902 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0158 | Val rms_score: 0.3735
|
| 328 |
+
2025-09-26 06:12:04,722 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0148 | Val rms_score: 0.3792
|
| 329 |
+
2025-09-26 06:12:12,849 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0149 | Val rms_score: 0.3700
|
| 330 |
+
2025-09-26 06:12:20,574 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0152 | Val rms_score: 0.3699
|
| 331 |
+
2025-09-26 06:12:25,828 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0140 | Val rms_score: 0.3765
|
| 332 |
+
2025-09-26 06:12:34,689 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0152 | Val rms_score: 0.3718
|
| 333 |
+
2025-09-26 06:12:42,470 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0135 | Val rms_score: 0.3725
|
| 334 |
+
2025-09-26 06:12:50,186 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0143 | Val rms_score: 0.3726
|
| 335 |
+
2025-09-26 06:12:55,563 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0144 | Val rms_score: 0.3710
|
| 336 |
+
2025-09-26 06:13:03,295 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0143 | Val rms_score: 0.3725
|
| 337 |
+
2025-09-26 06:13:11,428 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0139 | Val rms_score: 0.3707
|
| 338 |
+
2025-09-26 06:13:19,767 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0150 | Val rms_score: 0.3727
|
| 339 |
+
2025-09-26 06:13:25,754 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0146 | Val rms_score: 0.3781
|
| 340 |
+
2025-09-26 06:13:34,074 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0154 | Val rms_score: 0.3748
|
| 341 |
+
2025-09-26 06:13:42,385 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0143 | Val rms_score: 0.3754
|
| 342 |
+
2025-09-26 06:13:51,193 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0140 | Val rms_score: 0.3718
|
| 343 |
+
2025-09-26 06:13:57,405 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0147 | Val rms_score: 0.3732
|
| 344 |
+
2025-09-26 06:14:06,554 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0140 | Val rms_score: 0.3707
|
| 345 |
+
2025-09-26 06:14:14,620 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0137 | Val rms_score: 0.3728
|
| 346 |
+
2025-09-26 06:14:20,536 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0130 | Val rms_score: 0.3718
|
| 347 |
+
2025-09-26 06:14:28,867 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0136 | Val rms_score: 0.3745
|
| 348 |
+
2025-09-26 06:14:37,186 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0139 | Val rms_score: 0.3754
|
| 349 |
+
2025-09-26 06:14:45,342 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0130 | Val rms_score: 0.3723
|
| 350 |
+
2025-09-26 06:14:52,061 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0128 | Val rms_score: 0.3775
|
| 351 |
+
2025-09-26 06:14:59,254 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0142 | Val rms_score: 0.3704
|
| 352 |
+
2025-09-26 06:15:06,856 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0134 | Val rms_score: 0.3705
|
| 353 |
+
2025-09-26 06:15:14,519 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0133 | Val rms_score: 0.3682
|
| 354 |
+
2025-09-26 06:15:20,225 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0133 | Val rms_score: 0.3686
|
| 355 |
+
2025-09-26 06:15:27,564 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0132 | Val rms_score: 0.3780
|
| 356 |
+
2025-09-26 06:15:35,427 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0136 | Val rms_score: 0.3728
|
| 357 |
+
2025-09-26 06:15:43,224 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0129 | Val rms_score: 0.3779
|
| 358 |
+
2025-09-26 06:15:43,841 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Test rms_score: 0.4107
|
| 359 |
+
2025-09-26 06:15:44,212 - logs_modchembert_adme_microsom_stab_h_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.4206, Std Dev: 0.0071
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_microsom_stab_r_epochs100_batch_size32_20250926_061544.log
ADDED
|
@@ -0,0 +1,323 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 06:15:44,214 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Running benchmark for dataset: adme_microsom_stab_r
|
| 2 |
+
2025-09-26 06:15:44,214 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - dataset: adme_microsom_stab_r, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 06:15:44,220 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset adme_microsom_stab_r at 2025-09-26_06-15-44
|
| 4 |
+
2025-09-26 06:15:50,471 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6691 | Val rms_score: 0.5054
|
| 5 |
+
2025-09-26 06:15:50,471 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Global step of best model: 68
|
| 6 |
+
2025-09-26 06:15:51,223 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.5054
|
| 7 |
+
2025-09-26 06:16:00,740 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4167 | Val rms_score: 0.4756
|
| 8 |
+
2025-09-26 06:16:00,949 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Global step of best model: 136
|
| 9 |
+
2025-09-26 06:16:01,643 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.4756
|
| 10 |
+
2025-09-26 06:16:11,352 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2188 | Val rms_score: 0.4704
|
| 11 |
+
2025-09-26 06:16:11,586 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Global step of best model: 204
|
| 12 |
+
2025-09-26 06:16:12,294 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.4704
|
| 13 |
+
2025-09-26 06:16:19,153 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2105 | Val rms_score: 0.4926
|
| 14 |
+
2025-09-26 06:16:28,774 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1703 | Val rms_score: 0.4869
|
| 15 |
+
2025-09-26 06:16:38,539 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1289 | Val rms_score: 0.4818
|
| 16 |
+
2025-09-26 06:16:49,660 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1002 | Val rms_score: 0.4767
|
| 17 |
+
2025-09-26 06:16:57,028 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0902 | Val rms_score: 0.4766
|
| 18 |
+
2025-09-26 06:17:07,260 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0794 | Val rms_score: 0.4970
|
| 19 |
+
2025-09-26 06:17:17,316 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0611 | Val rms_score: 0.4942
|
| 20 |
+
2025-09-26 06:17:24,313 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0557 | Val rms_score: 0.4801
|
| 21 |
+
2025-09-26 06:17:34,824 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0527 | Val rms_score: 0.4814
|
| 22 |
+
2025-09-26 06:17:44,383 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0430 | Val rms_score: 0.4835
|
| 23 |
+
2025-09-26 06:17:51,810 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0436 | Val rms_score: 0.4868
|
| 24 |
+
2025-09-26 06:18:01,983 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0371 | Val rms_score: 0.4895
|
| 25 |
+
2025-09-26 06:18:11,813 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0363 | Val rms_score: 0.4989
|
| 26 |
+
2025-09-26 06:18:18,519 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0374 | Val rms_score: 0.4912
|
| 27 |
+
2025-09-26 06:18:27,799 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0340 | Val rms_score: 0.4864
|
| 28 |
+
2025-09-26 06:18:36,933 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0340 | Val rms_score: 0.5044
|
| 29 |
+
2025-09-26 06:18:46,227 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0339 | Val rms_score: 0.4895
|
| 30 |
+
2025-09-26 06:18:52,620 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0368 | Val rms_score: 0.4787
|
| 31 |
+
2025-09-26 06:19:02,035 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0290 | Val rms_score: 0.4849
|
| 32 |
+
2025-09-26 06:19:10,894 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0260 | Val rms_score: 0.4851
|
| 33 |
+
2025-09-26 06:19:17,839 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0275 | Val rms_score: 0.4803
|
| 34 |
+
2025-09-26 06:19:27,201 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0252 | Val rms_score: 0.4843
|
| 35 |
+
2025-09-26 06:19:36,339 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0247 | Val rms_score: 0.4898
|
| 36 |
+
2025-09-26 06:19:45,429 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0237 | Val rms_score: 0.4916
|
| 37 |
+
2025-09-26 06:19:52,480 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0206 | Val rms_score: 0.4882
|
| 38 |
+
2025-09-26 06:20:01,868 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0229 | Val rms_score: 0.4902
|
| 39 |
+
2025-09-26 06:20:12,577 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0212 | Val rms_score: 0.4867
|
| 40 |
+
2025-09-26 06:20:19,713 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0214 | Val rms_score: 0.4887
|
| 41 |
+
2025-09-26 06:20:30,167 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0216 | Val rms_score: 0.4862
|
| 42 |
+
2025-09-26 06:20:39,897 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0220 | Val rms_score: 0.4866
|
| 43 |
+
2025-09-26 06:20:47,277 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0191 | Val rms_score: 0.4853
|
| 44 |
+
2025-09-26 06:20:58,400 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0203 | Val rms_score: 0.4871
|
| 45 |
+
2025-09-26 06:21:07,622 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0189 | Val rms_score: 0.4921
|
| 46 |
+
2025-09-26 06:21:17,996 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0194 | Val rms_score: 0.4836
|
| 47 |
+
2025-09-26 06:21:25,147 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0200 | Val rms_score: 0.4860
|
| 48 |
+
2025-09-26 06:21:34,711 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0197 | Val rms_score: 0.4848
|
| 49 |
+
2025-09-26 06:21:45,081 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0195 | Val rms_score: 0.4859
|
| 50 |
+
2025-09-26 06:21:52,660 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0175 | Val rms_score: 0.4878
|
| 51 |
+
2025-09-26 06:22:02,572 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0187 | Val rms_score: 0.4882
|
| 52 |
+
2025-09-26 06:22:12,225 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0181 | Val rms_score: 0.4830
|
| 53 |
+
2025-09-26 06:22:18,916 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0176 | Val rms_score: 0.4846
|
| 54 |
+
2025-09-26 06:22:29,340 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0184 | Val rms_score: 0.4917
|
| 55 |
+
2025-09-26 06:22:38,767 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0178 | Val rms_score: 0.4855
|
| 56 |
+
2025-09-26 06:22:45,725 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0183 | Val rms_score: 0.4858
|
| 57 |
+
2025-09-26 06:22:55,888 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0170 | Val rms_score: 0.4871
|
| 58 |
+
2025-09-26 06:23:05,922 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0170 | Val rms_score: 0.4855
|
| 59 |
+
2025-09-26 06:23:15,236 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0172 | Val rms_score: 0.4923
|
| 60 |
+
2025-09-26 06:23:22,784 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0165 | Val rms_score: 0.4837
|
| 61 |
+
2025-09-26 06:23:33,013 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0165 | Val rms_score: 0.4898
|
| 62 |
+
2025-09-26 06:23:42,773 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0159 | Val rms_score: 0.4815
|
| 63 |
+
2025-09-26 06:23:50,480 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0153 | Val rms_score: 0.4823
|
| 64 |
+
2025-09-26 06:24:00,518 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0154 | Val rms_score: 0.4829
|
| 65 |
+
2025-09-26 06:24:10,327 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0157 | Val rms_score: 0.4878
|
| 66 |
+
2025-09-26 06:24:17,859 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0161 | Val rms_score: 0.4844
|
| 67 |
+
2025-09-26 06:24:27,599 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0165 | Val rms_score: 0.4861
|
| 68 |
+
2025-09-26 06:24:37,748 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0142 | Val rms_score: 0.4875
|
| 69 |
+
2025-09-26 06:24:47,077 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0149 | Val rms_score: 0.4821
|
| 70 |
+
2025-09-26 06:24:53,752 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0154 | Val rms_score: 0.4826
|
| 71 |
+
2025-09-26 06:25:03,940 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0171 | Val rms_score: 0.4880
|
| 72 |
+
2025-09-26 06:25:13,048 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0155 | Val rms_score: 0.4850
|
| 73 |
+
2025-09-26 06:25:20,189 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0155 | Val rms_score: 0.4784
|
| 74 |
+
2025-09-26 06:25:29,084 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0161 | Val rms_score: 0.4862
|
| 75 |
+
2025-09-26 06:25:38,433 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0144 | Val rms_score: 0.4867
|
| 76 |
+
2025-09-26 06:25:45,521 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0151 | Val rms_score: 0.4804
|
| 77 |
+
2025-09-26 06:25:55,610 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0139 | Val rms_score: 0.4867
|
| 78 |
+
2025-09-26 06:26:07,178 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0141 | Val rms_score: 0.4863
|
| 79 |
+
2025-09-26 06:26:16,422 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0154 | Val rms_score: 0.4826
|
| 80 |
+
2025-09-26 06:26:27,818 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0153 | Val rms_score: 0.4882
|
| 81 |
+
2025-09-26 06:26:37,503 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0143 | Val rms_score: 0.4847
|
| 82 |
+
2025-09-26 06:26:44,308 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0142 | Val rms_score: 0.4843
|
| 83 |
+
2025-09-26 06:26:54,590 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0137 | Val rms_score: 0.4862
|
| 84 |
+
2025-09-26 06:27:04,025 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0149 | Val rms_score: 0.4834
|
| 85 |
+
2025-09-26 06:27:12,942 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0147 | Val rms_score: 0.4868
|
| 86 |
+
2025-09-26 06:27:19,819 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0142 | Val rms_score: 0.4846
|
| 87 |
+
2025-09-26 06:27:28,811 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0139 | Val rms_score: 0.4832
|
| 88 |
+
2025-09-26 06:27:37,937 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0141 | Val rms_score: 0.4836
|
| 89 |
+
2025-09-26 06:27:44,744 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0143 | Val rms_score: 0.4830
|
| 90 |
+
2025-09-26 06:27:54,240 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0149 | Val rms_score: 0.4845
|
| 91 |
+
2025-09-26 06:28:03,923 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0135 | Val rms_score: 0.4847
|
| 92 |
+
2025-09-26 06:28:13,100 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0138 | Val rms_score: 0.4839
|
| 93 |
+
2025-09-26 06:28:20,043 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0132 | Val rms_score: 0.4834
|
| 94 |
+
2025-09-26 06:28:29,199 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0134 | Val rms_score: 0.4832
|
| 95 |
+
2025-09-26 06:28:38,418 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0140 | Val rms_score: 0.4830
|
| 96 |
+
2025-09-26 06:28:45,477 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0114 | Val rms_score: 0.4839
|
| 97 |
+
2025-09-26 06:28:54,699 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0139 | Val rms_score: 0.4838
|
| 98 |
+
2025-09-26 06:29:04,792 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0129 | Val rms_score: 0.4862
|
| 99 |
+
2025-09-26 06:29:13,953 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0129 | Val rms_score: 0.4862
|
| 100 |
+
2025-09-26 06:29:20,270 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0125 | Val rms_score: 0.4845
|
| 101 |
+
2025-09-26 06:29:29,794 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0131 | Val rms_score: 0.4844
|
| 102 |
+
2025-09-26 06:29:39,002 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0127 | Val rms_score: 0.4846
|
| 103 |
+
2025-09-26 06:29:45,745 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0128 | Val rms_score: 0.4869
|
| 104 |
+
2025-09-26 06:29:54,621 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0126 | Val rms_score: 0.4874
|
| 105 |
+
2025-09-26 06:30:03,612 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0122 | Val rms_score: 0.4871
|
| 106 |
+
2025-09-26 06:30:13,198 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0132 | Val rms_score: 0.4883
|
| 107 |
+
2025-09-26 06:30:20,012 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0123 | Val rms_score: 0.4846
|
| 108 |
+
2025-09-26 06:30:28,928 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0123 | Val rms_score: 0.4820
|
| 109 |
+
2025-09-26 06:30:37,986 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0117 | Val rms_score: 0.4869
|
| 110 |
+
2025-09-26 06:30:38,739 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Test rms_score: 0.4425
|
| 111 |
+
2025-09-26 06:30:39,124 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset adme_microsom_stab_r at 2025-09-26_06-30-39
|
| 112 |
+
2025-09-26 06:30:45,099 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6581 | Val rms_score: 0.5092
|
| 113 |
+
2025-09-26 06:30:45,099 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Global step of best model: 68
|
| 114 |
+
2025-09-26 06:30:45,968 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.5092
|
| 115 |
+
2025-09-26 06:30:55,238 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4115 | Val rms_score: 0.4714
|
| 116 |
+
2025-09-26 06:30:55,500 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Global step of best model: 136
|
| 117 |
+
2025-09-26 06:30:56,239 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.4714
|
| 118 |
+
2025-09-26 06:31:06,335 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2305 | Val rms_score: 0.4808
|
| 119 |
+
2025-09-26 06:31:13,134 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2178 | Val rms_score: 0.4859
|
| 120 |
+
2025-09-26 06:31:22,040 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1695 | Val rms_score: 0.4811
|
| 121 |
+
2025-09-26 06:31:31,078 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1367 | Val rms_score: 0.4835
|
| 122 |
+
2025-09-26 06:31:40,945 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1039 | Val rms_score: 0.4828
|
| 123 |
+
2025-09-26 06:31:47,377 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0866 | Val rms_score: 0.4884
|
| 124 |
+
2025-09-26 06:31:56,760 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0781 | Val rms_score: 0.4979
|
| 125 |
+
2025-09-26 06:32:06,070 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0634 | Val rms_score: 0.4949
|
| 126 |
+
2025-09-26 06:32:13,172 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0609 | Val rms_score: 0.4807
|
| 127 |
+
2025-09-26 06:32:22,934 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0527 | Val rms_score: 0.4882
|
| 128 |
+
2025-09-26 06:32:32,282 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0480 | Val rms_score: 0.4878
|
| 129 |
+
2025-09-26 06:32:41,678 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0445 | Val rms_score: 0.4875
|
| 130 |
+
2025-09-26 06:32:49,096 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0398 | Val rms_score: 0.4908
|
| 131 |
+
2025-09-26 06:32:58,360 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0384 | Val rms_score: 0.4876
|
| 132 |
+
2025-09-26 06:33:07,495 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0391 | Val rms_score: 0.4938
|
| 133 |
+
2025-09-26 06:33:14,105 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0409 | Val rms_score: 0.4881
|
| 134 |
+
2025-09-26 06:33:22,826 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0324 | Val rms_score: 0.4871
|
| 135 |
+
2025-09-26 06:33:32,180 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0326 | Val rms_score: 0.4882
|
| 136 |
+
2025-09-26 06:33:41,225 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0331 | Val rms_score: 0.4898
|
| 137 |
+
2025-09-26 06:33:47,885 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0284 | Val rms_score: 0.4925
|
| 138 |
+
2025-09-26 06:33:57,070 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0291 | Val rms_score: 0.4913
|
| 139 |
+
2025-09-26 06:34:06,197 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0264 | Val rms_score: 0.4958
|
| 140 |
+
2025-09-26 06:34:13,061 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0278 | Val rms_score: 0.4926
|
| 141 |
+
2025-09-26 06:34:23,302 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0245 | Val rms_score: 0.4954
|
| 142 |
+
2025-09-26 06:34:33,364 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0222 | Val rms_score: 0.4913
|
| 143 |
+
2025-09-26 06:34:40,588 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0276 | Val rms_score: 0.4893
|
| 144 |
+
2025-09-26 06:34:49,664 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0240 | Val rms_score: 0.4928
|
| 145 |
+
2025-09-26 06:34:59,704 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0256 | Val rms_score: 0.4895
|
| 146 |
+
2025-09-26 06:35:09,482 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0261 | Val rms_score: 0.4957
|
| 147 |
+
2025-09-26 06:35:16,910 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0223 | Val rms_score: 0.4904
|
| 148 |
+
2025-09-26 06:35:27,053 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0231 | Val rms_score: 0.4819
|
| 149 |
+
2025-09-26 06:35:36,710 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0208 | Val rms_score: 0.4939
|
| 150 |
+
2025-09-26 06:35:43,794 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0227 | Val rms_score: 0.4914
|
| 151 |
+
2025-09-26 06:35:52,848 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0208 | Val rms_score: 0.4902
|
| 152 |
+
2025-09-26 06:36:02,595 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0215 | Val rms_score: 0.4902
|
| 153 |
+
2025-09-26 06:36:09,999 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0198 | Val rms_score: 0.4933
|
| 154 |
+
2025-09-26 06:36:19,623 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0206 | Val rms_score: 0.4910
|
| 155 |
+
2025-09-26 06:36:29,322 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0194 | Val rms_score: 0.4882
|
| 156 |
+
2025-09-26 06:36:38,876 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0195 | Val rms_score: 0.4888
|
| 157 |
+
2025-09-26 06:36:46,379 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0205 | Val rms_score: 0.4922
|
| 158 |
+
2025-09-26 06:36:56,082 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0163 | Val rms_score: 0.4903
|
| 159 |
+
2025-09-26 06:37:05,563 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0171 | Val rms_score: 0.4840
|
| 160 |
+
2025-09-26 06:37:13,372 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0184 | Val rms_score: 0.4907
|
| 161 |
+
2025-09-26 06:37:22,889 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0188 | Val rms_score: 0.4917
|
| 162 |
+
2025-09-26 06:37:32,474 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0179 | Val rms_score: 0.4914
|
| 163 |
+
2025-09-26 06:37:39,382 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0177 | Val rms_score: 0.4909
|
| 164 |
+
2025-09-26 06:37:49,892 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0160 | Val rms_score: 0.4908
|
| 165 |
+
2025-09-26 06:37:58,975 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0175 | Val rms_score: 0.4927
|
| 166 |
+
2025-09-26 06:38:08,268 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0170 | Val rms_score: 0.4894
|
| 167 |
+
2025-09-26 06:38:15,710 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0169 | Val rms_score: 0.4900
|
| 168 |
+
2025-09-26 06:38:25,856 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0194 | Val rms_score: 0.4892
|
| 169 |
+
2025-09-26 06:38:35,526 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0161 | Val rms_score: 0.4901
|
| 170 |
+
2025-09-26 06:38:42,244 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0164 | Val rms_score: 0.4940
|
| 171 |
+
2025-09-26 06:38:51,317 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0141 | Val rms_score: 0.4881
|
| 172 |
+
2025-09-26 06:39:00,907 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0167 | Val rms_score: 0.4891
|
| 173 |
+
2025-09-26 06:39:10,339 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0162 | Val rms_score: 0.4886
|
| 174 |
+
2025-09-26 06:39:17,853 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0162 | Val rms_score: 0.4894
|
| 175 |
+
2025-09-26 06:39:26,825 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0157 | Val rms_score: 0.4872
|
| 176 |
+
2025-09-26 06:39:36,147 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0153 | Val rms_score: 0.4900
|
| 177 |
+
2025-09-26 06:39:43,224 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0171 | Val rms_score: 0.4933
|
| 178 |
+
2025-09-26 06:39:53,297 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0152 | Val rms_score: 0.4861
|
| 179 |
+
2025-09-26 06:40:02,963 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0144 | Val rms_score: 0.4903
|
| 180 |
+
2025-09-26 06:40:08,875 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0158 | Val rms_score: 0.4839
|
| 181 |
+
2025-09-26 06:40:18,365 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0148 | Val rms_score: 0.4872
|
| 182 |
+
2025-09-26 06:40:28,324 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0150 | Val rms_score: 0.4901
|
| 183 |
+
2025-09-26 06:40:37,749 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0154 | Val rms_score: 0.4889
|
| 184 |
+
2025-09-26 06:40:44,064 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0147 | Val rms_score: 0.4883
|
| 185 |
+
2025-09-26 06:40:53,433 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0152 | Val rms_score: 0.4881
|
| 186 |
+
2025-09-26 06:41:02,691 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0152 | Val rms_score: 0.4881
|
| 187 |
+
2025-09-26 06:41:09,955 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0142 | Val rms_score: 0.4889
|
| 188 |
+
2025-09-26 06:41:19,032 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0156 | Val rms_score: 0.4925
|
| 189 |
+
2025-09-26 06:41:29,631 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0152 | Val rms_score: 0.4912
|
| 190 |
+
2025-09-26 06:41:39,057 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0155 | Val rms_score: 0.4919
|
| 191 |
+
2025-09-26 06:41:45,648 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0153 | Val rms_score: 0.4881
|
| 192 |
+
2025-09-26 06:41:55,089 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0137 | Val rms_score: 0.4894
|
| 193 |
+
2025-09-26 06:42:05,203 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0142 | Val rms_score: 0.4909
|
| 194 |
+
2025-09-26 06:42:12,168 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0132 | Val rms_score: 0.4878
|
| 195 |
+
2025-09-26 06:42:21,162 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0133 | Val rms_score: 0.4903
|
| 196 |
+
2025-09-26 06:42:30,423 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0150 | Val rms_score: 0.4905
|
| 197 |
+
2025-09-26 06:42:38,349 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0141 | Val rms_score: 0.4906
|
| 198 |
+
2025-09-26 06:42:47,931 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0150 | Val rms_score: 0.4883
|
| 199 |
+
2025-09-26 06:42:57,280 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0138 | Val rms_score: 0.4894
|
| 200 |
+
2025-09-26 06:43:06,797 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0138 | Val rms_score: 0.4881
|
| 201 |
+
2025-09-26 06:43:14,036 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0134 | Val rms_score: 0.4952
|
| 202 |
+
2025-09-26 06:43:23,833 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0140 | Val rms_score: 0.4926
|
| 203 |
+
2025-09-26 06:43:33,163 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0134 | Val rms_score: 0.4883
|
| 204 |
+
2025-09-26 06:43:40,803 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0136 | Val rms_score: 0.4899
|
| 205 |
+
2025-09-26 06:43:50,515 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0118 | Val rms_score: 0.4883
|
| 206 |
+
2025-09-26 06:43:59,659 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0136 | Val rms_score: 0.4909
|
| 207 |
+
2025-09-26 06:44:07,827 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0132 | Val rms_score: 0.4912
|
| 208 |
+
2025-09-26 06:44:17,877 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0125 | Val rms_score: 0.4893
|
| 209 |
+
2025-09-26 06:44:27,678 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0125 | Val rms_score: 0.4927
|
| 210 |
+
2025-09-26 06:44:37,730 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0128 | Val rms_score: 0.4866
|
| 211 |
+
2025-09-26 06:44:45,231 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0121 | Val rms_score: 0.4896
|
| 212 |
+
2025-09-26 06:44:54,802 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0124 | Val rms_score: 0.4864
|
| 213 |
+
2025-09-26 06:45:04,251 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0121 | Val rms_score: 0.4886
|
| 214 |
+
2025-09-26 06:45:11,312 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0124 | Val rms_score: 0.4853
|
| 215 |
+
2025-09-26 06:45:20,818 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0132 | Val rms_score: 0.4885
|
| 216 |
+
2025-09-26 06:45:21,548 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Test rms_score: 0.4430
|
| 217 |
+
2025-09-26 06:45:22,089 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset adme_microsom_stab_r at 2025-09-26_06-45-22
|
| 218 |
+
2025-09-26 06:45:30,641 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6287 | Val rms_score: 0.4885
|
| 219 |
+
2025-09-26 06:45:30,641 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Global step of best model: 68
|
| 220 |
+
2025-09-26 06:45:31,481 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4885
|
| 221 |
+
2025-09-26 06:45:38,562 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.3767 | Val rms_score: 0.4630
|
| 222 |
+
2025-09-26 06:45:38,764 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Global step of best model: 136
|
| 223 |
+
2025-09-26 06:45:39,396 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.4630
|
| 224 |
+
2025-09-26 06:45:49,112 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2773 | Val rms_score: 0.4768
|
| 225 |
+
2025-09-26 06:45:58,375 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2325 | Val rms_score: 0.4692
|
| 226 |
+
2025-09-26 06:46:05,048 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1555 | Val rms_score: 0.4836
|
| 227 |
+
2025-09-26 06:46:14,529 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1357 | Val rms_score: 0.4847
|
| 228 |
+
2025-09-26 06:46:23,826 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1034 | Val rms_score: 0.4849
|
| 229 |
+
2025-09-26 06:46:33,026 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0938 | Val rms_score: 0.4945
|
| 230 |
+
2025-09-26 06:46:40,043 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0866 | Val rms_score: 0.4829
|
| 231 |
+
2025-09-26 06:46:49,513 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0671 | Val rms_score: 0.4832
|
| 232 |
+
2025-09-26 06:46:59,080 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0635 | Val rms_score: 0.4872
|
| 233 |
+
2025-09-26 06:47:06,755 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0520 | Val rms_score: 0.4965
|
| 234 |
+
2025-09-26 06:47:16,410 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0503 | Val rms_score: 0.4854
|
| 235 |
+
2025-09-26 06:47:25,919 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0496 | Val rms_score: 0.4769
|
| 236 |
+
2025-09-26 06:47:36,390 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0523 | Val rms_score: 0.4754
|
| 237 |
+
2025-09-26 06:47:43,419 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0411 | Val rms_score: 0.4915
|
| 238 |
+
2025-09-26 06:47:53,446 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0374 | Val rms_score: 0.4853
|
| 239 |
+
2025-09-26 06:48:03,983 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0329 | Val rms_score: 0.4906
|
| 240 |
+
2025-09-26 06:48:11,890 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0340 | Val rms_score: 0.4827
|
| 241 |
+
2025-09-26 06:48:21,596 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0314 | Val rms_score: 0.4890
|
| 242 |
+
2025-09-26 06:48:31,017 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0360 | Val rms_score: 0.4883
|
| 243 |
+
2025-09-26 06:48:38,531 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0299 | Val rms_score: 0.4920
|
| 244 |
+
2025-09-26 06:48:47,289 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0289 | Val rms_score: 0.4831
|
| 245 |
+
2025-09-26 06:48:56,904 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0303 | Val rms_score: 0.4888
|
| 246 |
+
2025-09-26 06:49:04,271 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0264 | Val rms_score: 0.4856
|
| 247 |
+
2025-09-26 06:49:14,549 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0242 | Val rms_score: 0.4834
|
| 248 |
+
2025-09-26 06:49:25,128 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0266 | Val rms_score: 0.4842
|
| 249 |
+
2025-09-26 06:49:35,300 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0228 | Val rms_score: 0.4940
|
| 250 |
+
2025-09-26 06:49:42,721 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0236 | Val rms_score: 0.4902
|
| 251 |
+
2025-09-26 06:49:54,544 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0234 | Val rms_score: 0.4907
|
| 252 |
+
2025-09-26 06:50:04,329 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0233 | Val rms_score: 0.4843
|
| 253 |
+
2025-09-26 06:50:12,132 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0233 | Val rms_score: 0.4860
|
| 254 |
+
2025-09-26 06:50:21,947 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0236 | Val rms_score: 0.4885
|
| 255 |
+
2025-09-26 06:50:31,661 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0207 | Val rms_score: 0.4850
|
| 256 |
+
2025-09-26 06:50:39,170 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0218 | Val rms_score: 0.4842
|
| 257 |
+
2025-09-26 06:50:48,611 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0210 | Val rms_score: 0.4884
|
| 258 |
+
2025-09-26 06:50:58,032 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0248 | Val rms_score: 0.4935
|
| 259 |
+
2025-09-26 06:51:04,437 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0209 | Val rms_score: 0.4873
|
| 260 |
+
2025-09-26 06:51:13,608 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0197 | Val rms_score: 0.4851
|
| 261 |
+
2025-09-26 06:51:22,982 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0187 | Val rms_score: 0.4882
|
| 262 |
+
2025-09-26 06:51:32,044 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0194 | Val rms_score: 0.4806
|
| 263 |
+
2025-09-26 06:51:39,483 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0191 | Val rms_score: 0.4843
|
| 264 |
+
2025-09-26 06:51:48,666 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0196 | Val rms_score: 0.4926
|
| 265 |
+
2025-09-26 06:51:58,314 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0196 | Val rms_score: 0.4870
|
| 266 |
+
2025-09-26 06:52:06,448 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0173 | Val rms_score: 0.4844
|
| 267 |
+
2025-09-26 06:52:16,091 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0158 | Val rms_score: 0.4837
|
| 268 |
+
2025-09-26 06:52:25,109 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0172 | Val rms_score: 0.4916
|
| 269 |
+
2025-09-26 06:52:34,011 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0172 | Val rms_score: 0.4874
|
| 270 |
+
2025-09-26 06:52:40,669 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0187 | Val rms_score: 0.4871
|
| 271 |
+
2025-09-26 06:52:49,843 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0170 | Val rms_score: 0.4929
|
| 272 |
+
2025-09-26 06:52:59,133 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0168 | Val rms_score: 0.4899
|
| 273 |
+
2025-09-26 06:53:07,310 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0160 | Val rms_score: 0.4920
|
| 274 |
+
2025-09-26 06:53:17,469 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0133 | Val rms_score: 0.4870
|
| 275 |
+
2025-09-26 06:53:28,022 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0164 | Val rms_score: 0.4878
|
| 276 |
+
2025-09-26 06:53:35,775 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0172 | Val rms_score: 0.4882
|
| 277 |
+
2025-09-26 06:53:46,132 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0139 | Val rms_score: 0.4873
|
| 278 |
+
2025-09-26 06:53:56,592 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0165 | Val rms_score: 0.4925
|
| 279 |
+
2025-09-26 06:54:03,227 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0164 | Val rms_score: 0.4832
|
| 280 |
+
2025-09-26 06:54:13,741 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0173 | Val rms_score: 0.4837
|
| 281 |
+
2025-09-26 06:54:23,734 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0160 | Val rms_score: 0.4909
|
| 282 |
+
2025-09-26 06:54:31,467 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0160 | Val rms_score: 0.4859
|
| 283 |
+
2025-09-26 06:54:41,577 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0145 | Val rms_score: 0.4842
|
| 284 |
+
2025-09-26 06:54:51,403 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0154 | Val rms_score: 0.4842
|
| 285 |
+
2025-09-26 06:55:00,368 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0153 | Val rms_score: 0.4851
|
| 286 |
+
2025-09-26 06:55:07,249 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0162 | Val rms_score: 0.4815
|
| 287 |
+
2025-09-26 06:55:16,662 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0149 | Val rms_score: 0.4882
|
| 288 |
+
2025-09-26 06:55:26,630 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0155 | Val rms_score: 0.4845
|
| 289 |
+
2025-09-26 06:55:33,516 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0149 | Val rms_score: 0.4862
|
| 290 |
+
2025-09-26 06:55:42,672 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0151 | Val rms_score: 0.4858
|
| 291 |
+
2025-09-26 06:55:51,902 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0141 | Val rms_score: 0.4847
|
| 292 |
+
2025-09-26 06:56:01,263 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0163 | Val rms_score: 0.4889
|
| 293 |
+
2025-09-26 06:56:08,522 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0146 | Val rms_score: 0.4879
|
| 294 |
+
2025-09-26 06:56:18,395 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0145 | Val rms_score: 0.4910
|
| 295 |
+
2025-09-26 06:56:29,723 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0151 | Val rms_score: 0.4845
|
| 296 |
+
2025-09-26 06:56:36,360 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0146 | Val rms_score: 0.4885
|
| 297 |
+
2025-09-26 06:56:45,422 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0134 | Val rms_score: 0.4860
|
| 298 |
+
2025-09-26 06:56:55,056 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0138 | Val rms_score: 0.4839
|
| 299 |
+
2025-09-26 06:57:01,889 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0141 | Val rms_score: 0.4867
|
| 300 |
+
2025-09-26 06:57:11,429 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0138 | Val rms_score: 0.4847
|
| 301 |
+
2025-09-26 06:57:21,287 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0147 | Val rms_score: 0.4865
|
| 302 |
+
2025-09-26 06:57:31,662 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0149 | Val rms_score: 0.4891
|
| 303 |
+
2025-09-26 06:57:40,926 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0144 | Val rms_score: 0.4878
|
| 304 |
+
2025-09-26 06:57:51,704 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0133 | Val rms_score: 0.4822
|
| 305 |
+
2025-09-26 06:58:01,209 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0132 | Val rms_score: 0.4824
|
| 306 |
+
2025-09-26 06:58:10,536 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0134 | Val rms_score: 0.4876
|
| 307 |
+
2025-09-26 06:58:16,506 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0126 | Val rms_score: 0.4894
|
| 308 |
+
2025-09-26 06:58:26,606 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0146 | Val rms_score: 0.4913
|
| 309 |
+
2025-09-26 06:58:36,454 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0128 | Val rms_score: 0.4894
|
| 310 |
+
2025-09-26 06:58:45,394 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0132 | Val rms_score: 0.4849
|
| 311 |
+
2025-09-26 06:58:54,875 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0134 | Val rms_score: 0.4890
|
| 312 |
+
2025-09-26 06:59:04,364 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0132 | Val rms_score: 0.4875
|
| 313 |
+
2025-09-26 06:59:13,112 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0139 | Val rms_score: 0.4877
|
| 314 |
+
2025-09-26 06:59:23,816 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0127 | Val rms_score: 0.4861
|
| 315 |
+
2025-09-26 06:59:33,203 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0130 | Val rms_score: 0.4874
|
| 316 |
+
2025-09-26 06:59:42,924 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0128 | Val rms_score: 0.4898
|
| 317 |
+
2025-09-26 06:59:50,642 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0133 | Val rms_score: 0.4865
|
| 318 |
+
2025-09-26 07:00:00,677 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0128 | Val rms_score: 0.4843
|
| 319 |
+
2025-09-26 07:00:09,970 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0135 | Val rms_score: 0.4880
|
| 320 |
+
2025-09-26 07:00:17,299 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0142 | Val rms_score: 0.4866
|
| 321 |
+
2025-09-26 07:00:27,521 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0122 | Val rms_score: 0.4853
|
| 322 |
+
2025-09-26 07:00:28,456 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Test rms_score: 0.4345
|
| 323 |
+
2025-09-26 07:00:28,853 - logs_modchembert_adme_microsom_stab_r_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.4400, Std Dev: 0.0039
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_permeability_epochs100_batch_size32_20250926_070028.log
ADDED
|
@@ -0,0 +1,343 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 07:00:28,898 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Running benchmark for dataset: adme_permeability
|
| 2 |
+
2025-09-26 07:00:28,899 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - dataset: adme_permeability, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 07:00:28,906 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset adme_permeability at 2025-09-26_07-00-28
|
| 4 |
+
2025-09-26 07:00:36,324 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5784 | Val rms_score: 0.4191
|
| 5 |
+
2025-09-26 07:00:36,324 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 67
|
| 6 |
+
2025-09-26 07:00:37,089 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4191
|
| 7 |
+
2025-09-26 07:00:45,219 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.3364 | Val rms_score: 0.3859
|
| 8 |
+
2025-09-26 07:00:45,423 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 134
|
| 9 |
+
2025-09-26 07:00:46,177 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3859
|
| 10 |
+
2025-09-26 07:00:55,944 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.0527 | Val rms_score: 0.3823
|
| 11 |
+
2025-09-26 07:00:56,192 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 201
|
| 12 |
+
2025-09-26 07:00:56,825 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.3823
|
| 13 |
+
2025-09-26 07:01:06,131 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1978 | Val rms_score: 0.3872
|
| 14 |
+
2025-09-26 07:01:12,449 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1464 | Val rms_score: 0.3854
|
| 15 |
+
2025-09-26 07:01:21,333 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0552 | Val rms_score: 0.3822
|
| 16 |
+
2025-09-26 07:01:21,944 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 402
|
| 17 |
+
2025-09-26 07:01:22,704 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 6 with val rms_score: 0.3822
|
| 18 |
+
2025-09-26 07:01:31,823 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0951 | Val rms_score: 0.3758
|
| 19 |
+
2025-09-26 07:01:32,045 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 469
|
| 20 |
+
2025-09-26 07:01:32,740 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val rms_score: 0.3758
|
| 21 |
+
2025-09-26 07:01:41,957 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0760 | Val rms_score: 0.3790
|
| 22 |
+
2025-09-26 07:01:48,034 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0485 | Val rms_score: 0.3810
|
| 23 |
+
2025-09-26 07:01:57,327 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0606 | Val rms_score: 0.3808
|
| 24 |
+
2025-09-26 07:02:07,131 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0600 | Val rms_score: 0.3996
|
| 25 |
+
2025-09-26 07:02:14,706 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0356 | Val rms_score: 0.3847
|
| 26 |
+
2025-09-26 07:02:23,479 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0452 | Val rms_score: 0.3737
|
| 27 |
+
2025-09-26 07:02:23,648 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 871
|
| 28 |
+
2025-09-26 07:02:24,488 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 13 with val rms_score: 0.3737
|
| 29 |
+
2025-09-26 07:02:33,724 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0452 | Val rms_score: 0.4321
|
| 30 |
+
2025-09-26 07:02:41,716 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0672 | Val rms_score: 0.3875
|
| 31 |
+
2025-09-26 07:02:51,464 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0485 | Val rms_score: 0.3881
|
| 32 |
+
2025-09-26 07:03:01,197 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0441 | Val rms_score: 0.3868
|
| 33 |
+
2025-09-26 07:03:10,448 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0311 | Val rms_score: 0.3852
|
| 34 |
+
2025-09-26 07:03:17,292 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0310 | Val rms_score: 0.3821
|
| 35 |
+
2025-09-26 07:03:25,995 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0314 | Val rms_score: 0.3841
|
| 36 |
+
2025-09-26 07:03:35,014 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0299 | Val rms_score: 0.3852
|
| 37 |
+
2025-09-26 07:03:42,447 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0261 | Val rms_score: 0.3792
|
| 38 |
+
2025-09-26 07:03:51,344 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0278 | Val rms_score: 0.3831
|
| 39 |
+
2025-09-26 07:04:00,515 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0242 | Val rms_score: 0.3884
|
| 40 |
+
2025-09-26 07:04:10,326 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0231 | Val rms_score: 0.3852
|
| 41 |
+
2025-09-26 07:04:17,802 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0240 | Val rms_score: 0.3864
|
| 42 |
+
2025-09-26 07:04:27,686 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0234 | Val rms_score: 0.3903
|
| 43 |
+
2025-09-26 07:04:37,387 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0315 | Val rms_score: 0.3789
|
| 44 |
+
2025-09-26 07:04:45,005 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0233 | Val rms_score: 0.3843
|
| 45 |
+
2025-09-26 07:04:55,781 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0198 | Val rms_score: 0.3813
|
| 46 |
+
2025-09-26 07:05:04,874 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0208 | Val rms_score: 0.3840
|
| 47 |
+
2025-09-26 07:05:12,612 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0202 | Val rms_score: 0.3865
|
| 48 |
+
2025-09-26 07:05:21,520 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0162 | Val rms_score: 0.3848
|
| 49 |
+
2025-09-26 07:05:30,902 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0182 | Val rms_score: 0.3793
|
| 50 |
+
2025-09-26 07:05:40,547 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0175 | Val rms_score: 0.3784
|
| 51 |
+
2025-09-26 07:05:47,957 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0176 | Val rms_score: 0.3815
|
| 52 |
+
2025-09-26 07:05:58,508 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0161 | Val rms_score: 0.3763
|
| 53 |
+
2025-09-26 07:06:07,721 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0162 | Val rms_score: 0.3797
|
| 54 |
+
2025-09-26 07:06:14,482 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0186 | Val rms_score: 0.3754
|
| 55 |
+
2025-09-26 07:06:23,783 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0160 | Val rms_score: 0.3755
|
| 56 |
+
2025-09-26 07:06:33,164 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0168 | Val rms_score: 0.3810
|
| 57 |
+
2025-09-26 07:06:40,948 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0151 | Val rms_score: 0.3798
|
| 58 |
+
2025-09-26 07:06:50,701 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0149 | Val rms_score: 0.3783
|
| 59 |
+
2025-09-26 07:07:00,322 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0143 | Val rms_score: 0.3778
|
| 60 |
+
2025-09-26 07:07:11,023 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0140 | Val rms_score: 0.3853
|
| 61 |
+
2025-09-26 07:07:18,067 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0146 | Val rms_score: 0.3783
|
| 62 |
+
2025-09-26 07:07:28,174 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0139 | Val rms_score: 0.3782
|
| 63 |
+
2025-09-26 07:07:37,557 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0125 | Val rms_score: 0.3814
|
| 64 |
+
2025-09-26 07:07:44,299 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0153 | Val rms_score: 0.3826
|
| 65 |
+
2025-09-26 07:07:53,705 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0138 | Val rms_score: 0.3789
|
| 66 |
+
2025-09-26 07:08:02,864 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0140 | Val rms_score: 0.3790
|
| 67 |
+
2025-09-26 07:08:10,236 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0136 | Val rms_score: 0.3755
|
| 68 |
+
2025-09-26 07:08:19,638 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0141 | Val rms_score: 0.3753
|
| 69 |
+
2025-09-26 07:08:28,887 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0145 | Val rms_score: 0.3785
|
| 70 |
+
2025-09-26 07:08:38,047 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0166 | Val rms_score: 0.3772
|
| 71 |
+
2025-09-26 07:08:44,749 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0162 | Val rms_score: 0.3838
|
| 72 |
+
2025-09-26 07:08:54,656 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0144 | Val rms_score: 0.3795
|
| 73 |
+
2025-09-26 07:09:03,525 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0133 | Val rms_score: 0.3742
|
| 74 |
+
2025-09-26 07:09:09,712 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0138 | Val rms_score: 0.3762
|
| 75 |
+
2025-09-26 07:09:19,866 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0115 | Val rms_score: 0.3784
|
| 76 |
+
2025-09-26 07:09:29,185 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0129 | Val rms_score: 0.3775
|
| 77 |
+
2025-09-26 07:09:38,862 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0111 | Val rms_score: 0.3790
|
| 78 |
+
2025-09-26 07:09:44,588 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0123 | Val rms_score: 0.3787
|
| 79 |
+
2025-09-26 07:09:53,130 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0123 | Val rms_score: 0.3789
|
| 80 |
+
2025-09-26 07:10:02,464 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0123 | Val rms_score: 0.3813
|
| 81 |
+
2025-09-26 07:10:08,995 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0119 | Val rms_score: 0.3783
|
| 82 |
+
2025-09-26 07:10:18,876 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0121 | Val rms_score: 0.3786
|
| 83 |
+
2025-09-26 07:10:27,690 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0110 | Val rms_score: 0.3772
|
| 84 |
+
2025-09-26 07:10:36,568 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0116 | Val rms_score: 0.3730
|
| 85 |
+
2025-09-26 07:10:36,752 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 4623
|
| 86 |
+
2025-09-26 07:10:37,638 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 69 with val rms_score: 0.3730
|
| 87 |
+
2025-09-26 07:10:44,063 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0114 | Val rms_score: 0.3766
|
| 88 |
+
2025-09-26 07:10:53,203 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0118 | Val rms_score: 0.3767
|
| 89 |
+
2025-09-26 07:11:03,051 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0103 | Val rms_score: 0.3755
|
| 90 |
+
2025-09-26 07:11:09,394 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0116 | Val rms_score: 0.3751
|
| 91 |
+
2025-09-26 07:11:18,331 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0111 | Val rms_score: 0.3761
|
| 92 |
+
2025-09-26 07:11:28,488 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0110 | Val rms_score: 0.3784
|
| 93 |
+
2025-09-26 07:11:37,316 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0109 | Val rms_score: 0.3787
|
| 94 |
+
2025-09-26 07:11:44,659 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0105 | Val rms_score: 0.3774
|
| 95 |
+
2025-09-26 07:11:54,235 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0104 | Val rms_score: 0.3763
|
| 96 |
+
2025-09-26 07:12:03,140 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0110 | Val rms_score: 0.3791
|
| 97 |
+
2025-09-26 07:12:10,287 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0128 | Val rms_score: 0.3732
|
| 98 |
+
2025-09-26 07:12:19,756 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0114 | Val rms_score: 0.3749
|
| 99 |
+
2025-09-26 07:12:29,716 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0112 | Val rms_score: 0.3804
|
| 100 |
+
2025-09-26 07:12:36,491 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0108 | Val rms_score: 0.3781
|
| 101 |
+
2025-09-26 07:12:45,800 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0111 | Val rms_score: 0.3764
|
| 102 |
+
2025-09-26 07:12:55,031 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0103 | Val rms_score: 0.3765
|
| 103 |
+
2025-09-26 07:13:04,470 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0104 | Val rms_score: 0.3764
|
| 104 |
+
2025-09-26 07:13:12,066 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0102 | Val rms_score: 0.3754
|
| 105 |
+
2025-09-26 07:13:21,482 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0101 | Val rms_score: 0.3785
|
| 106 |
+
2025-09-26 07:13:30,763 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0112 | Val rms_score: 0.3795
|
| 107 |
+
2025-09-26 07:13:39,385 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0101 | Val rms_score: 0.3785
|
| 108 |
+
2025-09-26 07:13:49,506 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0108 | Val rms_score: 0.3790
|
| 109 |
+
2025-09-26 07:13:59,703 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0109 | Val rms_score: 0.3777
|
| 110 |
+
2025-09-26 07:14:06,440 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0097 | Val rms_score: 0.3762
|
| 111 |
+
2025-09-26 07:14:16,493 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0102 | Val rms_score: 0.3764
|
| 112 |
+
2025-09-26 07:14:26,104 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0099 | Val rms_score: 0.3756
|
| 113 |
+
2025-09-26 07:14:35,360 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0093 | Val rms_score: 0.3768
|
| 114 |
+
2025-09-26 07:14:42,721 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0097 | Val rms_score: 0.3770
|
| 115 |
+
2025-09-26 07:14:51,989 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0102 | Val rms_score: 0.3787
|
| 116 |
+
2025-09-26 07:15:01,303 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0105 | Val rms_score: 0.3750
|
| 117 |
+
2025-09-26 07:15:07,987 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0103 | Val rms_score: 0.3770
|
| 118 |
+
2025-09-26 07:15:08,861 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Test rms_score: 0.4956
|
| 119 |
+
2025-09-26 07:15:09,370 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset adme_permeability at 2025-09-26_07-15-09
|
| 120 |
+
2025-09-26 07:15:17,234 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5224 | Val rms_score: 0.4444
|
| 121 |
+
2025-09-26 07:15:17,234 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 67
|
| 122 |
+
2025-09-26 07:15:17,968 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4444
|
| 123 |
+
2025-09-26 07:15:27,428 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.3309 | Val rms_score: 0.3877
|
| 124 |
+
2025-09-26 07:15:27,639 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 134
|
| 125 |
+
2025-09-26 07:15:28,372 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3877
|
| 126 |
+
2025-09-26 07:15:37,452 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.0137 | Val rms_score: 0.3770
|
| 127 |
+
2025-09-26 07:15:35,196 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 201
|
| 128 |
+
2025-09-26 07:15:36,007 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.3770
|
| 129 |
+
2025-09-26 07:15:45,884 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1800 | Val rms_score: 0.3895
|
| 130 |
+
2025-09-26 07:15:54,897 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1536 | Val rms_score: 0.4444
|
| 131 |
+
2025-09-26 07:16:04,096 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0801 | Val rms_score: 0.3896
|
| 132 |
+
2025-09-26 07:16:11,766 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0993 | Val rms_score: 0.3879
|
| 133 |
+
2025-09-26 07:16:20,982 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0924 | Val rms_score: 0.3610
|
| 134 |
+
2025-09-26 07:16:21,182 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 536
|
| 135 |
+
2025-09-26 07:16:22,044 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val rms_score: 0.3610
|
| 136 |
+
2025-09-26 07:16:31,426 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0801 | Val rms_score: 0.3586
|
| 137 |
+
2025-09-26 07:16:31,739 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 603
|
| 138 |
+
2025-09-26 07:16:32,513 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 9 with val rms_score: 0.3586
|
| 139 |
+
2025-09-26 07:16:39,616 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0700 | Val rms_score: 0.3613
|
| 140 |
+
2025-09-26 07:16:48,839 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0591 | Val rms_score: 0.3916
|
| 141 |
+
2025-09-26 07:16:59,164 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0459 | Val rms_score: 0.3847
|
| 142 |
+
2025-09-26 07:17:06,305 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0525 | Val rms_score: 0.3831
|
| 143 |
+
2025-09-26 07:17:16,221 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0520 | Val rms_score: 0.3709
|
| 144 |
+
2025-09-26 07:17:27,049 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0361 | Val rms_score: 0.3748
|
| 145 |
+
2025-09-26 07:17:36,303 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0389 | Val rms_score: 0.3778
|
| 146 |
+
2025-09-26 07:17:43,265 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0383 | Val rms_score: 0.3696
|
| 147 |
+
2025-09-26 07:17:51,996 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0249 | Val rms_score: 0.3780
|
| 148 |
+
2025-09-26 07:18:01,157 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0322 | Val rms_score: 0.3696
|
| 149 |
+
2025-09-26 07:18:07,921 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0314 | Val rms_score: 0.3738
|
| 150 |
+
2025-09-26 07:18:17,668 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0312 | Val rms_score: 0.3735
|
| 151 |
+
2025-09-26 07:18:27,548 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0286 | Val rms_score: 0.3936
|
| 152 |
+
2025-09-26 07:18:34,234 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0394 | Val rms_score: 0.3756
|
| 153 |
+
2025-09-26 07:18:43,333 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0281 | Val rms_score: 0.3848
|
| 154 |
+
2025-09-26 07:18:52,636 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0279 | Val rms_score: 0.3752
|
| 155 |
+
2025-09-26 07:19:02,539 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0255 | Val rms_score: 0.3703
|
| 156 |
+
2025-09-26 07:19:10,168 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0239 | Val rms_score: 0.3705
|
| 157 |
+
2025-09-26 07:19:19,939 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0368 | Val rms_score: 0.3783
|
| 158 |
+
2025-09-26 07:19:29,388 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0263 | Val rms_score: 0.3740
|
| 159 |
+
2025-09-26 07:19:37,130 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0297 | Val rms_score: 0.3738
|
| 160 |
+
2025-09-26 07:19:46,421 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0259 | Val rms_score: 0.3746
|
| 161 |
+
2025-09-26 07:19:56,094 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0209 | Val rms_score: 0.3725
|
| 162 |
+
2025-09-26 07:20:05,098 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0188 | Val rms_score: 0.3724
|
| 163 |
+
2025-09-26 07:20:12,102 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0201 | Val rms_score: 0.3728
|
| 164 |
+
2025-09-26 07:20:21,311 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0182 | Val rms_score: 0.3729
|
| 165 |
+
2025-09-26 07:20:30,504 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0154 | Val rms_score: 0.3708
|
| 166 |
+
2025-09-26 07:20:38,464 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0166 | Val rms_score: 0.3737
|
| 167 |
+
2025-09-26 07:20:48,030 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0182 | Val rms_score: 0.3702
|
| 168 |
+
2025-09-26 07:20:57,087 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0176 | Val rms_score: 0.3735
|
| 169 |
+
2025-09-26 07:21:04,068 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0194 | Val rms_score: 0.3722
|
| 170 |
+
2025-09-26 07:21:13,941 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0168 | Val rms_score: 0.3713
|
| 171 |
+
2025-09-26 07:21:23,748 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0171 | Val rms_score: 0.3726
|
| 172 |
+
2025-09-26 07:21:33,180 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0157 | Val rms_score: 0.3707
|
| 173 |
+
2025-09-26 07:21:40,261 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0164 | Val rms_score: 0.3716
|
| 174 |
+
2025-09-26 07:21:51,746 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0152 | Val rms_score: 0.3704
|
| 175 |
+
2025-09-26 07:22:01,919 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0152 | Val rms_score: 0.3712
|
| 176 |
+
2025-09-26 07:22:09,195 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0148 | Val rms_score: 0.3711
|
| 177 |
+
2025-09-26 07:22:18,826 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0157 | Val rms_score: 0.3727
|
| 178 |
+
2025-09-26 07:22:28,402 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0139 | Val rms_score: 0.3710
|
| 179 |
+
2025-09-26 07:22:35,057 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0145 | Val rms_score: 0.3704
|
| 180 |
+
2025-09-26 07:22:44,727 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0149 | Val rms_score: 0.3712
|
| 181 |
+
2025-09-26 07:22:54,465 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0142 | Val rms_score: 0.3723
|
| 182 |
+
2025-09-26 07:23:03,617 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0138 | Val rms_score: 0.3743
|
| 183 |
+
2025-09-26 07:23:10,625 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0133 | Val rms_score: 0.3704
|
| 184 |
+
2025-09-26 07:23:20,913 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0131 | Val rms_score: 0.3741
|
| 185 |
+
2025-09-26 07:23:31,143 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0142 | Val rms_score: 0.3690
|
| 186 |
+
2025-09-26 07:23:38,764 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0126 | Val rms_score: 0.3701
|
| 187 |
+
2025-09-26 07:23:48,423 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0128 | Val rms_score: 0.3728
|
| 188 |
+
2025-09-26 07:23:57,991 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0147 | Val rms_score: 0.3689
|
| 189 |
+
2025-09-26 07:24:06,442 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0126 | Val rms_score: 0.3701
|
| 190 |
+
2025-09-26 07:24:16,045 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0141 | Val rms_score: 0.3742
|
| 191 |
+
2025-09-26 07:24:26,496 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0120 | Val rms_score: 0.3689
|
| 192 |
+
2025-09-26 07:24:33,240 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0126 | Val rms_score: 0.3712
|
| 193 |
+
2025-09-26 07:24:42,643 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0120 | Val rms_score: 0.3701
|
| 194 |
+
2025-09-26 07:24:52,172 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0114 | Val rms_score: 0.3700
|
| 195 |
+
2025-09-26 07:25:01,908 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0131 | Val rms_score: 0.3680
|
| 196 |
+
2025-09-26 07:25:09,663 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0127 | Val rms_score: 0.3686
|
| 197 |
+
2025-09-26 07:25:19,609 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0121 | Val rms_score: 0.3699
|
| 198 |
+
2025-09-26 07:25:29,662 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0115 | Val rms_score: 0.3712
|
| 199 |
+
2025-09-26 07:25:36,712 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0116 | Val rms_score: 0.3716
|
| 200 |
+
2025-09-26 07:25:46,698 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0120 | Val rms_score: 0.3693
|
| 201 |
+
2025-09-26 07:25:55,930 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0125 | Val rms_score: 0.3674
|
| 202 |
+
2025-09-26 07:26:02,438 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0115 | Val rms_score: 0.3722
|
| 203 |
+
2025-09-26 07:26:11,208 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0116 | Val rms_score: 0.3737
|
| 204 |
+
2025-09-26 07:26:21,779 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0108 | Val rms_score: 0.3690
|
| 205 |
+
2025-09-26 07:26:30,597 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0122 | Val rms_score: 0.3665
|
| 206 |
+
2025-09-26 07:26:37,554 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0166 | Val rms_score: 0.3732
|
| 207 |
+
2025-09-26 07:26:46,343 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0131 | Val rms_score: 0.3729
|
| 208 |
+
2025-09-26 07:26:55,688 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0138 | Val rms_score: 0.3699
|
| 209 |
+
2025-09-26 07:27:02,389 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0113 | Val rms_score: 0.3675
|
| 210 |
+
2025-09-26 07:27:11,836 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0125 | Val rms_score: 0.3661
|
| 211 |
+
2025-09-26 07:27:21,715 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0121 | Val rms_score: 0.3665
|
| 212 |
+
2025-09-26 07:27:31,063 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0124 | Val rms_score: 0.3678
|
| 213 |
+
2025-09-26 07:27:38,098 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0115 | Val rms_score: 0.3654
|
| 214 |
+
2025-09-26 07:27:47,875 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0115 | Val rms_score: 0.3669
|
| 215 |
+
2025-09-26 07:27:58,186 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0112 | Val rms_score: 0.3668
|
| 216 |
+
2025-09-26 07:28:06,158 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0105 | Val rms_score: 0.3653
|
| 217 |
+
2025-09-26 07:28:15,229 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0108 | Val rms_score: 0.3653
|
| 218 |
+
2025-09-26 07:28:24,717 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0103 | Val rms_score: 0.3672
|
| 219 |
+
2025-09-26 07:28:32,515 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0101 | Val rms_score: 0.3659
|
| 220 |
+
2025-09-26 07:28:41,609 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0101 | Val rms_score: 0.3704
|
| 221 |
+
2025-09-26 07:28:51,637 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0144 | Val rms_score: 0.3593
|
| 222 |
+
2025-09-26 07:29:01,202 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0106 | Val rms_score: 0.3629
|
| 223 |
+
2025-09-26 07:29:08,798 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0115 | Val rms_score: 0.3635
|
| 224 |
+
2025-09-26 07:29:19,080 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0106 | Val rms_score: 0.3642
|
| 225 |
+
2025-09-26 07:29:29,064 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0109 | Val rms_score: 0.3658
|
| 226 |
+
2025-09-26 07:29:36,925 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0106 | Val rms_score: 0.3667
|
| 227 |
+
2025-09-26 07:29:46,439 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0108 | Val rms_score: 0.3642
|
| 228 |
+
2025-09-26 07:29:55,880 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0104 | Val rms_score: 0.3669
|
| 229 |
+
2025-09-26 07:30:02,468 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0106 | Val rms_score: 0.3663
|
| 230 |
+
2025-09-26 07:30:03,293 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Test rms_score: 0.4804
|
| 231 |
+
2025-09-26 07:30:03,759 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset adme_permeability at 2025-09-26_07-30-03
|
| 232 |
+
2025-09-26 07:30:11,780 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.5112 | Val rms_score: 0.4157
|
| 233 |
+
2025-09-26 07:30:11,780 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 67
|
| 234 |
+
2025-09-26 07:30:12,620 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4157
|
| 235 |
+
2025-09-26 07:30:21,821 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.2978 | Val rms_score: 0.3889
|
| 236 |
+
2025-09-26 07:30:22,037 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 134
|
| 237 |
+
2025-09-26 07:30:22,697 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3889
|
| 238 |
+
2025-09-26 07:30:29,331 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.0498 | Val rms_score: 0.3880
|
| 239 |
+
2025-09-26 07:30:29,605 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 201
|
| 240 |
+
2025-09-26 07:30:30,253 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.3880
|
| 241 |
+
2025-09-26 07:30:40,647 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2052 | Val rms_score: 0.3793
|
| 242 |
+
2025-09-26 07:30:40,862 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 268
|
| 243 |
+
2025-09-26 07:30:41,595 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.3793
|
| 244 |
+
2025-09-26 07:30:50,468 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1500 | Val rms_score: 0.3810
|
| 245 |
+
2025-09-26 07:30:59,930 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0349 | Val rms_score: 0.3810
|
| 246 |
+
2025-09-26 07:31:06,775 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0970 | Val rms_score: 0.3720
|
| 247 |
+
2025-09-26 07:31:06,985 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Global step of best model: 469
|
| 248 |
+
2025-09-26 07:31:07,623 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val rms_score: 0.3720
|
| 249 |
+
2025-09-26 07:31:16,426 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0738 | Val rms_score: 0.3762
|
| 250 |
+
2025-09-26 07:31:25,814 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0690 | Val rms_score: 0.3905
|
| 251 |
+
2025-09-26 07:31:32,001 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0928 | Val rms_score: 0.3875
|
| 252 |
+
2025-09-26 07:31:40,664 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0790 | Val rms_score: 0.3970
|
| 253 |
+
2025-09-26 07:31:50,463 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0923 | Val rms_score: 0.3874
|
| 254 |
+
2025-09-26 07:31:59,196 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0784 | Val rms_score: 0.3907
|
| 255 |
+
2025-09-26 07:32:05,006 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0526 | Val rms_score: 0.3903
|
| 256 |
+
2025-09-26 07:32:14,938 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0357 | Val rms_score: 0.3945
|
| 257 |
+
2025-09-26 07:32:24,420 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0431 | Val rms_score: 0.3844
|
| 258 |
+
2025-09-26 07:32:31,204 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0421 | Val rms_score: 0.3938
|
| 259 |
+
2025-09-26 07:32:39,329 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0439 | Val rms_score: 0.3881
|
| 260 |
+
2025-09-26 07:32:48,170 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0347 | Val rms_score: 0.3890
|
| 261 |
+
2025-09-26 07:32:56,683 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0312 | Val rms_score: 0.3867
|
| 262 |
+
2025-09-26 07:33:03,503 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0213 | Val rms_score: 0.3913
|
| 263 |
+
2025-09-26 07:33:13,552 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0301 | Val rms_score: 0.3884
|
| 264 |
+
2025-09-26 07:33:22,463 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0295 | Val rms_score: 0.3937
|
| 265 |
+
2025-09-26 07:33:29,671 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0292 | Val rms_score: 0.3882
|
| 266 |
+
2025-09-26 07:33:38,873 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0259 | Val rms_score: 0.3867
|
| 267 |
+
2025-09-26 07:33:48,667 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0255 | Val rms_score: 0.3859
|
| 268 |
+
2025-09-26 07:33:58,178 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0266 | Val rms_score: 0.3873
|
| 269 |
+
2025-09-26 07:34:05,966 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0234 | Val rms_score: 0.3896
|
| 270 |
+
2025-09-26 07:34:15,360 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0217 | Val rms_score: 0.3879
|
| 271 |
+
2025-09-26 07:34:26,091 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0186 | Val rms_score: 0.3853
|
| 272 |
+
2025-09-26 07:34:33,062 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0217 | Val rms_score: 0.3874
|
| 273 |
+
2025-09-26 07:34:42,714 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0214 | Val rms_score: 0.3860
|
| 274 |
+
2025-09-26 07:34:51,457 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0178 | Val rms_score: 0.3858
|
| 275 |
+
2025-09-26 07:34:58,292 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0194 | Val rms_score: 0.3908
|
| 276 |
+
2025-09-26 07:35:07,819 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0175 | Val rms_score: 0.3935
|
| 277 |
+
2025-09-26 07:35:17,760 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0254 | Val rms_score: 0.3978
|
| 278 |
+
2025-09-26 07:35:27,530 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0303 | Val rms_score: 0.3851
|
| 279 |
+
2025-09-26 07:35:34,438 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0219 | Val rms_score: 0.3837
|
| 280 |
+
2025-09-26 07:35:43,557 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0181 | Val rms_score: 0.3869
|
| 281 |
+
2025-09-26 07:35:52,992 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0175 | Val rms_score: 0.3870
|
| 282 |
+
2025-09-26 07:35:59,255 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0179 | Val rms_score: 0.3860
|
| 283 |
+
2025-09-26 07:36:09,050 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0157 | Val rms_score: 0.3834
|
| 284 |
+
2025-09-26 07:36:18,322 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0169 | Val rms_score: 0.3864
|
| 285 |
+
2025-09-26 07:36:27,413 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0160 | Val rms_score: 0.3829
|
| 286 |
+
2025-09-26 07:36:35,547 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0165 | Val rms_score: 0.3882
|
| 287 |
+
2025-09-26 07:36:45,303 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0159 | Val rms_score: 0.3836
|
| 288 |
+
2025-09-26 07:36:55,166 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0163 | Val rms_score: 0.3843
|
| 289 |
+
2025-09-26 07:37:01,804 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0164 | Val rms_score: 0.3837
|
| 290 |
+
2025-09-26 07:37:11,427 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0162 | Val rms_score: 0.3825
|
| 291 |
+
2025-09-26 07:37:20,890 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0155 | Val rms_score: 0.3937
|
| 292 |
+
2025-09-26 07:37:28,132 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0177 | Val rms_score: 0.3855
|
| 293 |
+
2025-09-26 07:37:37,543 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0174 | Val rms_score: 0.3861
|
| 294 |
+
2025-09-26 07:37:46,594 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0230 | Val rms_score: 0.3841
|
| 295 |
+
2025-09-26 07:37:55,346 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0186 | Val rms_score: 0.3828
|
| 296 |
+
2025-09-26 07:38:02,717 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0180 | Val rms_score: 0.3844
|
| 297 |
+
2025-09-26 07:38:11,833 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0172 | Val rms_score: 0.3817
|
| 298 |
+
2025-09-26 07:38:22,373 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0182 | Val rms_score: 0.3885
|
| 299 |
+
2025-09-26 07:38:29,173 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0187 | Val rms_score: 0.3857
|
| 300 |
+
2025-09-26 07:38:38,231 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0166 | Val rms_score: 0.3848
|
| 301 |
+
2025-09-26 07:38:49,102 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0170 | Val rms_score: 0.3817
|
| 302 |
+
2025-09-26 07:38:56,129 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0153 | Val rms_score: 0.3839
|
| 303 |
+
2025-09-26 07:39:05,625 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0140 | Val rms_score: 0.3827
|
| 304 |
+
2025-09-26 07:39:14,239 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0122 | Val rms_score: 0.3801
|
| 305 |
+
2025-09-26 07:39:23,067 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0138 | Val rms_score: 0.3809
|
| 306 |
+
2025-09-26 07:39:29,703 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0144 | Val rms_score: 0.3804
|
| 307 |
+
2025-09-26 07:39:39,425 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0157 | Val rms_score: 0.3818
|
| 308 |
+
2025-09-26 07:39:49,735 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0133 | Val rms_score: 0.3809
|
| 309 |
+
2025-09-26 07:39:56,924 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0126 | Val rms_score: 0.3789
|
| 310 |
+
2025-09-26 07:40:05,910 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0112 | Val rms_score: 0.3802
|
| 311 |
+
2025-09-26 07:40:15,541 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0120 | Val rms_score: 0.3780
|
| 312 |
+
2025-09-26 07:40:24,718 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0119 | Val rms_score: 0.3801
|
| 313 |
+
2025-09-26 07:40:32,560 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0133 | Val rms_score: 0.3820
|
| 314 |
+
2025-09-26 07:40:42,183 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0120 | Val rms_score: 0.3829
|
| 315 |
+
2025-09-26 07:40:50,963 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0125 | Val rms_score: 0.3793
|
| 316 |
+
2025-09-26 07:40:58,879 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0107 | Val rms_score: 0.3815
|
| 317 |
+
2025-09-26 07:41:07,882 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0130 | Val rms_score: 0.3862
|
| 318 |
+
2025-09-26 07:41:17,326 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0253 | Val rms_score: 0.3837
|
| 319 |
+
2025-09-26 07:41:23,784 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0159 | Val rms_score: 0.3781
|
| 320 |
+
2025-09-26 07:41:32,795 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0166 | Val rms_score: 0.3748
|
| 321 |
+
2025-09-26 07:41:41,522 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0152 | Val rms_score: 0.3760
|
| 322 |
+
2025-09-26 07:41:50,219 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0126 | Val rms_score: 0.3759
|
| 323 |
+
2025-09-26 07:41:57,048 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0132 | Val rms_score: 0.3789
|
| 324 |
+
2025-09-26 07:42:05,906 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0134 | Val rms_score: 0.3776
|
| 325 |
+
2025-09-26 07:42:14,729 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0115 | Val rms_score: 0.3769
|
| 326 |
+
2025-09-26 07:42:23,806 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0121 | Val rms_score: 0.3770
|
| 327 |
+
2025-09-26 07:42:30,642 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0125 | Val rms_score: 0.3800
|
| 328 |
+
2025-09-26 07:42:40,001 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0114 | Val rms_score: 0.3786
|
| 329 |
+
2025-09-26 07:42:49,472 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0121 | Val rms_score: 0.3779
|
| 330 |
+
2025-09-26 07:42:56,003 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0120 | Val rms_score: 0.3780
|
| 331 |
+
2025-09-26 07:43:06,060 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0118 | Val rms_score: 0.3800
|
| 332 |
+
2025-09-26 07:43:15,530 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0115 | Val rms_score: 0.3783
|
| 333 |
+
2025-09-26 07:43:22,463 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0111 | Val rms_score: 0.3768
|
| 334 |
+
2025-09-26 07:43:31,992 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0126 | Val rms_score: 0.3780
|
| 335 |
+
2025-09-26 07:43:41,066 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0115 | Val rms_score: 0.3783
|
| 336 |
+
2025-09-26 07:43:50,494 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0120 | Val rms_score: 0.3791
|
| 337 |
+
2025-09-26 07:43:57,043 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0118 | Val rms_score: 0.3778
|
| 338 |
+
2025-09-26 07:44:07,183 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0149 | Val rms_score: 0.3766
|
| 339 |
+
2025-09-26 07:44:17,103 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0118 | Val rms_score: 0.3768
|
| 340 |
+
2025-09-26 07:44:24,472 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0115 | Val rms_score: 0.3782
|
| 341 |
+
2025-09-26 07:44:34,052 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0111 | Val rms_score: 0.3764
|
| 342 |
+
2025-09-26 07:44:34,794 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Test rms_score: 0.4938
|
| 343 |
+
2025-09-26 07:44:35,283 - logs_modchembert_adme_permeability_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.4899, Std Dev: 0.0068
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_ppb_h_epochs100_batch_size32_20250926_075041.log
ADDED
|
@@ -0,0 +1,315 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 07:50:41,642 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Running benchmark for dataset: adme_ppb_h
|
| 2 |
+
2025-09-26 07:50:41,642 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - dataset: adme_ppb_h, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 07:50:41,650 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset adme_ppb_h at 2025-09-26_07-50-41
|
| 4 |
+
2025-09-26 07:50:49,499 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.8750 | Val rms_score: 0.5077
|
| 5 |
+
2025-09-26 07:50:49,499 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Global step of best model: 5
|
| 6 |
+
2025-09-26 07:50:50,307 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.5077
|
| 7 |
+
2025-09-26 07:50:52,548 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4250 | Val rms_score: 0.5747
|
| 8 |
+
2025-09-26 07:50:54,604 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2188 | Val rms_score: 0.6526
|
| 9 |
+
2025-09-26 07:50:56,788 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1781 | Val rms_score: 0.6461
|
| 10 |
+
2025-09-26 07:50:59,400 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1141 | Val rms_score: 0.6382
|
| 11 |
+
2025-09-26 07:51:01,760 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0867 | Val rms_score: 0.6298
|
| 12 |
+
2025-09-26 07:51:04,884 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0637 | Val rms_score: 0.6311
|
| 13 |
+
2025-09-26 07:51:07,805 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0443 | Val rms_score: 0.6493
|
| 14 |
+
2025-09-26 07:51:10,550 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0350 | Val rms_score: 0.6538
|
| 15 |
+
2025-09-26 07:51:13,471 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0254 | Val rms_score: 0.6713
|
| 16 |
+
2025-09-26 07:51:15,813 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0246 | Val rms_score: 0.6821
|
| 17 |
+
2025-09-26 07:51:18,893 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0170 | Val rms_score: 0.6609
|
| 18 |
+
2025-09-26 07:51:21,770 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0156 | Val rms_score: 0.6766
|
| 19 |
+
2025-09-26 07:51:24,731 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0141 | Val rms_score: 0.6746
|
| 20 |
+
2025-09-26 07:51:27,664 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0107 | Val rms_score: 0.6656
|
| 21 |
+
2025-09-26 07:51:30,236 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0104 | Val rms_score: 0.6706
|
| 22 |
+
2025-09-26 07:51:33,415 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0091 | Val rms_score: 0.6836
|
| 23 |
+
2025-09-26 07:51:36,156 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0082 | Val rms_score: 0.6771
|
| 24 |
+
2025-09-26 07:51:38,604 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0088 | Val rms_score: 0.6760
|
| 25 |
+
2025-09-26 07:51:41,075 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0090 | Val rms_score: 0.6835
|
| 26 |
+
2025-09-26 07:51:43,799 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0072 | Val rms_score: 0.6681
|
| 27 |
+
2025-09-26 07:51:46,697 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0074 | Val rms_score: 0.6731
|
| 28 |
+
2025-09-26 07:51:49,285 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0061 | Val rms_score: 0.6754
|
| 29 |
+
2025-09-26 07:51:52,237 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0064 | Val rms_score: 0.6774
|
| 30 |
+
2025-09-26 07:51:55,105 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0060 | Val rms_score: 0.6851
|
| 31 |
+
2025-09-26 07:51:58,060 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0057 | Val rms_score: 0.6791
|
| 32 |
+
2025-09-26 07:52:01,045 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0049 | Val rms_score: 0.6812
|
| 33 |
+
2025-09-26 07:52:03,721 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0058 | Val rms_score: 0.6800
|
| 34 |
+
2025-09-26 07:52:06,669 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0052 | Val rms_score: 0.6783
|
| 35 |
+
2025-09-26 07:52:09,321 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0054 | Val rms_score: 0.6825
|
| 36 |
+
2025-09-26 07:52:11,931 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0050 | Val rms_score: 0.6747
|
| 37 |
+
2025-09-26 07:52:14,924 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0047 | Val rms_score: 0.6773
|
| 38 |
+
2025-09-26 07:52:17,532 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0043 | Val rms_score: 0.6772
|
| 39 |
+
2025-09-26 07:52:20,132 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0039 | Val rms_score: 0.6760
|
| 40 |
+
2025-09-26 07:52:22,860 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0046 | Val rms_score: 0.6769
|
| 41 |
+
2025-09-26 07:52:25,577 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0047 | Val rms_score: 0.6786
|
| 42 |
+
2025-09-26 07:52:28,459 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0039 | Val rms_score: 0.6779
|
| 43 |
+
2025-09-26 07:52:31,163 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0049 | Val rms_score: 0.6858
|
| 44 |
+
2025-09-26 07:52:33,804 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0056 | Val rms_score: 0.6751
|
| 45 |
+
2025-09-26 07:52:36,555 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0052 | Val rms_score: 0.6795
|
| 46 |
+
2025-09-26 07:52:39,204 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0046 | Val rms_score: 0.6692
|
| 47 |
+
2025-09-26 07:52:42,304 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0050 | Val rms_score: 0.6881
|
| 48 |
+
2025-09-26 07:52:45,054 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0056 | Val rms_score: 0.6710
|
| 49 |
+
2025-09-26 07:52:47,854 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0040 | Val rms_score: 0.6727
|
| 50 |
+
2025-09-26 07:52:50,753 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0040 | Val rms_score: 0.6726
|
| 51 |
+
2025-09-26 07:52:53,385 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0042 | Val rms_score: 0.6768
|
| 52 |
+
2025-09-26 07:52:56,312 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0040 | Val rms_score: 0.6733
|
| 53 |
+
2025-09-26 07:52:58,958 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0044 | Val rms_score: 0.6802
|
| 54 |
+
2025-09-26 07:53:01,515 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0039 | Val rms_score: 0.6825
|
| 55 |
+
2025-09-26 07:53:04,217 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0041 | Val rms_score: 0.6670
|
| 56 |
+
2025-09-26 07:53:06,886 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0042 | Val rms_score: 0.6799
|
| 57 |
+
2025-09-26 07:53:09,990 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0045 | Val rms_score: 0.6702
|
| 58 |
+
2025-09-26 07:53:12,736 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0043 | Val rms_score: 0.6677
|
| 59 |
+
2025-09-26 07:53:15,639 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0031 | Val rms_score: 0.6711
|
| 60 |
+
2025-09-26 07:53:18,555 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0038 | Val rms_score: 0.6671
|
| 61 |
+
2025-09-26 07:53:21,420 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0042 | Val rms_score: 0.6658
|
| 62 |
+
2025-09-26 07:53:24,926 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0041 | Val rms_score: 0.6820
|
| 63 |
+
2025-09-26 07:53:27,707 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0042 | Val rms_score: 0.6619
|
| 64 |
+
2025-09-26 07:53:30,527 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0053 | Val rms_score: 0.6798
|
| 65 |
+
2025-09-26 07:53:33,324 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0051 | Val rms_score: 0.6677
|
| 66 |
+
2025-09-26 07:53:36,384 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0043 | Val rms_score: 0.6818
|
| 67 |
+
2025-09-26 07:53:39,460 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0054 | Val rms_score: 0.6705
|
| 68 |
+
2025-09-26 07:53:41,781 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0037 | Val rms_score: 0.6748
|
| 69 |
+
2025-09-26 07:53:44,049 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0043 | Val rms_score: 0.6715
|
| 70 |
+
2025-09-26 07:53:46,223 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0040 | Val rms_score: 0.6721
|
| 71 |
+
2025-09-26 07:53:48,641 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0034 | Val rms_score: 0.6663
|
| 72 |
+
2025-09-26 07:53:51,312 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0043 | Val rms_score: 0.6681
|
| 73 |
+
2025-09-26 07:53:53,618 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0041 | Val rms_score: 0.6735
|
| 74 |
+
2025-09-26 07:53:55,892 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0038 | Val rms_score: 0.6683
|
| 75 |
+
2025-09-26 07:53:58,391 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0040 | Val rms_score: 0.6711
|
| 76 |
+
2025-09-26 07:54:00,703 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0043 | Val rms_score: 0.6677
|
| 77 |
+
2025-09-26 07:54:03,162 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0048 | Val rms_score: 0.6687
|
| 78 |
+
2025-09-26 07:54:05,447 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0041 | Val rms_score: 0.6602
|
| 79 |
+
2025-09-26 07:54:07,656 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0048 | Val rms_score: 0.6805
|
| 80 |
+
2025-09-26 07:54:10,002 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0033 | Val rms_score: 0.6630
|
| 81 |
+
2025-09-26 07:54:12,317 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0046 | Val rms_score: 0.6723
|
| 82 |
+
2025-09-26 07:54:14,813 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0041 | Val rms_score: 0.6656
|
| 83 |
+
2025-09-26 07:54:17,158 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0038 | Val rms_score: 0.6710
|
| 84 |
+
2025-09-26 07:54:19,415 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0049 | Val rms_score: 0.6661
|
| 85 |
+
2025-09-26 07:54:21,688 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0038 | Val rms_score: 0.6675
|
| 86 |
+
2025-09-26 07:54:23,926 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0031 | Val rms_score: 0.6595
|
| 87 |
+
2025-09-26 07:54:26,440 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0034 | Val rms_score: 0.6637
|
| 88 |
+
2025-09-26 07:54:28,636 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0030 | Val rms_score: 0.6686
|
| 89 |
+
2025-09-26 07:54:30,817 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0032 | Val rms_score: 0.6618
|
| 90 |
+
2025-09-26 07:54:33,001 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0037 | Val rms_score: 0.6610
|
| 91 |
+
2025-09-26 07:54:35,208 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0039 | Val rms_score: 0.6684
|
| 92 |
+
2025-09-26 07:54:37,700 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0035 | Val rms_score: 0.6655
|
| 93 |
+
2025-09-26 07:54:39,968 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0034 | Val rms_score: 0.6678
|
| 94 |
+
2025-09-26 07:54:42,163 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0042 | Val rms_score: 0.6662
|
| 95 |
+
2025-09-26 07:54:44,462 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0034 | Val rms_score: 0.6700
|
| 96 |
+
2025-09-26 07:54:46,692 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0031 | Val rms_score: 0.6597
|
| 97 |
+
2025-09-26 07:54:49,463 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0040 | Val rms_score: 0.6778
|
| 98 |
+
2025-09-26 07:54:51,779 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0041 | Val rms_score: 0.6519
|
| 99 |
+
2025-09-26 07:54:53,986 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0034 | Val rms_score: 0.6792
|
| 100 |
+
2025-09-26 07:54:56,236 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0047 | Val rms_score: 0.6689
|
| 101 |
+
2025-09-26 07:54:58,469 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0044 | Val rms_score: 0.6667
|
| 102 |
+
2025-09-26 07:55:01,088 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0029 | Val rms_score: 0.6636
|
| 103 |
+
2025-09-26 07:55:03,294 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0040 | Val rms_score: 0.6602
|
| 104 |
+
2025-09-26 07:55:05,486 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0030 | Val rms_score: 0.6594
|
| 105 |
+
2025-09-26 07:55:07,700 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0030 | Val rms_score: 0.6537
|
| 106 |
+
2025-09-26 07:55:08,191 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Test rms_score: 0.9063
|
| 107 |
+
2025-09-26 07:55:08,474 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset adme_ppb_h at 2025-09-26_07-55-08
|
| 108 |
+
2025-09-26 07:55:10,325 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.8375 | Val rms_score: 0.5057
|
| 109 |
+
2025-09-26 07:55:10,325 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Global step of best model: 5
|
| 110 |
+
2025-09-26 07:55:10,950 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.5057
|
| 111 |
+
2025-09-26 07:55:13,398 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4156 | Val rms_score: 0.6027
|
| 112 |
+
2025-09-26 07:55:15,668 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2219 | Val rms_score: 0.6164
|
| 113 |
+
2025-09-26 07:55:17,997 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1688 | Val rms_score: 0.6507
|
| 114 |
+
2025-09-26 07:55:20,366 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1094 | Val rms_score: 0.6216
|
| 115 |
+
2025-09-26 07:55:22,728 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0805 | Val rms_score: 0.6379
|
| 116 |
+
2025-09-26 07:55:25,326 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0570 | Val rms_score: 0.6270
|
| 117 |
+
2025-09-26 07:55:27,593 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0402 | Val rms_score: 0.6639
|
| 118 |
+
2025-09-26 07:55:29,899 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0316 | Val rms_score: 0.6443
|
| 119 |
+
2025-09-26 07:55:32,098 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0252 | Val rms_score: 0.6599
|
| 120 |
+
2025-09-26 07:55:34,591 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0229 | Val rms_score: 0.6755
|
| 121 |
+
2025-09-26 07:55:37,250 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0174 | Val rms_score: 0.6510
|
| 122 |
+
2025-09-26 07:55:39,631 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0166 | Val rms_score: 0.6755
|
| 123 |
+
2025-09-26 07:55:41,928 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0126 | Val rms_score: 0.6754
|
| 124 |
+
2025-09-26 07:55:44,106 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0104 | Val rms_score: 0.6726
|
| 125 |
+
2025-09-26 07:55:46,400 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0102 | Val rms_score: 0.6797
|
| 126 |
+
2025-09-26 07:55:48,908 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0081 | Val rms_score: 0.6793
|
| 127 |
+
2025-09-26 07:55:51,086 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0088 | Val rms_score: 0.6737
|
| 128 |
+
2025-09-26 07:55:53,276 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0077 | Val rms_score: 0.6767
|
| 129 |
+
2025-09-26 07:55:55,464 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0094 | Val rms_score: 0.6727
|
| 130 |
+
2025-09-26 07:55:57,880 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0066 | Val rms_score: 0.6728
|
| 131 |
+
2025-09-26 07:56:00,557 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0060 | Val rms_score: 0.6853
|
| 132 |
+
2025-09-26 07:56:02,749 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0061 | Val rms_score: 0.6818
|
| 133 |
+
2025-09-26 07:56:05,129 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0049 | Val rms_score: 0.6724
|
| 134 |
+
2025-09-26 07:56:07,679 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0068 | Val rms_score: 0.6854
|
| 135 |
+
2025-09-26 07:56:10,372 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0050 | Val rms_score: 0.6815
|
| 136 |
+
2025-09-26 07:56:13,215 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0049 | Val rms_score: 0.6731
|
| 137 |
+
2025-09-26 07:56:15,698 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0042 | Val rms_score: 0.6774
|
| 138 |
+
2025-09-26 07:56:18,520 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0047 | Val rms_score: 0.6881
|
| 139 |
+
2025-09-26 07:56:21,449 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0056 | Val rms_score: 0.6789
|
| 140 |
+
2025-09-26 07:56:24,417 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0049 | Val rms_score: 0.6716
|
| 141 |
+
2025-09-26 07:56:27,320 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0047 | Val rms_score: 0.6794
|
| 142 |
+
2025-09-26 07:56:30,048 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0052 | Val rms_score: 0.6778
|
| 143 |
+
2025-09-26 07:56:32,822 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0046 | Val rms_score: 0.6745
|
| 144 |
+
2025-09-26 07:56:35,594 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0048 | Val rms_score: 0.6738
|
| 145 |
+
2025-09-26 07:56:38,166 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0051 | Val rms_score: 0.6832
|
| 146 |
+
2025-09-26 07:56:41,384 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0041 | Val rms_score: 0.6750
|
| 147 |
+
2025-09-26 07:56:44,079 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0051 | Val rms_score: 0.6795
|
| 148 |
+
2025-09-26 07:56:46,697 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0051 | Val rms_score: 0.6724
|
| 149 |
+
2025-09-26 07:56:49,556 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0047 | Val rms_score: 0.6742
|
| 150 |
+
2025-09-26 07:56:52,274 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0064 | Val rms_score: 0.6840
|
| 151 |
+
2025-09-26 07:56:55,432 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0069 | Val rms_score: 0.6663
|
| 152 |
+
2025-09-26 07:56:58,012 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0055 | Val rms_score: 0.6998
|
| 153 |
+
2025-09-26 07:57:00,566 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0049 | Val rms_score: 0.6657
|
| 154 |
+
2025-09-26 07:57:03,085 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0063 | Val rms_score: 0.6800
|
| 155 |
+
2025-09-26 07:57:05,971 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0049 | Val rms_score: 0.6756
|
| 156 |
+
2025-09-26 07:57:09,250 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0052 | Val rms_score: 0.6710
|
| 157 |
+
2025-09-26 07:57:11,904 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0054 | Val rms_score: 0.6765
|
| 158 |
+
2025-09-26 07:57:14,682 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0047 | Val rms_score: 0.6690
|
| 159 |
+
2025-09-26 07:57:17,544 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0044 | Val rms_score: 0.6677
|
| 160 |
+
2025-09-26 07:57:20,074 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0036 | Val rms_score: 0.6743
|
| 161 |
+
2025-09-26 07:57:23,324 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0042 | Val rms_score: 0.6765
|
| 162 |
+
2025-09-26 07:57:26,649 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0037 | Val rms_score: 0.6864
|
| 163 |
+
2025-09-26 07:57:29,103 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0043 | Val rms_score: 0.6762
|
| 164 |
+
2025-09-26 07:57:31,322 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0038 | Val rms_score: 0.6703
|
| 165 |
+
2025-09-26 07:57:33,761 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0044 | Val rms_score: 0.6710
|
| 166 |
+
2025-09-26 07:57:36,339 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0048 | Val rms_score: 0.6658
|
| 167 |
+
2025-09-26 07:57:38,583 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0043 | Val rms_score: 0.6738
|
| 168 |
+
2025-09-26 07:57:40,848 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0055 | Val rms_score: 0.6667
|
| 169 |
+
2025-09-26 07:57:43,034 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0055 | Val rms_score: 0.6811
|
| 170 |
+
2025-09-26 07:57:45,657 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0047 | Val rms_score: 0.6747
|
| 171 |
+
2025-09-26 07:57:48,314 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0038 | Val rms_score: 0.6665
|
| 172 |
+
2025-09-26 07:57:50,549 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0043 | Val rms_score: 0.6712
|
| 173 |
+
2025-09-26 07:57:52,786 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0043 | Val rms_score: 0.6694
|
| 174 |
+
2025-09-26 07:57:55,161 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0049 | Val rms_score: 0.6666
|
| 175 |
+
2025-09-26 07:57:57,826 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0038 | Val rms_score: 0.6656
|
| 176 |
+
2025-09-26 07:58:00,421 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0038 | Val rms_score: 0.6668
|
| 177 |
+
2025-09-26 07:58:02,671 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0043 | Val rms_score: 0.6662
|
| 178 |
+
2025-09-26 07:58:05,028 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0042 | Val rms_score: 0.6673
|
| 179 |
+
2025-09-26 07:58:07,448 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0046 | Val rms_score: 0.6750
|
| 180 |
+
2025-09-26 07:58:10,371 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0041 | Val rms_score: 0.6589
|
| 181 |
+
2025-09-26 07:58:13,208 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0037 | Val rms_score: 0.6693
|
| 182 |
+
2025-09-26 07:58:15,497 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0043 | Val rms_score: 0.6684
|
| 183 |
+
2025-09-26 07:58:17,809 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0037 | Val rms_score: 0.6718
|
| 184 |
+
2025-09-26 07:58:19,978 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0040 | Val rms_score: 0.6701
|
| 185 |
+
2025-09-26 07:58:22,273 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0028 | Val rms_score: 0.6573
|
| 186 |
+
2025-09-26 07:58:24,879 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0037 | Val rms_score: 0.6564
|
| 187 |
+
2025-09-26 07:58:27,204 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0029 | Val rms_score: 0.6694
|
| 188 |
+
2025-09-26 07:58:29,556 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0035 | Val rms_score: 0.6623
|
| 189 |
+
2025-09-26 07:58:31,680 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0029 | Val rms_score: 0.6669
|
| 190 |
+
2025-09-26 07:58:34,107 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0030 | Val rms_score: 0.6713
|
| 191 |
+
2025-09-26 07:58:36,635 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0034 | Val rms_score: 0.6691
|
| 192 |
+
2025-09-26 07:58:38,971 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0039 | Val rms_score: 0.6632
|
| 193 |
+
2025-09-26 07:58:41,293 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0037 | Val rms_score: 0.6691
|
| 194 |
+
2025-09-26 07:58:43,458 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0039 | Val rms_score: 0.6644
|
| 195 |
+
2025-09-26 07:58:45,882 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0032 | Val rms_score: 0.6654
|
| 196 |
+
2025-09-26 07:58:48,415 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0034 | Val rms_score: 0.6704
|
| 197 |
+
2025-09-26 07:58:50,660 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0035 | Val rms_score: 0.6576
|
| 198 |
+
2025-09-26 07:58:52,898 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0037 | Val rms_score: 0.6745
|
| 199 |
+
2025-09-26 07:58:55,084 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0034 | Val rms_score: 0.6629
|
| 200 |
+
2025-09-26 07:58:57,554 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0039 | Val rms_score: 0.6652
|
| 201 |
+
2025-09-26 07:59:00,132 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0039 | Val rms_score: 0.6549
|
| 202 |
+
2025-09-26 07:59:02,414 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0037 | Val rms_score: 0.6671
|
| 203 |
+
2025-09-26 07:59:04,658 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0038 | Val rms_score: 0.6527
|
| 204 |
+
2025-09-26 07:59:06,906 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0036 | Val rms_score: 0.6651
|
| 205 |
+
2025-09-26 07:59:09,145 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0032 | Val rms_score: 0.6599
|
| 206 |
+
2025-09-26 07:59:11,701 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0038 | Val rms_score: 0.6710
|
| 207 |
+
2025-09-26 07:59:13,893 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0040 | Val rms_score: 0.6554
|
| 208 |
+
2025-09-26 07:59:16,221 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0035 | Val rms_score: 0.6738
|
| 209 |
+
2025-09-26 07:59:18,485 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0046 | Val rms_score: 0.6602
|
| 210 |
+
2025-09-26 07:59:19,066 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Test rms_score: 0.9019
|
| 211 |
+
2025-09-26 07:59:19,401 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset adme_ppb_h at 2025-09-26_07-59-19
|
| 212 |
+
2025-09-26 07:59:21,230 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.8250 | Val rms_score: 0.4951
|
| 213 |
+
2025-09-26 07:59:21,230 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Global step of best model: 5
|
| 214 |
+
2025-09-26 07:59:21,848 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4951
|
| 215 |
+
2025-09-26 07:59:24,101 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4062 | Val rms_score: 0.6299
|
| 216 |
+
2025-09-26 07:59:26,277 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2656 | Val rms_score: 0.6407
|
| 217 |
+
2025-09-26 07:59:28,579 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1852 | Val rms_score: 0.6906
|
| 218 |
+
2025-09-26 07:59:30,839 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1313 | Val rms_score: 0.6626
|
| 219 |
+
2025-09-26 07:59:33,528 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0910 | Val rms_score: 0.6317
|
| 220 |
+
2025-09-26 07:59:36,146 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0695 | Val rms_score: 0.6445
|
| 221 |
+
2025-09-26 07:59:38,435 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0574 | Val rms_score: 0.6537
|
| 222 |
+
2025-09-26 07:59:40,961 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0422 | Val rms_score: 0.6753
|
| 223 |
+
2025-09-26 07:59:43,487 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0322 | Val rms_score: 0.6681
|
| 224 |
+
2025-09-26 07:59:46,421 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0238 | Val rms_score: 0.6641
|
| 225 |
+
2025-09-26 07:59:49,063 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0223 | Val rms_score: 0.6808
|
| 226 |
+
2025-09-26 07:59:51,436 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0182 | Val rms_score: 0.6896
|
| 227 |
+
2025-09-26 07:59:53,952 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0160 | Val rms_score: 0.6726
|
| 228 |
+
2025-09-26 07:59:56,655 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0152 | Val rms_score: 0.6825
|
| 229 |
+
2025-09-26 07:59:59,611 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0134 | Val rms_score: 0.6955
|
| 230 |
+
2025-09-26 08:00:02,464 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0104 | Val rms_score: 0.6870
|
| 231 |
+
2025-09-26 08:00:04,994 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0107 | Val rms_score: 0.6832
|
| 232 |
+
2025-09-26 08:00:07,288 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0103 | Val rms_score: 0.6883
|
| 233 |
+
2025-09-26 08:00:09,605 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0086 | Val rms_score: 0.6879
|
| 234 |
+
2025-09-26 08:00:12,317 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0082 | Val rms_score: 0.6850
|
| 235 |
+
2025-09-26 08:00:14,974 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0083 | Val rms_score: 0.6776
|
| 236 |
+
2025-09-26 08:00:17,417 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0069 | Val rms_score: 0.6841
|
| 237 |
+
2025-09-26 08:00:19,892 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0085 | Val rms_score: 0.6846
|
| 238 |
+
2025-09-26 08:00:22,188 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0069 | Val rms_score: 0.6770
|
| 239 |
+
2025-09-26 08:00:24,964 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0063 | Val rms_score: 0.6832
|
| 240 |
+
2025-09-26 08:00:27,730 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0060 | Val rms_score: 0.6843
|
| 241 |
+
2025-09-26 08:00:29,999 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0052 | Val rms_score: 0.6833
|
| 242 |
+
2025-09-26 08:00:32,466 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0049 | Val rms_score: 0.6810
|
| 243 |
+
2025-09-26 08:00:34,953 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0059 | Val rms_score: 0.6863
|
| 244 |
+
2025-09-26 08:00:37,716 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0047 | Val rms_score: 0.6877
|
| 245 |
+
2025-09-26 08:00:40,467 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0053 | Val rms_score: 0.6828
|
| 246 |
+
2025-09-26 08:00:43,128 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0052 | Val rms_score: 0.6812
|
| 247 |
+
2025-09-26 08:00:45,744 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0058 | Val rms_score: 0.6820
|
| 248 |
+
2025-09-26 08:00:48,530 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0051 | Val rms_score: 0.6865
|
| 249 |
+
2025-09-26 08:00:51,223 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0051 | Val rms_score: 0.6855
|
| 250 |
+
2025-09-26 08:00:54,101 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0048 | Val rms_score: 0.6849
|
| 251 |
+
2025-09-26 08:00:56,769 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0036 | Val rms_score: 0.6820
|
| 252 |
+
2025-09-26 08:00:59,094 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0045 | Val rms_score: 0.6797
|
| 253 |
+
2025-09-26 08:01:01,458 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0050 | Val rms_score: 0.6852
|
| 254 |
+
2025-09-26 08:01:04,202 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0036 | Val rms_score: 0.6819
|
| 255 |
+
2025-09-26 08:01:06,860 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0049 | Val rms_score: 0.6810
|
| 256 |
+
2025-09-26 08:01:09,572 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0043 | Val rms_score: 0.6850
|
| 257 |
+
2025-09-26 08:01:11,810 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0041 | Val rms_score: 0.6875
|
| 258 |
+
2025-09-26 08:01:14,283 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0040 | Val rms_score: 0.6858
|
| 259 |
+
2025-09-26 08:01:17,174 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0054 | Val rms_score: 0.6910
|
| 260 |
+
2025-09-26 08:01:19,948 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0046 | Val rms_score: 0.6892
|
| 261 |
+
2025-09-26 08:01:22,618 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0042 | Val rms_score: 0.6870
|
| 262 |
+
2025-09-26 08:01:25,107 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0045 | Val rms_score: 0.6984
|
| 263 |
+
2025-09-26 08:01:27,550 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0045 | Val rms_score: 0.6889
|
| 264 |
+
2025-09-26 08:01:30,388 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0050 | Val rms_score: 0.6823
|
| 265 |
+
2025-09-26 08:01:33,534 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0037 | Val rms_score: 0.6843
|
| 266 |
+
2025-09-26 08:01:36,120 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0041 | Val rms_score: 0.6782
|
| 267 |
+
2025-09-26 08:01:38,621 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0042 | Val rms_score: 0.6806
|
| 268 |
+
2025-09-26 08:01:40,767 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0049 | Val rms_score: 0.6815
|
| 269 |
+
2025-09-26 08:01:43,384 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0041 | Val rms_score: 0.6839
|
| 270 |
+
2025-09-26 08:01:45,995 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0037 | Val rms_score: 0.6849
|
| 271 |
+
2025-09-26 08:01:48,297 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0037 | Val rms_score: 0.6803
|
| 272 |
+
2025-09-26 08:01:50,716 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0044 | Val rms_score: 0.6787
|
| 273 |
+
2025-09-26 08:01:53,272 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0049 | Val rms_score: 0.6816
|
| 274 |
+
2025-09-26 08:01:56,089 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0038 | Val rms_score: 0.6757
|
| 275 |
+
2025-09-26 08:01:58,836 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0034 | Val rms_score: 0.6792
|
| 276 |
+
2025-09-26 08:02:01,212 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0051 | Val rms_score: 0.6834
|
| 277 |
+
2025-09-26 08:02:03,527 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0041 | Val rms_score: 0.6825
|
| 278 |
+
2025-09-26 08:02:05,814 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0045 | Val rms_score: 0.6848
|
| 279 |
+
2025-09-26 08:02:08,643 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0045 | Val rms_score: 0.6824
|
| 280 |
+
2025-09-26 08:02:11,417 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0045 | Val rms_score: 0.6686
|
| 281 |
+
2025-09-26 08:02:13,999 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0048 | Val rms_score: 0.6706
|
| 282 |
+
2025-09-26 08:02:16,253 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0032 | Val rms_score: 0.6816
|
| 283 |
+
2025-09-26 08:02:18,537 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0048 | Val rms_score: 0.6790
|
| 284 |
+
2025-09-26 08:02:20,987 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0037 | Val rms_score: 0.6778
|
| 285 |
+
2025-09-26 08:02:23,571 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0043 | Val rms_score: 0.6777
|
| 286 |
+
2025-09-26 08:02:25,835 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0040 | Val rms_score: 0.6760
|
| 287 |
+
2025-09-26 08:02:28,096 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0042 | Val rms_score: 0.6774
|
| 288 |
+
2025-09-26 08:02:30,351 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0034 | Val rms_score: 0.6787
|
| 289 |
+
2025-09-26 08:02:32,639 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0042 | Val rms_score: 0.6763
|
| 290 |
+
2025-09-26 08:02:35,360 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0041 | Val rms_score: 0.6749
|
| 291 |
+
2025-09-26 08:02:37,673 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0041 | Val rms_score: 0.6773
|
| 292 |
+
2025-09-26 08:02:39,934 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0037 | Val rms_score: 0.6792
|
| 293 |
+
2025-09-26 08:02:42,214 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0035 | Val rms_score: 0.6772
|
| 294 |
+
2025-09-26 08:02:44,483 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0037 | Val rms_score: 0.6714
|
| 295 |
+
2025-09-26 08:02:47,125 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0036 | Val rms_score: 0.6719
|
| 296 |
+
2025-09-26 08:02:49,409 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0035 | Val rms_score: 0.6711
|
| 297 |
+
2025-09-26 08:02:51,633 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0033 | Val rms_score: 0.6746
|
| 298 |
+
2025-09-26 08:02:53,856 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0030 | Val rms_score: 0.6763
|
| 299 |
+
2025-09-26 08:02:56,026 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0034 | Val rms_score: 0.6681
|
| 300 |
+
2025-09-26 08:02:58,516 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0041 | Val rms_score: 0.6713
|
| 301 |
+
2025-09-26 08:03:00,751 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0034 | Val rms_score: 0.6762
|
| 302 |
+
2025-09-26 08:03:02,914 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0037 | Val rms_score: 0.6707
|
| 303 |
+
2025-09-26 08:03:05,126 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0039 | Val rms_score: 0.6751
|
| 304 |
+
2025-09-26 08:03:07,401 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0040 | Val rms_score: 0.6781
|
| 305 |
+
2025-09-26 08:03:09,975 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0039 | Val rms_score: 0.6676
|
| 306 |
+
2025-09-26 08:03:12,218 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0032 | Val rms_score: 0.6653
|
| 307 |
+
2025-09-26 08:03:14,495 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0037 | Val rms_score: 0.6732
|
| 308 |
+
2025-09-26 08:03:16,805 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0039 | Val rms_score: 0.6680
|
| 309 |
+
2025-09-26 08:03:19,071 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0029 | Val rms_score: 0.6653
|
| 310 |
+
2025-09-26 08:03:21,693 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0031 | Val rms_score: 0.6676
|
| 311 |
+
2025-09-26 08:03:23,857 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0033 | Val rms_score: 0.6734
|
| 312 |
+
2025-09-26 08:03:26,048 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0034 | Val rms_score: 0.6703
|
| 313 |
+
2025-09-26 08:03:28,267 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0038 | Val rms_score: 0.6742
|
| 314 |
+
2025-09-26 08:03:28,795 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Test rms_score: 0.8698
|
| 315 |
+
2025-09-26 08:03:29,104 - logs_modchembert_adme_ppb_h_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.8927, Std Dev: 0.0163
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_ppb_r_epochs100_batch_size32_20250926_080329.log
ADDED
|
@@ -0,0 +1,329 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 08:03:29,106 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Running benchmark for dataset: adme_ppb_r
|
| 2 |
+
2025-09-26 08:03:29,106 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - dataset: adme_ppb_r, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 08:03:29,143 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset adme_ppb_r at 2025-09-26_08-03-29
|
| 4 |
+
2025-09-26 08:03:30,979 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.8938 | Val rms_score: 0.5334
|
| 5 |
+
2025-09-26 08:03:30,980 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 5
|
| 6 |
+
2025-09-26 08:03:31,573 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.5334
|
| 7 |
+
2025-09-26 08:03:34,148 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4031 | Val rms_score: 0.3670
|
| 8 |
+
2025-09-26 08:03:34,339 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 10
|
| 9 |
+
2025-09-26 08:03:34,918 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3670
|
| 10 |
+
2025-09-26 08:03:37,296 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2266 | Val rms_score: 0.4133
|
| 11 |
+
2025-09-26 08:03:39,875 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2109 | Val rms_score: 0.3446
|
| 12 |
+
2025-09-26 08:03:40,062 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 20
|
| 13 |
+
2025-09-26 08:03:40,849 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.3446
|
| 14 |
+
2025-09-26 08:03:43,438 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1414 | Val rms_score: 0.3603
|
| 15 |
+
2025-09-26 08:03:45,590 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1000 | Val rms_score: 0.3732
|
| 16 |
+
2025-09-26 08:03:48,138 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0871 | Val rms_score: 0.4052
|
| 17 |
+
2025-09-26 08:03:50,376 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0566 | Val rms_score: 0.4752
|
| 18 |
+
2025-09-26 08:03:52,461 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0471 | Val rms_score: 0.4318
|
| 19 |
+
2025-09-26 08:03:55,225 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0480 | Val rms_score: 0.4543
|
| 20 |
+
2025-09-26 08:03:57,318 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0436 | Val rms_score: 0.4484
|
| 21 |
+
2025-09-26 08:03:59,817 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0359 | Val rms_score: 0.4022
|
| 22 |
+
2025-09-26 08:04:01,949 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0279 | Val rms_score: 0.4533
|
| 23 |
+
2025-09-26 08:04:04,101 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0222 | Val rms_score: 0.4545
|
| 24 |
+
2025-09-26 08:04:06,330 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0208 | Val rms_score: 0.4452
|
| 25 |
+
2025-09-26 08:04:08,509 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0180 | Val rms_score: 0.4705
|
| 26 |
+
2025-09-26 08:04:11,038 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0152 | Val rms_score: 0.4534
|
| 27 |
+
2025-09-26 08:04:13,345 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0138 | Val rms_score: 0.4527
|
| 28 |
+
2025-09-26 08:04:15,554 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0122 | Val rms_score: 0.4666
|
| 29 |
+
2025-09-26 08:04:17,743 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0094 | Val rms_score: 0.4478
|
| 30 |
+
2025-09-26 08:04:19,892 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0207 | Val rms_score: 0.4507
|
| 31 |
+
2025-09-26 08:04:22,359 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0068 | Val rms_score: 0.4592
|
| 32 |
+
2025-09-26 08:04:24,634 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0076 | Val rms_score: 0.4599
|
| 33 |
+
2025-09-26 08:04:26,858 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0066 | Val rms_score: 0.4651
|
| 34 |
+
2025-09-26 08:04:29,099 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0070 | Val rms_score: 0.4641
|
| 35 |
+
2025-09-26 08:04:31,267 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0085 | Val rms_score: 0.4541
|
| 36 |
+
2025-09-26 08:04:33,713 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0084 | Val rms_score: 0.4570
|
| 37 |
+
2025-09-26 08:04:35,957 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0065 | Val rms_score: 0.4703
|
| 38 |
+
2025-09-26 08:04:38,200 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0064 | Val rms_score: 0.4544
|
| 39 |
+
2025-09-26 08:04:40,468 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0066 | Val rms_score: 0.4527
|
| 40 |
+
2025-09-26 08:04:42,624 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0122 | Val rms_score: 0.4756
|
| 41 |
+
2025-09-26 08:04:45,244 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0086 | Val rms_score: 0.4587
|
| 42 |
+
2025-09-26 08:04:47,581 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0092 | Val rms_score: 0.4591
|
| 43 |
+
2025-09-26 08:04:49,987 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0070 | Val rms_score: 0.4768
|
| 44 |
+
2025-09-26 08:04:52,459 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0102 | Val rms_score: 0.4661
|
| 45 |
+
2025-09-26 08:04:55,102 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0074 | Val rms_score: 0.4704
|
| 46 |
+
2025-09-26 08:04:58,018 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0083 | Val rms_score: 0.4696
|
| 47 |
+
2025-09-26 08:05:00,648 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0052 | Val rms_score: 0.4598
|
| 48 |
+
2025-09-26 08:05:03,337 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0065 | Val rms_score: 0.4668
|
| 49 |
+
2025-09-26 08:05:06,052 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0068 | Val rms_score: 0.4556
|
| 50 |
+
2025-09-26 08:05:08,763 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0070 | Val rms_score: 0.4573
|
| 51 |
+
2025-09-26 08:05:11,821 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0057 | Val rms_score: 0.4903
|
| 52 |
+
2025-09-26 08:05:14,588 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0054 | Val rms_score: 0.4618
|
| 53 |
+
2025-09-26 08:05:17,028 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0062 | Val rms_score: 0.4543
|
| 54 |
+
2025-09-26 08:05:19,429 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0052 | Val rms_score: 0.4689
|
| 55 |
+
2025-09-26 08:05:21,763 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0049 | Val rms_score: 0.4575
|
| 56 |
+
2025-09-26 08:05:24,402 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0054 | Val rms_score: 0.4624
|
| 57 |
+
2025-09-26 08:05:26,677 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0050 | Val rms_score: 0.4688
|
| 58 |
+
2025-09-26 08:05:29,135 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0054 | Val rms_score: 0.4737
|
| 59 |
+
2025-09-26 08:05:31,441 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0041 | Val rms_score: 0.4682
|
| 60 |
+
2025-09-26 08:05:33,626 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0050 | Val rms_score: 0.4583
|
| 61 |
+
2025-09-26 08:05:36,295 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0039 | Val rms_score: 0.4647
|
| 62 |
+
2025-09-26 08:05:38,624 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0042 | Val rms_score: 0.4635
|
| 63 |
+
2025-09-26 08:05:40,887 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0043 | Val rms_score: 0.4627
|
| 64 |
+
2025-09-26 08:05:43,156 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0051 | Val rms_score: 0.4594
|
| 65 |
+
2025-09-26 08:05:45,369 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0043 | Val rms_score: 0.4626
|
| 66 |
+
2025-09-26 08:05:48,509 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0047 | Val rms_score: 0.4619
|
| 67 |
+
2025-09-26 08:05:50,762 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0045 | Val rms_score: 0.4678
|
| 68 |
+
2025-09-26 08:05:52,927 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0043 | Val rms_score: 0.4727
|
| 69 |
+
2025-09-26 08:05:55,321 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0043 | Val rms_score: 0.4657
|
| 70 |
+
2025-09-26 08:05:57,584 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0046 | Val rms_score: 0.4612
|
| 71 |
+
2025-09-26 08:06:00,160 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0049 | Val rms_score: 0.4644
|
| 72 |
+
2025-09-26 08:06:02,373 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0036 | Val rms_score: 0.4604
|
| 73 |
+
2025-09-26 08:06:04,617 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0049 | Val rms_score: 0.4610
|
| 74 |
+
2025-09-26 08:06:06,876 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0039 | Val rms_score: 0.4719
|
| 75 |
+
2025-09-26 08:06:09,132 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0051 | Val rms_score: 0.4568
|
| 76 |
+
2025-09-26 08:06:11,712 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0052 | Val rms_score: 0.4607
|
| 77 |
+
2025-09-26 08:06:13,990 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0054 | Val rms_score: 0.4600
|
| 78 |
+
2025-09-26 08:06:16,178 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0046 | Val rms_score: 0.4496
|
| 79 |
+
2025-09-26 08:06:18,529 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0050 | Val rms_score: 0.4562
|
| 80 |
+
2025-09-26 08:06:20,800 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0035 | Val rms_score: 0.4638
|
| 81 |
+
2025-09-26 08:06:23,478 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0041 | Val rms_score: 0.4612
|
| 82 |
+
2025-09-26 08:06:25,678 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0052 | Val rms_score: 0.4712
|
| 83 |
+
2025-09-26 08:06:27,931 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0035 | Val rms_score: 0.4775
|
| 84 |
+
2025-09-26 08:06:30,506 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0045 | Val rms_score: 0.4618
|
| 85 |
+
2025-09-26 08:06:32,696 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0040 | Val rms_score: 0.4597
|
| 86 |
+
2025-09-26 08:06:35,643 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0047 | Val rms_score: 0.4575
|
| 87 |
+
2025-09-26 08:06:38,462 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0040 | Val rms_score: 0.4488
|
| 88 |
+
2025-09-26 08:06:41,255 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0046 | Val rms_score: 0.4746
|
| 89 |
+
2025-09-26 08:06:44,027 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0047 | Val rms_score: 0.4584
|
| 90 |
+
2025-09-26 08:06:46,674 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0040 | Val rms_score: 0.4597
|
| 91 |
+
2025-09-26 08:06:49,867 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0050 | Val rms_score: 0.4663
|
| 92 |
+
2025-09-26 08:06:52,751 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0040 | Val rms_score: 0.4581
|
| 93 |
+
2025-09-26 08:06:55,817 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0041 | Val rms_score: 0.4644
|
| 94 |
+
2025-09-26 08:06:58,446 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0033 | Val rms_score: 0.4528
|
| 95 |
+
2025-09-26 08:07:01,385 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0044 | Val rms_score: 0.4574
|
| 96 |
+
2025-09-26 08:07:04,170 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0047 | Val rms_score: 0.4668
|
| 97 |
+
2025-09-26 08:07:06,589 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0047 | Val rms_score: 0.4613
|
| 98 |
+
2025-09-26 08:07:09,087 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0036 | Val rms_score: 0.4593
|
| 99 |
+
2025-09-26 08:07:11,537 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0042 | Val rms_score: 0.4680
|
| 100 |
+
2025-09-26 08:07:14,147 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0042 | Val rms_score: 0.4617
|
| 101 |
+
2025-09-26 08:07:17,150 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0040 | Val rms_score: 0.4572
|
| 102 |
+
2025-09-26 08:07:19,539 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0034 | Val rms_score: 0.4586
|
| 103 |
+
2025-09-26 08:07:22,203 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0040 | Val rms_score: 0.4594
|
| 104 |
+
2025-09-26 08:07:24,461 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0045 | Val rms_score: 0.4580
|
| 105 |
+
2025-09-26 08:07:26,986 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0032 | Val rms_score: 0.4638
|
| 106 |
+
2025-09-26 08:07:29,676 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0048 | Val rms_score: 0.4500
|
| 107 |
+
2025-09-26 08:07:31,947 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0041 | Val rms_score: 0.4561
|
| 108 |
+
2025-09-26 08:07:34,189 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0058 | Val rms_score: 0.4651
|
| 109 |
+
2025-09-26 08:07:36,391 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0052 | Val rms_score: 0.4451
|
| 110 |
+
2025-09-26 08:07:36,927 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Test rms_score: 0.6464
|
| 111 |
+
2025-09-26 08:07:37,255 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset adme_ppb_r at 2025-09-26_08-07-37
|
| 112 |
+
2025-09-26 08:07:39,125 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.9125 | Val rms_score: 0.5661
|
| 113 |
+
2025-09-26 08:07:39,125 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 5
|
| 114 |
+
2025-09-26 08:07:39,733 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.5661
|
| 115 |
+
2025-09-26 08:07:42,286 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.5437 | Val rms_score: 0.4625
|
| 116 |
+
2025-09-26 08:07:42,472 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 10
|
| 117 |
+
2025-09-26 08:07:43,056 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.4625
|
| 118 |
+
2025-09-26 08:07:45,582 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3109 | Val rms_score: 0.3648
|
| 119 |
+
2025-09-26 08:07:45,771 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 15
|
| 120 |
+
2025-09-26 08:07:46,373 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.3648
|
| 121 |
+
2025-09-26 08:07:48,832 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2656 | Val rms_score: 0.3246
|
| 122 |
+
2025-09-26 08:07:49,053 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 20
|
| 123 |
+
2025-09-26 08:07:49,632 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.3246
|
| 124 |
+
2025-09-26 08:07:51,774 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2234 | Val rms_score: 0.3580
|
| 125 |
+
2025-09-26 08:07:54,808 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1461 | Val rms_score: 0.3302
|
| 126 |
+
2025-09-26 08:07:57,711 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1141 | Val rms_score: 0.3296
|
| 127 |
+
2025-09-26 08:07:59,840 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0930 | Val rms_score: 0.3476
|
| 128 |
+
2025-09-26 08:08:02,142 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0793 | Val rms_score: 0.3682
|
| 129 |
+
2025-09-26 08:08:04,358 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0820 | Val rms_score: 0.3625
|
| 130 |
+
2025-09-26 08:08:06,457 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0531 | Val rms_score: 0.4083
|
| 131 |
+
2025-09-26 08:08:09,427 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0508 | Val rms_score: 0.4318
|
| 132 |
+
2025-09-26 08:08:11,657 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0383 | Val rms_score: 0.4383
|
| 133 |
+
2025-09-26 08:08:13,824 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0289 | Val rms_score: 0.4472
|
| 134 |
+
2025-09-26 08:08:16,012 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0273 | Val rms_score: 0.4413
|
| 135 |
+
2025-09-26 08:08:18,251 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0227 | Val rms_score: 0.4377
|
| 136 |
+
2025-09-26 08:08:20,770 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0205 | Val rms_score: 0.4353
|
| 137 |
+
2025-09-26 08:08:22,931 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0187 | Val rms_score: 0.4417
|
| 138 |
+
2025-09-26 08:08:25,197 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0168 | Val rms_score: 0.4440
|
| 139 |
+
2025-09-26 08:08:27,409 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0137 | Val rms_score: 0.4522
|
| 140 |
+
2025-09-26 08:08:29,649 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0145 | Val rms_score: 0.4595
|
| 141 |
+
2025-09-26 08:08:32,189 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0112 | Val rms_score: 0.4535
|
| 142 |
+
2025-09-26 08:08:34,427 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0100 | Val rms_score: 0.4442
|
| 143 |
+
2025-09-26 08:08:36,712 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0097 | Val rms_score: 0.4405
|
| 144 |
+
2025-09-26 08:08:38,953 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0098 | Val rms_score: 0.4410
|
| 145 |
+
2025-09-26 08:08:41,184 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0114 | Val rms_score: 0.4354
|
| 146 |
+
2025-09-26 08:08:43,774 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0090 | Val rms_score: 0.4368
|
| 147 |
+
2025-09-26 08:08:45,945 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0126 | Val rms_score: 0.4516
|
| 148 |
+
2025-09-26 08:08:48,133 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0081 | Val rms_score: 0.4594
|
| 149 |
+
2025-09-26 08:08:50,361 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0061 | Val rms_score: 0.4391
|
| 150 |
+
2025-09-26 08:08:52,719 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0076 | Val rms_score: 0.4433
|
| 151 |
+
2025-09-26 08:08:55,417 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0061 | Val rms_score: 0.4648
|
| 152 |
+
2025-09-26 08:08:57,679 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0072 | Val rms_score: 0.4595
|
| 153 |
+
2025-09-26 08:08:59,963 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0049 | Val rms_score: 0.4458
|
| 154 |
+
2025-09-26 08:09:02,113 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0052 | Val rms_score: 0.4495
|
| 155 |
+
2025-09-26 08:09:04,273 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0048 | Val rms_score: 0.4571
|
| 156 |
+
2025-09-26 08:09:06,755 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0070 | Val rms_score: 0.4545
|
| 157 |
+
2025-09-26 08:09:09,047 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0066 | Val rms_score: 0.4437
|
| 158 |
+
2025-09-26 08:09:11,242 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0062 | Val rms_score: 0.4559
|
| 159 |
+
2025-09-26 08:09:13,553 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0063 | Val rms_score: 0.4545
|
| 160 |
+
2025-09-26 08:09:15,758 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0062 | Val rms_score: 0.4377
|
| 161 |
+
2025-09-26 08:09:18,432 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0063 | Val rms_score: 0.4365
|
| 162 |
+
2025-09-26 08:09:20,700 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0049 | Val rms_score: 0.4509
|
| 163 |
+
2025-09-26 08:09:22,878 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0049 | Val rms_score: 0.4518
|
| 164 |
+
2025-09-26 08:09:25,067 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0051 | Val rms_score: 0.4484
|
| 165 |
+
2025-09-26 08:09:27,360 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0065 | Val rms_score: 0.4527
|
| 166 |
+
2025-09-26 08:09:29,893 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0048 | Val rms_score: 0.4546
|
| 167 |
+
2025-09-26 08:09:32,064 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0042 | Val rms_score: 0.4533
|
| 168 |
+
2025-09-26 08:09:34,212 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0046 | Val rms_score: 0.4396
|
| 169 |
+
2025-09-26 08:09:36,539 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0051 | Val rms_score: 0.4440
|
| 170 |
+
2025-09-26 08:09:38,772 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0050 | Val rms_score: 0.4487
|
| 171 |
+
2025-09-26 08:09:41,207 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0043 | Val rms_score: 0.4390
|
| 172 |
+
2025-09-26 08:09:43,411 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0047 | Val rms_score: 0.4418
|
| 173 |
+
2025-09-26 08:09:45,526 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0056 | Val rms_score: 0.4536
|
| 174 |
+
2025-09-26 08:09:47,770 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0055 | Val rms_score: 0.4514
|
| 175 |
+
2025-09-26 08:09:49,918 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0043 | Val rms_score: 0.4483
|
| 176 |
+
2025-09-26 08:09:52,471 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0040 | Val rms_score: 0.4460
|
| 177 |
+
2025-09-26 08:09:54,801 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0044 | Val rms_score: 0.4504
|
| 178 |
+
2025-09-26 08:09:57,108 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0052 | Val rms_score: 0.4450
|
| 179 |
+
2025-09-26 08:09:59,257 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0038 | Val rms_score: 0.4470
|
| 180 |
+
2025-09-26 08:10:01,625 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0049 | Val rms_score: 0.4454
|
| 181 |
+
2025-09-26 08:10:04,169 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0045 | Val rms_score: 0.4508
|
| 182 |
+
2025-09-26 08:10:06,476 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0040 | Val rms_score: 0.4517
|
| 183 |
+
2025-09-26 08:10:08,808 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0045 | Val rms_score: 0.4452
|
| 184 |
+
2025-09-26 08:10:11,225 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0043 | Val rms_score: 0.4453
|
| 185 |
+
2025-09-26 08:10:13,430 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0040 | Val rms_score: 0.4525
|
| 186 |
+
2025-09-26 08:10:16,236 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0040 | Val rms_score: 0.4617
|
| 187 |
+
2025-09-26 08:10:18,796 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0047 | Val rms_score: 0.4522
|
| 188 |
+
2025-09-26 08:10:21,315 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0049 | Val rms_score: 0.4424
|
| 189 |
+
2025-09-26 08:10:23,680 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0052 | Val rms_score: 0.4422
|
| 190 |
+
2025-09-26 08:10:25,899 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0044 | Val rms_score: 0.4387
|
| 191 |
+
2025-09-26 08:10:28,899 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0045 | Val rms_score: 0.4350
|
| 192 |
+
2025-09-26 08:10:31,591 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0050 | Val rms_score: 0.4463
|
| 193 |
+
2025-09-26 08:10:34,352 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0050 | Val rms_score: 0.4501
|
| 194 |
+
2025-09-26 08:10:36,793 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0049 | Val rms_score: 0.4374
|
| 195 |
+
2025-09-26 08:10:39,210 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0038 | Val rms_score: 0.4360
|
| 196 |
+
2025-09-26 08:10:42,258 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0049 | Val rms_score: 0.4363
|
| 197 |
+
2025-09-26 08:10:45,207 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0037 | Val rms_score: 0.4461
|
| 198 |
+
2025-09-26 08:10:48,055 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0041 | Val rms_score: 0.4491
|
| 199 |
+
2025-09-26 08:10:50,961 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0043 | Val rms_score: 0.4426
|
| 200 |
+
2025-09-26 08:10:53,474 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0042 | Val rms_score: 0.4382
|
| 201 |
+
2025-09-26 08:10:56,638 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0042 | Val rms_score: 0.4361
|
| 202 |
+
2025-09-26 08:10:59,539 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0040 | Val rms_score: 0.4409
|
| 203 |
+
2025-09-26 08:11:02,455 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0046 | Val rms_score: 0.4558
|
| 204 |
+
2025-09-26 08:11:05,651 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0048 | Val rms_score: 0.4433
|
| 205 |
+
2025-09-26 08:11:08,357 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0046 | Val rms_score: 0.4352
|
| 206 |
+
2025-09-26 08:11:11,015 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0039 | Val rms_score: 0.4487
|
| 207 |
+
2025-09-26 08:11:13,338 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0041 | Val rms_score: 0.4533
|
| 208 |
+
2025-09-26 08:11:15,692 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0042 | Val rms_score: 0.4422
|
| 209 |
+
2025-09-26 08:11:18,013 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0058 | Val rms_score: 0.4323
|
| 210 |
+
2025-09-26 08:11:20,729 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0045 | Val rms_score: 0.4473
|
| 211 |
+
2025-09-26 08:11:23,650 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0045 | Val rms_score: 0.4457
|
| 212 |
+
2025-09-26 08:11:26,060 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0051 | Val rms_score: 0.4483
|
| 213 |
+
2025-09-26 08:11:28,774 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0038 | Val rms_score: 0.4427
|
| 214 |
+
2025-09-26 08:11:31,494 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0043 | Val rms_score: 0.4414
|
| 215 |
+
2025-09-26 08:11:34,333 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0039 | Val rms_score: 0.4468
|
| 216 |
+
2025-09-26 08:11:37,292 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0035 | Val rms_score: 0.4433
|
| 217 |
+
2025-09-26 08:11:40,059 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0035 | Val rms_score: 0.4410
|
| 218 |
+
2025-09-26 08:11:42,668 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0041 | Val rms_score: 0.4425
|
| 219 |
+
2025-09-26 08:11:45,380 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0044 | Val rms_score: 0.4578
|
| 220 |
+
2025-09-26 08:11:45,938 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Test rms_score: 0.7436
|
| 221 |
+
2025-09-26 08:11:46,238 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset adme_ppb_r at 2025-09-26_08-11-46
|
| 222 |
+
2025-09-26 08:11:48,068 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.8688 | Val rms_score: 0.5339
|
| 223 |
+
2025-09-26 08:11:48,068 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 5
|
| 224 |
+
2025-09-26 08:11:48,628 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.5339
|
| 225 |
+
2025-09-26 08:11:50,964 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4500 | Val rms_score: 0.3867
|
| 226 |
+
2025-09-26 08:11:51,152 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 10
|
| 227 |
+
2025-09-26 08:11:51,724 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3867
|
| 228 |
+
2025-09-26 08:11:54,295 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2938 | Val rms_score: 0.4180
|
| 229 |
+
2025-09-26 08:11:56,601 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3000 | Val rms_score: 0.3342
|
| 230 |
+
2025-09-26 08:11:56,785 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Global step of best model: 20
|
| 231 |
+
2025-09-26 08:11:57,327 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.3342
|
| 232 |
+
2025-09-26 08:11:59,750 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1609 | Val rms_score: 0.3387
|
| 233 |
+
2025-09-26 08:12:01,877 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1703 | Val rms_score: 0.3443
|
| 234 |
+
2025-09-26 08:12:04,397 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0992 | Val rms_score: 0.3486
|
| 235 |
+
2025-09-26 08:12:06,658 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0922 | Val rms_score: 0.3677
|
| 236 |
+
2025-09-26 08:12:09,176 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0594 | Val rms_score: 0.4005
|
| 237 |
+
2025-09-26 08:12:11,601 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0516 | Val rms_score: 0.3992
|
| 238 |
+
2025-09-26 08:12:13,803 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0424 | Val rms_score: 0.4035
|
| 239 |
+
2025-09-26 08:12:16,612 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0352 | Val rms_score: 0.4121
|
| 240 |
+
2025-09-26 08:12:19,112 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0340 | Val rms_score: 0.4115
|
| 241 |
+
2025-09-26 08:12:21,473 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0301 | Val rms_score: 0.4197
|
| 242 |
+
2025-09-26 08:12:23,777 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0245 | Val rms_score: 0.4420
|
| 243 |
+
2025-09-26 08:12:25,919 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0194 | Val rms_score: 0.4504
|
| 244 |
+
2025-09-26 08:12:28,567 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0210 | Val rms_score: 0.4381
|
| 245 |
+
2025-09-26 08:12:30,815 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0159 | Val rms_score: 0.4437
|
| 246 |
+
2025-09-26 08:12:32,932 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0150 | Val rms_score: 0.4466
|
| 247 |
+
2025-09-26 08:12:35,928 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0130 | Val rms_score: 0.4490
|
| 248 |
+
2025-09-26 08:12:38,742 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0139 | Val rms_score: 0.4385
|
| 249 |
+
2025-09-26 08:12:41,738 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0131 | Val rms_score: 0.4484
|
| 250 |
+
2025-09-26 08:12:44,525 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0107 | Val rms_score: 0.4501
|
| 251 |
+
2025-09-26 08:12:47,260 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0097 | Val rms_score: 0.4466
|
| 252 |
+
2025-09-26 08:12:50,187 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0076 | Val rms_score: 0.4490
|
| 253 |
+
2025-09-26 08:12:52,927 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0091 | Val rms_score: 0.4562
|
| 254 |
+
2025-09-26 08:12:55,533 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0080 | Val rms_score: 0.4586
|
| 255 |
+
2025-09-26 08:12:58,027 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0081 | Val rms_score: 0.4542
|
| 256 |
+
2025-09-26 08:13:00,417 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0080 | Val rms_score: 0.4592
|
| 257 |
+
2025-09-26 08:13:02,915 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0075 | Val rms_score: 0.4718
|
| 258 |
+
2025-09-26 08:13:05,646 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0082 | Val rms_score: 0.4591
|
| 259 |
+
2025-09-26 08:13:08,676 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0055 | Val rms_score: 0.4479
|
| 260 |
+
2025-09-26 08:13:11,456 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0080 | Val rms_score: 0.4537
|
| 261 |
+
2025-09-26 08:13:14,231 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0074 | Val rms_score: 0.4580
|
| 262 |
+
2025-09-26 08:13:16,799 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0061 | Val rms_score: 0.4727
|
| 263 |
+
2025-09-26 08:13:19,506 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0058 | Val rms_score: 0.4694
|
| 264 |
+
2025-09-26 08:13:22,593 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0062 | Val rms_score: 0.4564
|
| 265 |
+
2025-09-26 08:13:25,432 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0052 | Val rms_score: 0.4667
|
| 266 |
+
2025-09-26 08:13:28,421 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0056 | Val rms_score: 0.4720
|
| 267 |
+
2025-09-26 08:13:31,150 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0068 | Val rms_score: 0.4579
|
| 268 |
+
2025-09-26 08:13:34,086 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0077 | Val rms_score: 0.4618
|
| 269 |
+
2025-09-26 08:13:37,015 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0051 | Val rms_score: 0.4614
|
| 270 |
+
2025-09-26 08:13:39,661 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0062 | Val rms_score: 0.4693
|
| 271 |
+
2025-09-26 08:13:42,282 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0079 | Val rms_score: 0.4579
|
| 272 |
+
2025-09-26 08:13:45,046 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0063 | Val rms_score: 0.4461
|
| 273 |
+
2025-09-26 08:13:47,897 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0057 | Val rms_score: 0.4673
|
| 274 |
+
2025-09-26 08:13:50,749 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0066 | Val rms_score: 0.4457
|
| 275 |
+
2025-09-26 08:13:53,246 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0069 | Val rms_score: 0.4663
|
| 276 |
+
2025-09-26 08:13:55,612 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0079 | Val rms_score: 0.4581
|
| 277 |
+
2025-09-26 08:13:58,031 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0052 | Val rms_score: 0.4430
|
| 278 |
+
2025-09-26 08:14:00,565 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0060 | Val rms_score: 0.4617
|
| 279 |
+
2025-09-26 08:14:03,479 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0055 | Val rms_score: 0.4516
|
| 280 |
+
2025-09-26 08:14:06,088 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0058 | Val rms_score: 0.4545
|
| 281 |
+
2025-09-26 08:14:08,989 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0053 | Val rms_score: 0.4607
|
| 282 |
+
2025-09-26 08:14:11,536 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0050 | Val rms_score: 0.4630
|
| 283 |
+
2025-09-26 08:14:14,302 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0052 | Val rms_score: 0.4548
|
| 284 |
+
2025-09-26 08:14:17,366 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0046 | Val rms_score: 0.4618
|
| 285 |
+
2025-09-26 08:14:20,133 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0048 | Val rms_score: 0.4722
|
| 286 |
+
2025-09-26 08:14:22,901 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0047 | Val rms_score: 0.4673
|
| 287 |
+
2025-09-26 08:14:25,676 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0057 | Val rms_score: 0.4556
|
| 288 |
+
2025-09-26 08:14:28,415 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0051 | Val rms_score: 0.4716
|
| 289 |
+
2025-09-26 08:14:31,367 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0053 | Val rms_score: 0.4680
|
| 290 |
+
2025-09-26 08:14:34,049 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0059 | Val rms_score: 0.4615
|
| 291 |
+
2025-09-26 08:14:36,934 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0043 | Val rms_score: 0.4595
|
| 292 |
+
2025-09-26 08:14:40,020 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0040 | Val rms_score: 0.4503
|
| 293 |
+
2025-09-26 08:14:42,917 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0055 | Val rms_score: 0.4743
|
| 294 |
+
2025-09-26 08:14:45,912 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0053 | Val rms_score: 0.4553
|
| 295 |
+
2025-09-26 08:14:48,379 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0054 | Val rms_score: 0.4543
|
| 296 |
+
2025-09-26 08:14:50,977 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0054 | Val rms_score: 0.4762
|
| 297 |
+
2025-09-26 08:14:53,401 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0044 | Val rms_score: 0.4482
|
| 298 |
+
2025-09-26 08:14:55,893 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0052 | Val rms_score: 0.4546
|
| 299 |
+
2025-09-26 08:14:58,788 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0034 | Val rms_score: 0.4678
|
| 300 |
+
2025-09-26 08:15:01,077 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0050 | Val rms_score: 0.4613
|
| 301 |
+
2025-09-26 08:15:03,470 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0042 | Val rms_score: 0.4564
|
| 302 |
+
2025-09-26 08:15:05,893 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0044 | Val rms_score: 0.4660
|
| 303 |
+
2025-09-26 08:15:08,172 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0047 | Val rms_score: 0.4680
|
| 304 |
+
2025-09-26 08:15:10,990 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0031 | Val rms_score: 0.4565
|
| 305 |
+
2025-09-26 08:15:13,545 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0032 | Val rms_score: 0.4624
|
| 306 |
+
2025-09-26 08:15:16,014 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0043 | Val rms_score: 0.4681
|
| 307 |
+
2025-09-26 08:15:18,610 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0046 | Val rms_score: 0.4568
|
| 308 |
+
2025-09-26 08:15:21,124 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0032 | Val rms_score: 0.4464
|
| 309 |
+
2025-09-26 08:15:23,845 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0037 | Val rms_score: 0.4601
|
| 310 |
+
2025-09-26 08:15:26,191 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0045 | Val rms_score: 0.4610
|
| 311 |
+
2025-09-26 08:15:28,727 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0052 | Val rms_score: 0.4595
|
| 312 |
+
2025-09-26 08:15:31,019 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0038 | Val rms_score: 0.4658
|
| 313 |
+
2025-09-26 08:15:33,436 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0039 | Val rms_score: 0.4604
|
| 314 |
+
2025-09-26 08:15:36,347 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0031 | Val rms_score: 0.4578
|
| 315 |
+
2025-09-26 08:15:38,896 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0042 | Val rms_score: 0.4578
|
| 316 |
+
2025-09-26 08:15:41,420 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0037 | Val rms_score: 0.4603
|
| 317 |
+
2025-09-26 08:15:43,973 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0038 | Val rms_score: 0.4649
|
| 318 |
+
2025-09-26 08:15:46,361 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0041 | Val rms_score: 0.4585
|
| 319 |
+
2025-09-26 08:15:49,104 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0038 | Val rms_score: 0.4585
|
| 320 |
+
2025-09-26 08:15:51,677 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0045 | Val rms_score: 0.4565
|
| 321 |
+
2025-09-26 08:15:54,133 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0048 | Val rms_score: 0.4526
|
| 322 |
+
2025-09-26 08:15:56,692 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0042 | Val rms_score: 0.4582
|
| 323 |
+
2025-09-26 08:15:59,445 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0042 | Val rms_score: 0.4445
|
| 324 |
+
2025-09-26 08:16:02,940 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0043 | Val rms_score: 0.4568
|
| 325 |
+
2025-09-26 08:16:05,517 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0056 | Val rms_score: 0.4464
|
| 326 |
+
2025-09-26 08:16:08,099 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0046 | Val rms_score: 0.4456
|
| 327 |
+
2025-09-26 08:16:10,624 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0057 | Val rms_score: 0.4581
|
| 328 |
+
2025-09-26 08:16:11,212 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Test rms_score: 0.6926
|
| 329 |
+
2025-09-26 08:16:11,503 - logs_modchembert_adme_ppb_r_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.6942, Std Dev: 0.0397
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_adme_solubility_epochs100_batch_size32_20250926_081611.log
ADDED
|
@@ -0,0 +1,343 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 08:16:11,504 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Running benchmark for dataset: adme_solubility
|
| 2 |
+
2025-09-26 08:16:11,504 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - dataset: adme_solubility, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 08:16:11,513 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset adme_solubility at 2025-09-26_08-16-11
|
| 4 |
+
2025-09-26 08:16:19,297 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6364 | Val rms_score: 0.3772
|
| 5 |
+
2025-09-26 08:16:19,297 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 55
|
| 6 |
+
2025-09-26 08:16:19,836 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.3772
|
| 7 |
+
2025-09-26 08:16:29,891 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.3297 | Val rms_score: 0.3845
|
| 8 |
+
2025-09-26 08:16:41,052 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2739 | Val rms_score: 0.3883
|
| 9 |
+
2025-09-26 08:16:52,658 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1961 | Val rms_score: 0.4210
|
| 10 |
+
2025-09-26 08:17:03,502 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1739 | Val rms_score: 0.3759
|
| 11 |
+
2025-09-26 08:17:03,653 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 275
|
| 12 |
+
2025-09-26 08:17:04,209 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val rms_score: 0.3759
|
| 13 |
+
2025-09-26 08:17:16,004 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1417 | Val rms_score: 0.3563
|
| 14 |
+
2025-09-26 08:17:16,512 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 330
|
| 15 |
+
2025-09-26 08:17:17,095 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 6 with val rms_score: 0.3563
|
| 16 |
+
2025-09-26 08:17:27,734 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1017 | Val rms_score: 0.3809
|
| 17 |
+
2025-09-26 08:17:38,810 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0879 | Val rms_score: 0.3590
|
| 18 |
+
2025-09-26 08:17:50,732 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0744 | Val rms_score: 0.3840
|
| 19 |
+
2025-09-26 08:18:02,225 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0694 | Val rms_score: 0.3828
|
| 20 |
+
2025-09-26 08:18:14,148 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0574 | Val rms_score: 0.3713
|
| 21 |
+
2025-09-26 08:18:26,075 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0534 | Val rms_score: 0.3835
|
| 22 |
+
2025-09-26 08:18:37,746 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0440 | Val rms_score: 0.3823
|
| 23 |
+
2025-09-26 08:18:49,673 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0477 | Val rms_score: 0.3550
|
| 24 |
+
2025-09-26 08:18:49,947 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 770
|
| 25 |
+
2025-09-26 08:18:50,515 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 14 with val rms_score: 0.3550
|
| 26 |
+
2025-09-26 08:19:02,219 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0450 | Val rms_score: 0.3593
|
| 27 |
+
2025-09-26 08:19:14,103 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0403 | Val rms_score: 0.3892
|
| 28 |
+
2025-09-26 08:19:24,791 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0353 | Val rms_score: 0.3786
|
| 29 |
+
2025-09-26 08:19:36,210 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0359 | Val rms_score: 0.3674
|
| 30 |
+
2025-09-26 08:19:49,300 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0359 | Val rms_score: 0.3665
|
| 31 |
+
2025-09-26 08:19:59,630 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0347 | Val rms_score: 0.3749
|
| 32 |
+
2025-09-26 08:20:11,738 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0337 | Val rms_score: 0.3705
|
| 33 |
+
2025-09-26 08:20:23,879 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0233 | Val rms_score: 0.3733
|
| 34 |
+
2025-09-26 08:20:35,445 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0268 | Val rms_score: 0.3694
|
| 35 |
+
2025-09-26 08:20:47,441 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0291 | Val rms_score: 0.3723
|
| 36 |
+
2025-09-26 08:20:59,354 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0261 | Val rms_score: 0.3616
|
| 37 |
+
2025-09-26 08:21:11,128 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0242 | Val rms_score: 0.3652
|
| 38 |
+
2025-09-26 08:21:23,017 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0246 | Val rms_score: 0.3596
|
| 39 |
+
2025-09-26 08:21:34,254 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0262 | Val rms_score: 0.3710
|
| 40 |
+
2025-09-26 08:21:46,227 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0256 | Val rms_score: 0.3648
|
| 41 |
+
2025-09-26 08:21:58,341 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0233 | Val rms_score: 0.3613
|
| 42 |
+
2025-09-26 08:22:10,141 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0254 | Val rms_score: 0.3595
|
| 43 |
+
2025-09-26 08:22:22,298 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0217 | Val rms_score: 0.3619
|
| 44 |
+
2025-09-26 08:22:33,946 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0224 | Val rms_score: 0.3642
|
| 45 |
+
2025-09-26 08:22:45,628 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0219 | Val rms_score: 0.3601
|
| 46 |
+
2025-09-26 08:22:57,493 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0200 | Val rms_score: 0.3519
|
| 47 |
+
2025-09-26 08:22:57,650 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1925
|
| 48 |
+
2025-09-26 08:22:58,204 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 35 with val rms_score: 0.3519
|
| 49 |
+
2025-09-26 08:23:10,036 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0195 | Val rms_score: 0.3631
|
| 50 |
+
2025-09-26 08:23:23,316 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0210 | Val rms_score: 0.3724
|
| 51 |
+
2025-09-26 08:23:34,360 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0178 | Val rms_score: 0.3735
|
| 52 |
+
2025-09-26 08:23:46,224 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0187 | Val rms_score: 0.3646
|
| 53 |
+
2025-09-26 08:23:58,126 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0169 | Val rms_score: 0.3617
|
| 54 |
+
2025-09-26 08:24:10,015 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0179 | Val rms_score: 0.3613
|
| 55 |
+
2025-09-26 08:24:22,105 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0187 | Val rms_score: 0.3695
|
| 56 |
+
2025-09-26 08:24:33,499 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0168 | Val rms_score: 0.3596
|
| 57 |
+
2025-09-26 08:24:45,173 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0163 | Val rms_score: 0.3612
|
| 58 |
+
2025-09-26 08:24:57,050 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0166 | Val rms_score: 0.3623
|
| 59 |
+
2025-09-26 08:25:08,797 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0173 | Val rms_score: 0.3546
|
| 60 |
+
2025-09-26 08:25:21,018 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0156 | Val rms_score: 0.3679
|
| 61 |
+
2025-09-26 08:25:32,636 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0161 | Val rms_score: 0.3616
|
| 62 |
+
2025-09-26 08:25:44,383 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0155 | Val rms_score: 0.3632
|
| 63 |
+
2025-09-26 08:25:56,223 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0152 | Val rms_score: 0.3547
|
| 64 |
+
2025-09-26 08:26:07,753 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0163 | Val rms_score: 0.3599
|
| 65 |
+
2025-09-26 08:26:19,176 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0141 | Val rms_score: 0.3604
|
| 66 |
+
2025-09-26 08:26:30,832 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0134 | Val rms_score: 0.3613
|
| 67 |
+
2025-09-26 08:26:42,689 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0135 | Val rms_score: 0.3619
|
| 68 |
+
2025-09-26 08:26:55,590 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0134 | Val rms_score: 0.3614
|
| 69 |
+
2025-09-26 08:27:06,554 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0138 | Val rms_score: 0.3574
|
| 70 |
+
2025-09-26 08:27:18,948 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0146 | Val rms_score: 0.3592
|
| 71 |
+
2025-09-26 08:27:30,152 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0129 | Val rms_score: 0.3582
|
| 72 |
+
2025-09-26 08:27:41,992 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0130 | Val rms_score: 0.3620
|
| 73 |
+
2025-09-26 08:27:53,633 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0131 | Val rms_score: 0.3633
|
| 74 |
+
2025-09-26 08:28:05,426 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0138 | Val rms_score: 0.3608
|
| 75 |
+
2025-09-26 08:28:17,457 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0123 | Val rms_score: 0.3624
|
| 76 |
+
2025-09-26 08:28:28,836 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0126 | Val rms_score: 0.3653
|
| 77 |
+
2025-09-26 08:28:40,667 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0127 | Val rms_score: 0.3614
|
| 78 |
+
2025-09-26 08:28:52,530 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0129 | Val rms_score: 0.3674
|
| 79 |
+
2025-09-26 08:29:04,385 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0133 | Val rms_score: 0.3622
|
| 80 |
+
2025-09-26 08:29:16,489 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0124 | Val rms_score: 0.3560
|
| 81 |
+
2025-09-26 08:29:28,126 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0124 | Val rms_score: 0.3689
|
| 82 |
+
2025-09-26 08:29:39,952 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0111 | Val rms_score: 0.3601
|
| 83 |
+
2025-09-26 08:29:51,753 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0123 | Val rms_score: 0.3600
|
| 84 |
+
2025-09-26 08:30:03,727 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0128 | Val rms_score: 0.3673
|
| 85 |
+
2025-09-26 08:30:16,157 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0120 | Val rms_score: 0.3625
|
| 86 |
+
2025-09-26 08:30:28,914 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0113 | Val rms_score: 0.3627
|
| 87 |
+
2025-09-26 08:30:40,818 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0130 | Val rms_score: 0.3564
|
| 88 |
+
2025-09-26 08:30:52,958 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0109 | Val rms_score: 0.3587
|
| 89 |
+
2025-09-26 08:31:05,030 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0124 | Val rms_score: 0.3580
|
| 90 |
+
2025-09-26 08:31:17,546 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0113 | Val rms_score: 0.3570
|
| 91 |
+
2025-09-26 08:31:29,567 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0108 | Val rms_score: 0.3592
|
| 92 |
+
2025-09-26 08:31:41,499 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0114 | Val rms_score: 0.3603
|
| 93 |
+
2025-09-26 08:31:53,574 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0119 | Val rms_score: 0.3615
|
| 94 |
+
2025-09-26 08:32:05,725 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0111 | Val rms_score: 0.3604
|
| 95 |
+
2025-09-26 08:32:18,260 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0092 | Val rms_score: 0.3602
|
| 96 |
+
2025-09-26 08:32:30,437 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0104 | Val rms_score: 0.3627
|
| 97 |
+
2025-09-26 08:32:42,731 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0104 | Val rms_score: 0.3628
|
| 98 |
+
2025-09-26 08:32:54,723 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0104 | Val rms_score: 0.3620
|
| 99 |
+
2025-09-26 08:33:06,758 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0104 | Val rms_score: 0.3590
|
| 100 |
+
2025-09-26 08:33:19,209 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0102 | Val rms_score: 0.3628
|
| 101 |
+
2025-09-26 08:33:31,368 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0120 | Val rms_score: 0.3580
|
| 102 |
+
2025-09-26 08:33:43,630 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0104 | Val rms_score: 0.3587
|
| 103 |
+
2025-09-26 08:33:55,741 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0105 | Val rms_score: 0.3646
|
| 104 |
+
2025-09-26 08:34:08,951 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0107 | Val rms_score: 0.3571
|
| 105 |
+
2025-09-26 08:34:21,731 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0104 | Val rms_score: 0.3610
|
| 106 |
+
2025-09-26 08:34:33,890 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0100 | Val rms_score: 0.3633
|
| 107 |
+
2025-09-26 08:34:46,025 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0094 | Val rms_score: 0.3615
|
| 108 |
+
2025-09-26 08:34:58,119 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0097 | Val rms_score: 0.3538
|
| 109 |
+
2025-09-26 08:35:10,266 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0102 | Val rms_score: 0.3547
|
| 110 |
+
2025-09-26 08:35:22,314 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0099 | Val rms_score: 0.3560
|
| 111 |
+
2025-09-26 08:35:32,655 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0103 | Val rms_score: 0.3608
|
| 112 |
+
2025-09-26 08:35:44,507 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0093 | Val rms_score: 0.3590
|
| 113 |
+
2025-09-26 08:35:56,173 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0100 | Val rms_score: 0.3608
|
| 114 |
+
2025-09-26 08:35:57,142 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Test rms_score: 0.4718
|
| 115 |
+
2025-09-26 08:35:57,448 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset adme_solubility at 2025-09-26_08-35-57
|
| 116 |
+
2025-09-26 08:36:08,063 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6000 | Val rms_score: 0.4552
|
| 117 |
+
2025-09-26 08:36:08,063 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 55
|
| 118 |
+
2025-09-26 08:36:08,699 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4552
|
| 119 |
+
2025-09-26 08:36:20,037 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.3891 | Val rms_score: 0.3869
|
| 120 |
+
2025-09-26 08:36:20,215 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 110
|
| 121 |
+
2025-09-26 08:36:20,797 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3869
|
| 122 |
+
2025-09-26 08:36:32,366 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2977 | Val rms_score: 0.3510
|
| 123 |
+
2025-09-26 08:36:32,558 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 165
|
| 124 |
+
2025-09-26 08:36:33,125 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.3510
|
| 125 |
+
2025-09-26 08:36:44,677 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1883 | Val rms_score: 0.4121
|
| 126 |
+
2025-09-26 08:36:55,303 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1580 | Val rms_score: 0.3607
|
| 127 |
+
2025-09-26 08:37:07,271 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1203 | Val rms_score: 0.3871
|
| 128 |
+
2025-09-26 08:37:18,323 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0972 | Val rms_score: 0.3865
|
| 129 |
+
2025-09-26 08:37:29,876 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0773 | Val rms_score: 0.4085
|
| 130 |
+
2025-09-26 08:37:41,323 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0713 | Val rms_score: 0.3785
|
| 131 |
+
2025-09-26 08:37:52,972 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0659 | Val rms_score: 0.3863
|
| 132 |
+
2025-09-26 08:38:04,479 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0691 | Val rms_score: 0.3501
|
| 133 |
+
2025-09-26 08:38:04,982 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 605
|
| 134 |
+
2025-09-26 08:38:05,572 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 11 with val rms_score: 0.3501
|
| 135 |
+
2025-09-26 08:38:17,434 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0577 | Val rms_score: 0.3676
|
| 136 |
+
2025-09-26 08:38:28,737 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0474 | Val rms_score: 0.3807
|
| 137 |
+
2025-09-26 08:38:40,240 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0491 | Val rms_score: 0.3538
|
| 138 |
+
2025-09-26 08:38:52,083 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0391 | Val rms_score: 0.3665
|
| 139 |
+
2025-09-26 08:39:03,851 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0355 | Val rms_score: 0.3716
|
| 140 |
+
2025-09-26 08:39:16,135 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0388 | Val rms_score: 0.3683
|
| 141 |
+
2025-09-26 08:39:27,579 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0375 | Val rms_score: 0.3709
|
| 142 |
+
2025-09-26 08:39:41,740 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0347 | Val rms_score: 0.3634
|
| 143 |
+
2025-09-26 08:39:52,692 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0318 | Val rms_score: 0.3657
|
| 144 |
+
2025-09-26 08:40:04,327 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0301 | Val rms_score: 0.3473
|
| 145 |
+
2025-09-26 08:40:04,796 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1155
|
| 146 |
+
2025-09-26 08:40:05,368 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 21 with val rms_score: 0.3473
|
| 147 |
+
2025-09-26 08:40:17,700 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0322 | Val rms_score: 0.3666
|
| 148 |
+
2025-09-26 08:40:29,830 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0311 | Val rms_score: 0.3604
|
| 149 |
+
2025-09-26 08:40:42,267 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0273 | Val rms_score: 0.3671
|
| 150 |
+
2025-09-26 08:40:54,783 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0268 | Val rms_score: 0.3664
|
| 151 |
+
2025-09-26 08:41:07,338 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0232 | Val rms_score: 0.3579
|
| 152 |
+
2025-09-26 08:41:20,367 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0230 | Val rms_score: 0.3631
|
| 153 |
+
2025-09-26 08:41:32,746 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0246 | Val rms_score: 0.3614
|
| 154 |
+
2025-09-26 08:41:45,332 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0230 | Val rms_score: 0.3660
|
| 155 |
+
2025-09-26 08:41:57,824 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0220 | Val rms_score: 0.3640
|
| 156 |
+
2025-09-26 08:42:10,577 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0215 | Val rms_score: 0.3675
|
| 157 |
+
2025-09-26 08:42:24,280 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0199 | Val rms_score: 0.3552
|
| 158 |
+
2025-09-26 08:42:37,083 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0208 | Val rms_score: 0.3623
|
| 159 |
+
2025-09-26 08:42:50,185 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0205 | Val rms_score: 0.3496
|
| 160 |
+
2025-09-26 08:43:03,304 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0198 | Val rms_score: 0.3580
|
| 161 |
+
2025-09-26 08:43:16,222 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0197 | Val rms_score: 0.3572
|
| 162 |
+
2025-09-26 08:43:30,010 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0206 | Val rms_score: 0.3552
|
| 163 |
+
2025-09-26 08:43:41,843 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0202 | Val rms_score: 0.3617
|
| 164 |
+
2025-09-26 08:43:55,092 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0182 | Val rms_score: 0.3514
|
| 165 |
+
2025-09-26 08:44:08,464 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0183 | Val rms_score: 0.3567
|
| 166 |
+
2025-09-26 08:44:21,699 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0192 | Val rms_score: 0.3605
|
| 167 |
+
2025-09-26 08:44:34,706 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0149 | Val rms_score: 0.3535
|
| 168 |
+
2025-09-26 08:44:47,669 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0171 | Val rms_score: 0.3575
|
| 169 |
+
2025-09-26 08:45:00,703 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0147 | Val rms_score: 0.3640
|
| 170 |
+
2025-09-26 08:45:13,683 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0166 | Val rms_score: 0.3551
|
| 171 |
+
2025-09-26 08:45:26,642 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0173 | Val rms_score: 0.3624
|
| 172 |
+
2025-09-26 08:45:39,101 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0168 | Val rms_score: 0.3589
|
| 173 |
+
2025-09-26 08:45:51,767 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0161 | Val rms_score: 0.3601
|
| 174 |
+
2025-09-26 08:46:04,774 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0165 | Val rms_score: 0.3531
|
| 175 |
+
2025-09-26 08:46:17,836 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0167 | Val rms_score: 0.3583
|
| 176 |
+
2025-09-26 08:46:30,979 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0150 | Val rms_score: 0.3618
|
| 177 |
+
2025-09-26 08:46:43,761 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0148 | Val rms_score: 0.3584
|
| 178 |
+
2025-09-26 08:46:56,791 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0153 | Val rms_score: 0.3622
|
| 179 |
+
2025-09-26 08:47:10,072 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0147 | Val rms_score: 0.3564
|
| 180 |
+
2025-09-26 08:47:23,523 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0144 | Val rms_score: 0.3617
|
| 181 |
+
2025-09-26 08:47:35,654 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0145 | Val rms_score: 0.3560
|
| 182 |
+
2025-09-26 08:47:48,851 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0148 | Val rms_score: 0.3558
|
| 183 |
+
2025-09-26 08:48:02,138 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0134 | Val rms_score: 0.3635
|
| 184 |
+
2025-09-26 08:48:15,490 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0137 | Val rms_score: 0.3578
|
| 185 |
+
2025-09-26 08:48:28,443 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0140 | Val rms_score: 0.3662
|
| 186 |
+
2025-09-26 08:48:41,838 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0134 | Val rms_score: 0.3527
|
| 187 |
+
2025-09-26 08:48:55,195 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0122 | Val rms_score: 0.3626
|
| 188 |
+
2025-09-26 08:49:08,615 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0155 | Val rms_score: 0.3537
|
| 189 |
+
2025-09-26 08:49:22,009 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0134 | Val rms_score: 0.3575
|
| 190 |
+
2025-09-26 08:49:35,121 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0129 | Val rms_score: 0.3585
|
| 191 |
+
2025-09-26 08:49:48,382 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0123 | Val rms_score: 0.3565
|
| 192 |
+
2025-09-26 08:50:01,998 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0121 | Val rms_score: 0.3614
|
| 193 |
+
2025-09-26 08:50:14,757 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0130 | Val rms_score: 0.3643
|
| 194 |
+
2025-09-26 08:50:27,532 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0124 | Val rms_score: 0.3527
|
| 195 |
+
2025-09-26 08:50:40,235 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0112 | Val rms_score: 0.3600
|
| 196 |
+
2025-09-26 08:50:52,732 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0121 | Val rms_score: 0.3592
|
| 197 |
+
2025-09-26 08:51:05,927 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0113 | Val rms_score: 0.3557
|
| 198 |
+
2025-09-26 08:51:19,276 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0114 | Val rms_score: 0.3531
|
| 199 |
+
2025-09-26 08:51:31,228 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0109 | Val rms_score: 0.3557
|
| 200 |
+
2025-09-26 08:51:43,752 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0116 | Val rms_score: 0.3531
|
| 201 |
+
2025-09-26 08:51:56,306 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0116 | Val rms_score: 0.3587
|
| 202 |
+
2025-09-26 08:52:09,794 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0120 | Val rms_score: 0.3621
|
| 203 |
+
2025-09-26 08:52:22,385 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0116 | Val rms_score: 0.3577
|
| 204 |
+
2025-09-26 08:52:35,118 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0111 | Val rms_score: 0.3556
|
| 205 |
+
2025-09-26 08:52:47,866 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0105 | Val rms_score: 0.3524
|
| 206 |
+
2025-09-26 08:53:00,552 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0107 | Val rms_score: 0.3604
|
| 207 |
+
2025-09-26 08:53:14,473 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0110 | Val rms_score: 0.3534
|
| 208 |
+
2025-09-26 08:53:27,160 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0106 | Val rms_score: 0.3574
|
| 209 |
+
2025-09-26 08:53:39,921 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0102 | Val rms_score: 0.3580
|
| 210 |
+
2025-09-26 08:53:52,735 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0107 | Val rms_score: 0.3573
|
| 211 |
+
2025-09-26 08:54:05,824 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0107 | Val rms_score: 0.3539
|
| 212 |
+
2025-09-26 08:54:19,425 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0113 | Val rms_score: 0.3587
|
| 213 |
+
2025-09-26 08:54:31,729 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0109 | Val rms_score: 0.3528
|
| 214 |
+
2025-09-26 08:54:44,754 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0105 | Val rms_score: 0.3506
|
| 215 |
+
2025-09-26 08:54:57,641 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0104 | Val rms_score: 0.3542
|
| 216 |
+
2025-09-26 08:55:10,950 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0083 | Val rms_score: 0.3568
|
| 217 |
+
2025-09-26 08:55:23,477 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0091 | Val rms_score: 0.3565
|
| 218 |
+
2025-09-26 08:55:35,999 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0109 | Val rms_score: 0.3534
|
| 219 |
+
2025-09-26 08:55:49,007 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0097 | Val rms_score: 0.3575
|
| 220 |
+
2025-09-26 08:56:01,768 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0093 | Val rms_score: 0.3543
|
| 221 |
+
2025-09-26 08:56:14,447 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0099 | Val rms_score: 0.3538
|
| 222 |
+
2025-09-26 08:56:27,733 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0095 | Val rms_score: 0.3534
|
| 223 |
+
2025-09-26 08:56:40,444 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0096 | Val rms_score: 0.3581
|
| 224 |
+
2025-09-26 08:56:53,260 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0097 | Val rms_score: 0.3579
|
| 225 |
+
2025-09-26 08:57:06,154 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0090 | Val rms_score: 0.3535
|
| 226 |
+
2025-09-26 08:57:06,984 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Test rms_score: 0.4527
|
| 227 |
+
2025-09-26 08:57:07,316 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset adme_solubility at 2025-09-26_08-57-07
|
| 228 |
+
2025-09-26 08:57:19,548 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6273 | Val rms_score: 0.3897
|
| 229 |
+
2025-09-26 08:57:19,548 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 55
|
| 230 |
+
2025-09-26 08:57:20,173 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.3897
|
| 231 |
+
2025-09-26 08:57:33,928 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4000 | Val rms_score: 0.3618
|
| 232 |
+
2025-09-26 08:57:34,122 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 110
|
| 233 |
+
2025-09-26 08:57:34,711 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.3618
|
| 234 |
+
2025-09-26 08:57:48,254 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3000 | Val rms_score: 0.4047
|
| 235 |
+
2025-09-26 08:58:00,618 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1805 | Val rms_score: 0.3807
|
| 236 |
+
2025-09-26 08:58:13,515 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1420 | Val rms_score: 0.4054
|
| 237 |
+
2025-09-26 08:58:25,629 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1073 | Val rms_score: 0.4107
|
| 238 |
+
2025-09-26 08:58:37,953 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1017 | Val rms_score: 0.3649
|
| 239 |
+
2025-09-26 08:58:49,181 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0898 | Val rms_score: 0.3798
|
| 240 |
+
2025-09-26 08:58:59,704 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0801 | Val rms_score: 0.3762
|
| 241 |
+
2025-09-26 08:59:11,140 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0753 | Val rms_score: 0.3884
|
| 242 |
+
2025-09-26 08:59:21,811 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0777 | Val rms_score: 0.3800
|
| 243 |
+
2025-09-26 08:59:33,828 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0588 | Val rms_score: 0.3911
|
| 244 |
+
2025-09-26 08:59:45,159 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0581 | Val rms_score: 0.3806
|
| 245 |
+
2025-09-26 08:59:56,836 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0534 | Val rms_score: 0.3763
|
| 246 |
+
2025-09-26 09:00:08,582 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0491 | Val rms_score: 0.3708
|
| 247 |
+
2025-09-26 09:00:20,524 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0386 | Val rms_score: 0.3660
|
| 248 |
+
2025-09-26 09:00:32,760 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0350 | Val rms_score: 0.3830
|
| 249 |
+
2025-09-26 09:00:43,364 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0348 | Val rms_score: 0.3616
|
| 250 |
+
2025-09-26 09:00:43,518 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 990
|
| 251 |
+
2025-09-26 09:00:44,143 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 18 with val rms_score: 0.3616
|
| 252 |
+
2025-09-26 09:00:56,957 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0309 | Val rms_score: 0.3681
|
| 253 |
+
2025-09-26 09:01:07,169 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0315 | Val rms_score: 0.3614
|
| 254 |
+
2025-09-26 09:01:07,459 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1100
|
| 255 |
+
2025-09-26 09:01:08,035 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 20 with val rms_score: 0.3614
|
| 256 |
+
2025-09-26 09:01:20,062 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0321 | Val rms_score: 0.3613
|
| 257 |
+
2025-09-26 09:01:20,547 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1155
|
| 258 |
+
2025-09-26 09:01:21,175 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 21 with val rms_score: 0.3613
|
| 259 |
+
2025-09-26 09:01:32,951 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0363 | Val rms_score: 0.3715
|
| 260 |
+
2025-09-26 09:01:44,374 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0283 | Val rms_score: 0.3644
|
| 261 |
+
2025-09-26 09:01:56,302 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0314 | Val rms_score: 0.3575
|
| 262 |
+
2025-09-26 09:01:56,485 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1320
|
| 263 |
+
2025-09-26 09:01:57,094 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 24 with val rms_score: 0.3575
|
| 264 |
+
2025-09-26 09:02:09,213 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0286 | Val rms_score: 0.3649
|
| 265 |
+
2025-09-26 09:02:21,158 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0250 | Val rms_score: 0.3707
|
| 266 |
+
2025-09-26 09:02:33,613 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0244 | Val rms_score: 0.3642
|
| 267 |
+
2025-09-26 09:02:45,374 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0254 | Val rms_score: 0.3585
|
| 268 |
+
2025-09-26 09:02:56,985 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0230 | Val rms_score: 0.3704
|
| 269 |
+
2025-09-26 09:03:08,799 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0227 | Val rms_score: 0.3518
|
| 270 |
+
2025-09-26 09:03:08,956 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1650
|
| 271 |
+
2025-09-26 09:03:09,515 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 30 with val rms_score: 0.3518
|
| 272 |
+
2025-09-26 09:03:22,251 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0177 | Val rms_score: 0.3594
|
| 273 |
+
2025-09-26 09:03:34,537 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0203 | Val rms_score: 0.3724
|
| 274 |
+
2025-09-26 09:03:46,398 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0233 | Val rms_score: 0.3562
|
| 275 |
+
2025-09-26 09:03:58,574 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0199 | Val rms_score: 0.3659
|
| 276 |
+
2025-09-26 09:04:10,704 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0241 | Val rms_score: 0.3596
|
| 277 |
+
2025-09-26 09:04:22,728 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0203 | Val rms_score: 0.3580
|
| 278 |
+
2025-09-26 09:04:36,635 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0184 | Val rms_score: 0.3666
|
| 279 |
+
2025-09-26 09:04:48,587 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0216 | Val rms_score: 0.3789
|
| 280 |
+
2025-09-26 09:05:00,710 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0179 | Val rms_score: 0.3651
|
| 281 |
+
2025-09-26 09:05:12,979 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0182 | Val rms_score: 0.3607
|
| 282 |
+
2025-09-26 09:05:24,955 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0183 | Val rms_score: 0.3669
|
| 283 |
+
2025-09-26 09:05:37,626 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0178 | Val rms_score: 0.3564
|
| 284 |
+
2025-09-26 09:05:49,784 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0186 | Val rms_score: 0.3579
|
| 285 |
+
2025-09-26 09:06:02,004 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0143 | Val rms_score: 0.3652
|
| 286 |
+
2025-09-26 09:06:14,252 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0165 | Val rms_score: 0.3673
|
| 287 |
+
2025-09-26 09:06:26,576 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0168 | Val rms_score: 0.3559
|
| 288 |
+
2025-09-26 09:06:39,515 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0168 | Val rms_score: 0.3596
|
| 289 |
+
2025-09-26 09:06:51,369 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0161 | Val rms_score: 0.3600
|
| 290 |
+
2025-09-26 09:07:03,507 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0156 | Val rms_score: 0.3587
|
| 291 |
+
2025-09-26 09:07:15,531 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0163 | Val rms_score: 0.3713
|
| 292 |
+
2025-09-26 09:07:27,493 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0145 | Val rms_score: 0.3589
|
| 293 |
+
2025-09-26 09:07:40,051 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0151 | Val rms_score: 0.3652
|
| 294 |
+
2025-09-26 09:07:52,138 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0147 | Val rms_score: 0.3579
|
| 295 |
+
2025-09-26 09:08:04,430 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0143 | Val rms_score: 0.3647
|
| 296 |
+
2025-09-26 09:08:18,045 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0145 | Val rms_score: 0.3545
|
| 297 |
+
2025-09-26 09:08:29,830 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0143 | Val rms_score: 0.3636
|
| 298 |
+
2025-09-26 09:08:42,340 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0131 | Val rms_score: 0.3637
|
| 299 |
+
2025-09-26 09:08:54,596 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0145 | Val rms_score: 0.3609
|
| 300 |
+
2025-09-26 09:09:06,722 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0136 | Val rms_score: 0.3669
|
| 301 |
+
2025-09-26 09:09:18,514 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0131 | Val rms_score: 0.3660
|
| 302 |
+
2025-09-26 09:09:30,371 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0139 | Val rms_score: 0.3692
|
| 303 |
+
2025-09-26 09:09:43,054 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0111 | Val rms_score: 0.3631
|
| 304 |
+
2025-09-26 09:09:54,943 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0122 | Val rms_score: 0.3643
|
| 305 |
+
2025-09-26 09:10:07,009 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0128 | Val rms_score: 0.3630
|
| 306 |
+
2025-09-26 09:10:19,112 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0136 | Val rms_score: 0.3590
|
| 307 |
+
2025-09-26 09:10:31,384 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0126 | Val rms_score: 0.3661
|
| 308 |
+
2025-09-26 09:10:44,359 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0116 | Val rms_score: 0.3650
|
| 309 |
+
2025-09-26 09:10:56,425 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0125 | Val rms_score: 0.3627
|
| 310 |
+
2025-09-26 09:11:08,645 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0121 | Val rms_score: 0.3592
|
| 311 |
+
2025-09-26 09:11:21,000 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0121 | Val rms_score: 0.3638
|
| 312 |
+
2025-09-26 09:11:33,252 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0088 | Val rms_score: 0.3629
|
| 313 |
+
2025-09-26 09:11:46,312 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0112 | Val rms_score: 0.3667
|
| 314 |
+
2025-09-26 09:11:59,651 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0113 | Val rms_score: 0.3680
|
| 315 |
+
2025-09-26 09:12:11,673 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0112 | Val rms_score: 0.3660
|
| 316 |
+
2025-09-26 09:12:24,323 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0128 | Val rms_score: 0.3628
|
| 317 |
+
2025-09-26 09:12:37,108 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0115 | Val rms_score: 0.3640
|
| 318 |
+
2025-09-26 09:12:50,582 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0116 | Val rms_score: 0.3572
|
| 319 |
+
2025-09-26 09:13:03,142 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0110 | Val rms_score: 0.3622
|
| 320 |
+
2025-09-26 09:13:15,923 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0101 | Val rms_score: 0.3571
|
| 321 |
+
2025-09-26 09:13:29,042 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0116 | Val rms_score: 0.3586
|
| 322 |
+
2025-09-26 09:13:41,804 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0109 | Val rms_score: 0.3661
|
| 323 |
+
2025-09-26 09:13:55,060 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0103 | Val rms_score: 0.3589
|
| 324 |
+
2025-09-26 09:14:07,705 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0118 | Val rms_score: 0.3568
|
| 325 |
+
2025-09-26 09:14:20,527 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0103 | Val rms_score: 0.3647
|
| 326 |
+
2025-09-26 09:14:33,450 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0115 | Val rms_score: 0.3593
|
| 327 |
+
2025-09-26 09:14:46,297 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0107 | Val rms_score: 0.3603
|
| 328 |
+
2025-09-26 09:14:59,880 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0104 | Val rms_score: 0.3591
|
| 329 |
+
2025-09-26 09:15:12,691 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0102 | Val rms_score: 0.3641
|
| 330 |
+
2025-09-26 09:15:25,931 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0099 | Val rms_score: 0.3613
|
| 331 |
+
2025-09-26 09:15:38,793 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0109 | Val rms_score: 0.3649
|
| 332 |
+
2025-09-26 09:15:52,411 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0083 | Val rms_score: 0.3574
|
| 333 |
+
2025-09-26 09:16:04,493 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0104 | Val rms_score: 0.3632
|
| 334 |
+
2025-09-26 09:16:17,752 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0107 | Val rms_score: 0.3599
|
| 335 |
+
2025-09-26 09:16:30,692 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0098 | Val rms_score: 0.3631
|
| 336 |
+
2025-09-26 09:16:43,778 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0101 | Val rms_score: 0.3609
|
| 337 |
+
2025-09-26 09:16:56,849 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0096 | Val rms_score: 0.3575
|
| 338 |
+
2025-09-26 09:17:10,690 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0096 | Val rms_score: 0.3687
|
| 339 |
+
2025-09-26 09:17:24,146 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0099 | Val rms_score: 0.3580
|
| 340 |
+
2025-09-26 09:17:37,394 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0099 | Val rms_score: 0.3603
|
| 341 |
+
2025-09-26 09:17:50,720 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0091 | Val rms_score: 0.3603
|
| 342 |
+
2025-09-26 09:17:51,757 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Test rms_score: 0.4678
|
| 343 |
+
2025-09-26 09:17:52,092 - logs_modchembert_adme_solubility_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.4641, Std Dev: 0.0082
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_cl_epochs100_batch_size32_20250926_091752.log
ADDED
|
@@ -0,0 +1,317 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 09:17:52,093 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Running benchmark for dataset: astrazeneca_cl
|
| 2 |
+
2025-09-26 09:17:52,093 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - dataset: astrazeneca_cl, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 09:17:52,121 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset astrazeneca_cl at 2025-09-26_09-17-52
|
| 4 |
+
2025-09-26 09:18:01,990 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7222 | Val rms_score: 0.4567
|
| 5 |
+
2025-09-26 09:18:01,990 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Global step of best model: 36
|
| 6 |
+
2025-09-26 09:18:02,667 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4567
|
| 7 |
+
2025-09-26 09:18:13,199 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4896 | Val rms_score: 0.4569
|
| 8 |
+
2025-09-26 09:18:22,845 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3750 | Val rms_score: 0.4770
|
| 9 |
+
2025-09-26 09:18:31,754 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2986 | Val rms_score: 0.4706
|
| 10 |
+
2025-09-26 09:18:41,141 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2517 | Val rms_score: 0.4658
|
| 11 |
+
2025-09-26 09:18:50,256 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1982 | Val rms_score: 0.4794
|
| 12 |
+
2025-09-26 09:18:58,946 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1658 | Val rms_score: 0.5264
|
| 13 |
+
2025-09-26 09:19:07,318 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1380 | Val rms_score: 0.4953
|
| 14 |
+
2025-09-26 09:19:16,449 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1322 | Val rms_score: 0.4942
|
| 15 |
+
2025-09-26 09:19:25,634 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1081 | Val rms_score: 0.4949
|
| 16 |
+
2025-09-26 09:19:34,810 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0929 | Val rms_score: 0.5086
|
| 17 |
+
2025-09-26 09:19:44,285 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0835 | Val rms_score: 0.5035
|
| 18 |
+
2025-09-26 09:19:53,268 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0842 | Val rms_score: 0.5044
|
| 19 |
+
2025-09-26 09:20:02,366 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0576 | Val rms_score: 0.4967
|
| 20 |
+
2025-09-26 09:20:11,850 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0690 | Val rms_score: 0.5047
|
| 21 |
+
2025-09-26 09:20:21,027 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0625 | Val rms_score: 0.5055
|
| 22 |
+
2025-09-26 09:20:30,593 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0671 | Val rms_score: 0.5109
|
| 23 |
+
2025-09-26 09:20:39,577 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0608 | Val rms_score: 0.5095
|
| 24 |
+
2025-09-26 09:20:48,677 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0556 | Val rms_score: 0.4954
|
| 25 |
+
2025-09-26 09:20:58,149 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0578 | Val rms_score: 0.4935
|
| 26 |
+
2025-09-26 09:21:07,412 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0514 | Val rms_score: 0.5016
|
| 27 |
+
2025-09-26 09:21:17,102 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0480 | Val rms_score: 0.5032
|
| 28 |
+
2025-09-26 09:21:25,982 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0477 | Val rms_score: 0.4987
|
| 29 |
+
2025-09-26 09:21:35,237 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0454 | Val rms_score: 0.4991
|
| 30 |
+
2025-09-26 09:21:44,444 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0432 | Val rms_score: 0.4936
|
| 31 |
+
2025-09-26 09:21:53,671 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0460 | Val rms_score: 0.4982
|
| 32 |
+
2025-09-26 09:22:03,421 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0477 | Val rms_score: 0.4954
|
| 33 |
+
2025-09-26 09:22:13,389 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0432 | Val rms_score: 0.4929
|
| 34 |
+
2025-09-26 09:22:22,618 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0375 | Val rms_score: 0.4982
|
| 35 |
+
2025-09-26 09:22:31,761 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0362 | Val rms_score: 0.4928
|
| 36 |
+
2025-09-26 09:22:40,815 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0427 | Val rms_score: 0.4942
|
| 37 |
+
2025-09-26 09:22:50,440 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0367 | Val rms_score: 0.4951
|
| 38 |
+
2025-09-26 09:22:59,339 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0339 | Val rms_score: 0.4986
|
| 39 |
+
2025-09-26 09:23:08,583 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0312 | Val rms_score: 0.4985
|
| 40 |
+
2025-09-26 09:23:18,235 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0336 | Val rms_score: 0.4914
|
| 41 |
+
2025-09-26 09:23:27,454 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0326 | Val rms_score: 0.4967
|
| 42 |
+
2025-09-26 09:23:36,950 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0356 | Val rms_score: 0.4948
|
| 43 |
+
2025-09-26 09:23:45,874 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0328 | Val rms_score: 0.5018
|
| 44 |
+
2025-09-26 09:23:55,280 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0251 | Val rms_score: 0.4957
|
| 45 |
+
2025-09-26 09:24:04,354 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0265 | Val rms_score: 0.4892
|
| 46 |
+
2025-09-26 09:24:13,471 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0251 | Val rms_score: 0.4917
|
| 47 |
+
2025-09-26 09:24:23,012 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0278 | Val rms_score: 0.4949
|
| 48 |
+
2025-09-26 09:24:31,993 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0347 | Val rms_score: 0.4959
|
| 49 |
+
2025-09-26 09:24:41,287 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0270 | Val rms_score: 0.4951
|
| 50 |
+
2025-09-26 09:24:50,690 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0275 | Val rms_score: 0.4888
|
| 51 |
+
2025-09-26 09:24:59,945 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0276 | Val rms_score: 0.4960
|
| 52 |
+
2025-09-26 09:25:09,596 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0260 | Val rms_score: 0.4888
|
| 53 |
+
2025-09-26 09:25:18,418 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0243 | Val rms_score: 0.4902
|
| 54 |
+
2025-09-26 09:25:27,678 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0229 | Val rms_score: 0.4919
|
| 55 |
+
2025-09-26 09:25:36,954 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0244 | Val rms_score: 0.4910
|
| 56 |
+
2025-09-26 09:25:46,133 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0240 | Val rms_score: 0.4904
|
| 57 |
+
2025-09-26 09:25:55,734 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0216 | Val rms_score: 0.4913
|
| 58 |
+
2025-09-26 09:26:04,644 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0168 | Val rms_score: 0.4873
|
| 59 |
+
2025-09-26 09:26:13,567 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0221 | Val rms_score: 0.4896
|
| 60 |
+
2025-09-26 09:26:22,608 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0202 | Val rms_score: 0.4912
|
| 61 |
+
2025-09-26 09:26:33,846 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0234 | Val rms_score: 0.4884
|
| 62 |
+
2025-09-26 09:26:43,157 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0248 | Val rms_score: 0.4953
|
| 63 |
+
2025-09-26 09:26:52,071 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0239 | Val rms_score: 0.4935
|
| 64 |
+
2025-09-26 09:27:01,436 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0218 | Val rms_score: 0.4909
|
| 65 |
+
2025-09-26 09:27:10,760 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0209 | Val rms_score: 0.4900
|
| 66 |
+
2025-09-26 09:27:19,803 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0199 | Val rms_score: 0.4905
|
| 67 |
+
2025-09-26 09:27:29,527 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0195 | Val rms_score: 0.4878
|
| 68 |
+
2025-09-26 09:27:38,464 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0196 | Val rms_score: 0.4906
|
| 69 |
+
2025-09-26 09:27:47,767 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0177 | Val rms_score: 0.4929
|
| 70 |
+
2025-09-26 09:27:57,122 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0194 | Val rms_score: 0.4857
|
| 71 |
+
2025-09-26 09:28:06,506 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0182 | Val rms_score: 0.4941
|
| 72 |
+
2025-09-26 09:28:16,021 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0185 | Val rms_score: 0.4847
|
| 73 |
+
2025-09-26 09:28:24,880 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0174 | Val rms_score: 0.4878
|
| 74 |
+
2025-09-26 09:28:34,057 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0190 | Val rms_score: 0.4912
|
| 75 |
+
2025-09-26 09:28:43,271 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0144 | Val rms_score: 0.4884
|
| 76 |
+
2025-09-26 09:28:52,557 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0163 | Val rms_score: 0.4852
|
| 77 |
+
2025-09-26 09:29:02,070 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0163 | Val rms_score: 0.4857
|
| 78 |
+
2025-09-26 09:29:10,742 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0202 | Val rms_score: 0.4932
|
| 79 |
+
2025-09-26 09:29:20,024 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0167 | Val rms_score: 0.4910
|
| 80 |
+
2025-09-26 09:29:29,203 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0181 | Val rms_score: 0.4856
|
| 81 |
+
2025-09-26 09:29:38,384 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0186 | Val rms_score: 0.4842
|
| 82 |
+
2025-09-26 09:29:48,045 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0199 | Val rms_score: 0.4863
|
| 83 |
+
2025-09-26 09:29:57,077 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0168 | Val rms_score: 0.4875
|
| 84 |
+
2025-09-26 09:30:06,243 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0155 | Val rms_score: 0.4851
|
| 85 |
+
2025-09-26 09:30:15,549 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0189 | Val rms_score: 0.4812
|
| 86 |
+
2025-09-26 09:30:24,592 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0146 | Val rms_score: 0.4875
|
| 87 |
+
2025-09-26 09:30:34,170 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0168 | Val rms_score: 0.4844
|
| 88 |
+
2025-09-26 09:30:42,952 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0187 | Val rms_score: 0.4909
|
| 89 |
+
2025-09-26 09:30:54,692 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0164 | Val rms_score: 0.4871
|
| 90 |
+
2025-09-26 09:31:02,772 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0156 | Val rms_score: 0.4843
|
| 91 |
+
2025-09-26 09:31:11,752 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0160 | Val rms_score: 0.4907
|
| 92 |
+
2025-09-26 09:31:21,270 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0151 | Val rms_score: 0.4917
|
| 93 |
+
2025-09-26 09:31:30,084 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0164 | Val rms_score: 0.4879
|
| 94 |
+
2025-09-26 09:31:39,297 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0112 | Val rms_score: 0.4877
|
| 95 |
+
2025-09-26 09:31:48,362 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0150 | Val rms_score: 0.4867
|
| 96 |
+
2025-09-26 09:31:57,518 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0140 | Val rms_score: 0.4853
|
| 97 |
+
2025-09-26 09:32:07,075 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0177 | Val rms_score: 0.4862
|
| 98 |
+
2025-09-26 09:32:15,972 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0149 | Val rms_score: 0.4874
|
| 99 |
+
2025-09-26 09:32:25,280 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0142 | Val rms_score: 0.4884
|
| 100 |
+
2025-09-26 09:32:34,335 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0154 | Val rms_score: 0.4864
|
| 101 |
+
2025-09-26 09:32:43,661 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0144 | Val rms_score: 0.4825
|
| 102 |
+
2025-09-26 09:32:52,901 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0149 | Val rms_score: 0.4845
|
| 103 |
+
2025-09-26 09:33:01,459 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0158 | Val rms_score: 0.4840
|
| 104 |
+
2025-09-26 09:33:10,575 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0143 | Val rms_score: 0.4843
|
| 105 |
+
2025-09-26 09:33:19,675 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0126 | Val rms_score: 0.4874
|
| 106 |
+
2025-09-26 09:33:20,540 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Test rms_score: 0.4976
|
| 107 |
+
2025-09-26 09:33:20,882 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset astrazeneca_cl at 2025-09-26_09-33-20
|
| 108 |
+
2025-09-26 09:33:29,060 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7188 | Val rms_score: 0.4531
|
| 109 |
+
2025-09-26 09:33:29,060 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Global step of best model: 36
|
| 110 |
+
2025-09-26 09:33:29,603 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4531
|
| 111 |
+
2025-09-26 09:33:38,146 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4826 | Val rms_score: 0.4851
|
| 112 |
+
2025-09-26 09:33:47,210 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3516 | Val rms_score: 0.4575
|
| 113 |
+
2025-09-26 09:33:56,385 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2830 | Val rms_score: 0.4716
|
| 114 |
+
2025-09-26 09:34:05,243 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2274 | Val rms_score: 0.4888
|
| 115 |
+
2025-09-26 09:34:15,013 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1973 | Val rms_score: 0.4898
|
| 116 |
+
2025-09-26 09:34:24,917 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1684 | Val rms_score: 0.4919
|
| 117 |
+
2025-09-26 09:34:34,160 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1372 | Val rms_score: 0.4948
|
| 118 |
+
2025-09-26 09:34:43,958 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1120 | Val rms_score: 0.4974
|
| 119 |
+
2025-09-26 09:34:53,692 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1102 | Val rms_score: 0.5046
|
| 120 |
+
2025-09-26 09:35:03,526 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0946 | Val rms_score: 0.4927
|
| 121 |
+
2025-09-26 09:35:13,354 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0835 | Val rms_score: 0.4947
|
| 122 |
+
2025-09-26 09:35:22,496 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0790 | Val rms_score: 0.4973
|
| 123 |
+
2025-09-26 09:35:32,060 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0630 | Val rms_score: 0.4955
|
| 124 |
+
2025-09-26 09:35:41,582 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0642 | Val rms_score: 0.4956
|
| 125 |
+
2025-09-26 09:35:51,138 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0629 | Val rms_score: 0.5051
|
| 126 |
+
2025-09-26 09:36:00,746 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0729 | Val rms_score: 0.5134
|
| 127 |
+
2025-09-26 09:36:10,277 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0621 | Val rms_score: 0.4993
|
| 128 |
+
2025-09-26 09:36:20,099 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0577 | Val rms_score: 0.4999
|
| 129 |
+
2025-09-26 09:36:29,989 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0637 | Val rms_score: 0.4966
|
| 130 |
+
2025-09-26 09:36:39,621 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0536 | Val rms_score: 0.4945
|
| 131 |
+
2025-09-26 09:36:49,559 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0519 | Val rms_score: 0.5062
|
| 132 |
+
2025-09-26 09:36:59,035 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0469 | Val rms_score: 0.4937
|
| 133 |
+
2025-09-26 09:37:09,222 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0445 | Val rms_score: 0.4974
|
| 134 |
+
2025-09-26 09:37:19,582 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0419 | Val rms_score: 0.4997
|
| 135 |
+
2025-09-26 09:37:29,964 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0464 | Val rms_score: 0.4848
|
| 136 |
+
2025-09-26 09:37:40,149 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0467 | Val rms_score: 0.4915
|
| 137 |
+
2025-09-26 09:37:51,547 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0481 | Val rms_score: 0.4920
|
| 138 |
+
2025-09-26 09:38:00,563 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0360 | Val rms_score: 0.4874
|
| 139 |
+
2025-09-26 09:38:09,537 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0349 | Val rms_score: 0.4894
|
| 140 |
+
2025-09-26 09:38:20,049 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0317 | Val rms_score: 0.4928
|
| 141 |
+
2025-09-26 09:38:30,503 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0356 | Val rms_score: 0.5002
|
| 142 |
+
2025-09-26 09:38:40,638 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0343 | Val rms_score: 0.4931
|
| 143 |
+
2025-09-26 09:38:51,206 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0339 | Val rms_score: 0.4821
|
| 144 |
+
2025-09-26 09:39:01,727 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0323 | Val rms_score: 0.4892
|
| 145 |
+
2025-09-26 09:39:12,301 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0321 | Val rms_score: 0.4885
|
| 146 |
+
2025-09-26 09:39:22,960 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0302 | Val rms_score: 0.4920
|
| 147 |
+
2025-09-26 09:39:33,039 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0310 | Val rms_score: 0.4921
|
| 148 |
+
2025-09-26 09:39:43,390 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0383 | Val rms_score: 0.4886
|
| 149 |
+
2025-09-26 09:39:53,749 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0308 | Val rms_score: 0.4878
|
| 150 |
+
2025-09-26 09:40:03,844 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0276 | Val rms_score: 0.4936
|
| 151 |
+
2025-09-26 09:40:14,023 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0298 | Val rms_score: 0.4901
|
| 152 |
+
2025-09-26 09:40:23,694 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0252 | Val rms_score: 0.4863
|
| 153 |
+
2025-09-26 09:40:34,048 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0278 | Val rms_score: 0.4851
|
| 154 |
+
2025-09-26 09:40:44,139 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0252 | Val rms_score: 0.4905
|
| 155 |
+
2025-09-26 09:40:54,419 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0272 | Val rms_score: 0.4888
|
| 156 |
+
2025-09-26 09:41:04,655 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0243 | Val rms_score: 0.4918
|
| 157 |
+
2025-09-26 09:41:14,264 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0262 | Val rms_score: 0.4841
|
| 158 |
+
2025-09-26 09:41:24,314 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0260 | Val rms_score: 0.4890
|
| 159 |
+
2025-09-26 09:41:34,015 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0226 | Val rms_score: 0.4933
|
| 160 |
+
2025-09-26 09:41:43,799 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0232 | Val rms_score: 0.4859
|
| 161 |
+
2025-09-26 09:41:53,512 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0252 | Val rms_score: 0.4880
|
| 162 |
+
2025-09-26 09:42:02,465 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0248 | Val rms_score: 0.4853
|
| 163 |
+
2025-09-26 09:42:12,017 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0234 | Val rms_score: 0.4895
|
| 164 |
+
2025-09-26 09:42:21,667 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0237 | Val rms_score: 0.4816
|
| 165 |
+
2025-09-26 09:42:33,016 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0242 | Val rms_score: 0.4866
|
| 166 |
+
2025-09-26 09:42:42,672 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0206 | Val rms_score: 0.4811
|
| 167 |
+
2025-09-26 09:42:51,517 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0224 | Val rms_score: 0.4903
|
| 168 |
+
2025-09-26 09:43:01,049 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0229 | Val rms_score: 0.4897
|
| 169 |
+
2025-09-26 09:43:10,212 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0200 | Val rms_score: 0.4874
|
| 170 |
+
2025-09-26 09:43:19,410 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0209 | Val rms_score: 0.4829
|
| 171 |
+
2025-09-26 09:43:28,990 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0205 | Val rms_score: 0.4812
|
| 172 |
+
2025-09-26 09:43:37,802 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0194 | Val rms_score: 0.4909
|
| 173 |
+
2025-09-26 09:43:46,867 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0291 | Val rms_score: 0.4811
|
| 174 |
+
2025-09-26 09:43:56,179 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0216 | Val rms_score: 0.4834
|
| 175 |
+
2025-09-26 09:44:05,549 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0181 | Val rms_score: 0.4829
|
| 176 |
+
2025-09-26 09:44:14,993 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0204 | Val rms_score: 0.4835
|
| 177 |
+
2025-09-26 09:44:23,631 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0200 | Val rms_score: 0.4826
|
| 178 |
+
2025-09-26 09:44:32,974 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0191 | Val rms_score: 0.4873
|
| 179 |
+
2025-09-26 09:44:42,185 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0192 | Val rms_score: 0.4863
|
| 180 |
+
2025-09-26 09:44:51,703 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0174 | Val rms_score: 0.4885
|
| 181 |
+
2025-09-26 09:45:01,521 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0170 | Val rms_score: 0.4809
|
| 182 |
+
2025-09-26 09:45:10,496 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0188 | Val rms_score: 0.4830
|
| 183 |
+
2025-09-26 09:45:19,969 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0171 | Val rms_score: 0.4789
|
| 184 |
+
2025-09-26 09:45:29,083 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0171 | Val rms_score: 0.4813
|
| 185 |
+
2025-09-26 09:45:38,413 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0166 | Val rms_score: 0.4810
|
| 186 |
+
2025-09-26 09:45:48,111 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0167 | Val rms_score: 0.4825
|
| 187 |
+
2025-09-26 09:45:57,064 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0164 | Val rms_score: 0.4802
|
| 188 |
+
2025-09-26 09:46:06,228 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0169 | Val rms_score: 0.4817
|
| 189 |
+
2025-09-26 09:46:15,497 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0161 | Val rms_score: 0.4840
|
| 190 |
+
2025-09-26 09:46:24,838 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0142 | Val rms_score: 0.4796
|
| 191 |
+
2025-09-26 09:46:34,432 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0156 | Val rms_score: 0.4795
|
| 192 |
+
2025-09-26 09:46:43,388 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0161 | Val rms_score: 0.4803
|
| 193 |
+
2025-09-26 09:46:55,111 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0172 | Val rms_score: 0.4796
|
| 194 |
+
2025-09-26 09:47:03,014 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0158 | Val rms_score: 0.4795
|
| 195 |
+
2025-09-26 09:47:12,457 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0148 | Val rms_score: 0.4793
|
| 196 |
+
2025-09-26 09:47:22,276 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0162 | Val rms_score: 0.4820
|
| 197 |
+
2025-09-26 09:47:31,486 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0155 | Val rms_score: 0.4782
|
| 198 |
+
2025-09-26 09:47:40,792 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0156 | Val rms_score: 0.4758
|
| 199 |
+
2025-09-26 09:47:50,082 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0166 | Val rms_score: 0.4801
|
| 200 |
+
2025-09-26 09:47:59,253 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0148 | Val rms_score: 0.4765
|
| 201 |
+
2025-09-26 09:48:09,020 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0144 | Val rms_score: 0.4753
|
| 202 |
+
2025-09-26 09:48:18,072 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0153 | Val rms_score: 0.4782
|
| 203 |
+
2025-09-26 09:48:27,221 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0144 | Val rms_score: 0.4794
|
| 204 |
+
2025-09-26 09:48:36,311 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0160 | Val rms_score: 0.4787
|
| 205 |
+
2025-09-26 09:48:45,714 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0164 | Val rms_score: 0.4786
|
| 206 |
+
2025-09-26 09:48:55,377 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0140 | Val rms_score: 0.4789
|
| 207 |
+
2025-09-26 09:49:04,006 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0138 | Val rms_score: 0.4835
|
| 208 |
+
2025-09-26 09:49:13,183 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0155 | Val rms_score: 0.4797
|
| 209 |
+
2025-09-26 09:49:22,289 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0156 | Val rms_score: 0.4798
|
| 210 |
+
2025-09-26 09:49:23,159 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Test rms_score: 0.4884
|
| 211 |
+
2025-09-26 09:49:23,486 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset astrazeneca_cl at 2025-09-26_09-49-23
|
| 212 |
+
2025-09-26 09:49:32,071 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7396 | Val rms_score: 0.4637
|
| 213 |
+
2025-09-26 09:49:32,072 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Global step of best model: 36
|
| 214 |
+
2025-09-26 09:49:32,660 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.4637
|
| 215 |
+
2025-09-26 09:49:41,393 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.5035 | Val rms_score: 0.4558
|
| 216 |
+
2025-09-26 09:49:41,596 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Global step of best model: 72
|
| 217 |
+
2025-09-26 09:49:42,138 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.4558
|
| 218 |
+
2025-09-26 09:49:51,436 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.4023 | Val rms_score: 0.4589
|
| 219 |
+
2025-09-26 09:50:00,820 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3194 | Val rms_score: 0.4624
|
| 220 |
+
2025-09-26 09:50:08,812 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2222 | Val rms_score: 0.4809
|
| 221 |
+
2025-09-26 09:50:18,092 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1777 | Val rms_score: 0.4936
|
| 222 |
+
2025-09-26 09:50:27,779 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1597 | Val rms_score: 0.4822
|
| 223 |
+
2025-09-26 09:50:36,394 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1302 | Val rms_score: 0.4859
|
| 224 |
+
2025-09-26 09:50:45,771 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1068 | Val rms_score: 0.4925
|
| 225 |
+
2025-09-26 09:50:55,052 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1024 | Val rms_score: 0.4859
|
| 226 |
+
2025-09-26 09:51:04,139 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0911 | Val rms_score: 0.4936
|
| 227 |
+
2025-09-26 09:51:13,778 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0854 | Val rms_score: 0.4898
|
| 228 |
+
2025-09-26 09:51:22,675 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0729 | Val rms_score: 0.4860
|
| 229 |
+
2025-09-26 09:51:31,938 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0654 | Val rms_score: 0.4895
|
| 230 |
+
2025-09-26 09:51:41,401 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0738 | Val rms_score: 0.4898
|
| 231 |
+
2025-09-26 09:51:50,399 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0629 | Val rms_score: 0.4942
|
| 232 |
+
2025-09-26 09:51:59,890 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0775 | Val rms_score: 0.4827
|
| 233 |
+
2025-09-26 09:52:08,792 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0551 | Val rms_score: 0.4838
|
| 234 |
+
2025-09-26 09:52:17,986 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0538 | Val rms_score: 0.4890
|
| 235 |
+
2025-09-26 09:52:27,187 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0543 | Val rms_score: 0.4826
|
| 236 |
+
2025-09-26 09:52:36,602 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0482 | Val rms_score: 0.4895
|
| 237 |
+
2025-09-26 09:52:46,121 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0473 | Val rms_score: 0.4858
|
| 238 |
+
2025-09-26 09:52:54,627 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0472 | Val rms_score: 0.4985
|
| 239 |
+
2025-09-26 09:53:04,328 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0451 | Val rms_score: 0.4820
|
| 240 |
+
2025-09-26 09:53:13,309 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0447 | Val rms_score: 0.4836
|
| 241 |
+
2025-09-26 09:53:22,767 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0399 | Val rms_score: 0.4837
|
| 242 |
+
2025-09-26 09:53:32,132 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0395 | Val rms_score: 0.4890
|
| 243 |
+
2025-09-26 09:53:42,110 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0449 | Val rms_score: 0.4878
|
| 244 |
+
2025-09-26 09:53:51,647 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0378 | Val rms_score: 0.4831
|
| 245 |
+
2025-09-26 09:54:01,001 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0380 | Val rms_score: 0.4826
|
| 246 |
+
2025-09-26 09:54:10,378 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0342 | Val rms_score: 0.4876
|
| 247 |
+
2025-09-26 09:54:20,171 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0397 | Val rms_score: 0.4814
|
| 248 |
+
2025-09-26 09:54:29,048 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0417 | Val rms_score: 0.4804
|
| 249 |
+
2025-09-26 09:54:37,948 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0355 | Val rms_score: 0.4918
|
| 250 |
+
2025-09-26 09:54:47,390 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0310 | Val rms_score: 0.4874
|
| 251 |
+
2025-09-26 09:54:56,643 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0286 | Val rms_score: 0.4845
|
| 252 |
+
2025-09-26 09:55:06,439 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0292 | Val rms_score: 0.4820
|
| 253 |
+
2025-09-26 09:55:15,358 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0275 | Val rms_score: 0.4818
|
| 254 |
+
2025-09-26 09:55:24,591 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0295 | Val rms_score: 0.4827
|
| 255 |
+
2025-09-26 09:55:33,954 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0280 | Val rms_score: 0.4750
|
| 256 |
+
2025-09-26 09:55:43,416 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0302 | Val rms_score: 0.4843
|
| 257 |
+
2025-09-26 09:55:53,255 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0265 | Val rms_score: 0.4793
|
| 258 |
+
2025-09-26 09:56:02,758 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0297 | Val rms_score: 0.4767
|
| 259 |
+
2025-09-26 09:56:12,141 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0308 | Val rms_score: 0.4795
|
| 260 |
+
2025-09-26 09:56:21,691 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0283 | Val rms_score: 0.4817
|
| 261 |
+
2025-09-26 09:56:31,224 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0239 | Val rms_score: 0.4805
|
| 262 |
+
2025-09-26 09:56:41,189 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0224 | Val rms_score: 0.4846
|
| 263 |
+
2025-09-26 09:56:50,608 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0222 | Val rms_score: 0.4841
|
| 264 |
+
2025-09-26 09:57:00,026 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0235 | Val rms_score: 0.4747
|
| 265 |
+
2025-09-26 09:57:09,587 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0221 | Val rms_score: 0.4809
|
| 266 |
+
2025-09-26 09:57:18,963 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0228 | Val rms_score: 0.4841
|
| 267 |
+
2025-09-26 09:57:29,245 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0224 | Val rms_score: 0.4822
|
| 268 |
+
2025-09-26 09:57:38,816 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0203 | Val rms_score: 0.4806
|
| 269 |
+
2025-09-26 09:57:48,497 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0216 | Val rms_score: 0.4759
|
| 270 |
+
2025-09-26 09:57:58,294 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0216 | Val rms_score: 0.4802
|
| 271 |
+
2025-09-26 09:58:09,133 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0214 | Val rms_score: 0.4805
|
| 272 |
+
2025-09-26 09:58:18,495 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0202 | Val rms_score: 0.4785
|
| 273 |
+
2025-09-26 09:58:28,167 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0207 | Val rms_score: 0.4795
|
| 274 |
+
2025-09-26 09:58:37,915 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0212 | Val rms_score: 0.4791
|
| 275 |
+
2025-09-26 09:58:47,714 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0212 | Val rms_score: 0.4800
|
| 276 |
+
2025-09-26 09:58:57,589 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0229 | Val rms_score: 0.4850
|
| 277 |
+
2025-09-26 09:59:07,946 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0206 | Val rms_score: 0.4799
|
| 278 |
+
2025-09-26 09:59:17,596 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0201 | Val rms_score: 0.4769
|
| 279 |
+
2025-09-26 09:59:27,500 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0194 | Val rms_score: 0.4799
|
| 280 |
+
2025-09-26 09:59:37,415 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0195 | Val rms_score: 0.4710
|
| 281 |
+
2025-09-26 09:59:47,111 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0178 | Val rms_score: 0.4794
|
| 282 |
+
2025-09-26 09:59:57,386 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0194 | Val rms_score: 0.4795
|
| 283 |
+
2025-09-26 10:00:06,897 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0191 | Val rms_score: 0.4757
|
| 284 |
+
2025-09-26 10:00:15,973 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0170 | Val rms_score: 0.4768
|
| 285 |
+
2025-09-26 10:00:25,269 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0198 | Val rms_score: 0.4778
|
| 286 |
+
2025-09-26 10:00:34,394 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0178 | Val rms_score: 0.4794
|
| 287 |
+
2025-09-26 10:00:44,032 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0184 | Val rms_score: 0.4822
|
| 288 |
+
2025-09-26 10:00:52,827 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0176 | Val rms_score: 0.4762
|
| 289 |
+
2025-09-26 10:01:02,283 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0177 | Val rms_score: 0.4816
|
| 290 |
+
2025-09-26 10:01:11,438 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0163 | Val rms_score: 0.4750
|
| 291 |
+
2025-09-26 10:01:20,466 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0162 | Val rms_score: 0.4780
|
| 292 |
+
2025-09-26 10:01:30,196 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0153 | Val rms_score: 0.4800
|
| 293 |
+
2025-09-26 10:01:38,960 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0178 | Val rms_score: 0.4800
|
| 294 |
+
2025-09-26 10:01:48,516 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0160 | Val rms_score: 0.4778
|
| 295 |
+
2025-09-26 10:01:57,629 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0158 | Val rms_score: 0.4784
|
| 296 |
+
2025-09-26 10:02:06,977 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0157 | Val rms_score: 0.4761
|
| 297 |
+
2025-09-26 10:02:16,702 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0161 | Val rms_score: 0.4795
|
| 298 |
+
2025-09-26 10:02:25,679 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0165 | Val rms_score: 0.4750
|
| 299 |
+
2025-09-26 10:02:36,520 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0143 | Val rms_score: 0.4738
|
| 300 |
+
2025-09-26 10:02:45,315 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0151 | Val rms_score: 0.4756
|
| 301 |
+
2025-09-26 10:02:54,436 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0166 | Val rms_score: 0.4752
|
| 302 |
+
2025-09-26 10:03:04,079 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0165 | Val rms_score: 0.4782
|
| 303 |
+
2025-09-26 10:03:13,032 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0155 | Val rms_score: 0.4776
|
| 304 |
+
2025-09-26 10:03:22,199 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0110 | Val rms_score: 0.4744
|
| 305 |
+
2025-09-26 10:03:31,651 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0156 | Val rms_score: 0.4791
|
| 306 |
+
2025-09-26 10:03:40,801 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0155 | Val rms_score: 0.4722
|
| 307 |
+
2025-09-26 10:03:50,565 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0138 | Val rms_score: 0.4734
|
| 308 |
+
2025-09-26 10:03:59,707 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0152 | Val rms_score: 0.4781
|
| 309 |
+
2025-09-26 10:04:08,616 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0144 | Val rms_score: 0.4753
|
| 310 |
+
2025-09-26 10:04:17,898 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0150 | Val rms_score: 0.4725
|
| 311 |
+
2025-09-26 10:04:27,065 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0137 | Val rms_score: 0.4712
|
| 312 |
+
2025-09-26 10:04:36,846 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0146 | Val rms_score: 0.4763
|
| 313 |
+
2025-09-26 10:04:45,563 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0132 | Val rms_score: 0.4794
|
| 314 |
+
2025-09-26 10:04:54,764 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0146 | Val rms_score: 0.4799
|
| 315 |
+
2025-09-26 10:05:03,930 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0158 | Val rms_score: 0.4755
|
| 316 |
+
2025-09-26 10:05:04,815 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Test rms_score: 0.5207
|
| 317 |
+
2025-09-26 10:05:05,185 - logs_modchembert_astrazeneca_cl_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.5022, Std Dev: 0.0136
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_logd74_epochs100_batch_size32_20250926_100505.log
ADDED
|
@@ -0,0 +1,407 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 10:05:05,187 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Running benchmark for dataset: astrazeneca_logd74
|
| 2 |
+
2025-09-26 10:05:05,187 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - dataset: astrazeneca_logd74, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 10:05:05,193 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset astrazeneca_logd74 at 2025-09-26_10-05-05
|
| 4 |
+
2025-09-26 10:05:27,132 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.3375 | Val rms_score: 0.7430
|
| 5 |
+
2025-09-26 10:05:27,132 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 105
|
| 6 |
+
2025-09-26 10:05:27,738 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.7430
|
| 7 |
+
2025-09-26 10:05:53,253 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.1547 | Val rms_score: 0.7041
|
| 8 |
+
2025-09-26 10:05:53,428 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 210
|
| 9 |
+
2025-09-26 10:05:53,960 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.7041
|
| 10 |
+
2025-09-26 10:06:18,236 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.1318 | Val rms_score: 0.6968
|
| 11 |
+
2025-09-26 10:06:18,417 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 315
|
| 12 |
+
2025-09-26 10:06:18,947 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.6968
|
| 13 |
+
2025-09-26 10:06:44,866 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1094 | Val rms_score: 0.7011
|
| 14 |
+
2025-09-26 10:07:10,577 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.0869 | Val rms_score: 0.6958
|
| 15 |
+
2025-09-26 10:07:10,732 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 525
|
| 16 |
+
2025-09-26 10:07:11,407 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val rms_score: 0.6958
|
| 17 |
+
2025-09-26 10:07:37,183 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0823 | Val rms_score: 0.7217
|
| 18 |
+
2025-09-26 10:08:02,730 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0674 | Val rms_score: 0.6919
|
| 19 |
+
2025-09-26 10:08:02,917 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 735
|
| 20 |
+
2025-09-26 10:08:03,575 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val rms_score: 0.6919
|
| 21 |
+
2025-09-26 10:08:27,499 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0609 | Val rms_score: 0.6989
|
| 22 |
+
2025-09-26 10:08:53,307 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0545 | Val rms_score: 0.7057
|
| 23 |
+
2025-09-26 10:09:20,760 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0497 | Val rms_score: 0.6837
|
| 24 |
+
2025-09-26 10:09:20,920 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 1050
|
| 25 |
+
2025-09-26 10:09:21,513 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 10 with val rms_score: 0.6837
|
| 26 |
+
2025-09-26 10:09:46,497 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0486 | Val rms_score: 0.6897
|
| 27 |
+
2025-09-26 10:10:11,329 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0427 | Val rms_score: 0.6978
|
| 28 |
+
2025-09-26 10:10:36,937 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0394 | Val rms_score: 0.6956
|
| 29 |
+
2025-09-26 10:11:02,673 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0375 | Val rms_score: 0.6808
|
| 30 |
+
2025-09-26 10:11:02,832 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 1470
|
| 31 |
+
2025-09-26 10:11:03,378 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 14 with val rms_score: 0.6808
|
| 32 |
+
2025-09-26 10:11:28,123 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0333 | Val rms_score: 0.6840
|
| 33 |
+
2025-09-26 10:11:53,960 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0312 | Val rms_score: 0.6902
|
| 34 |
+
2025-09-26 10:12:19,774 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0312 | Val rms_score: 0.6871
|
| 35 |
+
2025-09-26 10:12:45,420 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0285 | Val rms_score: 0.6797
|
| 36 |
+
2025-09-26 10:12:45,575 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 1890
|
| 37 |
+
2025-09-26 10:12:46,156 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 18 with val rms_score: 0.6797
|
| 38 |
+
2025-09-26 10:13:11,734 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0276 | Val rms_score: 0.6858
|
| 39 |
+
2025-09-26 10:13:37,474 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0269 | Val rms_score: 0.6844
|
| 40 |
+
2025-09-26 10:14:02,903 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0264 | Val rms_score: 0.6823
|
| 41 |
+
2025-09-26 10:14:28,695 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0307 | Val rms_score: 0.6854
|
| 42 |
+
2025-09-26 10:14:54,293 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0262 | Val rms_score: 0.6801
|
| 43 |
+
2025-09-26 10:15:20,001 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0250 | Val rms_score: 0.6840
|
| 44 |
+
2025-09-26 10:15:45,548 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0269 | Val rms_score: 0.6848
|
| 45 |
+
2025-09-26 10:16:11,183 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0245 | Val rms_score: 0.6821
|
| 46 |
+
2025-09-26 10:16:36,994 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0222 | Val rms_score: 0.6821
|
| 47 |
+
2025-09-26 10:17:01,401 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0208 | Val rms_score: 0.6842
|
| 48 |
+
2025-09-26 10:17:29,609 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0203 | Val rms_score: 0.6782
|
| 49 |
+
2025-09-26 10:17:29,788 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 3045
|
| 50 |
+
2025-09-26 10:17:30,341 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 29 with val rms_score: 0.6782
|
| 51 |
+
2025-09-26 10:17:54,910 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0214 | Val rms_score: 0.6852
|
| 52 |
+
2025-09-26 10:18:21,015 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0216 | Val rms_score: 0.6773
|
| 53 |
+
2025-09-26 10:18:21,502 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 3255
|
| 54 |
+
2025-09-26 10:18:22,052 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 31 with val rms_score: 0.6773
|
| 55 |
+
2025-09-26 10:18:47,268 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0204 | Val rms_score: 0.6832
|
| 56 |
+
2025-09-26 10:19:12,652 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0195 | Val rms_score: 0.6789
|
| 57 |
+
2025-09-26 10:19:38,493 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0192 | Val rms_score: 0.6752
|
| 58 |
+
2025-09-26 10:19:38,686 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 3570
|
| 59 |
+
2025-09-26 10:19:39,235 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 34 with val rms_score: 0.6752
|
| 60 |
+
2025-09-26 10:20:03,840 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0193 | Val rms_score: 0.6763
|
| 61 |
+
2025-09-26 10:20:28,494 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0181 | Val rms_score: 0.6748
|
| 62 |
+
2025-09-26 10:20:28,974 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 3780
|
| 63 |
+
2025-09-26 10:20:29,573 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 36 with val rms_score: 0.6748
|
| 64 |
+
2025-09-26 10:20:55,270 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0186 | Val rms_score: 0.6782
|
| 65 |
+
2025-09-26 10:21:20,832 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0175 | Val rms_score: 0.6792
|
| 66 |
+
2025-09-26 10:21:46,236 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0183 | Val rms_score: 0.6778
|
| 67 |
+
2025-09-26 10:22:11,960 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0177 | Val rms_score: 0.6807
|
| 68 |
+
2025-09-26 10:22:37,845 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0148 | Val rms_score: 0.6776
|
| 69 |
+
2025-09-26 10:23:03,314 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0193 | Val rms_score: 0.6694
|
| 70 |
+
2025-09-26 10:23:03,472 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 4410
|
| 71 |
+
2025-09-26 10:23:04,078 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 42 with val rms_score: 0.6694
|
| 72 |
+
2025-09-26 10:23:28,767 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0176 | Val rms_score: 0.6810
|
| 73 |
+
2025-09-26 10:23:54,473 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0158 | Val rms_score: 0.6800
|
| 74 |
+
2025-09-26 10:24:20,278 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0161 | Val rms_score: 0.6752
|
| 75 |
+
2025-09-26 10:24:46,049 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0159 | Val rms_score: 0.6769
|
| 76 |
+
2025-09-26 10:25:11,479 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0161 | Val rms_score: 0.6791
|
| 77 |
+
2025-09-26 10:25:38,836 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0149 | Val rms_score: 0.6777
|
| 78 |
+
2025-09-26 10:26:03,547 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0148 | Val rms_score: 0.6724
|
| 79 |
+
2025-09-26 10:26:29,164 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0152 | Val rms_score: 0.6874
|
| 80 |
+
2025-09-26 10:26:54,700 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0158 | Val rms_score: 0.6747
|
| 81 |
+
2025-09-26 10:27:20,562 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0149 | Val rms_score: 0.6788
|
| 82 |
+
2025-09-26 10:27:46,412 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0147 | Val rms_score: 0.6743
|
| 83 |
+
2025-09-26 10:28:11,099 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0145 | Val rms_score: 0.6692
|
| 84 |
+
2025-09-26 10:28:11,257 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 5670
|
| 85 |
+
2025-09-26 10:28:11,825 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 54 with val rms_score: 0.6692
|
| 86 |
+
2025-09-26 10:28:37,690 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0142 | Val rms_score: 0.6720
|
| 87 |
+
2025-09-26 10:29:03,177 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0141 | Val rms_score: 0.6731
|
| 88 |
+
2025-09-26 10:29:28,722 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0140 | Val rms_score: 0.6794
|
| 89 |
+
2025-09-26 10:29:54,087 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0134 | Val rms_score: 0.6748
|
| 90 |
+
2025-09-26 10:30:19,630 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0140 | Val rms_score: 0.6718
|
| 91 |
+
2025-09-26 10:30:45,529 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0143 | Val rms_score: 0.6715
|
| 92 |
+
2025-09-26 10:31:11,266 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0147 | Val rms_score: 0.6743
|
| 93 |
+
2025-09-26 10:31:36,544 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0126 | Val rms_score: 0.6731
|
| 94 |
+
2025-09-26 10:32:01,950 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0143 | Val rms_score: 0.6718
|
| 95 |
+
2025-09-26 10:32:27,979 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0129 | Val rms_score: 0.6788
|
| 96 |
+
2025-09-26 10:32:54,131 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0126 | Val rms_score: 0.6766
|
| 97 |
+
2025-09-26 10:33:19,877 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0137 | Val rms_score: 0.6740
|
| 98 |
+
2025-09-26 10:33:46,478 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0130 | Val rms_score: 0.6704
|
| 99 |
+
2025-09-26 10:34:10,687 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0120 | Val rms_score: 0.6761
|
| 100 |
+
2025-09-26 10:34:36,388 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0135 | Val rms_score: 0.6702
|
| 101 |
+
2025-09-26 10:35:00,928 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0119 | Val rms_score: 0.6701
|
| 102 |
+
2025-09-26 10:35:25,175 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0132 | Val rms_score: 0.6752
|
| 103 |
+
2025-09-26 10:35:51,037 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0134 | Val rms_score: 0.6705
|
| 104 |
+
2025-09-26 10:36:15,550 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0132 | Val rms_score: 0.6725
|
| 105 |
+
2025-09-26 10:36:41,175 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0137 | Val rms_score: 0.6732
|
| 106 |
+
2025-09-26 10:37:07,158 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0121 | Val rms_score: 0.6732
|
| 107 |
+
2025-09-26 10:37:32,783 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0122 | Val rms_score: 0.6683
|
| 108 |
+
2025-09-26 10:37:33,325 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 7980
|
| 109 |
+
2025-09-26 10:37:33,909 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 76 with val rms_score: 0.6683
|
| 110 |
+
2025-09-26 10:38:00,004 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0129 | Val rms_score: 0.6723
|
| 111 |
+
2025-09-26 10:38:25,495 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0124 | Val rms_score: 0.6754
|
| 112 |
+
2025-09-26 10:38:51,195 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0123 | Val rms_score: 0.6683
|
| 113 |
+
2025-09-26 10:39:15,570 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0126 | Val rms_score: 0.6708
|
| 114 |
+
2025-09-26 10:39:41,245 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0127 | Val rms_score: 0.6730
|
| 115 |
+
2025-09-26 10:40:06,603 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0119 | Val rms_score: 0.6708
|
| 116 |
+
2025-09-26 10:40:32,216 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0135 | Val rms_score: 0.6749
|
| 117 |
+
2025-09-26 10:40:58,032 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0112 | Val rms_score: 0.6723
|
| 118 |
+
2025-09-26 10:41:24,127 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0123 | Val rms_score: 0.6726
|
| 119 |
+
2025-09-26 10:41:51,181 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0121 | Val rms_score: 0.6778
|
| 120 |
+
2025-09-26 10:42:15,647 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0109 | Val rms_score: 0.6671
|
| 121 |
+
2025-09-26 10:42:15,805 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 9135
|
| 122 |
+
2025-09-26 10:42:16,412 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 87 with val rms_score: 0.6671
|
| 123 |
+
2025-09-26 10:42:41,052 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0120 | Val rms_score: 0.6712
|
| 124 |
+
2025-09-26 10:43:06,785 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0122 | Val rms_score: 0.6670
|
| 125 |
+
2025-09-26 10:43:06,947 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 9345
|
| 126 |
+
2025-09-26 10:43:07,539 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 89 with val rms_score: 0.6670
|
| 127 |
+
2025-09-26 10:43:33,160 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0118 | Val rms_score: 0.6695
|
| 128 |
+
2025-09-26 10:43:58,693 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0110 | Val rms_score: 0.6724
|
| 129 |
+
2025-09-26 10:44:24,368 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0115 | Val rms_score: 0.6649
|
| 130 |
+
2025-09-26 10:44:24,549 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 9660
|
| 131 |
+
2025-09-26 10:44:25,173 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 92 with val rms_score: 0.6649
|
| 132 |
+
2025-09-26 10:44:51,073 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0121 | Val rms_score: 0.6659
|
| 133 |
+
2025-09-26 10:45:15,711 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0114 | Val rms_score: 0.6658
|
| 134 |
+
2025-09-26 10:45:41,569 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0118 | Val rms_score: 0.6690
|
| 135 |
+
2025-09-26 10:46:07,945 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0115 | Val rms_score: 0.6690
|
| 136 |
+
2025-09-26 10:46:33,747 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0115 | Val rms_score: 0.6654
|
| 137 |
+
2025-09-26 10:46:58,261 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0107 | Val rms_score: 0.6707
|
| 138 |
+
2025-09-26 10:47:23,877 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0110 | Val rms_score: 0.6697
|
| 139 |
+
2025-09-26 10:47:49,631 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0109 | Val rms_score: 0.6683
|
| 140 |
+
2025-09-26 10:47:51,274 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Test rms_score: 0.7439
|
| 141 |
+
2025-09-26 10:47:51,711 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset astrazeneca_logd74 at 2025-09-26_10-47-51
|
| 142 |
+
2025-09-26 10:48:15,914 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.2266 | Val rms_score: 0.7203
|
| 143 |
+
2025-09-26 10:48:15,914 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 105
|
| 144 |
+
2025-09-26 10:48:16,671 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.7203
|
| 145 |
+
2025-09-26 10:48:43,356 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.1977 | Val rms_score: 0.7089
|
| 146 |
+
2025-09-26 10:48:43,544 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 210
|
| 147 |
+
2025-09-26 10:48:44,124 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.7089
|
| 148 |
+
2025-09-26 10:49:08,724 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.1354 | Val rms_score: 0.6826
|
| 149 |
+
2025-09-26 10:49:08,918 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 315
|
| 150 |
+
2025-09-26 10:49:09,535 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.6826
|
| 151 |
+
2025-09-26 10:49:34,252 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1109 | Val rms_score: 0.6945
|
| 152 |
+
2025-09-26 10:49:59,894 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.0944 | Val rms_score: 0.6866
|
| 153 |
+
2025-09-26 10:50:25,711 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0766 | Val rms_score: 0.6886
|
| 154 |
+
2025-09-26 10:50:51,660 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0656 | Val rms_score: 0.6902
|
| 155 |
+
2025-09-26 10:51:17,376 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0555 | Val rms_score: 0.6720
|
| 156 |
+
2025-09-26 10:51:17,538 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 840
|
| 157 |
+
2025-09-26 10:51:18,123 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val rms_score: 0.6720
|
| 158 |
+
2025-09-26 10:51:44,220 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0563 | Val rms_score: 0.6650
|
| 159 |
+
2025-09-26 10:51:44,378 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 945
|
| 160 |
+
2025-09-26 10:51:45,066 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 9 with val rms_score: 0.6650
|
| 161 |
+
2025-09-26 10:52:11,685 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0534 | Val rms_score: 0.6851
|
| 162 |
+
2025-09-26 10:52:36,183 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0455 | Val rms_score: 0.6949
|
| 163 |
+
2025-09-26 10:53:02,016 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0388 | Val rms_score: 0.6685
|
| 164 |
+
2025-09-26 10:53:26,632 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0377 | Val rms_score: 0.6787
|
| 165 |
+
2025-09-26 10:53:52,463 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0359 | Val rms_score: 0.6749
|
| 166 |
+
2025-09-26 10:54:18,654 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0323 | Val rms_score: 0.6775
|
| 167 |
+
2025-09-26 10:54:44,306 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0338 | Val rms_score: 0.6848
|
| 168 |
+
2025-09-26 10:55:10,005 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0333 | Val rms_score: 0.6672
|
| 169 |
+
2025-09-26 10:55:35,916 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0278 | Val rms_score: 0.6741
|
| 170 |
+
2025-09-26 10:56:01,724 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0278 | Val rms_score: 0.6741
|
| 171 |
+
2025-09-26 10:56:26,706 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0281 | Val rms_score: 0.6754
|
| 172 |
+
2025-09-26 10:56:52,900 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0281 | Val rms_score: 0.6728
|
| 173 |
+
2025-09-26 10:57:18,794 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0293 | Val rms_score: 0.6747
|
| 174 |
+
2025-09-26 10:57:44,521 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0238 | Val rms_score: 0.6750
|
| 175 |
+
2025-09-26 10:58:10,420 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0246 | Val rms_score: 0.6754
|
| 176 |
+
2025-09-26 10:58:35,976 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0211 | Val rms_score: 0.6772
|
| 177 |
+
2025-09-26 10:59:01,880 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0204 | Val rms_score: 0.6715
|
| 178 |
+
2025-09-26 10:59:26,313 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0215 | Val rms_score: 0.6702
|
| 179 |
+
2025-09-26 10:59:52,452 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0200 | Val rms_score: 0.6677
|
| 180 |
+
2025-09-26 11:00:19,142 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0181 | Val rms_score: 0.6726
|
| 181 |
+
2025-09-26 11:00:43,404 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0200 | Val rms_score: 0.6694
|
| 182 |
+
2025-09-26 11:01:09,075 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0217 | Val rms_score: 0.6634
|
| 183 |
+
2025-09-26 11:01:09,584 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 3255
|
| 184 |
+
2025-09-26 11:01:10,226 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 31 with val rms_score: 0.6634
|
| 185 |
+
2025-09-26 11:01:35,457 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0193 | Val rms_score: 0.6689
|
| 186 |
+
2025-09-26 11:01:59,968 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0207 | Val rms_score: 0.6668
|
| 187 |
+
2025-09-26 11:02:25,660 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0187 | Val rms_score: 0.6725
|
| 188 |
+
2025-09-26 11:02:51,638 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0183 | Val rms_score: 0.6730
|
| 189 |
+
2025-09-26 11:03:17,489 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0174 | Val rms_score: 0.6707
|
| 190 |
+
2025-09-26 11:03:43,342 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0175 | Val rms_score: 0.6645
|
| 191 |
+
2025-09-26 11:04:08,609 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0185 | Val rms_score: 0.6740
|
| 192 |
+
2025-09-26 11:04:34,729 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0179 | Val rms_score: 0.6661
|
| 193 |
+
2025-09-26 11:05:00,788 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0173 | Val rms_score: 0.6688
|
| 194 |
+
2025-09-26 11:05:26,745 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0149 | Val rms_score: 0.6779
|
| 195 |
+
2025-09-26 11:05:53,184 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0155 | Val rms_score: 0.6708
|
| 196 |
+
2025-09-26 11:06:19,391 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0177 | Val rms_score: 0.6682
|
| 197 |
+
2025-09-26 11:06:45,761 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0157 | Val rms_score: 0.6587
|
| 198 |
+
2025-09-26 11:06:45,925 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 4620
|
| 199 |
+
2025-09-26 11:06:46,487 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 44 with val rms_score: 0.6587
|
| 200 |
+
2025-09-26 11:07:12,402 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0161 | Val rms_score: 0.6649
|
| 201 |
+
2025-09-26 11:07:38,989 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0153 | Val rms_score: 0.6663
|
| 202 |
+
2025-09-26 11:08:05,814 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0157 | Val rms_score: 0.6617
|
| 203 |
+
2025-09-26 11:08:33,042 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0161 | Val rms_score: 0.6636
|
| 204 |
+
2025-09-26 11:08:58,000 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0155 | Val rms_score: 0.6648
|
| 205 |
+
2025-09-26 11:09:24,241 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0144 | Val rms_score: 0.6673
|
| 206 |
+
2025-09-26 11:09:50,198 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0144 | Val rms_score: 0.6691
|
| 207 |
+
2025-09-26 11:10:17,043 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0144 | Val rms_score: 0.6637
|
| 208 |
+
2025-09-26 11:10:43,199 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0148 | Val rms_score: 0.6716
|
| 209 |
+
2025-09-26 11:11:09,601 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0146 | Val rms_score: 0.6690
|
| 210 |
+
2025-09-26 11:11:36,083 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0138 | Val rms_score: 0.6653
|
| 211 |
+
2025-09-26 11:12:02,226 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0130 | Val rms_score: 0.6660
|
| 212 |
+
2025-09-26 11:12:28,895 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0142 | Val rms_score: 0.6692
|
| 213 |
+
2025-09-26 11:12:55,589 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0144 | Val rms_score: 0.6664
|
| 214 |
+
2025-09-26 11:13:21,579 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0132 | Val rms_score: 0.6634
|
| 215 |
+
2025-09-26 11:13:47,112 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0130 | Val rms_score: 0.6637
|
| 216 |
+
2025-09-26 11:14:14,479 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0134 | Val rms_score: 0.6679
|
| 217 |
+
2025-09-26 11:14:41,215 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0128 | Val rms_score: 0.6617
|
| 218 |
+
2025-09-26 11:15:08,393 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0138 | Val rms_score: 0.6661
|
| 219 |
+
2025-09-26 11:15:34,807 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0140 | Val rms_score: 0.6710
|
| 220 |
+
2025-09-26 11:16:01,503 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0125 | Val rms_score: 0.6643
|
| 221 |
+
2025-09-26 11:16:28,083 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0137 | Val rms_score: 0.6629
|
| 222 |
+
2025-09-26 11:16:55,222 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0129 | Val rms_score: 0.6671
|
| 223 |
+
2025-09-26 11:17:20,665 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0127 | Val rms_score: 0.6657
|
| 224 |
+
2025-09-26 11:17:46,558 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0124 | Val rms_score: 0.6630
|
| 225 |
+
2025-09-26 11:18:12,624 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0130 | Val rms_score: 0.6634
|
| 226 |
+
2025-09-26 11:18:38,518 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0126 | Val rms_score: 0.6599
|
| 227 |
+
2025-09-26 11:19:05,247 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0128 | Val rms_score: 0.6591
|
| 228 |
+
2025-09-26 11:19:31,406 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0123 | Val rms_score: 0.6625
|
| 229 |
+
2025-09-26 11:19:57,441 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0116 | Val rms_score: 0.6690
|
| 230 |
+
2025-09-26 11:20:23,830 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0118 | Val rms_score: 0.6654
|
| 231 |
+
2025-09-26 11:20:50,003 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0126 | Val rms_score: 0.6604
|
| 232 |
+
2025-09-26 11:21:17,987 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0122 | Val rms_score: 0.6634
|
| 233 |
+
2025-09-26 11:21:43,126 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0117 | Val rms_score: 0.6582
|
| 234 |
+
2025-09-26 11:21:43,279 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 8190
|
| 235 |
+
2025-09-26 11:21:43,902 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 78 with val rms_score: 0.6582
|
| 236 |
+
2025-09-26 11:22:10,383 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0118 | Val rms_score: 0.6600
|
| 237 |
+
2025-09-26 11:22:36,608 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0116 | Val rms_score: 0.6610
|
| 238 |
+
2025-09-26 11:23:02,993 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0126 | Val rms_score: 0.6591
|
| 239 |
+
2025-09-26 11:23:29,958 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0099 | Val rms_score: 0.6627
|
| 240 |
+
2025-09-26 11:23:56,306 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0119 | Val rms_score: 0.6632
|
| 241 |
+
2025-09-26 11:24:22,616 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0128 | Val rms_score: 0.6641
|
| 242 |
+
2025-09-26 11:24:48,848 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0112 | Val rms_score: 0.6673
|
| 243 |
+
2025-09-26 11:25:16,439 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0118 | Val rms_score: 0.6609
|
| 244 |
+
2025-09-26 11:25:42,575 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0116 | Val rms_score: 0.6606
|
| 245 |
+
2025-09-26 11:26:08,896 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0123 | Val rms_score: 0.6610
|
| 246 |
+
2025-09-26 11:26:35,331 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0111 | Val rms_score: 0.6600
|
| 247 |
+
2025-09-26 11:27:01,598 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0111 | Val rms_score: 0.6616
|
| 248 |
+
2025-09-26 11:27:27,873 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0112 | Val rms_score: 0.6592
|
| 249 |
+
2025-09-26 11:27:54,878 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0115 | Val rms_score: 0.6657
|
| 250 |
+
2025-09-26 11:28:21,205 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0106 | Val rms_score: 0.6615
|
| 251 |
+
2025-09-26 11:28:47,546 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0112 | Val rms_score: 0.6624
|
| 252 |
+
2025-09-26 11:29:13,796 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0106 | Val rms_score: 0.6582
|
| 253 |
+
2025-09-26 11:29:13,961 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 9975
|
| 254 |
+
2025-09-26 11:29:14,519 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 95 with val rms_score: 0.6582
|
| 255 |
+
2025-09-26 11:29:41,961 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0115 | Val rms_score: 0.6565
|
| 256 |
+
2025-09-26 11:29:42,517 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 10080
|
| 257 |
+
2025-09-26 11:29:43,114 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 96 with val rms_score: 0.6565
|
| 258 |
+
2025-09-26 11:30:07,796 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0107 | Val rms_score: 0.6613
|
| 259 |
+
2025-09-26 11:30:34,348 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0110 | Val rms_score: 0.6561
|
| 260 |
+
2025-09-26 11:30:34,514 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 10290
|
| 261 |
+
2025-09-26 11:30:35,116 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 98 with val rms_score: 0.6561
|
| 262 |
+
2025-09-26 11:31:02,019 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0113 | Val rms_score: 0.6598
|
| 263 |
+
2025-09-26 11:31:28,820 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0109 | Val rms_score: 0.6524
|
| 264 |
+
2025-09-26 11:31:28,986 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 10500
|
| 265 |
+
2025-09-26 11:31:29,527 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 100 with val rms_score: 0.6524
|
| 266 |
+
2025-09-26 11:31:31,129 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Test rms_score: 0.7436
|
| 267 |
+
2025-09-26 11:31:31,578 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset astrazeneca_logd74 at 2025-09-26_11-31-31
|
| 268 |
+
2025-09-26 11:31:55,496 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.2219 | Val rms_score: 0.7374
|
| 269 |
+
2025-09-26 11:31:55,497 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 105
|
| 270 |
+
2025-09-26 11:31:56,103 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.7374
|
| 271 |
+
2025-09-26 11:32:23,724 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.1789 | Val rms_score: 0.7275
|
| 272 |
+
2025-09-26 11:32:23,902 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 210
|
| 273 |
+
2025-09-26 11:32:24,539 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.7275
|
| 274 |
+
2025-09-26 11:32:50,560 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.1203 | Val rms_score: 0.6953
|
| 275 |
+
2025-09-26 11:32:50,752 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 315
|
| 276 |
+
2025-09-26 11:32:51,322 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.6953
|
| 277 |
+
2025-09-26 11:33:17,395 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.1117 | Val rms_score: 0.7053
|
| 278 |
+
2025-09-26 11:33:43,616 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.0931 | Val rms_score: 0.7051
|
| 279 |
+
2025-09-26 11:34:09,598 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.0776 | Val rms_score: 0.6957
|
| 280 |
+
2025-09-26 11:34:36,000 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.0674 | Val rms_score: 0.6996
|
| 281 |
+
2025-09-26 11:35:02,218 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.0613 | Val rms_score: 0.6896
|
| 282 |
+
2025-09-26 11:35:02,375 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 840
|
| 283 |
+
2025-09-26 11:35:02,969 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val rms_score: 0.6896
|
| 284 |
+
2025-09-26 11:35:28,214 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.0542 | Val rms_score: 0.6812
|
| 285 |
+
2025-09-26 11:35:28,404 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 945
|
| 286 |
+
2025-09-26 11:35:29,018 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 9 with val rms_score: 0.6812
|
| 287 |
+
2025-09-26 11:35:56,070 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0469 | Val rms_score: 0.6856
|
| 288 |
+
2025-09-26 11:36:20,596 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0509 | Val rms_score: 0.6738
|
| 289 |
+
2025-09-26 11:36:21,161 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 1155
|
| 290 |
+
2025-09-26 11:36:21,760 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 11 with val rms_score: 0.6738
|
| 291 |
+
2025-09-26 11:36:46,055 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0453 | Val rms_score: 0.6764
|
| 292 |
+
2025-09-26 11:37:11,717 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0406 | Val rms_score: 0.6777
|
| 293 |
+
2025-09-26 11:37:36,579 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0362 | Val rms_score: 0.6874
|
| 294 |
+
2025-09-26 11:38:02,569 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0342 | Val rms_score: 0.6708
|
| 295 |
+
2025-09-26 11:38:02,733 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 1575
|
| 296 |
+
2025-09-26 11:38:03,298 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 15 with val rms_score: 0.6708
|
| 297 |
+
2025-09-26 11:38:28,751 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0330 | Val rms_score: 0.6718
|
| 298 |
+
2025-09-26 11:38:54,602 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0305 | Val rms_score: 0.6744
|
| 299 |
+
2025-09-26 11:39:20,573 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0316 | Val rms_score: 0.6712
|
| 300 |
+
2025-09-26 11:39:46,399 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0294 | Val rms_score: 0.6779
|
| 301 |
+
2025-09-26 11:40:11,810 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0277 | Val rms_score: 0.6685
|
| 302 |
+
2025-09-26 11:40:11,965 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 2100
|
| 303 |
+
2025-09-26 11:40:12,537 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 20 with val rms_score: 0.6685
|
| 304 |
+
2025-09-26 11:40:38,423 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0225 | Val rms_score: 0.6792
|
| 305 |
+
2025-09-26 11:41:04,210 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0236 | Val rms_score: 0.6754
|
| 306 |
+
2025-09-26 11:41:29,945 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0237 | Val rms_score: 0.6787
|
| 307 |
+
2025-09-26 11:41:55,718 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0241 | Val rms_score: 0.6781
|
| 308 |
+
2025-09-26 11:42:21,544 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0247 | Val rms_score: 0.6762
|
| 309 |
+
2025-09-26 11:42:47,198 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0240 | Val rms_score: 0.6667
|
| 310 |
+
2025-09-26 11:42:47,717 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 2730
|
| 311 |
+
2025-09-26 11:42:48,277 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 26 with val rms_score: 0.6667
|
| 312 |
+
2025-09-26 11:43:12,533 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0227 | Val rms_score: 0.6686
|
| 313 |
+
2025-09-26 11:43:38,665 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0218 | Val rms_score: 0.6733
|
| 314 |
+
2025-09-26 11:44:05,405 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0220 | Val rms_score: 0.6703
|
| 315 |
+
2025-09-26 11:44:29,800 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0217 | Val rms_score: 0.6724
|
| 316 |
+
2025-09-26 11:44:55,671 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0206 | Val rms_score: 0.6791
|
| 317 |
+
2025-09-26 11:45:21,504 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0193 | Val rms_score: 0.6653
|
| 318 |
+
2025-09-26 11:45:21,663 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 3360
|
| 319 |
+
2025-09-26 11:45:22,250 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 32 with val rms_score: 0.6653
|
| 320 |
+
2025-09-26 11:45:48,065 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0196 | Val rms_score: 0.6727
|
| 321 |
+
2025-09-26 11:46:13,962 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0194 | Val rms_score: 0.6666
|
| 322 |
+
2025-09-26 11:46:38,390 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0195 | Val rms_score: 0.6701
|
| 323 |
+
2025-09-26 11:47:04,083 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0179 | Val rms_score: 0.6724
|
| 324 |
+
2025-09-26 11:47:29,559 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0183 | Val rms_score: 0.6765
|
| 325 |
+
2025-09-26 11:47:55,446 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0178 | Val rms_score: 0.6650
|
| 326 |
+
2025-09-26 11:47:55,605 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 3990
|
| 327 |
+
2025-09-26 11:47:56,216 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 38 with val rms_score: 0.6650
|
| 328 |
+
2025-09-26 11:48:20,607 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0178 | Val rms_score: 0.6701
|
| 329 |
+
2025-09-26 11:48:46,490 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0166 | Val rms_score: 0.6668
|
| 330 |
+
2025-09-26 11:49:12,434 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0209 | Val rms_score: 0.6674
|
| 331 |
+
2025-09-26 11:49:37,808 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0162 | Val rms_score: 0.6661
|
| 332 |
+
2025-09-26 11:50:03,809 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0145 | Val rms_score: 0.6691
|
| 333 |
+
2025-09-26 11:50:29,685 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0164 | Val rms_score: 0.6632
|
| 334 |
+
2025-09-26 11:50:29,872 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 4620
|
| 335 |
+
2025-09-26 11:50:30,586 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 44 with val rms_score: 0.6632
|
| 336 |
+
2025-09-26 11:50:55,258 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0161 | Val rms_score: 0.6644
|
| 337 |
+
2025-09-26 11:51:21,065 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0135 | Val rms_score: 0.6708
|
| 338 |
+
2025-09-26 11:51:46,868 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0146 | Val rms_score: 0.6723
|
| 339 |
+
2025-09-26 11:52:14,819 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0160 | Val rms_score: 0.6673
|
| 340 |
+
2025-09-26 11:52:39,244 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0167 | Val rms_score: 0.6656
|
| 341 |
+
2025-09-26 11:53:04,890 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0158 | Val rms_score: 0.6697
|
| 342 |
+
2025-09-26 11:53:30,686 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0155 | Val rms_score: 0.6643
|
| 343 |
+
2025-09-26 11:53:56,465 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0138 | Val rms_score: 0.6735
|
| 344 |
+
2025-09-26 11:54:22,244 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0145 | Val rms_score: 0.6643
|
| 345 |
+
2025-09-26 11:54:47,997 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0146 | Val rms_score: 0.6703
|
| 346 |
+
2025-09-26 11:55:13,708 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0136 | Val rms_score: 0.6648
|
| 347 |
+
2025-09-26 11:55:39,516 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0146 | Val rms_score: 0.6615
|
| 348 |
+
2025-09-26 11:55:40,190 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 5880
|
| 349 |
+
2025-09-26 11:55:40,765 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 56 with val rms_score: 0.6615
|
| 350 |
+
2025-09-26 11:56:05,033 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0146 | Val rms_score: 0.6671
|
| 351 |
+
2025-09-26 11:56:30,351 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0138 | Val rms_score: 0.6603
|
| 352 |
+
2025-09-26 11:56:30,581 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 6090
|
| 353 |
+
2025-09-26 11:56:31,454 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 58 with val rms_score: 0.6603
|
| 354 |
+
2025-09-26 11:56:56,381 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0136 | Val rms_score: 0.6673
|
| 355 |
+
2025-09-26 11:57:22,378 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0136 | Val rms_score: 0.6602
|
| 356 |
+
2025-09-26 11:57:22,534 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 6300
|
| 357 |
+
2025-09-26 11:57:23,118 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 60 with val rms_score: 0.6602
|
| 358 |
+
2025-09-26 11:57:48,042 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0150 | Val rms_score: 0.6639
|
| 359 |
+
2025-09-26 11:58:13,819 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0142 | Val rms_score: 0.6614
|
| 360 |
+
2025-09-26 11:58:39,752 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0141 | Val rms_score: 0.6680
|
| 361 |
+
2025-09-26 11:59:05,716 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0137 | Val rms_score: 0.6647
|
| 362 |
+
2025-09-26 11:59:31,664 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0144 | Val rms_score: 0.6601
|
| 363 |
+
2025-09-26 11:59:31,821 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 6825
|
| 364 |
+
2025-09-26 11:59:32,384 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 65 with val rms_score: 0.6601
|
| 365 |
+
2025-09-26 11:59:58,550 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0120 | Val rms_score: 0.6602
|
| 366 |
+
2025-09-26 12:00:25,770 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0130 | Val rms_score: 0.6634
|
| 367 |
+
2025-09-26 12:00:50,273 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0129 | Val rms_score: 0.6649
|
| 368 |
+
2025-09-26 12:01:15,970 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0128 | Val rms_score: 0.6655
|
| 369 |
+
2025-09-26 12:01:41,884 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0133 | Val rms_score: 0.6635
|
| 370 |
+
2025-09-26 12:02:07,687 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0128 | Val rms_score: 0.6667
|
| 371 |
+
2025-09-26 12:02:33,308 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0132 | Val rms_score: 0.6620
|
| 372 |
+
2025-09-26 12:02:58,905 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0129 | Val rms_score: 0.6646
|
| 373 |
+
2025-09-26 12:03:24,785 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0126 | Val rms_score: 0.6590
|
| 374 |
+
2025-09-26 12:03:24,946 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 7770
|
| 375 |
+
2025-09-26 12:03:25,556 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 74 with val rms_score: 0.6590
|
| 376 |
+
2025-09-26 12:03:51,704 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0123 | Val rms_score: 0.6573
|
| 377 |
+
2025-09-26 12:03:51,906 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 7875
|
| 378 |
+
2025-09-26 12:03:52,488 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 75 with val rms_score: 0.6573
|
| 379 |
+
2025-09-26 12:04:18,407 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0126 | Val rms_score: 0.6610
|
| 380 |
+
2025-09-26 12:04:44,418 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0129 | Val rms_score: 0.6629
|
| 381 |
+
2025-09-26 12:05:09,772 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0120 | Val rms_score: 0.6622
|
| 382 |
+
2025-09-26 12:05:35,374 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0126 | Val rms_score: 0.6609
|
| 383 |
+
2025-09-26 12:06:01,266 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0119 | Val rms_score: 0.6592
|
| 384 |
+
2025-09-26 12:06:26,913 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0123 | Val rms_score: 0.6622
|
| 385 |
+
2025-09-26 12:06:50,989 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0118 | Val rms_score: 0.6597
|
| 386 |
+
2025-09-26 12:07:16,633 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0139 | Val rms_score: 0.6630
|
| 387 |
+
2025-09-26 12:07:42,127 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0114 | Val rms_score: 0.6594
|
| 388 |
+
2025-09-26 12:08:08,117 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0126 | Val rms_score: 0.6630
|
| 389 |
+
2025-09-26 12:08:35,315 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0128 | Val rms_score: 0.6598
|
| 390 |
+
2025-09-26 12:08:59,814 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0119 | Val rms_score: 0.6664
|
| 391 |
+
2025-09-26 12:09:25,509 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0119 | Val rms_score: 0.6681
|
| 392 |
+
2025-09-26 12:09:51,443 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0110 | Val rms_score: 0.6636
|
| 393 |
+
2025-09-26 12:10:17,457 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0114 | Val rms_score: 0.6629
|
| 394 |
+
2025-09-26 12:10:43,237 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0109 | Val rms_score: 0.6608
|
| 395 |
+
2025-09-26 12:11:08,885 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0118 | Val rms_score: 0.6600
|
| 396 |
+
2025-09-26 12:11:34,619 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0116 | Val rms_score: 0.6591
|
| 397 |
+
2025-09-26 12:11:59,321 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0114 | Val rms_score: 0.6613
|
| 398 |
+
2025-09-26 12:12:24,918 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0111 | Val rms_score: 0.6584
|
| 399 |
+
2025-09-26 12:12:50,483 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0110 | Val rms_score: 0.6576
|
| 400 |
+
2025-09-26 12:13:16,061 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0112 | Val rms_score: 0.6566
|
| 401 |
+
2025-09-26 12:13:16,240 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Global step of best model: 10185
|
| 402 |
+
2025-09-26 12:13:16,833 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Best model saved at epoch 97 with val rms_score: 0.6566
|
| 403 |
+
2025-09-26 12:13:42,773 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0116 | Val rms_score: 0.6584
|
| 404 |
+
2025-09-26 12:14:08,339 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0113 | Val rms_score: 0.6616
|
| 405 |
+
2025-09-26 12:14:33,920 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0109 | Val rms_score: 0.6644
|
| 406 |
+
2025-09-26 12:14:35,545 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Test rms_score: 0.7525
|
| 407 |
+
2025-09-26 12:14:36,104 - logs_modchembert_astrazeneca_logd74_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.7467, Std Dev: 0.0041
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_ppb_epochs100_batch_size32_20250926_121436.log
ADDED
|
@@ -0,0 +1,327 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 12:14:36,106 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Running benchmark for dataset: astrazeneca_ppb
|
| 2 |
+
2025-09-26 12:14:36,106 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - dataset: astrazeneca_ppb, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 12:14:36,113 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset astrazeneca_ppb at 2025-09-26_12-14-36
|
| 4 |
+
2025-09-26 12:14:46,721 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6861 | Val rms_score: 0.1084
|
| 5 |
+
2025-09-26 12:14:46,721 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 45
|
| 6 |
+
2025-09-26 12:14:47,311 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.1084
|
| 7 |
+
2025-09-26 12:15:00,333 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.3972 | Val rms_score: 0.1073
|
| 8 |
+
2025-09-26 12:15:00,534 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 90
|
| 9 |
+
2025-09-26 12:15:01,128 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.1073
|
| 10 |
+
2025-09-26 12:15:12,747 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2375 | Val rms_score: 0.1065
|
| 11 |
+
2025-09-26 12:15:12,911 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 135
|
| 12 |
+
2025-09-26 12:15:13,499 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.1065
|
| 13 |
+
2025-09-26 12:15:24,132 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2125 | Val rms_score: 0.1086
|
| 14 |
+
2025-09-26 12:15:35,881 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1787 | Val rms_score: 0.1185
|
| 15 |
+
2025-09-26 12:15:47,547 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1750 | Val rms_score: 0.1050
|
| 16 |
+
2025-09-26 12:15:48,133 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 270
|
| 17 |
+
2025-09-26 12:15:48,757 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 6 with val rms_score: 0.1050
|
| 18 |
+
2025-09-26 12:15:59,992 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1583 | Val rms_score: 0.1067
|
| 19 |
+
2025-09-26 12:16:11,405 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1250 | Val rms_score: 0.1096
|
| 20 |
+
2025-09-26 12:16:22,649 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1500 | Val rms_score: 0.1073
|
| 21 |
+
2025-09-26 12:16:34,013 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0958 | Val rms_score: 0.1071
|
| 22 |
+
2025-09-26 12:16:45,705 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0882 | Val rms_score: 0.1103
|
| 23 |
+
2025-09-26 12:16:57,058 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0941 | Val rms_score: 0.1091
|
| 24 |
+
2025-09-26 12:17:08,530 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0889 | Val rms_score: 0.1090
|
| 25 |
+
2025-09-26 12:17:20,052 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0755 | Val rms_score: 0.1093
|
| 26 |
+
2025-09-26 12:17:31,662 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0708 | Val rms_score: 0.1071
|
| 27 |
+
2025-09-26 12:17:43,209 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0719 | Val rms_score: 0.1091
|
| 28 |
+
2025-09-26 12:17:54,596 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0590 | Val rms_score: 0.1084
|
| 29 |
+
2025-09-26 12:18:06,231 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0660 | Val rms_score: 0.1092
|
| 30 |
+
2025-09-26 12:18:17,493 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0507 | Val rms_score: 0.1092
|
| 31 |
+
2025-09-26 12:18:28,907 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0476 | Val rms_score: 0.1097
|
| 32 |
+
2025-09-26 12:18:40,401 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0472 | Val rms_score: 0.1095
|
| 33 |
+
2025-09-26 12:18:51,892 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0472 | Val rms_score: 0.1096
|
| 34 |
+
2025-09-26 12:19:04,384 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0455 | Val rms_score: 0.1097
|
| 35 |
+
2025-09-26 12:19:16,069 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0458 | Val rms_score: 0.1097
|
| 36 |
+
2025-09-26 12:19:27,986 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0422 | Val rms_score: 0.1100
|
| 37 |
+
2025-09-26 12:19:39,412 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0420 | Val rms_score: 0.1119
|
| 38 |
+
2025-09-26 12:19:50,914 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0370 | Val rms_score: 0.1092
|
| 39 |
+
2025-09-26 12:20:02,596 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0352 | Val rms_score: 0.1088
|
| 40 |
+
2025-09-26 12:20:14,146 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0320 | Val rms_score: 0.1099
|
| 41 |
+
2025-09-26 12:20:25,820 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0342 | Val rms_score: 0.1105
|
| 42 |
+
2025-09-26 12:20:37,397 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0385 | Val rms_score: 0.1108
|
| 43 |
+
2025-09-26 12:20:48,896 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0354 | Val rms_score: 0.1098
|
| 44 |
+
2025-09-26 12:21:00,415 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0306 | Val rms_score: 0.1083
|
| 45 |
+
2025-09-26 12:21:12,124 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0294 | Val rms_score: 0.1107
|
| 46 |
+
2025-09-26 12:21:23,728 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0280 | Val rms_score: 0.1103
|
| 47 |
+
2025-09-26 12:21:35,195 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0320 | Val rms_score: 0.1095
|
| 48 |
+
2025-09-26 12:21:46,696 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0260 | Val rms_score: 0.1101
|
| 49 |
+
2025-09-26 12:21:58,091 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0260 | Val rms_score: 0.1097
|
| 50 |
+
2025-09-26 12:22:09,789 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0255 | Val rms_score: 0.1091
|
| 51 |
+
2025-09-26 12:22:21,389 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0259 | Val rms_score: 0.1102
|
| 52 |
+
2025-09-26 12:22:32,895 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0250 | Val rms_score: 0.1097
|
| 53 |
+
2025-09-26 12:22:44,332 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0234 | Val rms_score: 0.1102
|
| 54 |
+
2025-09-26 12:22:55,884 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0242 | Val rms_score: 0.1101
|
| 55 |
+
2025-09-26 12:23:07,553 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0236 | Val rms_score: 0.1091
|
| 56 |
+
2025-09-26 12:23:20,692 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0199 | Val rms_score: 0.1089
|
| 57 |
+
2025-09-26 12:23:32,141 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0238 | Val rms_score: 0.1103
|
| 58 |
+
2025-09-26 12:23:43,776 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0258 | Val rms_score: 0.1107
|
| 59 |
+
2025-09-26 12:23:55,379 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0204 | Val rms_score: 0.1112
|
| 60 |
+
2025-09-26 12:24:06,954 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0262 | Val rms_score: 0.1111
|
| 61 |
+
2025-09-26 12:24:18,630 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0212 | Val rms_score: 0.1101
|
| 62 |
+
2025-09-26 12:24:30,126 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0220 | Val rms_score: 0.1109
|
| 63 |
+
2025-09-26 12:24:41,727 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0222 | Val rms_score: 0.1103
|
| 64 |
+
2025-09-26 12:24:53,427 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0229 | Val rms_score: 0.1094
|
| 65 |
+
2025-09-26 12:25:05,113 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0224 | Val rms_score: 0.1107
|
| 66 |
+
2025-09-26 12:25:16,494 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0213 | Val rms_score: 0.1125
|
| 67 |
+
2025-09-26 12:25:28,239 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0201 | Val rms_score: 0.1093
|
| 68 |
+
2025-09-26 12:25:39,290 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0181 | Val rms_score: 0.1106
|
| 69 |
+
2025-09-26 12:25:50,994 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0155 | Val rms_score: 0.1093
|
| 70 |
+
2025-09-26 12:26:02,442 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0174 | Val rms_score: 0.1108
|
| 71 |
+
2025-09-26 12:26:13,983 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0206 | Val rms_score: 0.1090
|
| 72 |
+
2025-09-26 12:26:25,416 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0192 | Val rms_score: 0.1093
|
| 73 |
+
2025-09-26 12:26:36,993 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0181 | Val rms_score: 0.1109
|
| 74 |
+
2025-09-26 12:26:48,633 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0205 | Val rms_score: 0.1104
|
| 75 |
+
2025-09-26 12:27:00,338 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0196 | Val rms_score: 0.1109
|
| 76 |
+
2025-09-26 12:27:12,006 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0158 | Val rms_score: 0.1118
|
| 77 |
+
2025-09-26 12:27:23,625 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0181 | Val rms_score: 0.1113
|
| 78 |
+
2025-09-26 12:27:37,233 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0182 | Val rms_score: 0.1105
|
| 79 |
+
2025-09-26 12:27:47,665 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0154 | Val rms_score: 0.1097
|
| 80 |
+
2025-09-26 12:27:59,168 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0155 | Val rms_score: 0.1110
|
| 81 |
+
2025-09-26 12:28:10,840 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0167 | Val rms_score: 0.1103
|
| 82 |
+
2025-09-26 12:28:22,342 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0171 | Val rms_score: 0.1092
|
| 83 |
+
2025-09-26 12:28:33,827 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0169 | Val rms_score: 0.1107
|
| 84 |
+
2025-09-26 12:28:45,294 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0155 | Val rms_score: 0.1095
|
| 85 |
+
2025-09-26 12:28:56,962 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0165 | Val rms_score: 0.1103
|
| 86 |
+
2025-09-26 12:29:08,414 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0156 | Val rms_score: 0.1095
|
| 87 |
+
2025-09-26 12:29:19,810 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0160 | Val rms_score: 0.1103
|
| 88 |
+
2025-09-26 12:29:31,370 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0156 | Val rms_score: 0.1121
|
| 89 |
+
2025-09-26 12:29:42,886 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0221 | Val rms_score: 0.1103
|
| 90 |
+
2025-09-26 12:29:54,296 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0160 | Val rms_score: 0.1105
|
| 91 |
+
2025-09-26 12:30:05,617 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0148 | Val rms_score: 0.1109
|
| 92 |
+
2025-09-26 12:30:17,105 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0154 | Val rms_score: 0.1103
|
| 93 |
+
2025-09-26 12:30:28,541 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0140 | Val rms_score: 0.1118
|
| 94 |
+
2025-09-26 12:30:40,002 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0155 | Val rms_score: 0.1094
|
| 95 |
+
2025-09-26 12:30:51,509 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0157 | Val rms_score: 0.1088
|
| 96 |
+
2025-09-26 12:31:03,116 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0123 | Val rms_score: 0.1103
|
| 97 |
+
2025-09-26 12:31:14,944 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0137 | Val rms_score: 0.1101
|
| 98 |
+
2025-09-26 12:31:26,252 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0136 | Val rms_score: 0.1100
|
| 99 |
+
2025-09-26 12:31:37,145 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0148 | Val rms_score: 0.1095
|
| 100 |
+
2025-09-26 12:31:51,201 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0180 | Val rms_score: 0.1094
|
| 101 |
+
2025-09-26 12:32:02,021 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0121 | Val rms_score: 0.1096
|
| 102 |
+
2025-09-26 12:32:14,729 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0134 | Val rms_score: 0.1108
|
| 103 |
+
2025-09-26 12:32:27,437 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0125 | Val rms_score: 0.1105
|
| 104 |
+
2025-09-26 12:32:40,497 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0120 | Val rms_score: 0.1108
|
| 105 |
+
2025-09-26 12:32:53,234 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0130 | Val rms_score: 0.1109
|
| 106 |
+
2025-09-26 12:33:05,945 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0133 | Val rms_score: 0.1106
|
| 107 |
+
2025-09-26 12:33:18,891 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0125 | Val rms_score: 0.1108
|
| 108 |
+
2025-09-26 12:33:31,392 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0123 | Val rms_score: 0.1106
|
| 109 |
+
2025-09-26 12:33:43,724 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0119 | Val rms_score: 0.1097
|
| 110 |
+
2025-09-26 12:33:56,042 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0138 | Val rms_score: 0.1100
|
| 111 |
+
2025-09-26 12:34:08,586 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0142 | Val rms_score: 0.1102
|
| 112 |
+
2025-09-26 12:34:09,610 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Test rms_score: 0.1195
|
| 113 |
+
2025-09-26 12:34:10,008 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset astrazeneca_ppb at 2025-09-26_12-34-10
|
| 114 |
+
2025-09-26 12:34:20,310 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6611 | Val rms_score: 0.1087
|
| 115 |
+
2025-09-26 12:34:20,310 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 45
|
| 116 |
+
2025-09-26 12:34:20,882 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.1087
|
| 117 |
+
2025-09-26 12:34:32,974 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4056 | Val rms_score: 0.1030
|
| 118 |
+
2025-09-26 12:34:33,153 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 90
|
| 119 |
+
2025-09-26 12:34:33,788 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.1030
|
| 120 |
+
2025-09-26 12:34:44,655 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.2696 | Val rms_score: 0.1047
|
| 121 |
+
2025-09-26 12:34:56,384 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2083 | Val rms_score: 0.1071
|
| 122 |
+
2025-09-26 12:35:09,272 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1300 | Val rms_score: 0.1055
|
| 123 |
+
2025-09-26 12:35:22,334 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1437 | Val rms_score: 0.1137
|
| 124 |
+
2025-09-26 12:35:35,384 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1328 | Val rms_score: 0.1075
|
| 125 |
+
2025-09-26 12:35:48,257 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1285 | Val rms_score: 0.1049
|
| 126 |
+
2025-09-26 12:36:01,087 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1047 | Val rms_score: 0.1106
|
| 127 |
+
2025-09-26 12:36:14,043 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.0924 | Val rms_score: 0.1091
|
| 128 |
+
2025-09-26 12:36:26,963 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0878 | Val rms_score: 0.1074
|
| 129 |
+
2025-09-26 12:36:39,953 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0945 | Val rms_score: 0.1125
|
| 130 |
+
2025-09-26 12:36:52,547 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0833 | Val rms_score: 0.1101
|
| 131 |
+
2025-09-26 12:37:05,397 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0859 | Val rms_score: 0.1122
|
| 132 |
+
2025-09-26 12:37:18,384 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0715 | Val rms_score: 0.1113
|
| 133 |
+
2025-09-26 12:37:31,059 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0770 | Val rms_score: 0.1083
|
| 134 |
+
2025-09-26 12:37:43,659 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0542 | Val rms_score: 0.1088
|
| 135 |
+
2025-09-26 12:37:56,872 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0559 | Val rms_score: 0.1089
|
| 136 |
+
2025-09-26 12:38:09,779 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0517 | Val rms_score: 0.1090
|
| 137 |
+
2025-09-26 12:38:22,581 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0503 | Val rms_score: 0.1098
|
| 138 |
+
2025-09-26 12:38:35,547 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0455 | Val rms_score: 0.1090
|
| 139 |
+
2025-09-26 12:38:48,570 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0382 | Val rms_score: 0.1093
|
| 140 |
+
2025-09-26 12:39:01,558 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0386 | Val rms_score: 0.1084
|
| 141 |
+
2025-09-26 12:39:13,477 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0375 | Val rms_score: 0.1088
|
| 142 |
+
2025-09-26 12:39:26,445 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0356 | Val rms_score: 0.1090
|
| 143 |
+
2025-09-26 12:39:39,423 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0385 | Val rms_score: 0.1091
|
| 144 |
+
2025-09-26 12:39:52,019 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0365 | Val rms_score: 0.1087
|
| 145 |
+
2025-09-26 12:40:05,030 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0413 | Val rms_score: 0.1098
|
| 146 |
+
2025-09-26 12:40:18,192 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0359 | Val rms_score: 0.1092
|
| 147 |
+
2025-09-26 12:40:31,187 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0408 | Val rms_score: 0.1091
|
| 148 |
+
2025-09-26 12:40:44,169 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0339 | Val rms_score: 0.1081
|
| 149 |
+
2025-09-26 12:40:57,884 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0326 | Val rms_score: 0.1078
|
| 150 |
+
2025-09-26 12:41:11,255 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0311 | Val rms_score: 0.1081
|
| 151 |
+
2025-09-26 12:41:24,497 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0299 | Val rms_score: 0.1085
|
| 152 |
+
2025-09-26 12:41:37,434 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0288 | Val rms_score: 0.1080
|
| 153 |
+
2025-09-26 12:41:50,629 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0322 | Val rms_score: 0.1098
|
| 154 |
+
2025-09-26 12:42:03,961 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0241 | Val rms_score: 0.1081
|
| 155 |
+
2025-09-26 12:42:17,432 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0230 | Val rms_score: 0.1088
|
| 156 |
+
2025-09-26 12:42:30,715 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0266 | Val rms_score: 0.1092
|
| 157 |
+
2025-09-26 12:42:44,034 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0255 | Val rms_score: 0.1090
|
| 158 |
+
2025-09-26 12:42:57,016 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0243 | Val rms_score: 0.1090
|
| 159 |
+
2025-09-26 12:43:09,760 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0245 | Val rms_score: 0.1097
|
| 160 |
+
2025-09-26 12:43:22,796 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0232 | Val rms_score: 0.1099
|
| 161 |
+
2025-09-26 12:43:35,857 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0241 | Val rms_score: 0.1103
|
| 162 |
+
2025-09-26 12:43:48,732 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0275 | Val rms_score: 0.1089
|
| 163 |
+
2025-09-26 12:43:59,814 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0253 | Val rms_score: 0.1095
|
| 164 |
+
2025-09-26 12:44:12,556 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0297 | Val rms_score: 0.1088
|
| 165 |
+
2025-09-26 12:44:25,691 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0259 | Val rms_score: 0.1104
|
| 166 |
+
2025-09-26 12:44:39,122 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0215 | Val rms_score: 0.1100
|
| 167 |
+
2025-09-26 12:44:52,117 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0233 | Val rms_score: 0.1088
|
| 168 |
+
2025-09-26 12:45:05,105 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0207 | Val rms_score: 0.1099
|
| 169 |
+
2025-09-26 12:45:17,627 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0183 | Val rms_score: 0.1080
|
| 170 |
+
2025-09-26 12:45:30,348 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0189 | Val rms_score: 0.1098
|
| 171 |
+
2025-09-26 12:45:43,080 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0219 | Val rms_score: 0.1121
|
| 172 |
+
2025-09-26 12:45:55,911 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0208 | Val rms_score: 0.1091
|
| 173 |
+
2025-09-26 12:46:08,699 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0224 | Val rms_score: 0.1079
|
| 174 |
+
2025-09-26 12:46:21,397 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0221 | Val rms_score: 0.1106
|
| 175 |
+
2025-09-26 12:46:34,384 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0241 | Val rms_score: 0.1096
|
| 176 |
+
2025-09-26 12:46:47,657 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0207 | Val rms_score: 0.1092
|
| 177 |
+
2025-09-26 12:47:00,618 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0194 | Val rms_score: 0.1088
|
| 178 |
+
2025-09-26 12:47:13,689 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0216 | Val rms_score: 0.1112
|
| 179 |
+
2025-09-26 12:47:26,539 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0224 | Val rms_score: 0.1094
|
| 180 |
+
2025-09-26 12:47:39,784 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0206 | Val rms_score: 0.1094
|
| 181 |
+
2025-09-26 12:47:52,782 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0179 | Val rms_score: 0.1088
|
| 182 |
+
2025-09-26 12:48:05,657 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0163 | Val rms_score: 0.1094
|
| 183 |
+
2025-09-26 12:48:18,336 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0167 | Val rms_score: 0.1099
|
| 184 |
+
2025-09-26 12:48:31,452 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0217 | Val rms_score: 0.1088
|
| 185 |
+
2025-09-26 12:48:42,775 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0186 | Val rms_score: 0.1078
|
| 186 |
+
2025-09-26 12:48:55,487 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0354 | Val rms_score: 0.1079
|
| 187 |
+
2025-09-26 12:49:08,139 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0183 | Val rms_score: 0.1091
|
| 188 |
+
2025-09-26 12:49:20,816 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0159 | Val rms_score: 0.1093
|
| 189 |
+
2025-09-26 12:49:33,500 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0154 | Val rms_score: 0.1094
|
| 190 |
+
2025-09-26 12:49:46,671 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0154 | Val rms_score: 0.1100
|
| 191 |
+
2025-09-26 12:49:59,551 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0171 | Val rms_score: 0.1107
|
| 192 |
+
2025-09-26 12:50:12,201 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0179 | Val rms_score: 0.1104
|
| 193 |
+
2025-09-26 12:50:24,740 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0141 | Val rms_score: 0.1092
|
| 194 |
+
2025-09-26 12:50:37,068 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0155 | Val rms_score: 0.1098
|
| 195 |
+
2025-09-26 12:50:49,975 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0155 | Val rms_score: 0.1094
|
| 196 |
+
2025-09-26 12:51:02,799 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0155 | Val rms_score: 0.1090
|
| 197 |
+
2025-09-26 12:51:15,305 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0146 | Val rms_score: 0.1096
|
| 198 |
+
2025-09-26 12:51:27,517 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0132 | Val rms_score: 0.1094
|
| 199 |
+
2025-09-26 12:51:40,012 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0146 | Val rms_score: 0.1095
|
| 200 |
+
2025-09-26 12:51:52,213 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0128 | Val rms_score: 0.1083
|
| 201 |
+
2025-09-26 12:52:04,572 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0136 | Val rms_score: 0.1099
|
| 202 |
+
2025-09-26 12:52:16,927 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0148 | Val rms_score: 0.1103
|
| 203 |
+
2025-09-26 12:52:29,086 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0174 | Val rms_score: 0.1108
|
| 204 |
+
2025-09-26 12:52:40,975 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0149 | Val rms_score: 0.1084
|
| 205 |
+
2025-09-26 12:52:53,046 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0131 | Val rms_score: 0.1095
|
| 206 |
+
2025-09-26 12:53:06,708 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0154 | Val rms_score: 0.1096
|
| 207 |
+
2025-09-26 12:53:17,394 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0129 | Val rms_score: 0.1098
|
| 208 |
+
2025-09-26 12:53:29,883 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0141 | Val rms_score: 0.1086
|
| 209 |
+
2025-09-26 12:53:42,496 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0134 | Val rms_score: 0.1092
|
| 210 |
+
2025-09-26 12:53:54,673 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0141 | Val rms_score: 0.1086
|
| 211 |
+
2025-09-26 12:54:06,789 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0137 | Val rms_score: 0.1087
|
| 212 |
+
2025-09-26 12:54:18,963 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0127 | Val rms_score: 0.1098
|
| 213 |
+
2025-09-26 12:54:31,008 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0133 | Val rms_score: 0.1093
|
| 214 |
+
2025-09-26 12:54:42,814 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0119 | Val rms_score: 0.1086
|
| 215 |
+
2025-09-26 12:54:54,519 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0168 | Val rms_score: 0.1088
|
| 216 |
+
2025-09-26 12:55:06,407 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0141 | Val rms_score: 0.1088
|
| 217 |
+
2025-09-26 12:55:18,241 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0119 | Val rms_score: 0.1087
|
| 218 |
+
2025-09-26 12:55:19,237 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Test rms_score: 0.1163
|
| 219 |
+
2025-09-26 12:55:19,670 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset astrazeneca_ppb at 2025-09-26_12-55-19
|
| 220 |
+
2025-09-26 12:55:29,952 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7028 | Val rms_score: 0.1287
|
| 221 |
+
2025-09-26 12:55:29,952 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 45
|
| 222 |
+
2025-09-26 12:55:30,514 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.1287
|
| 223 |
+
2025-09-26 12:55:42,309 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4056 | Val rms_score: 0.1099
|
| 224 |
+
2025-09-26 12:55:42,496 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 90
|
| 225 |
+
2025-09-26 12:55:43,086 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.1099
|
| 226 |
+
2025-09-26 12:55:53,812 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3000 | Val rms_score: 0.1025
|
| 227 |
+
2025-09-26 12:55:54,003 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Global step of best model: 135
|
| 228 |
+
2025-09-26 12:55:54,572 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.1025
|
| 229 |
+
2025-09-26 12:56:05,444 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2250 | Val rms_score: 0.1048
|
| 230 |
+
2025-09-26 12:56:17,644 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.1700 | Val rms_score: 0.1112
|
| 231 |
+
2025-09-26 12:56:29,336 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1444 | Val rms_score: 0.1043
|
| 232 |
+
2025-09-26 12:56:42,006 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1688 | Val rms_score: 0.1114
|
| 233 |
+
2025-09-26 12:56:54,152 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1257 | Val rms_score: 0.1065
|
| 234 |
+
2025-09-26 12:57:06,299 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1187 | Val rms_score: 0.1082
|
| 235 |
+
2025-09-26 12:57:18,516 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1042 | Val rms_score: 0.1061
|
| 236 |
+
2025-09-26 12:57:30,789 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.0958 | Val rms_score: 0.1092
|
| 237 |
+
2025-09-26 12:57:42,834 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.0852 | Val rms_score: 0.1091
|
| 238 |
+
2025-09-26 12:57:54,952 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0792 | Val rms_score: 0.1084
|
| 239 |
+
2025-09-26 12:58:07,046 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0807 | Val rms_score: 0.1087
|
| 240 |
+
2025-09-26 12:58:19,188 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0708 | Val rms_score: 0.1091
|
| 241 |
+
2025-09-26 12:58:31,150 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0668 | Val rms_score: 0.1086
|
| 242 |
+
2025-09-26 12:58:43,642 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0663 | Val rms_score: 0.1074
|
| 243 |
+
2025-09-26 12:58:55,883 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0965 | Val rms_score: 0.1119
|
| 244 |
+
2025-09-26 12:59:08,465 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0868 | Val rms_score: 0.1122
|
| 245 |
+
2025-09-26 12:59:21,000 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0549 | Val rms_score: 0.1108
|
| 246 |
+
2025-09-26 12:59:33,366 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0469 | Val rms_score: 0.1073
|
| 247 |
+
2025-09-26 12:59:46,078 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0417 | Val rms_score: 0.1086
|
| 248 |
+
2025-09-26 12:59:58,896 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0411 | Val rms_score: 0.1070
|
| 249 |
+
2025-09-26 13:00:10,762 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0399 | Val rms_score: 0.1073
|
| 250 |
+
2025-09-26 13:00:24,245 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0334 | Val rms_score: 0.1086
|
| 251 |
+
2025-09-26 13:00:37,660 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0356 | Val rms_score: 0.1078
|
| 252 |
+
2025-09-26 13:00:50,814 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0359 | Val rms_score: 0.1103
|
| 253 |
+
2025-09-26 13:01:03,818 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0339 | Val rms_score: 0.1081
|
| 254 |
+
2025-09-26 13:01:16,784 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0434 | Val rms_score: 0.1072
|
| 255 |
+
2025-09-26 13:01:29,829 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0318 | Val rms_score: 0.1077
|
| 256 |
+
2025-09-26 13:01:41,627 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0286 | Val rms_score: 0.1086
|
| 257 |
+
2025-09-26 13:01:54,829 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0283 | Val rms_score: 0.1075
|
| 258 |
+
2025-09-26 13:02:06,983 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0328 | Val rms_score: 0.1081
|
| 259 |
+
2025-09-26 13:02:19,032 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0329 | Val rms_score: 0.1076
|
| 260 |
+
2025-09-26 13:02:31,046 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0281 | Val rms_score: 0.1087
|
| 261 |
+
2025-09-26 13:02:42,066 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0244 | Val rms_score: 0.1083
|
| 262 |
+
2025-09-26 13:02:54,053 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0267 | Val rms_score: 0.1100
|
| 263 |
+
2025-09-26 13:03:05,316 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0227 | Val rms_score: 0.1084
|
| 264 |
+
2025-09-26 13:03:16,999 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0247 | Val rms_score: 0.1096
|
| 265 |
+
2025-09-26 13:03:28,585 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0273 | Val rms_score: 0.1082
|
| 266 |
+
2025-09-26 13:03:40,138 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0236 | Val rms_score: 0.1079
|
| 267 |
+
2025-09-26 13:03:52,526 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0250 | Val rms_score: 0.1072
|
| 268 |
+
2025-09-26 13:04:04,273 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0237 | Val rms_score: 0.1083
|
| 269 |
+
2025-09-26 13:04:15,726 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0227 | Val rms_score: 0.1077
|
| 270 |
+
2025-09-26 13:04:28,186 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0278 | Val rms_score: 0.1074
|
| 271 |
+
2025-09-26 13:04:39,559 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0250 | Val rms_score: 0.1084
|
| 272 |
+
2025-09-26 13:04:51,933 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0234 | Val rms_score: 0.1086
|
| 273 |
+
2025-09-26 13:05:03,638 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0247 | Val rms_score: 0.1081
|
| 274 |
+
2025-09-26 13:05:15,279 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0162 | Val rms_score: 0.1079
|
| 275 |
+
2025-09-26 13:05:26,726 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0218 | Val rms_score: 0.1085
|
| 276 |
+
2025-09-26 13:05:37,169 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0208 | Val rms_score: 0.1071
|
| 277 |
+
2025-09-26 13:05:48,972 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0225 | Val rms_score: 0.1084
|
| 278 |
+
2025-09-26 13:06:00,142 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0209 | Val rms_score: 0.1094
|
| 279 |
+
2025-09-26 13:06:11,670 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0201 | Val rms_score: 0.1092
|
| 280 |
+
2025-09-26 13:06:23,189 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0234 | Val rms_score: 0.1097
|
| 281 |
+
2025-09-26 13:06:34,721 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0256 | Val rms_score: 0.1072
|
| 282 |
+
2025-09-26 13:06:46,606 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0224 | Val rms_score: 0.1101
|
| 283 |
+
2025-09-26 13:06:57,536 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0208 | Val rms_score: 0.1078
|
| 284 |
+
2025-09-26 13:07:09,241 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0195 | Val rms_score: 0.1086
|
| 285 |
+
2025-09-26 13:07:19,416 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0187 | Val rms_score: 0.1085
|
| 286 |
+
2025-09-26 13:07:31,181 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0184 | Val rms_score: 0.1090
|
| 287 |
+
2025-09-26 13:07:43,377 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0201 | Val rms_score: 0.1082
|
| 288 |
+
2025-09-26 13:07:54,419 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0179 | Val rms_score: 0.1080
|
| 289 |
+
2025-09-26 13:08:05,848 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0187 | Val rms_score: 0.1086
|
| 290 |
+
2025-09-26 13:08:17,530 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0177 | Val rms_score: 0.1090
|
| 291 |
+
2025-09-26 13:08:28,975 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0183 | Val rms_score: 0.1096
|
| 292 |
+
2025-09-26 13:08:42,214 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0163 | Val rms_score: 0.1077
|
| 293 |
+
2025-09-26 13:08:53,015 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0169 | Val rms_score: 0.1073
|
| 294 |
+
2025-09-26 13:09:03,228 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0225 | Val rms_score: 0.1079
|
| 295 |
+
2025-09-26 13:09:14,868 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0178 | Val rms_score: 0.1078
|
| 296 |
+
2025-09-26 13:09:26,371 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0184 | Val rms_score: 0.1081
|
| 297 |
+
2025-09-26 13:09:37,279 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0175 | Val rms_score: 0.1087
|
| 298 |
+
2025-09-26 13:09:49,019 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0162 | Val rms_score: 0.1088
|
| 299 |
+
2025-09-26 13:10:01,116 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0156 | Val rms_score: 0.1086
|
| 300 |
+
2025-09-26 13:10:12,880 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0174 | Val rms_score: 0.1081
|
| 301 |
+
2025-09-26 13:10:24,712 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0209 | Val rms_score: 0.1089
|
| 302 |
+
2025-09-26 13:10:36,886 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0221 | Val rms_score: 0.1084
|
| 303 |
+
2025-09-26 13:10:48,202 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0227 | Val rms_score: 0.1084
|
| 304 |
+
2025-09-26 13:10:59,832 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0175 | Val rms_score: 0.1084
|
| 305 |
+
2025-09-26 13:11:11,625 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0161 | Val rms_score: 0.1090
|
| 306 |
+
2025-09-26 13:11:23,478 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0155 | Val rms_score: 0.1080
|
| 307 |
+
2025-09-26 13:11:35,475 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0165 | Val rms_score: 0.1084
|
| 308 |
+
2025-09-26 13:11:46,594 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0150 | Val rms_score: 0.1086
|
| 309 |
+
2025-09-26 13:11:58,107 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0152 | Val rms_score: 0.1088
|
| 310 |
+
2025-09-26 13:12:10,067 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0137 | Val rms_score: 0.1090
|
| 311 |
+
2025-09-26 13:12:22,008 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0150 | Val rms_score: 0.1092
|
| 312 |
+
2025-09-26 13:12:34,298 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0128 | Val rms_score: 0.1088
|
| 313 |
+
2025-09-26 13:12:45,464 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0161 | Val rms_score: 0.1098
|
| 314 |
+
2025-09-26 13:12:58,779 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0205 | Val rms_score: 0.1086
|
| 315 |
+
2025-09-26 13:13:10,398 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0141 | Val rms_score: 0.1075
|
| 316 |
+
2025-09-26 13:13:20,445 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0135 | Val rms_score: 0.1082
|
| 317 |
+
2025-09-26 13:13:32,473 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0148 | Val rms_score: 0.1084
|
| 318 |
+
2025-09-26 13:13:43,218 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0136 | Val rms_score: 0.1099
|
| 319 |
+
2025-09-26 13:13:54,863 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0131 | Val rms_score: 0.1085
|
| 320 |
+
2025-09-26 13:14:06,437 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0129 | Val rms_score: 0.1092
|
| 321 |
+
2025-09-26 13:14:17,922 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0114 | Val rms_score: 0.1088
|
| 322 |
+
2025-09-26 13:14:29,971 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0135 | Val rms_score: 0.1092
|
| 323 |
+
2025-09-26 13:14:41,124 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0106 | Val rms_score: 0.1078
|
| 324 |
+
2025-09-26 13:14:52,620 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0139 | Val rms_score: 0.1080
|
| 325 |
+
2025-09-26 13:15:04,097 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0135 | Val rms_score: 0.1079
|
| 326 |
+
2025-09-26 13:15:05,169 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Test rms_score: 0.1226
|
| 327 |
+
2025-09-26 13:15:05,575 - logs_modchembert_astrazeneca_ppb_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.1195, Std Dev: 0.0026
|
logs_modchembert_regression_ModChemBERT-MLM-DAPT-TAFT/modchembert_deepchem_splits_run_astrazeneca_solubility_epochs100_batch_size32_20250926_131505.log
ADDED
|
@@ -0,0 +1,355 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-09-26 13:15:05,577 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Running benchmark for dataset: astrazeneca_solubility
|
| 2 |
+
2025-09-26 13:15:05,577 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - dataset: astrazeneca_solubility, tasks: ['y'], epochs: 100, learning rate: 3e-05, transform: True
|
| 3 |
+
2025-09-26 13:15:05,585 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Starting triplicate run 1 for dataset astrazeneca_solubility at 2025-09-26_13-15-05
|
| 4 |
+
2025-09-26 13:15:15,437 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7389 | Val rms_score: 0.8587
|
| 5 |
+
2025-09-26 13:15:15,437 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 45
|
| 6 |
+
2025-09-26 13:15:15,986 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.8587
|
| 7 |
+
2025-09-26 13:15:26,752 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4528 | Val rms_score: 0.8449
|
| 8 |
+
2025-09-26 13:15:26,945 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 90
|
| 9 |
+
2025-09-26 13:15:27,519 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.8449
|
| 10 |
+
2025-09-26 13:15:38,677 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3393 | Val rms_score: 0.8458
|
| 11 |
+
2025-09-26 13:15:49,729 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2819 | Val rms_score: 0.8696
|
| 12 |
+
2025-09-26 13:15:59,678 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2612 | Val rms_score: 0.7790
|
| 13 |
+
2025-09-26 13:15:59,873 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 225
|
| 14 |
+
2025-09-26 13:16:00,446 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 5 with val rms_score: 0.7790
|
| 15 |
+
2025-09-26 13:16:11,717 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.2264 | Val rms_score: 0.7823
|
| 16 |
+
2025-09-26 13:16:22,233 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1833 | Val rms_score: 0.7599
|
| 17 |
+
2025-09-26 13:16:22,468 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 315
|
| 18 |
+
2025-09-26 13:16:23,081 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 7 with val rms_score: 0.7599
|
| 19 |
+
2025-09-26 13:16:32,611 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1486 | Val rms_score: 0.7730
|
| 20 |
+
2025-09-26 13:16:43,703 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1320 | Val rms_score: 0.7621
|
| 21 |
+
2025-09-26 13:16:55,276 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1278 | Val rms_score: 0.7786
|
| 22 |
+
2025-09-26 13:17:06,975 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.1118 | Val rms_score: 0.7555
|
| 23 |
+
2025-09-26 13:17:07,548 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 495
|
| 24 |
+
2025-09-26 13:17:08,163 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 11 with val rms_score: 0.7555
|
| 25 |
+
2025-09-26 13:17:19,763 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.1297 | Val rms_score: 0.7880
|
| 26 |
+
2025-09-26 13:17:30,598 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0993 | Val rms_score: 0.8042
|
| 27 |
+
2025-09-26 13:17:41,747 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0964 | Val rms_score: 0.8208
|
| 28 |
+
2025-09-26 13:17:52,844 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.1201 | Val rms_score: 0.8173
|
| 29 |
+
2025-09-26 13:18:03,982 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.1062 | Val rms_score: 0.7460
|
| 30 |
+
2025-09-26 13:18:04,556 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 720
|
| 31 |
+
2025-09-26 13:18:05,190 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 16 with val rms_score: 0.7460
|
| 32 |
+
2025-09-26 13:18:16,503 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0743 | Val rms_score: 0.7628
|
| 33 |
+
2025-09-26 13:18:25,902 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0641 | Val rms_score: 0.7632
|
| 34 |
+
2025-09-26 13:18:37,076 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0663 | Val rms_score: 0.7819
|
| 35 |
+
2025-09-26 13:18:46,983 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0840 | Val rms_score: 0.8082
|
| 36 |
+
2025-09-26 13:18:58,082 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.1729 | Val rms_score: 0.7605
|
| 37 |
+
2025-09-26 13:19:09,513 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0986 | Val rms_score: 0.8006
|
| 38 |
+
2025-09-26 13:19:21,303 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0924 | Val rms_score: 0.7722
|
| 39 |
+
2025-09-26 13:19:31,827 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0736 | Val rms_score: 0.7671
|
| 40 |
+
2025-09-26 13:19:43,143 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0875 | Val rms_score: 0.7677
|
| 41 |
+
2025-09-26 13:19:54,274 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0660 | Val rms_score: 0.7634
|
| 42 |
+
2025-09-26 13:20:05,823 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0714 | Val rms_score: 0.7641
|
| 43 |
+
2025-09-26 13:20:16,612 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0545 | Val rms_score: 0.7647
|
| 44 |
+
2025-09-26 13:20:27,786 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0551 | Val rms_score: 0.7654
|
| 45 |
+
2025-09-26 13:20:38,758 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0542 | Val rms_score: 0.7655
|
| 46 |
+
2025-09-26 13:20:49,985 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0556 | Val rms_score: 0.7888
|
| 47 |
+
2025-09-26 13:21:01,833 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0508 | Val rms_score: 0.7685
|
| 48 |
+
2025-09-26 13:21:12,388 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0500 | Val rms_score: 0.7674
|
| 49 |
+
2025-09-26 13:21:23,652 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0456 | Val rms_score: 0.7684
|
| 50 |
+
2025-09-26 13:21:34,939 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0443 | Val rms_score: 0.7684
|
| 51 |
+
2025-09-26 13:21:46,241 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0492 | Val rms_score: 0.7739
|
| 52 |
+
2025-09-26 13:21:57,664 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0405 | Val rms_score: 0.7761
|
| 53 |
+
2025-09-26 13:22:08,257 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0305 | Val rms_score: 0.7681
|
| 54 |
+
2025-09-26 13:22:19,446 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0373 | Val rms_score: 0.7728
|
| 55 |
+
2025-09-26 13:22:30,687 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0359 | Val rms_score: 0.7658
|
| 56 |
+
2025-09-26 13:22:41,902 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0370 | Val rms_score: 0.7706
|
| 57 |
+
2025-09-26 13:22:53,379 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0345 | Val rms_score: 0.7662
|
| 58 |
+
2025-09-26 13:23:04,068 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0324 | Val rms_score: 0.7682
|
| 59 |
+
2025-09-26 13:23:15,235 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0326 | Val rms_score: 0.7742
|
| 60 |
+
2025-09-26 13:23:28,432 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0334 | Val rms_score: 0.7614
|
| 61 |
+
2025-09-26 13:23:39,124 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0326 | Val rms_score: 0.7681
|
| 62 |
+
2025-09-26 13:23:50,783 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0315 | Val rms_score: 0.7607
|
| 63 |
+
2025-09-26 13:24:01,255 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0302 | Val rms_score: 0.7661
|
| 64 |
+
2025-09-26 13:24:12,523 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0252 | Val rms_score: 0.7694
|
| 65 |
+
2025-09-26 13:24:23,600 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0306 | Val rms_score: 0.7740
|
| 66 |
+
2025-09-26 13:24:34,752 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0306 | Val rms_score: 0.7678
|
| 67 |
+
2025-09-26 13:24:46,449 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0293 | Val rms_score: 0.7617
|
| 68 |
+
2025-09-26 13:24:57,356 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0337 | Val rms_score: 0.7679
|
| 69 |
+
2025-09-26 13:25:08,647 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0336 | Val rms_score: 0.7655
|
| 70 |
+
2025-09-26 13:25:19,771 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0486 | Val rms_score: 0.7860
|
| 71 |
+
2025-09-26 13:25:30,956 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0367 | Val rms_score: 0.7905
|
| 72 |
+
2025-09-26 13:25:42,557 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0300 | Val rms_score: 0.7772
|
| 73 |
+
2025-09-26 13:25:53,116 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0256 | Val rms_score: 0.7851
|
| 74 |
+
2025-09-26 13:26:04,290 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0281 | Val rms_score: 0.7832
|
| 75 |
+
2025-09-26 13:26:15,475 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0264 | Val rms_score: 0.7802
|
| 76 |
+
2025-09-26 13:26:26,468 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0234 | Val rms_score: 0.7767
|
| 77 |
+
2025-09-26 13:26:38,039 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0234 | Val rms_score: 0.7787
|
| 78 |
+
2025-09-26 13:26:48,572 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0276 | Val rms_score: 0.7785
|
| 79 |
+
2025-09-26 13:26:59,570 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0292 | Val rms_score: 0.7815
|
| 80 |
+
2025-09-26 13:27:10,783 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0253 | Val rms_score: 0.7859
|
| 81 |
+
2025-09-26 13:27:22,212 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0247 | Val rms_score: 0.7818
|
| 82 |
+
2025-09-26 13:27:35,042 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0288 | Val rms_score: 0.7855
|
| 83 |
+
2025-09-26 13:27:45,370 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0234 | Val rms_score: 0.7777
|
| 84 |
+
2025-09-26 13:27:56,015 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0187 | Val rms_score: 0.7758
|
| 85 |
+
2025-09-26 13:28:06,959 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0226 | Val rms_score: 0.7814
|
| 86 |
+
2025-09-26 13:28:18,093 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0220 | Val rms_score: 0.7792
|
| 87 |
+
2025-09-26 13:28:29,564 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0212 | Val rms_score: 0.7721
|
| 88 |
+
2025-09-26 13:28:39,939 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0207 | Val rms_score: 0.7766
|
| 89 |
+
2025-09-26 13:28:51,099 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0199 | Val rms_score: 0.7785
|
| 90 |
+
2025-09-26 13:29:02,353 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0195 | Val rms_score: 0.7787
|
| 91 |
+
2025-09-26 13:29:13,561 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0207 | Val rms_score: 0.7746
|
| 92 |
+
2025-09-26 13:29:25,169 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0201 | Val rms_score: 0.7733
|
| 93 |
+
2025-09-26 13:29:35,855 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0173 | Val rms_score: 0.7729
|
| 94 |
+
2025-09-26 13:29:47,058 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0198 | Val rms_score: 0.7750
|
| 95 |
+
2025-09-26 13:29:58,046 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0199 | Val rms_score: 0.7769
|
| 96 |
+
2025-09-26 13:30:09,101 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0233 | Val rms_score: 0.7795
|
| 97 |
+
2025-09-26 13:30:20,667 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0215 | Val rms_score: 0.7770
|
| 98 |
+
2025-09-26 13:30:31,249 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0190 | Val rms_score: 0.7737
|
| 99 |
+
2025-09-26 13:30:42,319 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0202 | Val rms_score: 0.7703
|
| 100 |
+
2025-09-26 13:30:53,297 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0186 | Val rms_score: 0.7765
|
| 101 |
+
2025-09-26 13:31:04,222 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0193 | Val rms_score: 0.7825
|
| 102 |
+
2025-09-26 13:31:15,897 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0177 | Val rms_score: 0.7797
|
| 103 |
+
2025-09-26 13:31:26,265 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0180 | Val rms_score: 0.7798
|
| 104 |
+
2025-09-26 13:31:38,712 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0172 | Val rms_score: 0.7695
|
| 105 |
+
2025-09-26 13:31:49,750 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0189 | Val rms_score: 0.7755
|
| 106 |
+
2025-09-26 13:31:59,490 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0167 | Val rms_score: 0.7763
|
| 107 |
+
2025-09-26 13:32:11,079 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0179 | Val rms_score: 0.7805
|
| 108 |
+
2025-09-26 13:32:21,641 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0174 | Val rms_score: 0.7773
|
| 109 |
+
2025-09-26 13:32:33,123 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0178 | Val rms_score: 0.7782
|
| 110 |
+
2025-09-26 13:32:44,114 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0170 | Val rms_score: 0.7847
|
| 111 |
+
2025-09-26 13:32:55,512 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0185 | Val rms_score: 0.7814
|
| 112 |
+
2025-09-26 13:33:07,157 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0189 | Val rms_score: 0.7690
|
| 113 |
+
2025-09-26 13:33:17,534 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0181 | Val rms_score: 0.7776
|
| 114 |
+
2025-09-26 13:33:28,881 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0166 | Val rms_score: 0.7753
|
| 115 |
+
2025-09-26 13:33:39,981 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0161 | Val rms_score: 0.7774
|
| 116 |
+
2025-09-26 13:33:40,959 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Test rms_score: 0.8220
|
| 117 |
+
2025-09-26 13:33:41,403 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Starting triplicate run 2 for dataset astrazeneca_solubility at 2025-09-26_13-33-41
|
| 118 |
+
2025-09-26 13:33:51,206 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.6750 | Val rms_score: 0.8976
|
| 119 |
+
2025-09-26 13:33:51,206 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 45
|
| 120 |
+
2025-09-26 13:33:51,792 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.8976
|
| 121 |
+
2025-09-26 13:34:02,321 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4278 | Val rms_score: 0.8126
|
| 122 |
+
2025-09-26 13:34:02,524 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 90
|
| 123 |
+
2025-09-26 13:34:03,215 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.8126
|
| 124 |
+
2025-09-26 13:34:13,932 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3482 | Val rms_score: 0.8442
|
| 125 |
+
2025-09-26 13:34:23,777 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.2708 | Val rms_score: 0.7978
|
| 126 |
+
2025-09-26 13:34:23,968 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 180
|
| 127 |
+
2025-09-26 13:34:24,518 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.7978
|
| 128 |
+
2025-09-26 13:34:35,816 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2250 | Val rms_score: 0.8127
|
| 129 |
+
2025-09-26 13:34:45,617 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.1889 | Val rms_score: 0.8218
|
| 130 |
+
2025-09-26 13:34:57,175 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1854 | Val rms_score: 0.8026
|
| 131 |
+
2025-09-26 13:35:07,756 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1625 | Val rms_score: 0.7942
|
| 132 |
+
2025-09-26 13:35:07,952 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 360
|
| 133 |
+
2025-09-26 13:35:08,546 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 8 with val rms_score: 0.7942
|
| 134 |
+
2025-09-26 13:35:19,712 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1578 | Val rms_score: 0.8740
|
| 135 |
+
2025-09-26 13:35:30,930 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1326 | Val rms_score: 0.8038
|
| 136 |
+
2025-09-26 13:35:42,162 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.1285 | Val rms_score: 0.8158
|
| 137 |
+
2025-09-26 13:35:53,709 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.1195 | Val rms_score: 0.7808
|
| 138 |
+
2025-09-26 13:35:53,865 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 540
|
| 139 |
+
2025-09-26 13:35:54,461 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 12 with val rms_score: 0.7808
|
| 140 |
+
2025-09-26 13:36:05,937 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.0993 | Val rms_score: 0.7845
|
| 141 |
+
2025-09-26 13:36:16,288 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.0823 | Val rms_score: 0.7906
|
| 142 |
+
2025-09-26 13:36:27,416 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0813 | Val rms_score: 0.7734
|
| 143 |
+
2025-09-26 13:36:27,573 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 675
|
| 144 |
+
2025-09-26 13:36:28,175 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 15 with val rms_score: 0.7734
|
| 145 |
+
2025-09-26 13:36:39,828 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0797 | Val rms_score: 0.7788
|
| 146 |
+
2025-09-26 13:36:51,313 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.0715 | Val rms_score: 0.7872
|
| 147 |
+
2025-09-26 13:37:01,760 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.0703 | Val rms_score: 0.8004
|
| 148 |
+
2025-09-26 13:37:12,967 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0830 | Val rms_score: 0.8070
|
| 149 |
+
2025-09-26 13:37:24,037 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.1250 | Val rms_score: 0.7810
|
| 150 |
+
2025-09-26 13:37:35,195 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0788 | Val rms_score: 0.7993
|
| 151 |
+
2025-09-26 13:37:46,069 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0688 | Val rms_score: 0.7960
|
| 152 |
+
2025-09-26 13:37:59,060 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0670 | Val rms_score: 0.8642
|
| 153 |
+
2025-09-26 13:38:09,273 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0986 | Val rms_score: 0.7798
|
| 154 |
+
2025-09-26 13:38:20,383 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0584 | Val rms_score: 0.7798
|
| 155 |
+
2025-09-26 13:38:30,246 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0535 | Val rms_score: 0.7865
|
| 156 |
+
2025-09-26 13:38:41,487 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0505 | Val rms_score: 0.7825
|
| 157 |
+
2025-09-26 13:38:52,161 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0465 | Val rms_score: 0.7745
|
| 158 |
+
2025-09-26 13:39:03,234 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0527 | Val rms_score: 0.7754
|
| 159 |
+
2025-09-26 13:39:14,485 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0434 | Val rms_score: 0.7700
|
| 160 |
+
2025-09-26 13:39:14,640 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1350
|
| 161 |
+
2025-09-26 13:39:15,575 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 30 with val rms_score: 0.7700
|
| 162 |
+
2025-09-26 13:39:27,013 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0427 | Val rms_score: 0.7868
|
| 163 |
+
2025-09-26 13:39:38,704 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0504 | Val rms_score: 0.8211
|
| 164 |
+
2025-09-26 13:39:49,008 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.1174 | Val rms_score: 0.7949
|
| 165 |
+
2025-09-26 13:40:00,443 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0503 | Val rms_score: 0.7821
|
| 166 |
+
2025-09-26 13:40:10,338 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0415 | Val rms_score: 0.7749
|
| 167 |
+
2025-09-26 13:40:20,233 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0393 | Val rms_score: 0.7799
|
| 168 |
+
2025-09-26 13:40:30,645 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0392 | Val rms_score: 0.7697
|
| 169 |
+
2025-09-26 13:40:30,803 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1665
|
| 170 |
+
2025-09-26 13:40:31,418 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 37 with val rms_score: 0.7697
|
| 171 |
+
2025-09-26 13:40:41,806 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0326 | Val rms_score: 0.7696
|
| 172 |
+
2025-09-26 13:40:41,998 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 1710
|
| 173 |
+
2025-09-26 13:40:42,581 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 38 with val rms_score: 0.7696
|
| 174 |
+
2025-09-26 13:40:52,456 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0358 | Val rms_score: 0.7790
|
| 175 |
+
2025-09-26 13:41:03,415 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0328 | Val rms_score: 0.7866
|
| 176 |
+
2025-09-26 13:41:14,521 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0340 | Val rms_score: 0.7926
|
| 177 |
+
2025-09-26 13:41:26,006 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0458 | Val rms_score: 0.7763
|
| 178 |
+
2025-09-26 13:41:36,864 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0359 | Val rms_score: 0.7821
|
| 179 |
+
2025-09-26 13:41:47,700 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0304 | Val rms_score: 0.7782
|
| 180 |
+
2025-09-26 13:42:00,697 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0311 | Val rms_score: 0.7881
|
| 181 |
+
2025-09-26 13:42:11,529 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0302 | Val rms_score: 0.7734
|
| 182 |
+
2025-09-26 13:42:23,434 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0264 | Val rms_score: 0.7851
|
| 183 |
+
2025-09-26 13:42:33,843 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0280 | Val rms_score: 0.7829
|
| 184 |
+
2025-09-26 13:42:45,118 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0330 | Val rms_score: 0.7784
|
| 185 |
+
2025-09-26 13:42:56,259 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0292 | Val rms_score: 0.7770
|
| 186 |
+
2025-09-26 13:43:07,434 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0302 | Val rms_score: 0.7775
|
| 187 |
+
2025-09-26 13:43:19,041 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0258 | Val rms_score: 0.7863
|
| 188 |
+
2025-09-26 13:43:29,558 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0248 | Val rms_score: 0.7966
|
| 189 |
+
2025-09-26 13:43:40,560 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0331 | Val rms_score: 0.7846
|
| 190 |
+
2025-09-26 13:43:51,796 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0385 | Val rms_score: 0.7851
|
| 191 |
+
2025-09-26 13:44:03,146 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0273 | Val rms_score: 0.7870
|
| 192 |
+
2025-09-26 13:44:14,701 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0252 | Val rms_score: 0.7892
|
| 193 |
+
2025-09-26 13:44:25,444 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0223 | Val rms_score: 0.7886
|
| 194 |
+
2025-09-26 13:44:36,965 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0238 | Val rms_score: 0.7753
|
| 195 |
+
2025-09-26 13:44:48,084 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0234 | Val rms_score: 0.7768
|
| 196 |
+
2025-09-26 13:44:59,363 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0236 | Val rms_score: 0.7820
|
| 197 |
+
2025-09-26 13:45:11,040 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0213 | Val rms_score: 0.7818
|
| 198 |
+
2025-09-26 13:45:21,771 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0231 | Val rms_score: 0.7804
|
| 199 |
+
2025-09-26 13:45:32,967 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0226 | Val rms_score: 0.7924
|
| 200 |
+
2025-09-26 13:45:44,041 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0214 | Val rms_score: 0.7804
|
| 201 |
+
2025-09-26 13:45:55,004 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0219 | Val rms_score: 0.7886
|
| 202 |
+
2025-09-26 13:46:07,819 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0262 | Val rms_score: 0.7843
|
| 203 |
+
2025-09-26 13:46:17,987 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0231 | Val rms_score: 0.7845
|
| 204 |
+
2025-09-26 13:46:28,473 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0224 | Val rms_score: 0.7851
|
| 205 |
+
2025-09-26 13:46:39,758 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0199 | Val rms_score: 0.7797
|
| 206 |
+
2025-09-26 13:46:50,653 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0203 | Val rms_score: 0.7780
|
| 207 |
+
2025-09-26 13:47:02,060 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0198 | Val rms_score: 0.7842
|
| 208 |
+
2025-09-26 13:47:12,653 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0222 | Val rms_score: 0.7786
|
| 209 |
+
2025-09-26 13:47:23,756 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0197 | Val rms_score: 0.7788
|
| 210 |
+
2025-09-26 13:47:34,840 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0192 | Val rms_score: 0.7753
|
| 211 |
+
2025-09-26 13:47:45,896 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0186 | Val rms_score: 0.7757
|
| 212 |
+
2025-09-26 13:47:57,402 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0184 | Val rms_score: 0.7740
|
| 213 |
+
2025-09-26 13:48:07,990 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0219 | Val rms_score: 0.7752
|
| 214 |
+
2025-09-26 13:48:19,340 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0197 | Val rms_score: 0.7811
|
| 215 |
+
2025-09-26 13:48:30,595 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0189 | Val rms_score: 0.7773
|
| 216 |
+
2025-09-26 13:48:41,984 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0194 | Val rms_score: 0.7854
|
| 217 |
+
2025-09-26 13:48:53,817 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0224 | Val rms_score: 0.7773
|
| 218 |
+
2025-09-26 13:49:04,169 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0177 | Val rms_score: 0.7791
|
| 219 |
+
2025-09-26 13:49:15,443 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0184 | Val rms_score: 0.7859
|
| 220 |
+
2025-09-26 13:49:26,512 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0198 | Val rms_score: 0.7828
|
| 221 |
+
2025-09-26 13:49:37,494 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0185 | Val rms_score: 0.7719
|
| 222 |
+
2025-09-26 13:49:48,490 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0176 | Val rms_score: 0.7784
|
| 223 |
+
2025-09-26 13:49:59,155 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0175 | Val rms_score: 0.7816
|
| 224 |
+
2025-09-26 13:50:11,471 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0192 | Val rms_score: 0.7780
|
| 225 |
+
2025-09-26 13:50:22,720 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0188 | Val rms_score: 0.7841
|
| 226 |
+
2025-09-26 13:50:32,333 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0177 | Val rms_score: 0.7847
|
| 227 |
+
2025-09-26 13:50:44,149 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0183 | Val rms_score: 0.7836
|
| 228 |
+
2025-09-26 13:50:54,649 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0175 | Val rms_score: 0.7853
|
| 229 |
+
2025-09-26 13:51:05,793 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0161 | Val rms_score: 0.7767
|
| 230 |
+
2025-09-26 13:51:16,881 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0168 | Val rms_score: 0.7793
|
| 231 |
+
2025-09-26 13:51:27,875 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0170 | Val rms_score: 0.7819
|
| 232 |
+
2025-09-26 13:51:39,607 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0165 | Val rms_score: 0.7794
|
| 233 |
+
2025-09-26 13:51:50,154 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0181 | Val rms_score: 0.7845
|
| 234 |
+
2025-09-26 13:52:01,370 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0167 | Val rms_score: 0.7827
|
| 235 |
+
2025-09-26 13:52:12,488 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0181 | Val rms_score: 0.7765
|
| 236 |
+
2025-09-26 13:52:13,459 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Test rms_score: 0.8866
|
| 237 |
+
2025-09-26 13:52:13,921 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Starting triplicate run 3 for dataset astrazeneca_solubility at 2025-09-26_13-52-13
|
| 238 |
+
2025-09-26 13:52:23,994 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 1/100 | Train Loss: 0.7111 | Val rms_score: 0.8542
|
| 239 |
+
2025-09-26 13:52:23,994 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 45
|
| 240 |
+
2025-09-26 13:52:24,546 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 1 with val rms_score: 0.8542
|
| 241 |
+
2025-09-26 13:52:35,041 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 2/100 | Train Loss: 0.4556 | Val rms_score: 0.8372
|
| 242 |
+
2025-09-26 13:52:35,222 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 90
|
| 243 |
+
2025-09-26 13:52:35,769 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 2 with val rms_score: 0.8372
|
| 244 |
+
2025-09-26 13:52:46,154 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 3/100 | Train Loss: 0.3482 | Val rms_score: 0.8297
|
| 245 |
+
2025-09-26 13:52:46,343 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 135
|
| 246 |
+
2025-09-26 13:52:46,909 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 3 with val rms_score: 0.8297
|
| 247 |
+
2025-09-26 13:52:56,741 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 4/100 | Train Loss: 0.3000 | Val rms_score: 0.8201
|
| 248 |
+
2025-09-26 13:52:56,926 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 180
|
| 249 |
+
2025-09-26 13:52:57,504 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 4 with val rms_score: 0.8201
|
| 250 |
+
2025-09-26 13:53:07,307 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 5/100 | Train Loss: 0.2662 | Val rms_score: 0.8202
|
| 251 |
+
2025-09-26 13:53:18,510 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 6/100 | Train Loss: 0.2319 | Val rms_score: 0.7968
|
| 252 |
+
2025-09-26 13:53:19,119 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 270
|
| 253 |
+
2025-09-26 13:53:19,831 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 6 with val rms_score: 0.7968
|
| 254 |
+
2025-09-26 13:53:31,485 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 7/100 | Train Loss: 0.1781 | Val rms_score: 0.8236
|
| 255 |
+
2025-09-26 13:53:41,255 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 8/100 | Train Loss: 0.1569 | Val rms_score: 0.8031
|
| 256 |
+
2025-09-26 13:53:51,080 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 9/100 | Train Loss: 0.1703 | Val rms_score: 0.7950
|
| 257 |
+
2025-09-26 13:53:51,292 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 405
|
| 258 |
+
2025-09-26 13:53:51,848 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 9 with val rms_score: 0.7950
|
| 259 |
+
2025-09-26 13:54:01,937 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 10/100 | Train Loss: 0.1639 | Val rms_score: 0.7853
|
| 260 |
+
2025-09-26 13:54:02,154 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 450
|
| 261 |
+
2025-09-26 13:54:02,760 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 10 with val rms_score: 0.7853
|
| 262 |
+
2025-09-26 13:54:12,656 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 11/100 | Train Loss: 0.1257 | Val rms_score: 0.7955
|
| 263 |
+
2025-09-26 13:54:23,038 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 12/100 | Train Loss: 0.1250 | Val rms_score: 0.8307
|
| 264 |
+
2025-09-26 13:54:33,637 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 13/100 | Train Loss: 0.1194 | Val rms_score: 0.7691
|
| 265 |
+
2025-09-26 13:54:33,822 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Global step of best model: 585
|
| 266 |
+
2025-09-26 13:54:34,395 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Best model saved at epoch 13 with val rms_score: 0.7691
|
| 267 |
+
2025-09-26 13:54:45,715 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 14/100 | Train Loss: 0.1026 | Val rms_score: 0.7885
|
| 268 |
+
2025-09-26 13:54:55,798 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 15/100 | Train Loss: 0.0924 | Val rms_score: 0.7810
|
| 269 |
+
2025-09-26 13:55:07,082 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 16/100 | Train Loss: 0.0934 | Val rms_score: 0.8110
|
| 270 |
+
2025-09-26 13:55:18,898 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 17/100 | Train Loss: 0.1111 | Val rms_score: 0.7975
|
| 271 |
+
2025-09-26 13:55:29,465 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 18/100 | Train Loss: 0.1047 | Val rms_score: 0.8331
|
| 272 |
+
2025-09-26 13:55:40,615 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 19/100 | Train Loss: 0.0944 | Val rms_score: 0.7800
|
| 273 |
+
2025-09-26 13:55:51,688 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 20/100 | Train Loss: 0.0694 | Val rms_score: 0.7858
|
| 274 |
+
2025-09-26 13:56:02,657 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 21/100 | Train Loss: 0.0635 | Val rms_score: 0.7762
|
| 275 |
+
2025-09-26 13:56:14,590 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 22/100 | Train Loss: 0.0597 | Val rms_score: 0.7937
|
| 276 |
+
2025-09-26 13:56:25,542 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 23/100 | Train Loss: 0.0594 | Val rms_score: 0.7759
|
| 277 |
+
2025-09-26 13:56:37,201 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 24/100 | Train Loss: 0.0590 | Val rms_score: 0.7825
|
| 278 |
+
2025-09-26 13:56:48,730 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 25/100 | Train Loss: 0.0716 | Val rms_score: 0.7732
|
| 279 |
+
2025-09-26 13:57:01,629 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 26/100 | Train Loss: 0.0566 | Val rms_score: 0.7848
|
| 280 |
+
2025-09-26 13:57:14,215 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 27/100 | Train Loss: 0.0534 | Val rms_score: 0.7851
|
| 281 |
+
2025-09-26 13:57:25,481 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 28/100 | Train Loss: 0.0538 | Val rms_score: 0.8081
|
| 282 |
+
2025-09-26 13:57:36,540 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 29/100 | Train Loss: 0.0695 | Val rms_score: 0.7852
|
| 283 |
+
2025-09-26 13:57:47,459 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 30/100 | Train Loss: 0.0958 | Val rms_score: 0.7963
|
| 284 |
+
2025-09-26 13:57:58,637 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 31/100 | Train Loss: 0.0722 | Val rms_score: 0.8134
|
| 285 |
+
2025-09-26 13:58:10,406 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 32/100 | Train Loss: 0.0508 | Val rms_score: 0.8020
|
| 286 |
+
2025-09-26 13:58:21,894 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 33/100 | Train Loss: 0.0476 | Val rms_score: 0.8108
|
| 287 |
+
2025-09-26 13:58:32,905 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 34/100 | Train Loss: 0.0477 | Val rms_score: 0.8176
|
| 288 |
+
2025-09-26 13:58:43,877 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 35/100 | Train Loss: 0.0415 | Val rms_score: 0.8120
|
| 289 |
+
2025-09-26 13:58:54,946 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 36/100 | Train Loss: 0.0398 | Val rms_score: 0.8058
|
| 290 |
+
2025-09-26 13:59:06,856 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 37/100 | Train Loss: 0.0465 | Val rms_score: 0.8037
|
| 291 |
+
2025-09-26 13:59:18,184 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 38/100 | Train Loss: 0.0357 | Val rms_score: 0.7942
|
| 292 |
+
2025-09-26 13:59:28,844 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 39/100 | Train Loss: 0.0384 | Val rms_score: 0.8053
|
| 293 |
+
2025-09-26 13:59:39,716 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 40/100 | Train Loss: 0.0372 | Val rms_score: 0.7993
|
| 294 |
+
2025-09-26 13:59:51,125 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 41/100 | Train Loss: 0.0337 | Val rms_score: 0.8026
|
| 295 |
+
2025-09-26 14:00:02,765 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 42/100 | Train Loss: 0.0312 | Val rms_score: 0.8014
|
| 296 |
+
2025-09-26 14:00:13,903 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 43/100 | Train Loss: 0.0321 | Val rms_score: 0.8088
|
| 297 |
+
2025-09-26 14:00:24,677 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 44/100 | Train Loss: 0.0354 | Val rms_score: 0.7943
|
| 298 |
+
2025-09-26 14:00:37,226 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 45/100 | Train Loss: 0.0358 | Val rms_score: 0.7916
|
| 299 |
+
2025-09-26 14:00:49,378 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 46/100 | Train Loss: 0.0312 | Val rms_score: 0.7927
|
| 300 |
+
2025-09-26 14:01:01,066 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 47/100 | Train Loss: 0.0293 | Val rms_score: 0.7951
|
| 301 |
+
2025-09-26 14:01:12,536 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 48/100 | Train Loss: 0.0297 | Val rms_score: 0.8016
|
| 302 |
+
2025-09-26 14:01:23,450 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 49/100 | Train Loss: 0.0256 | Val rms_score: 0.8044
|
| 303 |
+
2025-09-26 14:01:34,598 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 50/100 | Train Loss: 0.0285 | Val rms_score: 0.7897
|
| 304 |
+
2025-09-26 14:01:45,842 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 51/100 | Train Loss: 0.0286 | Val rms_score: 0.8050
|
| 305 |
+
2025-09-26 14:01:57,946 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 52/100 | Train Loss: 0.0254 | Val rms_score: 0.7957
|
| 306 |
+
2025-09-26 14:02:09,583 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 53/100 | Train Loss: 0.0286 | Val rms_score: 0.7975
|
| 307 |
+
2025-09-26 14:02:20,675 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 54/100 | Train Loss: 0.0283 | Val rms_score: 0.7920
|
| 308 |
+
2025-09-26 14:02:31,932 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 55/100 | Train Loss: 0.0266 | Val rms_score: 0.8093
|
| 309 |
+
2025-09-26 14:02:43,264 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 56/100 | Train Loss: 0.0293 | Val rms_score: 0.8071
|
| 310 |
+
2025-09-26 14:02:55,662 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 57/100 | Train Loss: 0.0255 | Val rms_score: 0.8098
|
| 311 |
+
2025-09-26 14:03:07,465 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 58/100 | Train Loss: 0.0258 | Val rms_score: 0.7965
|
| 312 |
+
2025-09-26 14:03:18,906 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 59/100 | Train Loss: 0.0259 | Val rms_score: 0.7953
|
| 313 |
+
2025-09-26 14:03:30,069 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 60/100 | Train Loss: 0.0255 | Val rms_score: 0.7940
|
| 314 |
+
2025-09-26 14:03:41,445 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 61/100 | Train Loss: 0.0241 | Val rms_score: 0.7907
|
| 315 |
+
2025-09-26 14:03:53,675 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 62/100 | Train Loss: 0.0250 | Val rms_score: 0.7990
|
| 316 |
+
2025-09-26 14:04:05,411 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 63/100 | Train Loss: 0.0254 | Val rms_score: 0.7980
|
| 317 |
+
2025-09-26 14:04:17,098 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 64/100 | Train Loss: 0.0247 | Val rms_score: 0.8035
|
| 318 |
+
2025-09-26 14:04:28,577 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 65/100 | Train Loss: 0.0252 | Val rms_score: 0.7890
|
| 319 |
+
2025-09-26 14:04:40,184 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 66/100 | Train Loss: 0.0240 | Val rms_score: 0.7905
|
| 320 |
+
2025-09-26 14:04:53,466 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 67/100 | Train Loss: 0.0251 | Val rms_score: 0.7953
|
| 321 |
+
2025-09-26 14:05:05,887 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 68/100 | Train Loss: 0.0250 | Val rms_score: 0.7936
|
| 322 |
+
2025-09-26 14:05:17,133 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 69/100 | Train Loss: 0.0195 | Val rms_score: 0.7970
|
| 323 |
+
2025-09-26 14:05:28,715 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 70/100 | Train Loss: 0.0220 | Val rms_score: 0.7905
|
| 324 |
+
2025-09-26 14:05:40,312 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 71/100 | Train Loss: 0.0222 | Val rms_score: 0.7910
|
| 325 |
+
2025-09-26 14:05:52,811 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 72/100 | Train Loss: 0.0222 | Val rms_score: 0.7909
|
| 326 |
+
2025-09-26 14:06:04,857 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 73/100 | Train Loss: 0.0226 | Val rms_score: 0.7909
|
| 327 |
+
2025-09-26 14:06:16,273 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 74/100 | Train Loss: 0.0232 | Val rms_score: 0.8029
|
| 328 |
+
2025-09-26 14:06:27,888 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 75/100 | Train Loss: 0.0227 | Val rms_score: 0.7907
|
| 329 |
+
2025-09-26 14:06:39,350 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 76/100 | Train Loss: 0.0190 | Val rms_score: 0.7944
|
| 330 |
+
2025-09-26 14:06:52,406 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 77/100 | Train Loss: 0.0214 | Val rms_score: 0.7922
|
| 331 |
+
2025-09-26 14:07:04,740 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 78/100 | Train Loss: 0.0252 | Val rms_score: 0.8091
|
| 332 |
+
2025-09-26 14:07:16,319 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 79/100 | Train Loss: 0.0208 | Val rms_score: 0.7971
|
| 333 |
+
2025-09-26 14:07:28,256 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 80/100 | Train Loss: 0.0207 | Val rms_score: 0.7918
|
| 334 |
+
2025-09-26 14:07:39,997 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 81/100 | Train Loss: 0.0214 | Val rms_score: 0.8010
|
| 335 |
+
2025-09-26 14:07:52,977 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 82/100 | Train Loss: 0.0238 | Val rms_score: 0.7945
|
| 336 |
+
2025-09-26 14:08:05,602 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 83/100 | Train Loss: 0.0209 | Val rms_score: 0.7887
|
| 337 |
+
2025-09-26 14:08:17,233 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 84/100 | Train Loss: 0.0198 | Val rms_score: 0.7940
|
| 338 |
+
2025-09-26 14:08:28,917 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 85/100 | Train Loss: 0.0259 | Val rms_score: 0.7906
|
| 339 |
+
2025-09-26 14:08:40,689 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 86/100 | Train Loss: 0.0255 | Val rms_score: 0.7942
|
| 340 |
+
2025-09-26 14:08:53,489 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 87/100 | Train Loss: 0.0197 | Val rms_score: 0.7925
|
| 341 |
+
2025-09-26 14:09:05,691 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 88/100 | Train Loss: 0.0185 | Val rms_score: 0.7906
|
| 342 |
+
2025-09-26 14:09:18,502 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 89/100 | Train Loss: 0.0210 | Val rms_score: 0.7905
|
| 343 |
+
2025-09-26 14:09:31,269 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 90/100 | Train Loss: 0.0183 | Val rms_score: 0.7927
|
| 344 |
+
2025-09-26 14:09:43,565 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 91/100 | Train Loss: 0.0210 | Val rms_score: 0.7911
|
| 345 |
+
2025-09-26 14:09:55,914 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 92/100 | Train Loss: 0.0191 | Val rms_score: 0.7935
|
| 346 |
+
2025-09-26 14:10:07,897 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 93/100 | Train Loss: 0.0193 | Val rms_score: 0.7933
|
| 347 |
+
2025-09-26 14:10:19,031 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 94/100 | Train Loss: 0.0223 | Val rms_score: 0.7914
|
| 348 |
+
2025-09-26 14:10:30,294 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 95/100 | Train Loss: 0.0213 | Val rms_score: 0.7915
|
| 349 |
+
2025-09-26 14:10:41,693 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 96/100 | Train Loss: 0.0199 | Val rms_score: 0.7915
|
| 350 |
+
2025-09-26 14:10:54,011 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 97/100 | Train Loss: 0.0194 | Val rms_score: 0.7933
|
| 351 |
+
2025-09-26 14:11:05,573 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 98/100 | Train Loss: 0.0178 | Val rms_score: 0.7946
|
| 352 |
+
2025-09-26 14:11:16,744 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 99/100 | Train Loss: 0.0227 | Val rms_score: 0.8005
|
| 353 |
+
2025-09-26 14:11:27,825 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Epoch 100/100 | Train Loss: 0.0218 | Val rms_score: 0.7998
|
| 354 |
+
2025-09-26 14:11:28,529 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Test rms_score: 0.8606
|
| 355 |
+
2025-09-26 14:11:28,980 - logs_modchembert_astrazeneca_solubility_epochs100_batch_size32 - INFO - Final Triplicate Test Results — Avg rms_score: 0.8564, Std Dev: 0.0265
|