dacunaq commited on
Commit
ec8a314
·
verified ·
1 Parent(s): 8b00037

Model save

Browse files
README.md ADDED
@@ -0,0 +1,149 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: google/vit-base-patch32-384
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
+ metrics:
10
+ - accuracy
11
+ model-index:
12
+ - name: vit-base-patch32-384-finetuned-humid-classes-nov27-18-00-f4
13
+ results:
14
+ - task:
15
+ name: Image Classification
16
+ type: image-classification
17
+ dataset:
18
+ name: imagefolder
19
+ type: imagefolder
20
+ config: default
21
+ split: validation
22
+ args: default
23
+ metrics:
24
+ - name: Accuracy
25
+ type: accuracy
26
+ value: 1.0
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # vit-base-patch32-384-finetuned-humid-classes-nov27-18-00-f4
33
+
34
+ This model is a fine-tuned version of [google/vit-base-patch32-384](https://huggingface.co/google/vit-base-patch32-384) on the imagefolder dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 0.0012
37
+ - Accuracy: 1.0
38
+ - F1 Macro: 1.0
39
+ - Precision Macro: 1.0
40
+ - Recall Macro: 1.0
41
+ - Precision Dry: 1.0
42
+ - Recall Dry: 1.0
43
+ - F1 Dry: 1.0
44
+ - Precision Firm: 1.0
45
+ - Recall Firm: 1.0
46
+ - F1 Firm: 1.0
47
+ - Precision Humid: 1.0
48
+ - Recall Humid: 1.0
49
+ - F1 Humid: 1.0
50
+ - Precision Lump: 1.0
51
+ - Recall Lump: 1.0
52
+ - F1 Lump: 1.0
53
+ - Precision Moist: 1.0
54
+ - Recall Moist: 1.0
55
+ - F1 Moist: 1.0
56
+ - Precision Rockies: 1.0
57
+ - Recall Rockies: 1.0
58
+ - F1 Rockies: 1.0
59
+
60
+ ## Model description
61
+
62
+ More information needed
63
+
64
+ ## Intended uses & limitations
65
+
66
+ More information needed
67
+
68
+ ## Training and evaluation data
69
+
70
+ More information needed
71
+
72
+ ## Training procedure
73
+
74
+ ### Training hyperparameters
75
+
76
+ The following hyperparameters were used during training:
77
+ - learning_rate: 5e-05
78
+ - train_batch_size: 16
79
+ - eval_batch_size: 16
80
+ - seed: 42
81
+ - gradient_accumulation_steps: 4
82
+ - total_train_batch_size: 64
83
+ - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
84
+ - lr_scheduler_type: linear
85
+ - lr_scheduler_warmup_ratio: 0.1
86
+ - num_epochs: 50
87
+
88
+ ### Training results
89
+
90
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | Precision Macro | Recall Macro | Precision Dry | Recall Dry | F1 Dry | Precision Firm | Recall Firm | F1 Firm | Precision Humid | Recall Humid | F1 Humid | Precision Lump | Recall Lump | F1 Lump | Precision Moist | Recall Moist | F1 Moist | Precision Rockies | Recall Rockies | F1 Rockies |
91
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:---------------:|:------------:|:-------------:|:----------:|:------:|:--------------:|:-----------:|:-------:|:---------------:|:------------:|:--------:|:--------------:|:-----------:|:-------:|:---------------:|:------------:|:--------:|:-----------------:|:--------------:|:----------:|
92
+ | No log | 1.0 | 3 | 1.6676 | 0.3871 | 0.2730 | 0.2315 | 0.3333 | 0.2222 | 0.3333 | 0.2667 | 0.6667 | 1.0 | 0.8 | 0.5 | 0.6667 | 0.5714 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
93
+ | No log | 2.0 | 6 | 1.3592 | 0.5161 | 0.4091 | 0.4333 | 0.4556 | 0.3333 | 0.3333 | 0.3333 | 0.6667 | 1.0 | 0.8 | 0.6 | 1.0 | 0.75 | 1.0 | 0.4 | 0.5714 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
94
+ | No log | 3.0 | 9 | 0.9985 | 0.6774 | 0.5548 | 0.6583 | 0.6083 | 0.6 | 1.0 | 0.75 | 0.75 | 1.0 | 0.8571 | 0.6 | 1.0 | 0.75 | 1.0 | 0.4 | 0.5714 | 1.0 | 0.25 | 0.4 | 0.0 | 0.0 | 0.0 |
95
+ | 1.4875 | 4.0 | 12 | 0.6967 | 0.7097 | 0.5994 | 0.6190 | 0.6417 | 0.5 | 1.0 | 0.6667 | 0.8571 | 1.0 | 0.9231 | 0.8571 | 1.0 | 0.9231 | 1.0 | 0.6 | 0.75 | 0.0 | 0.0 | 0.0 | 0.5 | 0.25 | 0.3333 |
96
+ | 1.4875 | 5.0 | 15 | 0.4559 | 0.9677 | 0.9687 | 0.9762 | 0.9667 | 1.0 | 1.0 | 1.0 | 0.8571 | 1.0 | 0.9231 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8889 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
97
+ | 1.4875 | 6.0 | 18 | 0.2447 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
98
+ | 0.5207 | 7.0 | 21 | 0.1329 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
99
+ | 0.5207 | 8.0 | 24 | 0.1024 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
100
+ | 0.5207 | 9.0 | 27 | 0.0473 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
101
+ | 0.0916 | 10.0 | 30 | 0.0282 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
102
+ | 0.0916 | 11.0 | 33 | 0.0175 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
103
+ | 0.0916 | 12.0 | 36 | 0.0101 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
104
+ | 0.0916 | 13.0 | 39 | 0.0097 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
105
+ | 0.0124 | 14.0 | 42 | 0.0278 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
106
+ | 0.0124 | 15.0 | 45 | 0.0061 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
107
+ | 0.0124 | 16.0 | 48 | 0.0034 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
108
+ | 0.0039 | 17.0 | 51 | 0.0030 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
109
+ | 0.0039 | 18.0 | 54 | 0.0026 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
110
+ | 0.0039 | 19.0 | 57 | 0.0023 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
111
+ | 0.0021 | 20.0 | 60 | 0.0022 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
112
+ | 0.0021 | 21.0 | 63 | 0.0022 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
113
+ | 0.0021 | 22.0 | 66 | 0.0022 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
114
+ | 0.0021 | 23.0 | 69 | 0.0022 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
115
+ | 0.0013 | 24.0 | 72 | 0.0022 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
116
+ | 0.0013 | 25.0 | 75 | 0.0021 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
117
+ | 0.0013 | 26.0 | 78 | 0.0020 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
118
+ | 0.0011 | 27.0 | 81 | 0.0018 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
119
+ | 0.0011 | 28.0 | 84 | 0.0017 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
120
+ | 0.0011 | 29.0 | 87 | 0.0016 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
121
+ | 0.0009 | 30.0 | 90 | 0.0015 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
122
+ | 0.0009 | 31.0 | 93 | 0.0015 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
123
+ | 0.0009 | 32.0 | 96 | 0.0014 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
124
+ | 0.0009 | 33.0 | 99 | 0.0014 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
125
+ | 0.0008 | 34.0 | 102 | 0.0013 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
126
+ | 0.0008 | 35.0 | 105 | 0.0013 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
127
+ | 0.0008 | 36.0 | 108 | 0.0013 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
128
+ | 0.0008 | 37.0 | 111 | 0.0013 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
129
+ | 0.0008 | 38.0 | 114 | 0.0013 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
130
+ | 0.0008 | 39.0 | 117 | 0.0013 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
131
+ | 0.0008 | 40.0 | 120 | 0.0013 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
132
+ | 0.0008 | 41.0 | 123 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
133
+ | 0.0008 | 42.0 | 126 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
134
+ | 0.0008 | 43.0 | 129 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
135
+ | 0.0007 | 44.0 | 132 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
136
+ | 0.0007 | 45.0 | 135 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
137
+ | 0.0007 | 46.0 | 138 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
138
+ | 0.0007 | 47.0 | 141 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
139
+ | 0.0007 | 48.0 | 144 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
140
+ | 0.0007 | 49.0 | 147 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
141
+ | 0.0007 | 50.0 | 150 | 0.0012 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
142
+
143
+
144
+ ### Framework versions
145
+
146
+ - Transformers 4.57.1
147
+ - Pytorch 2.9.0+cu126
148
+ - Datasets 4.0.0
149
+ - Tokenizers 0.22.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ed9fb1d1a82d85774717312ae36e8f8dc051eeadce8bda41215f9ea89ecb8dd4
3
  size 350154440
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:84e2a167e2b3c6dd189a0e2e06af0f2c9001584de01422cdea2bccd57c050cd0
3
  size 350154440
runs/Nov27_17-57-43_tech/events.out.tfevents.1764284265.tech.333507.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fcd68ba8cdb4905c400f38e06fade92aa128c4010b9d98f946a24ba487d5a4ce
3
- size 79864
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2c67d877dd2d4628e0610978e52b48e9d1d6b365b6e14527e4f9bba191483667
3
+ size 81915