segformer-b5-finetuned-apple-dms-v2-run2

This model is a fine-tuned version of nvidia/segformer-b5-finetuned-ade-640-640 on the AllanK24/apple-dms-materials-v2 dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1816
  • Mean Iou: 0.5168
  • Mean Accuracy: 0.6165
  • Overall Accuracy: 0.8368
  • Accuracy Animal Skin: 0.8455
  • Iou Animal Skin: 0.6188
  • Accuracy Bone Teeth Horn: 0.0931
  • Iou Bone Teeth Horn: 0.0892
  • Accuracy Brickwork: 0.7976
  • Iou Brickwork: 0.6377
  • Accuracy Cardboard: 0.7038
  • Iou Cardboard: 0.5829
  • Accuracy Carpet Rug: 0.8714
  • Iou Carpet Rug: 0.7699
  • Accuracy Ceiling Tile: 0.8745
  • Iou Ceiling Tile: 0.7971
  • Accuracy Ceramic: 0.7644
  • Iou Ceramic: 0.6403
  • Accuracy Chalkboard Blackboard: 0.8354
  • Iou Chalkboard Blackboard: 0.7216
  • Accuracy Clutter: 0.6474
  • Iou Clutter: 0.4418
  • Accuracy Concrete: 0.5971
  • Iou Concrete: 0.4432
  • Accuracy Cork Corkboard: 0.1812
  • Iou Cork Corkboard: 0.1587
  • Accuracy Engineered Stone: 0.1564
  • Iou Engineered Stone: 0.1235
  • Accuracy Fabric Cloth: 0.9101
  • Iou Fabric Cloth: 0.8157
  • Accuracy Fiberglass Wool: 0.0
  • Iou Fiberglass Wool: 0.0
  • Accuracy Fire: 0.8079
  • Iou Fire: 0.5896
  • Accuracy Foliage: 0.9207
  • Iou Foliage: 0.8361
  • Accuracy Food: 0.9126
  • Iou Food: 0.8083
  • Accuracy Fur: 0.9457
  • Iou Fur: 0.9007
  • Accuracy Gemstone Quartz: 0.4169
  • Iou Gemstone Quartz: 0.3774
  • Accuracy Glass: 0.7759
  • Iou Glass: 0.6581
  • Accuracy Hair: 0.8560
  • Iou Hair: 0.7465
  • Accuracy Ice: 0.7561
  • Iou Ice: 0.5848
  • Accuracy Leather: 0.6776
  • Iou Leather: 0.5701
  • Accuracy Liquid Non-water: 0.3126
  • Iou Liquid Non-water: 0.2938
  • Accuracy Metal: 0.5541
  • Iou Metal: 0.4088
  • Accuracy Mirror: 0.6526
  • Iou Mirror: 0.5701
  • Accuracy Paint Plaster Enamel: 0.8851
  • Iou Paint Plaster Enamel: 0.7697
  • Accuracy Paper: 0.7511
  • Iou Paper: 0.6369
  • Accuracy Pearl: 0.0
  • Iou Pearl: 0.0
  • Accuracy Photograph Painting: 0.5312
  • Iou Photograph Painting: 0.3893
  • Accuracy Plastic Clear: 0.3886
  • Iou Plastic Clear: 0.2666
  • Accuracy Plastic Non-clear: 0.5812
  • Iou Plastic Non-clear: 0.4455
  • Accuracy Rubber Latex: 0.4111
  • Iou Rubber Latex: 0.3570
  • Accuracy Sand: 0.6490
  • Iou Sand: 0.4828
  • Accuracy Skin Lips: 0.8694
  • Iou Skin Lips: 0.7614
  • Accuracy Sky: 0.9828
  • Iou Sky: 0.9506
  • Accuracy Snow: 0.7912
  • Iou Snow: 0.7119
  • Accuracy Soap: 0.0
  • Iou Soap: 0.0
  • Accuracy Soil Mud: 0.6244
  • Iou Soil Mud: 0.4812
  • Accuracy Sponge: 0.0
  • Iou Sponge: 0.0
  • Accuracy Stone Natural: 0.7767
  • Iou Stone Natural: 0.5933
  • Accuracy Stone Polished: 0.4189
  • Iou Stone Polished: 0.3580
  • Accuracy Styrofoam: 0.0
  • Iou Styrofoam: 0.0
  • Accuracy Tile: 0.8246
  • Iou Tile: 0.6969
  • Accuracy Wallpaper: 0.6797
  • Iou Wallpaper: 0.4607
  • Accuracy Water: 0.9246
  • Iou Water: 0.8367
  • Accuracy Wax: 0.6517
  • Iou Wax: 0.5691
  • Accuracy Whiteboard: 0.8631
  • Iou Whiteboard: 0.7922
  • Accuracy Wicker: 0.5960
  • Iou Wicker: 0.5027
  • Accuracy Wood: 0.8617
  • Iou Wood: 0.7449
  • Accuracy Wood Tree: 0.4590
  • Iou Wood Tree: 0.3484
  • Accuracy Asphalt: 0.6704
  • Iou Asphalt: 0.5317

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 8
  • total_train_batch_size: 256
  • total_eval_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 0.1
  • num_epochs: 40
  • label_smoothing_factor: 0.1

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Animal Skin Iou Animal Skin Accuracy Bone Teeth Horn Iou Bone Teeth Horn Accuracy Brickwork Iou Brickwork Accuracy Cardboard Iou Cardboard Accuracy Carpet Rug Iou Carpet Rug Accuracy Ceiling Tile Iou Ceiling Tile Accuracy Ceramic Iou Ceramic Accuracy Chalkboard Blackboard Iou Chalkboard Blackboard Accuracy Clutter Iou Clutter Accuracy Concrete Iou Concrete Accuracy Cork Corkboard Iou Cork Corkboard Accuracy Engineered Stone Iou Engineered Stone Accuracy Fabric Cloth Iou Fabric Cloth Accuracy Fiberglass Wool Iou Fiberglass Wool Accuracy Fire Iou Fire Accuracy Foliage Iou Foliage Accuracy Food Iou Food Accuracy Fur Iou Fur Accuracy Gemstone Quartz Iou Gemstone Quartz Accuracy Glass Iou Glass Accuracy Hair Iou Hair Accuracy Ice Iou Ice Accuracy Leather Iou Leather Accuracy Liquid Non-water Iou Liquid Non-water Accuracy Metal Iou Metal Accuracy Mirror Iou Mirror Accuracy Paint Plaster Enamel Iou Paint Plaster Enamel Accuracy Paper Iou Paper Accuracy Pearl Iou Pearl Accuracy Photograph Painting Iou Photograph Painting Accuracy Plastic Clear Iou Plastic Clear Accuracy Plastic Non-clear Iou Plastic Non-clear Accuracy Rubber Latex Iou Rubber Latex Accuracy Sand Iou Sand Accuracy Skin Lips Iou Skin Lips Accuracy Sky Iou Sky Accuracy Snow Iou Snow Accuracy Soap Iou Soap Accuracy Soil Mud Iou Soil Mud Accuracy Sponge Iou Sponge Accuracy Stone Natural Iou Stone Natural Accuracy Stone Polished Iou Stone Polished Accuracy Styrofoam Iou Styrofoam Accuracy Tile Iou Tile Accuracy Wallpaper Iou Wallpaper Accuracy Water Iou Water Accuracy Wax Iou Wax Accuracy Whiteboard Iou Whiteboard Accuracy Wicker Iou Wicker Accuracy Wood Iou Wood Accuracy Wood Tree Iou Wood Tree Accuracy Asphalt Iou Asphalt
2.1540 2.6923 350 1.3162 0.3595 0.4276 0.7877 0.1907 0.1823 0.0 0.0 0.6163 0.5204 0.5324 0.4513 0.7533 0.6621 0.8424 0.7453 0.7311 0.5837 0.5353 0.5026 0.0 0.0 0.4276 0.3011 0.0 0.0 0.0 0.0 0.8814 0.7562 0.0 0.0 0.0 0.0 0.9171 0.8019 0.9017 0.7750 0.8738 0.7639 0.0 0.0 0.7236 0.5678 0.7914 0.6708 0.0002 0.0002 0.3705 0.3342 0.0 0.0 0.3668 0.2942 0.4737 0.4373 0.8892 0.7202 0.7236 0.5531 0.0 0.0 0.4786 0.3219 0.1402 0.1159 0.4622 0.3456 0.0032 0.0032 0.4118 0.3285 0.7996 0.6751 0.9608 0.9184 0.4316 0.3699 0.0 0.0 0.4846 0.3333 0.0 0.0 0.6693 0.4964 0.1121 0.1081 0.0 0.0 0.7456 0.6003 0.3334 0.2932 0.8733 0.7836 0.0 0.0 0.7114 0.5961 0.4763 0.4145 0.8312 0.6889 0.2100 0.1966 0.5560 0.4809
1.3332 5.3846 700 1.2432 0.4272 0.5210 0.8081 0.7385 0.5663 0.0 0.0 0.6838 0.5890 0.6093 0.5297 0.8407 0.7188 0.8815 0.7504 0.7226 0.5979 0.8751 0.7465 0.0 0.0 0.5194 0.3764 0.0 0.0 0.0 0.0 0.8717 0.7776 0.0 0.0 0.2800 0.2734 0.9182 0.8176 0.8973 0.7755 0.9095 0.8295 0.0 0.0 0.7590 0.5868 0.8331 0.7058 0.3851 0.2892 0.7456 0.4475 0.0386 0.0386 0.4501 0.3505 0.6224 0.4357 0.8632 0.7441 0.7539 0.5753 0.0 0.0 0.5015 0.3547 0.3100 0.2030 0.5046 0.3816 0.3568 0.3134 0.7259 0.4003 0.8335 0.7078 0.9660 0.9360 0.7374 0.6092 0.0 0.0 0.4403 0.3770 0.0 0.0 0.7546 0.5854 0.3130 0.2619 0.0 0.0 0.7804 0.6455 0.5522 0.4201 0.9055 0.7897 0.0 0.0 0.8648 0.7462 0.5154 0.4297 0.8691 0.7187 0.3404 0.2821 0.6197 0.5280
1.2446 8.0769 1050 1.2176 0.4600 0.5508 0.8177 0.7691 0.5561 0.0 0.0 0.7348 0.6067 0.6546 0.5594 0.8130 0.7269 0.8462 0.7781 0.7535 0.6005 0.8812 0.7698 0.0503 0.0499 0.5072 0.3876 0.0 0.0 0.0000 0.0000 0.8860 0.7949 0.0 0.0 0.5071 0.4357 0.9258 0.8278 0.8784 0.7995 0.9201 0.8670 0.1267 0.1236 0.7415 0.6113 0.8461 0.7107 0.7660 0.4239 0.6834 0.5215 0.3542 0.3517 0.5494 0.3815 0.5424 0.4851 0.8850 0.7499 0.7395 0.6073 0.0 0.0 0.5071 0.3431 0.3162 0.2301 0.5082 0.4007 0.3566 0.3175 0.6144 0.4928 0.8551 0.7265 0.9581 0.9360 0.5541 0.5444 0.0 0.0 0.5442 0.4209 0.0 0.0 0.7566 0.5170 0.3190 0.2647 0.0 0.0 0.7862 0.6606 0.6297 0.4668 0.9033 0.7883 0.3082 0.3039 0.8525 0.7220 0.5388 0.4584 0.8635 0.7275 0.4613 0.3274 0.6492 0.5474
1.1962 10.7692 1400 1.2087 0.4609 0.5526 0.8209 0.7662 0.5936 0.0157 0.0157 0.7968 0.5810 0.7614 0.5146 0.8574 0.7357 0.8145 0.7642 0.7353 0.6262 0.8395 0.7272 0.2780 0.2579 0.5620 0.4071 0.0 0.0 0.0145 0.0135 0.8881 0.7989 0.0 0.0 0.6578 0.5183 0.9244 0.8362 0.9066 0.8334 0.9391 0.8840 0.0040 0.0031 0.7256 0.6132 0.8246 0.7256 0.3107 0.2797 0.6827 0.5386 0.0726 0.0726 0.4962 0.3822 0.6112 0.5226 0.8964 0.7524 0.7012 0.6031 0.0 0.0 0.5205 0.3632 0.2621 0.2034 0.5234 0.4011 0.4051 0.3347 0.4488 0.3675 0.8548 0.7364 0.9660 0.9357 0.8406 0.5291 0.0 0.0 0.6723 0.4766 0.0 0.0 0.7643 0.5567 0.2803 0.2599 0.0 0.0 0.7920 0.6709 0.5039 0.4021 0.8852 0.8169 0.6481 0.5788 0.8535 0.7485 0.5423 0.4384 0.8589 0.7268 0.4263 0.2966 0.6057 0.5250
1.1603 13.4615 1750 1.2011 0.4909 0.5964 0.8238 0.7953 0.5342 0.0321 0.0321 0.7265 0.6229 0.7325 0.5730 0.8569 0.7450 0.8510 0.7804 0.7798 0.6398 0.8942 0.7608 0.4514 0.3626 0.6380 0.4009 0.0 0.0 0.0967 0.0854 0.9026 0.8050 0.0 0.0 0.8603 0.7023 0.9052 0.8304 0.9164 0.7901 0.9510 0.8785 0.3379 0.3019 0.7031 0.6162 0.8447 0.7281 0.3785 0.3610 0.6847 0.5490 0.4601 0.4221 0.5223 0.3902 0.6297 0.4905 0.8702 0.7588 0.7382 0.6161 0.0 0.0 0.5703 0.3819 0.2965 0.2193 0.5728 0.4236 0.3979 0.3549 0.5852 0.3792 0.8696 0.7431 0.9644 0.9415 0.8767 0.5709 0.0 0.0 0.6020 0.4383 0.0 0.0 0.7840 0.5777 0.4429 0.3536 0.0 0.0 0.7768 0.6741 0.7169 0.4403 0.8977 0.8112 0.6558 0.5476 0.8368 0.7794 0.6201 0.5309 0.8912 0.7275 0.4325 0.3025 0.6655 0.5530
1.1330 16.1538 2100 1.1925 0.5014 0.5943 0.8285 0.8487 0.5849 0.0468 0.0467 0.7237 0.6457 0.6999 0.5574 0.8479 0.7573 0.8587 0.7952 0.7076 0.6161 0.8383 0.7077 0.4581 0.3441 0.6075 0.4254 0.0 0.0 0.1523 0.1170 0.8962 0.8102 0.0 0.0 0.8098 0.6989 0.9002 0.8291 0.9183 0.8210 0.9244 0.8805 0.1041 0.0718 0.7568 0.6294 0.8416 0.7347 0.6916 0.5570 0.6831 0.5560 0.6546 0.6268 0.5135 0.3859 0.5856 0.5099 0.8971 0.7597 0.7605 0.6284 0.0 0.0 0.5210 0.3822 0.2945 0.2218 0.5288 0.4169 0.4050 0.3628 0.6957 0.4805 0.8612 0.7473 0.9749 0.9495 0.7626 0.7120 0.0 0.0 0.6029 0.4524 0.0 0.0 0.7640 0.5758 0.3573 0.3194 0.0 0.0 0.7939 0.6815 0.6475 0.4695 0.9313 0.8202 0.6150 0.5379 0.8044 0.7629 0.6261 0.5230 0.8592 0.7335 0.4318 0.2913 0.7022 0.5363
1.1093 18.8462 2450 1.1892 0.5004 0.5979 0.8284 0.8219 0.5991 0.0520 0.0515 0.7571 0.6304 0.7172 0.5623 0.8513 0.7524 0.8610 0.7849 0.7374 0.6401 0.9167 0.7344 0.5957 0.4430 0.5425 0.3965 0.1178 0.1178 0.1275 0.1093 0.9103 0.8051 0.0 0.0 0.8223 0.6600 0.9154 0.8271 0.9049 0.8036 0.9424 0.8934 0.2846 0.2533 0.7648 0.6419 0.8403 0.7346 0.6323 0.4918 0.5781 0.5375 0.5029 0.4697 0.5113 0.3895 0.5953 0.5397 0.8847 0.7619 0.7672 0.6236 0.0 0.0 0.4817 0.3665 0.4045 0.2530 0.5788 0.4368 0.4139 0.3635 0.7256 0.4134 0.8687 0.7491 0.9792 0.9467 0.7850 0.6874 0.0 0.0 0.5376 0.4250 0.0 0.0 0.6434 0.5046 0.4103 0.3425 0.0 0.0 0.8258 0.6778 0.6264 0.4468 0.9067 0.8253 0.6500 0.5771 0.8600 0.7475 0.5512 0.4707 0.8536 0.7392 0.3468 0.2770 0.6892 0.5185
1.0900 21.5385 2800 1.1883 0.4960 0.5990 0.8312 0.8118 0.5922 0.0640 0.0617 0.8030 0.6400 0.7190 0.5750 0.8756 0.7472 0.8821 0.7946 0.7587 0.6438 0.9066 0.7319 0.6375 0.4475 0.5662 0.4217 0.1494 0.1357 0.1296 0.1081 0.9118 0.8063 0.0 0.0 0.8497 0.6169 0.9168 0.8310 0.9137 0.8065 0.9468 0.8953 0.3993 0.2981 0.7640 0.6452 0.8618 0.7399 0.6067 0.5008 0.6812 0.5719 0.2382 0.2248 0.5463 0.4041 0.5805 0.5147 0.8757 0.7692 0.7566 0.6282 0.0 0.0 0.5390 0.3694 0.3934 0.2519 0.5778 0.4369 0.4266 0.3451 0.5398 0.4058 0.8530 0.7510 0.9787 0.9454 0.7872 0.6761 0.0 0.0 0.6355 0.4499 0.0 0.0 0.7958 0.5624 0.4390 0.3639 0.0 0.0 0.7877 0.6881 0.6974 0.4749 0.8888 0.8117 0.3381 0.3014 0.8437 0.7633 0.5867 0.4596 0.8750 0.7378 0.3958 0.3115 0.6153 0.5322
1.0743 24.2308 3150 1.1888 0.5138 0.6143 0.8321 0.8854 0.6357 0.0698 0.0677 0.7679 0.6615 0.6769 0.5662 0.8400 0.7498 0.8891 0.7868 0.7627 0.6284 0.8264 0.7359 0.6769 0.4466 0.5356 0.4145 0.1509 0.1509 0.1337 0.1108 0.9020 0.8112 0.0 0.0 0.8954 0.5948 0.9182 0.8350 0.9059 0.8065 0.9325 0.8929 0.2664 0.2189 0.7877 0.6529 0.8616 0.7416 0.7692 0.5749 0.6694 0.5624 0.7389 0.7040 0.5169 0.3959 0.6576 0.5387 0.8841 0.7663 0.7427 0.6246 0.0 0.0 0.5264 0.3863 0.3738 0.2454 0.5840 0.4337 0.4150 0.3509 0.6353 0.5022 0.8554 0.7548 0.9818 0.9446 0.7618 0.7092 0.0 0.0 0.5604 0.4303 0.0 0.0 0.8187 0.5886 0.4089 0.3302 0.0 0.0 0.8430 0.6842 0.6717 0.4638 0.9269 0.8359 0.5963 0.5195 0.8243 0.7786 0.5441 0.4740 0.8481 0.7405 0.4014 0.3130 0.7023 0.5540
1.0588 26.9231 3500 1.1853 0.5067 0.6022 0.8342 0.8444 0.6675 0.0691 0.0681 0.7944 0.6676 0.7074 0.5604 0.8630 0.7555 0.8814 0.7957 0.7624 0.6461 0.8593 0.7227 0.6232 0.4798 0.6358 0.4509 0.1709 0.1708 0.1639 0.1294 0.8981 0.8139 0.0 0.0 0.8497 0.5912 0.9177 0.8340 0.9149 0.7852 0.9464 0.9012 0.2581 0.2423 0.7787 0.6505 0.8496 0.7444 0.4549 0.3838 0.6908 0.5722 0.3312 0.3109 0.5384 0.4013 0.6608 0.5628 0.8918 0.7672 0.7400 0.6291 0.0 0.0 0.4774 0.3811 0.3675 0.2602 0.5878 0.4418 0.4079 0.3608 0.6530 0.4832 0.8710 0.7584 0.9823 0.9476 0.8005 0.6360 0.0 0.0 0.5631 0.4470 0.0 0.0 0.8015 0.5818 0.3620 0.3253 0.0 0.0 0.7901 0.6896 0.6149 0.4504 0.9355 0.8523 0.6599 0.5825 0.8505 0.7616 0.5463 0.4587 0.8598 0.7433 0.4279 0.3369 0.6607 0.5436
1.0469 29.6154 3850 1.1874 0.5111 0.6127 0.8357 0.8812 0.6225 0.0831 0.0807 0.7840 0.6413 0.6960 0.5684 0.8470 0.7538 0.8871 0.7985 0.7695 0.6377 0.8224 0.7288 0.6314 0.4397 0.5878 0.4405 0.1738 0.1627 0.1410 0.1201 0.9128 0.8125 0.0 0.0 0.8426 0.5515 0.9184 0.8367 0.8998 0.8011 0.9370 0.8961 0.4115 0.3027 0.7730 0.6545 0.8508 0.7453 0.7684 0.5643 0.6687 0.5764 0.3822 0.3648 0.5432 0.4094 0.6069 0.5445 0.8853 0.7694 0.7698 0.6355 0.0 0.0 0.5152 0.3898 0.3558 0.2529 0.5692 0.4435 0.4231 0.3498 0.6746 0.5000 0.8748 0.7576 0.9807 0.9508 0.7596 0.6925 0.0 0.0 0.6116 0.4758 0.0 0.0 0.7968 0.5894 0.3960 0.3497 0.0 0.0 0.8336 0.6859 0.6922 0.4616 0.9334 0.8495 0.5224 0.4512 0.8497 0.7821 0.5815 0.4960 0.8610 0.7457 0.4960 0.3649 0.6564 0.5279
1.0386 32.3077 4200 1.1855 0.5119 0.6130 0.8361 0.8635 0.6096 0.0924 0.0895 0.7936 0.6379 0.7006 0.5757 0.8687 0.7631 0.8811 0.7987 0.7746 0.6407 0.8653 0.7398 0.6436 0.4440 0.5965 0.4505 0.2007 0.1640 0.1289 0.1046 0.9095 0.8143 0.0 0.0 0.8485 0.5926 0.9212 0.8365 0.9105 0.8074 0.9433 0.8987 0.3076 0.2946 0.7672 0.6542 0.8576 0.7453 0.7719 0.5424 0.6766 0.5670 0.2437 0.2309 0.5345 0.4032 0.6260 0.5583 0.8881 0.7698 0.7460 0.6341 0.0 0.0 0.5150 0.3879 0.3941 0.2641 0.5975 0.4480 0.4259 0.3603 0.6352 0.4894 0.8723 0.7602 0.9813 0.9514 0.7410 0.6706 0.0 0.0 0.5851 0.4609 0.0 0.0 0.7717 0.5922 0.4056 0.3521 0.0 0.0 0.8379 0.6933 0.6815 0.4682 0.9325 0.8464 0.6420 0.5631 0.8722 0.7798 0.5860 0.4960 0.8501 0.7427 0.4686 0.3513 0.7167 0.5723
1.0339 35.0 4550 1.1825 0.5160 0.6146 0.8369 0.8430 0.6195 0.0890 0.0862 0.7965 0.6339 0.7122 0.5851 0.8715 0.7673 0.8753 0.7968 0.7658 0.6385 0.8347 0.7257 0.6331 0.4511 0.5791 0.4444 0.1756 0.1488 0.1943 0.1407 0.9053 0.8158 0.0 0.0 0.8101 0.6117 0.9196 0.8380 0.9113 0.8075 0.9468 0.9007 0.3452 0.3211 0.7852 0.6572 0.8617 0.7447 0.7177 0.5538 0.6860 0.5713 0.3150 0.2997 0.5468 0.4070 0.6485 0.5694 0.8876 0.7699 0.7518 0.6364 0.0 0.0 0.5294 0.3850 0.3767 0.2599 0.5779 0.4433 0.4158 0.3581 0.6537 0.4845 0.8712 0.7610 0.9812 0.9506 0.7928 0.6925 0.0 0.0 0.6122 0.4806 0.0 0.0 0.7706 0.5900 0.3966 0.3397 0.0 0.0 0.8209 0.6969 0.6764 0.4633 0.9293 0.8421 0.6442 0.5668 0.8602 0.7953 0.5887 0.5007 0.8630 0.7469 0.4509 0.3498 0.7382 0.5806
1.0281 37.6923 4900 1.1816 0.5168 0.6165 0.8368 0.8455 0.6188 0.0931 0.0892 0.7976 0.6377 0.7038 0.5829 0.8714 0.7699 0.8745 0.7971 0.7644 0.6403 0.8354 0.7216 0.6474 0.4418 0.5971 0.4432 0.1812 0.1587 0.1564 0.1235 0.9101 0.8157 0.0 0.0 0.8079 0.5896 0.9207 0.8361 0.9126 0.8083 0.9457 0.9007 0.4169 0.3774 0.7759 0.6581 0.8560 0.7465 0.7561 0.5848 0.6776 0.5701 0.3126 0.2938 0.5541 0.4088 0.6526 0.5701 0.8851 0.7697 0.7511 0.6369 0.0 0.0 0.5312 0.3893 0.3886 0.2666 0.5812 0.4455 0.4111 0.3570 0.6490 0.4828 0.8694 0.7614 0.9828 0.9506 0.7912 0.7119 0.0 0.0 0.6244 0.4812 0.0 0.0 0.7767 0.5933 0.4189 0.3580 0.0 0.0 0.8246 0.6969 0.6797 0.4607 0.9246 0.8367 0.6517 0.5691 0.8631 0.7922 0.5960 0.5027 0.8617 0.7449 0.4590 0.3484 0.6704 0.5317

Framework versions

  • Transformers 5.0.0
  • Pytorch 2.9.1+cu128
  • Datasets 4.5.0
  • Tokenizers 0.22.2
Downloads last month
8
Safetensors
Model size
84.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AllanK24/segformer-b5-finetuned-apple-dms-v2-run2

Finetuned
(22)
this model