Skip to content

AtsunoriFujita/Bristol-Myers-Squibb-Molecular-Translation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Bristol-Myers-Squibb-Molecular-Translation

This respository contains my code for competition in kaggle.

7th Place Solution for Bristol-Myers Squibb – Molecular Translation

Team: Mr_KnowNothing, Shivam Gupta, Phaedrus, Nischay Dhankhar, atfujita

  • All models(Team)
    Public LB: 0.60(7th)
    Private LB: 0.60(7th)

The full picture of our solution is here

Note: This repository contains only my models and only train script.

  • My models(3 Models averaging)
    Public LB: 0.66
    Private LB: 0.66

Note: This repogitory is based on hengck23's great assets. Please check here for details

My Models

Vit based model

  • Encoder: vit_deit_base_distilled_patch16_384
  • Decoder: TransformerDecoder
  • Loss: LabelSmoothingLoss
  • Augumentation: RandomScale, Cutout

There are 2 Vit based models.
The second was re-training by strengthening Noize Injection and Augmentation.

Vit model1

  • Public LB: 0.77 (With Normalize)
  • Private LB: 0.78 (With Normalize)

Vit model2

  • Public LB: 0.76 (With Normalize)
  • Private LB: 0.77 (With Normalize)

Swin Transformer based model

  • Encoder: swin_base_patch4_window12_384_in22k
  • Decoder: TransformerDecoder
  • Loss: LabelSmoothingLoss
  • Augumentation: RandomScale, Cutout

Swin model

  • Public LB: 0.91 (With Normalize)
  • Private LB: 0.92 (With Normalize)

About

7th Place Solution for Bristol-Myers Squibb – Molecular Translation on Kaggle

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages