Skip to content

LOSS is skyrocketing when trying to train a NER classifier with transformers #9090

Discussion options

You must be logged in to vote

I'm glad to let you know that somehow the problem disappeared.
I ran it today with my old configs using new version of spacy and spacy-transformers and voila:

ℹ Initial learn rate: 1e-05
E    #       LOSS TRANS...  LOSS NER  ENTS_F  ENTS_P  ENTS_R  SCORE
---  ------  -------------  --------  ------  ------  ------  ------
  0       0         898.86    701.86    1.00    0.52   11.43    0.01

  1     200       12545.41  13049.79   75.57   79.44   72.05    0.76
  2     400        2371.97   2280.48   82.97   83.34   82.60    0.83
  4     600        1049.16   1168.24   84.29   85.43   83.18    0.84
  5     800         507.07    606.52   85.73   86.99   84.50    0.86

Replies: 10 comments 17 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@dchaplinsky
Comment options

Comment options

You must be logged in to vote
3 replies
@polm
Comment options

@svlandeg
Comment options

@dchaplinsky
Comment options

Comment options

You must be logged in to vote
2 replies
@polm
Comment options

@polm
Comment options

Comment options

You must be logged in to vote
1 reply
@polm
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@polm
Comment options

Comment options

You must be logged in to vote
8 replies
@dchaplinsky
Comment options

@polm
Comment options

@dchaplinsky
Comment options

@dchaplinsky
Comment options

@adrianeboyd
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@svlandeg
Comment options

Answer selected by svlandeg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / ner Feature: Named Entity Recognizer lang / uk Ukrainian language data and models feat / transformer Feature: Transformer
4 participants