You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I know this repository is a bit old, but I'm hoping someone can take a look at this issue. I successfully created the vocabulary and trained the model using all the default script settings on two of my own txt files, however when I try to run the inferencer so I can use the model on a prompt I get this shape erros:
Traceback (most recent call last):
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\run_inferencer.py", line 134, in
app.run(main)
File "C:\Users\gjsho\AppData\Local\Programs\Python\Python310\lib\site-packages\absl\app.py", line 308, in run
_run_main(main, args)
File "C:\Users\gjsho\AppData\Local\Programs\Python\Python310\lib\site-packages\absl\app.py", line 254, in _run_main
sys.exit(main(argv))
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\run_inferencer.py", line 123, in main
token_id_list = inferencer.infer(tf.constant([prompt_token_ids]))
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\model_runners.py", line 251, in infer
_, memories = self._model(
File "C:\Users\gjsho\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\model.py", line 176, in build
self.add_weight(name='content_bias',
ValueError: In this tf.Variable creation, the initial value's shape ((9, 8, 64)) is not compatible with the explicitly supplied shape argument ((8, 64)).
The text was updated successfully, but these errors were encountered:
I know this repository is a bit old, but I'm hoping someone can take a look at this issue. I successfully created the vocabulary and trained the model using all the default script settings on two of my own txt files, however when I try to run the inferencer so I can use the model on a prompt I get this shape erros:
Traceback (most recent call last):
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\run_inferencer.py", line 134, in
app.run(main)
File "C:\Users\gjsho\AppData\Local\Programs\Python\Python310\lib\site-packages\absl\app.py", line 308, in run
_run_main(main, args)
File "C:\Users\gjsho\AppData\Local\Programs\Python\Python310\lib\site-packages\absl\app.py", line 254, in _run_main
sys.exit(main(argv))
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\run_inferencer.py", line 123, in main
token_id_list = inferencer.infer(tf.constant([prompt_token_ids]))
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\model_runners.py", line 251, in infer
_, memories = self._model(
File "C:\Users\gjsho\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "E:\Prompt-Generating-AI\tf-transformerxl-language-model\model.py", line 176, in build
self.add_weight(name='content_bias',
ValueError: In this
tf.Variable
creation, the initial value's shape ((9, 8, 64)) is not compatible with the explicitly suppliedshape
argument ((8, 64)).The text was updated successfully, but these errors were encountered: