Replies: 3 comments
-
Beta Was this translation helpful? Give feedback.
-
>>> arushngpl16 |
Beta Was this translation helpful? Give feedback.
-
>>> lissyx |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
>>> arushngpl16 |
Beta Was this translation helpful? Give feedback.
-
>>> lissyx |
Beta Was this translation helpful? Give feedback.
-
>>> arushngpl16
[August 5, 2019, 8:36am]
Hello, slash
I created my own lm.binary and trie (for just the digits one to nine)
using the native binaries and I am trying to use them in the basic
android app. slash
I placed them in the same location as the output_graph.tflite and used
this code:
this._m = new DeepSpeechModel(tfliteModel, N_CEP, N_CONTEXT, alphabet, BEAM_WIDTH);
this._m.enableDecoderWihLM(alphabetFile, lmFile, trieFile, LM_ALPHA, LM_BETA);
But on running the inference, it is mapping to words other than the ones
present in my generated lm and trie. i.e. Instead of inferring only the
numbers one to nine, it does print other words as well. There are no
errors in the logs too. Is there something I am missing? Is there a way
to check which lm file is being picked up?
[This is an archived TTS discussion thread from discourse.mozilla.org/t/inference-on-android-using-custom-language-model-and-trie]
Beta Was this translation helpful? Give feedback.
All reactions