Pre-trained BERT models have achieved impressive accuracy on natural language processing (NLP) tasks. However, their excessive amount of parameters hinders them from efficient deployment on edge ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results