About
Experience
Publications
Activities
Light
Dark
Automatic
Transformer
Dynamic-TinyBERT: Boost TinyBERT's Inference Efficiency by Dynamic Sequence Length
Dynamic sequence length reduction for a TinyBERT model - ___[ENLSP NeurIPS Workshop 2021](https://neurips2021-nlp.github.io)___
Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search
Framework of training any transformer with length drop and using it for anytime prediction with a multi-objective evolutionary search - ___[ACL 2021](https://2021.aclweb.org)___
Cite
×