LM

Dynamic-TinyBERT: Boost TinyBERT's Inference Efficiency by Dynamic Sequence Length

Dynamic sequence length reduction for a TinyBERT model - ___[ENLSP NeurIPS Workshop 2021](https://neurips2021-nlp.github.io)___

Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search

Framework of training any transformer with length drop and using it for anytime prediction with a multi-objective evolutionary search - ___[ACL 2021](https://2021.aclweb.org)___

Large Product Key Memory for Pretrained Language Models

Improving accuracy and speed trade-off when finetuning pretrained language models by using large product key memory and mitigating a catastrophic drift with initialization and residual memory - ___[Findings of EMNLP 2020](https://2020.emnlp.org/)___

Subword Language Model for Query Auto-Completion

Utilization of subword language model for faster query auto-completion with a retrace algorithm and a reranking method by approximate marginalization - ___[EMNLP-IJCNLP 2019](https://www.emnlp-ijcnlp2019.org/)___

Mimicry Resilient Program Behavior Modeling with LSTM based Branch Models

Anomaly detection robust to mimicry attacks with language modeling of branch sequences - ___[S&P 2018 DLS Workshop](https://www.ieee-security.org/TC/SPW2018/DLS/)___

LSTM-Based System-Call Language Modelling and Robust Ensemble Method for Designing Host-Based Intrusion Detection System

System-call language-modeling approach for designing anomaly-based host intrusion detection systems and novel ensemble method to enhance the precision