I am a doctoral student in the Language Technologies Institute at Carnegie Mellon University where I am advised by Yonatan Bisk and Emma Strubell.

My research interests are in the computational efficiency of machine learning models, with a focus on applications in natural language processing and computer vision. My research is generously supported by the NSF Graduate Research Fellowship.

Previously, I received my B.S. in Computer Science and B.S. in Electrical Engineering at Northwestern University where I worked with Doug Downey and Thrasos Pappas. I’ve also spent time as a Software Engineer at Google.

Email: jaredfern [at] cmu.edu

$\qquad$ [April 2021] $\quad$ Awarded an NSF Graduate Research Fellowship!
$\qquad$ [Aug. 2020] $\quad$ Starting as a PhD student at Carnegie Mellon University!
$\qquad$ [Nov. 2019] $\quad$ Starting fulltime as a Software Engineer at Google!


Adapting to Gradual Distribution Shifts with Continual Weight Averaging

Workshop on High Dimensional Learning Dynamics at ICML, 2023.

The Framework Tax: Disparities Between Inference Efficiency in Research and Deployment

To appear at Empirical Methods in Natural Language Processing (EMNLP), 2023.

CIGLI: Conditional Image Generation from Language and image

Fourth Workshop on Closing the Loop Between Vision & Language, 2021.

Generative Data Augmentation for Commonsense Reasoning

Findings of EMNLP, 2020.

CODAH: An Adversarially Authored Question-Answer Dataset for Common Sense

Workshop on Evaluating Vector Space Representations for NLP (RepEval), 2019.

Sampling Informative Training Data for RNN Language Models

ACL Student Research Workshop (ACL-SRW), 2018.

VecShare: A Framework for Sharing Word Representation Vectors

Empirical Methods in Natural Language Processing (EMNLP), 2017.