Jared Fernandez
Jared Fernandez
About
Publications
Energy Considerations of Large Language Model Inference and Efficiency Optimizations
Jared Fernandez
,
Clara Na
,
Vashsisth Tiwari
,
Yonatan Bisk
,
Sasha Luccioni
,
Emma Strubell
April 2025
PDF
Type
Conference paper
Publication
Energy Considerations of Large Language Model Inference and Efficiency Optimizations
Energy
Efficiency
Inference
Jared Fernandez
PhD student at CMU LTI working on ML efficiency.
Related
Hardware Scaling Trends and Diminishing Returns in Large-Scale Distributed Training
The Framework Tax: Disparities Between Inference Efficiency in Research and Deployment
Holistically Evaluating the Environmental Impact of Creating Language Models
Sampling Informative Training Data for RNN Language Models
Cite
×