Dynamic Tensor Rematerialization release_hhp2aqux5zc4tnqunf6rqoeut4

by Marisa Kirisame, Steven Lyubomirsky, Altan Haan, Jennifer Brennan, Mike He, Jared Roesch, Tianqi Chen, Zachary Tatlock

Released as a article .

2020  

Abstract

Checkpointing enables training larger models by freeing intermediate activations and recomputing them on demand. Previous checkpointing techniques are difficult to generalize to dynamic models because they statically plan recomputations offline. We present Dynamic Tensor Rematerialization (DTR), a greedy online algorithm for heuristically checkpointing arbitrary models. DTR is extensible and general: it is parameterized by an eviction policy and only collects lightweight metadata on tensors and operators. Though DTR has no advance knowledge of the model or training task, we prove it can train an N-layer feedforward network on an Ω(√(N)) memory budget with only O(N) tensor operations. Moreover, we identify a general eviction heuristic and show how it allows DTR to automatically provide favorable checkpointing performance across a variety of models and memory budgets.
In text/plain format

Archived Files and Locations

application/pdf  1.3 MB
file_pvmqixpdujap7pr54s4obnrgqi
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2020-06-17
Version   v1
Language   en ?
arXiv  2006.09616v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: b61e2976-8e60-4303-afcd-551017c70a6b
API URL: JSON