Logan Adams 6dcced1d5c Cleanup required_torch_version code and references. (#5370) 6 months ago
..
__init__.py b112c99ea8 Fix loading a universal checkpoint (#5263) 7 months ago
bwc.py 08e0733e4a Support MoE for pipeline models (#5338) 6 months ago
comms_logging.py 0b507253e5 fix comm logging for inference (#4043) 1 year ago
debug.py 40342055ce Remove hooks on gradient accumulation on engine/optimizer destroy (#4858) 9 months ago
exceptions.py b361c72761 Update DeepSpeed copyright license to Apache 2.0 (#3111) 1 year ago
groups.py 08e0733e4a Support MoE for pipeline models (#5338) 6 months ago
init_on_device.py b361c72761 Update DeepSpeed copyright license to Apache 2.0 (#3111) 1 year ago
logging.py b361c72761 Update DeepSpeed copyright license to Apache 2.0 (#3111) 1 year ago
mixed_precision_linkage.py 18179807f5 Remove optimizer step on initialization (#5104) 8 months ago
numa.py 5dadf68771 support HBM in utils/numa.py (#3918) 1 year ago
nvtx.py b361c72761 Update DeepSpeed copyright license to Apache 2.0 (#3111) 1 year ago
tensor_fragment.py b112c99ea8 Fix loading a universal checkpoint (#5263) 7 months ago
timer.py 697f945a05 Split is_synchronized_device api to multiple apis (#5026) 8 months ago
torch.py 6dcced1d5c Cleanup required_torch_version code and references. (#5370) 6 months ago
types.py 0a61d5d664 Hybrid Engine Refactor and Llama Inference Support (#3425) 1 year ago
z3_leaf_module.py 19e0dc39ba Delay reduce-scatter for ZeRO3 leaf modules (#5008) 8 months ago
zero_to_fp32.py 2518cc429d Support `exclude_frozen_parameters` for `zero_to_fp32.py` script (#4979) 8 months ago