提交历史

作者 SHA1 备注 提交日期
  Polisetty V R K Jyothendra Varma ac935c7fde assumption of torch.initial_seed function accepting seed arg in DeepSpeedAccelerator abstract class is incorrect (#5569) 4 月之前
  vikram singh shekhawat fa8458b1a8 Add getter and setter methods for compile_backend across accelerators. (#5299) 6 月之前
  shiyuan680 3f875d9519 add device config env for the accelerator (#5396) 6 月之前
  BacharL 697f945a05 Split is_synchronized_device api to multiple apis (#5026) 8 月之前
  inkcherry d5a7c1e0b4 Capture short kernel sequences to graph (#4318) 10 月之前
  minchao 6d7b44a838 [NPU] load EXPORT_ENV based on different accelerators to support multi-node training on other devices (#4830) 10 月之前
  RyanInnerpeace 7b818ee961 improve the way to determine whether a variable is None (#4782) 10 月之前
  CurryRice233 3e70a88715 Add NPU FusedAdam support (#4343) 1 年之前
  Jeff Rasley 12aedac6ce add available memory check to accelerators (#4508) 1 年之前
  Liangliang-Ma 1760627eb9 Zero infinity xpu support (#4130) 1 年之前
  stephen youn 0e0748c579 adds triton flash attention2 kernel (#4337) 1 年之前
  CurryRice233 60d7b0a39d add npu support dtypes (#4223) 1 年之前
  Earlee 57a27b0803 add type checker ignore to resolve that pylance can't resolved noqa annotation (#4102) 1 年之前
  hipudding 23a11a3951 Make Ascend NPU available (#3831) 1 年之前
  CurryRice233 f3c8eacafc Add Ascend NPU accelerator support (#3595) 1 年之前