name | about | labels |
---|---|---|
Bug Report | Use this template for reporting a bug | kind/bug |
pangu_alpha网络在910A上8卡模式训练存在接口弃用告警
网络路径:https://e.gitee.com/mind_spore/repos/mindspore/models/tree/master/official/nlp/Pangu_alpha
Ascend
/GPU
/CPU
) / 硬件环境:Please delete the backend not involved / 请删除不涉及的后端:
/device ascend
Software Environment / 软件环境 (Mandatory / 必填):
-- MindSpore version (e.g., 1.7.0.Bxxx) :
-- Python version (e.g., Python 3.7.5) :
-- OS platform and distribution (e.g., Linux Ubuntu 16.04):
-- GCC/Compiler version (if compiled from source):
失败版本:r2.3_aa16e0b8 Milan_C17/20240408
Excute Mode / 执行模式 (Mandatory / 必填)(PyNative
/Graph
):
Please delete the mode not involved / 请删除不涉及的模式:
/mode pynative
/mode graph
用例路径:solution_test/cases/01frame_func/19large_model_feature_test/distributed_training_capability
关联用例:test_ms_large_model_data_parallel_004.py
1、修改models仓盘古脚本的src/pangu_alpha_config.py文件,其中2.6B的参数args_opt.op_level_model_parallel_num从1 2 4 8中随机生成
2、将train.py中所有的PipelineCell替换为GradAccumulationCell
3、启动训练:bash scripts/run_distribute_train.sh /home/workspace/mindspore_dataset/pangu-data/pangu_30_step_bs64/ /home/workspace/config/hccl_8p.json 8 fp16 2.6B 1 4 8 0 8
4、检查loss收敛情况,以及检查日志无报错
loss收敛正常,日志无报错
1 mindspore/ccsrc/plugin/device/ascend/hal/hardware/ascend_collective_comm_lib.cc:126] Initialize] Launch Ascend distributed job in RankTable manner. This manner will be deprecated in later version of MindSpore.
1 mindspore/common/_decorator.py:40] 'Parameter' is deprecated from version 2.3 and will be removed in a future version, use 'add_pipeline_stage' instead.
走给刘崇鸣
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。
感谢您的提问,您可以评论//mindspore-assistant更快获取帮助:
根因:弃用接口告警
修改:models中对应脚本已适配
是否需要增加ut、st:不需要
回归版本:
master_20240505061518_ba6602334 Milan_C17/20240414
回归步骤:参考issue步骤
测试结论:回归通过
INFO 2024-05-05 21:03:02 - test_ms_large_model_data_parallel_004 - process_handle.py:is_process_exist:173 - No residual processes need to be cleaned.
INFO 2024-05-05 21:03:02 - test_ms_large_model_data_parallel_004 - base.py:teardown:140 - The base teardown is running
=============================================================================== warnings summary ================================================================================
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:549
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:549: UserWarning: The value of the smallest subnormal for <class 'numpy.float64'> type is zero.
setattr(self, word, getattr(machar, word).flat[0])
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:89
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:89: UserWarning: The value of the smallest subnormal for <class 'numpy.float64'> type is zero.
return self._float_to_str(self.smallest_subnormal)
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:549
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:549: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero.
setattr(self, word, getattr(machar, word).flat[0])
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:89
/home/miniconda3/envs/feature_39/lib/python3.9/site-packages/numpy/core/getlimits.py:89: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero.
return self._float_to_str(self.smallest_subnormal)
-- Docs: https://docs.pytest.org/en/latest/warnings.html
=================================================================== 1 passed, 4 warnings in 811.55s (0:13:31) ===================================================================
[INFO] RUNTIME(58676,python):2024-05-05-21:03:03.401.924 [runtime.cc:1831] 58676 ~Runtime: deconstruct runtime
[INFO] RUNTIME(58676,python):2024-05-05-21:03:03.401.976 [runtime.cc:1838] 58676 ~Runtime: wait monitor success, use=0.
回归人员:白梦真
回归时间:2024.05.05
登录 后才可以发表评论