Skip to content

Commit 070e580

Browse files
andyanwangpytorchmergebot
authored andcommitted
[MTIA Aten Backend] Migrate _log_softmax.out / _log_softmax_backward_data.out (pytorch#156539)
# Context See the first PR pytorch#153670 # This diff Migrate _log_softmax.out / _log_softmax_backward_data.out to in-tree. Differential Revision: [D77044380](https://our.internmc.facebook.com/intern/diff/D77044380/) Pull Request resolved: pytorch#156539 Approved by: https://github.com/malfet ghstack dependencies: pytorch#156502
1 parent 93cd165 commit 070e580

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

aten/src/ATen/native/native_functions.yaml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3720,6 +3720,7 @@
37203720
dispatch:
37213721
CPU: log_softmax_cpu_out
37223722
CUDA: log_softmax_cuda_out
3723+
MTIA: log_softmax_mtia_out
37233724
MPS: log_softmax_mps_out
37243725

37253726
- func: _log_softmax_backward_data(Tensor grad_output, Tensor output, int dim, ScalarType input_dtype) -> Tensor
@@ -3730,6 +3731,7 @@
37303731
dispatch:
37313732
CPU: log_softmax_backward_cpu_out
37323733
CUDA: log_softmax_backward_cuda_out
3734+
MTIA: log_softmax_backward_mtia_out
37333735
MPS: log_softmax_backward_mps_out
37343736

37353737
- func: _logcumsumexp(Tensor self, int dim) -> Tensor

0 commit comments

Comments
 (0)