Fix sigmoid gradient calculation and move sigmoid into a specialized op (#2114)

* add sigmoid op

* small fix

* add as a method on `Tensor`

* implement gradient calculation for sigmoid

* add sigmoid tests

* we should have a specialized op for this

* fix clippy

* fix clippy 2

* Revert all previous commits in favor of a `CustomOp` based solution

* use `CustomOp1` implementation

* fix rustfmt

* experimental add metal impl

* add cuda kernel impl

* fix fmt

* Add a test + reduce some cuda duplication.

---------

Co-authored-by: laurent <laurent.mazare@gmail.com>
This commit is contained in:
MilkFather
2024-04-29 17:04:43 +08:00
committed by GitHub
parent ed7b99f525
commit 3bbb88fcb4
6 changed files with 214 additions and 5 deletions

View File

@ -129,7 +129,7 @@ macro_rules! ops{
pub mod unary {
ops!(
cos, sin, exp, sqr, sqrt, neg, log, gelu, abs, ceil, floor, relu, round, erf, gelu_erf,
tanh, recip, silu, sign
tanh, recip, silu, sign, sigmoid
);
}
pub mod binary {