-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deprecate clamp!(A, lo, hi)
for broadcast!(clamp, A, A, lo, hi)
?
#22236
Comments
Should be |
clamp!(A, lo, hi)
for broadcast!(clamp, A, lo, A, hi)
?clamp!(A, lo, hi)
for broadcast!(clamp, A, A, lo, hi)
?
My bad; the argument order for |
julia> @benchmark clamp!([1,2,3,4,5,6,7,8,9,10], 2, 7)
BenchmarkTools.Trial:
memory estimate: 160 bytes
allocs estimate: 1
--------------
minimum time: 43.174 ns (0.00% GC)
median time: 46.867 ns (0.00% GC)
mean time: 50.633 ns (6.12% GC)
maximum time: 550.135 ns (90.15% GC)
--------------
samples: 10000
evals/sample: 987
julia> @benchmark clamp.([1,2,3,4,5,6,7,8,9,10], 2, 7)
BenchmarkTools.Trial:
memory estimate: 320 bytes
allocs estimate: 2
--------------
minimum time: 73.643 ns (0.00% GC)
median time: 79.063 ns (0.00% GC)
mean time: 86.230 ns (7.28% GC)
maximum time: 606.447 ns (85.14% GC)
--------------
samples: 10000
evals/sample: 974 Also for named variable is much faster: julia> a = randn(100000);
julia> @benchmark clamp.($a, 2, 7)
BenchmarkTools.Trial:
memory estimate: 781.33 KiB
allocs estimate: 2
--------------
minimum time: 90.881 μs (0.00% GC)
median time: 97.769 μs (0.00% GC)
mean time: 113.206 μs (11.85% GC)
maximum time: 678.371 μs (82.18% GC)
--------------
samples: 10000
evals/sample: 1
julia> @benchmark clamp!($a, 2, 7)
BenchmarkTools.Trial:
memory estimate: 0 bytes
allocs estimate: 0
--------------
minimum time: 36.166 μs (0.00% GC)
median time: 37.643 μs (0.00% GC)
mean time: 37.689 μs (0.00% GC)
maximum time: 96.818 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 1 |
You should benchmark against the inplace |
Uhm, almost, you have to allocate the array, first. Anyway, you're right, in that case julia> using BenchmarkTools
julia> function foo()
a = randn(10000)
a .= clamp.(a, -1, 1)
end
foo (generic function with 1 method)
julia> bar() = clamp!(randn(10000), -1, 1)
bar (generic function with 1 method)
julia> @benchmark foo()
BenchmarkTools.Trial:
memory estimate: 78.20 KiB
allocs estimate: 2
--------------
minimum time: 87.944 μs (0.00% GC)
median time: 93.500 μs (0.00% GC)
mean time: 96.606 μs (2.87% GC)
maximum time: 943.194 μs (86.66% GC)
--------------
samples: 10000
evals/sample: 1
julia> @benchmark bar()
BenchmarkTools.Trial:
memory estimate: 78.20 KiB
allocs estimate: 2
--------------
minimum time: 84.859 μs (0.00% GC)
median time: 87.922 μs (0.00% GC)
mean time: 91.157 μs (3.08% GC)
maximum time: 950.349 μs (86.95% GC)
--------------
samples: 10000
evals/sample: 1 |
I actually get that the julia> a = randn(10_000); b = randn(10_000);
julia> f(x) = (x .= clamp.(x, -1, 1)); g(x) = clamp!(x, -1, 1);
julia> @benchmark f($a)
BenchmarkTools.Trial:
memory estimate: 0 bytes
allocs estimate: 0
--------------
minimum time: 8.120 μs (0.00% GC)
median time: 8.125 μs (0.00% GC)
mean time: 8.572 μs (0.00% GC)
maximum time: 30.056 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 3
julia> @benchmark g($b)
BenchmarkTools.Trial:
memory estimate: 0 bytes
allocs estimate: 0
--------------
minimum time: 9.607 μs (0.00% GC)
median time: 9.630 μs (0.00% GC)
mean time: 12.250 μs (0.00% GC)
maximum time: 110.585 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 1 |
Yes, but I was claiming that |
Perhaps the following alternatives are sufficient? :) julia> @benchmark [clamp(randn(), -1, 1) for _ in 1:1000]
BenchmarkTools.Trial:
memory estimate: 7.94 KiB
allocs estimate: 1
--------------
minimum time: 8.782 μs (0.00% GC)
median time: 9.878 μs (0.00% GC)
mean time: 11.005 μs (0.00% GC)
maximum time: 779.968 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 1
julia> @benchmark collect(clamp(randn(), -1, 1) for _ in 1:1000)
BenchmarkTools.Trial:
memory estimate: 7.94 KiB
allocs estimate: 1
--------------
minimum time: 8.842 μs (0.00% GC)
median time: 10.252 μs (0.00% GC)
mean time: 11.660 μs (0.00% GC)
maximum time: 675.692 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 1
julia> @benchmark map(_ -> clamp(randn(), -1, 1), 1:1000)
BenchmarkTools.Trial:
memory estimate: 7.94 KiB
allocs estimate: 1
--------------
minimum time: 8.834 μs (0.00% GC)
median time: 9.902 μs (0.00% GC)
mean time: 10.910 μs (0.00% GC)
maximum time: 651.152 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 1
julia> @benchmark (_ -> clamp(randn(), -1, 1)).(1:1000)
BenchmarkTools.Trial:
memory estimate: 7.94 KiB
allocs estimate: 1
--------------
minimum time: 8.877 μs (0.00% GC)
median time: 10.438 μs (0.00% GC)
mean time: 12.043 μs (0.00% GC)
maximum time: 800.743 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 1
julia> @benchmark clamp!(randn(1000), -1, 1)
BenchmarkTools.Trial:
memory estimate: 7.94 KiB
allocs estimate: 1
--------------
minimum time: 8.462 μs (0.00% GC)
median time: 9.914 μs (0.00% GC)
mean time: 11.508 μs (0.00% GC)
maximum time: 662.747 μs (0.00% GC)
--------------
samples: 10000
evals/sample: 1 |
Note that the behavior of `clamp!(A, lo, hi)` and `broadcast!(clamp, A, A, lo, hi)` are exactly identical: there is no excess allocation step.
That is,
```
let A = randn(1000)
A .= clamp.(A, lo, hi)
end
```
is exactly equivalent to
```
clamp!(randn(1000), lo, hi)
```
|
I have to confess that I don't see the urgency of removing |
Should we go ahead with this since we do the same for |
I'd really rather not. I find all the alternate spellings here far less convenient or obvious or both. |
IMO
|
Isn't it a bit weird to keep this one function when we have deprecated essentially all other auto-broadcasting functions? |
What's the alternative spelling again? |
Note that the same argument cannot be made in the other cases: |
Fair enough, guess we can close this issue then.
|
Regrettably vectorized |
They remain because they are vector operations, not vectorized operations. |
Clarification from a slack conversation: Stefan's argument is that: 1) one can view IIRC, this argument differs from prior arguments, which went roughly as follows: (Also IIRC, auditing usage of Best! |
Make |
You mean |
No, mean make the array-valued |
My response was a bit tongue-in-cheek :). More explicitly, I meant to imply that were you to make |
Should we deprecate
clamp!(A, lo, hi)
forbroadcast!(clamp, A, A, lo, hi)
? Note the latter can be writtenA .= clamp.(A, lo, hi)
.Related: #22235 (comment)
Part of #20402
The text was updated successfully, but these errors were encountered: