-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle multithreading #114
Comments
Currently Julia compiler doesn't run type inference on julia> code_typed(; optimize = false) do
fetch(Threads.@spawn 1 + 2)
end |> first
CodeInfo(
1 ─ (#34 = %new(Main.:(var"#34#36")))::Core.Const(var"#34#36"())
│ %2 = #34::Core.Const(var"#34#36"())
│ (task = Base.Threads.Task(%2))::Task
│ %4 = false::Core.Const(false)
│ Base.setproperty!(task, :sticky, %4)::Any
└── goto #3 if not false
2 ─ Core.Const(:(Base.Threads.put!(Main.:(var"##sync#41"), task)))::Union{}
3 ┄ Base.Threads.schedule(task)::Any
│ %9 = task::Task
│ %10 = Main.fetch(%9)::Any
└── return %10
) => Any I think JuliaLang/julia#39773 is an attempt to enable various optimizations including type inference for multithreading contexts like this (am I right, @tkf ?). That said, for the time being, I'm willing to make JET special case some of multithreading code and enable JET analysis on e.g. |
I've only ever used |
For the time being, we will add special analysis pass to analyze multithreading code with `Threads.@spawn` and `Threads.@threads` macros, in which threaded code is represented as a closure. `NativeInterpreter` doesn't run type inference nor optimization on the body of those closures when compiling threading code, but JET will try to run additional analysis pass by recurring into the closures. NOTE JET won't do anything other than doing JET analysis, e.g. won't annotate return type of threaded code block in order to not confuse the original `AbstractInterpreter` routine.
For the time being, we will add special analysis pass to analyze multithreading code with `Threads.@spawn` and `Threads.@threads` macros, in which threaded code is represented as a closure. `NativeInterpreter` doesn't run type inference nor optimization on the body of those closures when compiling threading code, but JET will try to run additional analysis pass by recurring into the closures. NOTE JET won't do anything other than doing JET analysis, e.g. won't annotate return type of threaded code block in order to not confuse the original `AbstractInterpreter` routine.
For the time being, we will add special analysis pass to analyze multithreading code with `Threads.@spawn` and `Threads.@threads` macros, in which threaded code is represented as a closure. `NativeInterpreter` doesn't run type inference nor optimization on the body of those closures when compiling threading code, but JET will try to run additional analysis pass by recurring into the closures. NOTE JET won't do anything other than doing JET analysis, e.g. won't annotate return type of threaded code block in order to not confuse the original `AbstractInterpreter` routine.
In Julia's task parallelism implementation, parallel code is represented as closure and it's wrapped in `Task` object. `NativeInterpreter` doesn't run type inference nor optimization on the body of those closures when compiling code that creates parallel tasks, but JET will try to run additional analysis pass by recurring into the closures. NOTE JET won't do anything other than doing JET analysis, e.g. won't annotate return type of wrapped code block in order to not confuse the original `AbstractInterpreter` routine.
After #124: julia> using JET
julia> foo() = fetch(Threads.@spawn 1 + "foo")
foo (generic function with 1 method)
julia> @report_call foo()
═════ 1 possible error found ═════
┌ @ threadingconstructs.jl:174 Base.Threads.Task(#3)
│┌ @ task.jl:5 #self#(f, 0)
││┌ @ threadingconstructs.jl:170 Main.+(1, "foo")
│││ no matching method found for call signature: Main.+(1, "foo")
││└──────────────────────────────
Any From the technical reason, there is a limitation that JET currently runs the additional analysis pass on |
I think a challenging point here is that julia> f() = 1
f (generic function with 1 method)
julia> t = Task(f)
Task (runnable) @0x00007f5c77464c40
julia> f() = 2
f (generic function with 1 method)
julia> schedule(t)
Task (done) @0x00007f5c77464c40
julia> fetch(t)
2 If we can change how You can find related discussions here: |
We've also noticed that julia> function abmult(r::Int)
if r < 0
r = -r
end
# the closure assigned to `f` make the variable `r` captured
f = x -> x * r
return f
end;
julia> JET.@report_opt abmult(42)
═════ 3 possible errors found ═════
┌ @ REPL[97]:2 r = Core.Box(_7::Int64)
│ captured variable `r` detected
└──────────────
┌ @ REPL[97]:2 %7 < 0
│ runtime dispatch detected: (%7::Any < 0)::Any
└──────────────
┌ @ REPL[97]:3 -(%14)
│ runtime dispatch detected: -(%14::Any)::Any
└──────────────
julia> JET.@report_opt fetch(Threads.@spawn abmult(42))
═════ 3 possible errors found ═════
┌ @ task.jl:360 wait(t)
│┌ @ task.jl:343 Base._wait(t)
││┌ @ task.jl:301 lock(%13)
│││ runtime dispatch detected: lock(%13::Any)::Any
││└───────────────
││┌ @ task.jl:304 wait(%30)
│││ runtime dispatch detected: wait(%30::Any)::Any
││└───────────────
││┌ @ task.jl:307 unlock(%40)
│││ runtime dispatch detected: unlock(%40::Any)::Any
││└─────────────── Note that the boxed variable is not reported when |
@Drvi fyi you can do julia> g() = Threads.@spawn abmult(42)
g (generic function with 1 method)
julia> JET.@report_opt g()
═════ 3 possible errors found ═════
┌ g() @ Main ./threadingconstructs.jl:377
│┌ Task(f::var"#10#11") @ Base ./task.jl:5
││┌ (::var"#10#11")() @ Main ./threadingconstructs.jl:373
│││┌ abmult(r::Int64) @ Main ./REPL[11]:2
││││ captured variable `r` detected
│││└────────────────────
│││┌ abmult(r::Int64) @ Main ./REPL[11]:2
││││ runtime dispatch detected: (%6::Any < 0)::Any
│││└────────────────────
│││┌ abmult(r::Int64) @ Main ./REPL[11]:3
││││ runtime dispatch detected: -(%13::Any)::Any
│││└──────────────────── |
Small example:
Output:
The text was updated successfully, but these errors were encountered: