Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testing NNLib / Lux / Flux #105

Open
gdalle opened this issue Mar 26, 2024 · 16 comments
Open

Testing NNLib / Lux / Flux #105

gdalle opened this issue Mar 26, 2024 · 16 comments
Labels
downstream Related to downstream compatibility test Related to the testing subpackage

Comments

@gdalle
Copy link
Member

gdalle commented Mar 26, 2024

Lower hanging fruit: NNLib.jl, because there are less weird structs, mostly arrays

Cross-referencing:

@gdalle gdalle changed the title Testing NNLib Testing NNLib / Flux.Losses Mar 26, 2024
@gdalle gdalle changed the title Testing NNLib / Flux.Losses Testing NNLib Mar 26, 2024
@gdalle
Copy link
Member Author

gdalle commented Mar 26, 2024

Slow

  • Replace several calls to grad_test with a vector of scenarios, like in softmax.jl and then scatter.jl

Fast

@avik-pal
Copy link

If we want to be adventurous, you can change https://github.com/LuxDL/LuxTestUtils.jl and all downstream CPU tests in Lux will be triggered (and we just need to copy one of the buildkite files from LuxLib to trigger the CUDA + AMDGPU tests)

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

Don't tempt me Avik

@avik-pal
Copy link

On a serious note though, I had to write it to mostly deal with arrays or at least convert structures to arrays https://github.com/LuxDL/LuxTestUtils.jl/blob/143a51f0d2fb4cbc75ea583c706ff5194be103d2/src/LuxTestUtils.jl#L387-L398, so that could be helpful to writing your test suite. (But this is also terribly inefficient and only tests correctness and definitely don't combine @test_gradients with @jet)

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

Are the tests of LuxTestUtils already interesting to run locally, or should we wait for the Downstream CI every time?

@avik-pal
Copy link

no the tests there do nothing practically, it is all via the downstream CI

@avik-pal
Copy link

avik-pal commented Mar 27, 2024

but the Lux test suite doesn't take long -- 10 mins on a nicer machine (like the buildkite ones) but github actions ones take longer ~30 mins

If you want to test locally, set RETESTITEMS_NWORKERS and it will be much faster

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

So the workflow is to:

  1. fork LuxTestUtils.jl and Lux.jl
  2. put my own gradient callers in LuxTestUtils.jl
  3. dev LuxTestUtils.jl into the test environment of Lux.jl
  4. test Lux.jl

right?

@avik-pal
Copy link

If you want to test locally yes.

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

Any suggestions on dealing with multiple arguments? Is wrapping them in a ComponentVector always gonna work, or are there non-array structs in the mix?

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

DifferentiationInterface only accepts a single input

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

I'm thinking https://docs.julialang.org/en/v1/base/base/#Base.splat on a ComponentVector

@avik-pal
Copy link

Based on how the tests are written, for multiple arguments, I assume any non-array is non-differentiable (this is a testing package so I can assume that) so these get filtered out in https://github.com/LuxDL/LuxTestUtils.jl/blob/143a51f0d2fb4cbc75ea583c706ff5194be103d2/src/LuxTestUtils.jl#L357-L383. After that there are 2 possibilities -- 1) backend supports multi args so in that case it just forwards it 2) all other cases use a componentarray and create a closure which unflattens the componentarray to provide the correct args.

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

I'll see what I can do once our own testing interface stabilizes. Step one would be to replace your gradient calls, but we can actually aim to replace your entire testing macro

@gdalle
Copy link
Member Author

gdalle commented Mar 27, 2024

@avik-pal
Copy link

I'll see what I can do once our own testing interface stabilizes. Step one would be to replace your gradient calls, but we can actually aim to replace your entire testing macro

correct. I had planned to replace the API with something like skip = [AutoTracker(), ...] and broken = [AutoReverseDiff()...]. But eventually we might use DI

@gdalle gdalle added the downstream Related to downstream compatibility label Mar 28, 2024
@gdalle gdalle changed the title Testing NNLib Testing NNLib / Lux Mar 28, 2024
@gdalle gdalle added the test Related to the testing subpackage label Mar 28, 2024
@gdalle gdalle changed the title Testing NNLib / Lux Testing NNLib / Lux / Flux Jul 15, 2024
@gdalle gdalle mentioned this issue Jul 19, 2024
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
downstream Related to downstream compatibility test Related to the testing subpackage
Projects
None yet
Development

No branches or pull requests

2 participants