Neural Hybrid Differential Equations and Adjoint Sensitivity Analysis
GSoC 2021: The NumFOCUS organization – Final report.
In this project, we have implemented state-of-the-art sensitivity tools for chaotic dynamical systems, continuous adjoint sensitivity methods for hybrid differential equations, as well as a high level API for automatic differentiation.
Possible fields of application for these tools range from model discovery with explicit dosing times in pharmacology, over accurate gradient estimates for chaotic fluid dynamics, to the control of open quantum systems. A more detailed summary is available on the GSoC page.
The following blog posts describe the work throughout the GSoC period in more detail:
- Neural Hybrid Differential Equations
- Shadowing Methods for Forward and Adjoint Sensitivity Analysis of Chaotic Systems
- Sensitivity Analysis of Hybrid Differential Equations
- AbstractDifferentiation.jl for AD-backend agnostic code
Documentation with respect to the adjoint sensitivity tools will be available on the local sensitivity analysis and on the control of automatic differentiation choices pages.
Below is a list of PRs in the various repositories in chronological order.
- Add additive noise downstream test for DiffEqFlux
- DiscreteCallback fixes
- Allow for changes of p in callbacks
- Fix for using the correct uleft/pleft in continuous callback
- Fix broadcasting error on steady state adjoint
- Forward Least Squares Shadowing (LSS)
- Adjoint-mode for the LSS method
- concrete_solve dispatch for LSS methods
- Non-Intrusive Least Square Shadowing (NILSS)
- concrete_solve for NILSS
- Remove allocation in NILSS
- Handle additional callback case
- State-dependent Continuous Callbacks for BacksolveAdjoint
- QuadratureAdjoint() for ContinuousCallback
- More tests for Neural ODEs with callbacks for different sensitivity algorithms
- Support for PeriodicCallbacks in continuous adjoint methods
Besides the implementation of more shadowing methods, such as
we are planning to
- benchmark the new adjoints,
- refine the AbstractDifferentiation.jl package and use it within DiffEqSensitivity.jl,
- add more docs and examples.
If you have any further suggestions or comments, check out our slac/zulip channels #sciml-bridged and #diffeq-bridged or the Julia language discourse.
Many thanks to my mentors Chris Rackauckas, Moritz Schauer, Yingbo Ma, and Mohamed Tarek for their unique, continuous support. It was a great opportunity to be part of such an inspiring collaboration. I highly appreciate our quick and flexible meeting times. I would also like to thank Christoph Bruder, Julian Arnold, and Martin Koppenhöfer for helpful comments on my blog posts. Special thanks to Michael Poli and Stefano Massaroli for their suggestions on adjoints for hybrid differential equations. Finally, thanks to the very supportive julia community and to Google’s open source program for funding this experience!