-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Ai2z solver #125
Add Ai2z solver #125
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! I added a couple of comments prior to merging; take a look when you get the chance.
glad to see this moving forward 🎉 thanks to the NV team 👍 |
for the nonlinear activation function, the key part is the implementation of |
Good point, we will add a reference there! Speaking of |
unfortunately the support function of a relu cannot be evaluated efficiently afaik (it was discussed with some length in this thread and stuff there could be useful to investigate further.. but i don't have high hopes); hence, one can't lazily work with layers, or do things like iterative refinements of a Rectification efficiently. about concretization using zonotopes as in the new Ai2z implementation, to improve precision one could split the zonotopes along generators, methods which are already available in LazySets and @SebastianGuadalupe was considering to apply that idea to NV (maybe worth opening an issue here?). |
That's a real shame... Thanks for digging up that thread.
We'd be happy to incorporate further improvements to ai2! On another matter, does LazySets offer any of way of calculating/approximating the amount of over-approximation of various operations? For example, after over-approximating with some set/algorithm, can I also ever know the % of over-approximation (without computing the exact set too, of course). |
There is relevant literature from the 90's (see works by Kamenev and also by Lotov), some of which we have implemented in LazySets for the 2D case, see the Polyhedral approximation of a convex set docs; it's available in the LazySets.Approximations.overapproximate functions that have an iterative refinement parameter ε. This parameter is an upper bound on the approximation error, measured in terms of the Hausdorff distance. The n-dimensional case, that we haven't implemented, is conceptually the same as in 2D however an efficient implementation would require familiarity with the double description method (see works by Fukuda); relevant open issues are LazySets#969 and LazySets#1572. |
that is to say, a non-efficient implementation (in the sense that it performs extra work by naively computing the whole hrep/vrep at each pass) is doable and i have it in a notebook somewhere... |
This couldn't scale to higher dimensions, so while it is useful for some of our networks, it wouldn't be for most of them. Good to hear though that there are open issues on the matter already! |
It still remains to integrate this with Ai2, as suggested in this comment: #87 (comment)
I will open an issue to do this in the future