Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Roadmap to merge GeometricFlux.jl and GraphNeuralNetworks.jl #132

Closed
yuehhua opened this issue Feb 14, 2022 · 3 comments
Closed

Roadmap to merge GeometricFlux.jl and GraphNeuralNetworks.jl #132

yuehhua opened this issue Feb 14, 2022 · 3 comments

Comments

@yuehhua
Copy link

yuehhua commented Feb 14, 2022

I think we should work together and avoid redundant work. It's no need to compete in the same comminuty.

Could you list the major difference of GraphNeuralNetworks.jl from GeometricFlux.jl?
I am curious about how can we redesign GeometricFlux.jl and keep the strength of GraphNeuralNetworks.jl.

I am thinking to migrate GraphSignals.jl to FluxML and you can put your design there.

@CarloLucibello
Copy link
Member

GeometrixFlux.jl is deeply flawed, as I abundantly stated in multiple issues and rejected PRs over there. It needs a major redesign, but we already have a major redesign, it is GraphNeuralNetworks.jl. Why put again a lot of work into GeometricFlux to obtain the same result? How would this benefit the ecosystem?

I think GeometricFlux should be discontinued and moved out of the FluxML org. The readme should be updated saying that development has moved on to GraphNeuralNetworks.jl. You are very welcome to contribute here: foundations are solid and there is lot of work to do in many different areas (see the open issues).

@yuehhua
Copy link
Author

yuehhua commented Feb 14, 2022

The major redesign should be the implementations, not the framework itself. I have outlined graph network and message-passing neural network in GeometricFlux.jl and the message function and update function should be able to overwrite to customize. I even don't see the update function in GraphNeuralNetworks.jl, not even graph network. Having graph network is to support more general neural netowrks, e.g. DeepSet, which doesn't use message-passing scheme. I am appreciate your soild foundations but it lack higher abstractions in GraphNeuralNetworks.jl.

I am currently busy with my PhD thesis and have little time to work on GeometrixFlux.jl. I think I will get it done in few weeks and get back to GeometrixFlux.jl.

@CarloLucibello
Copy link
Member

A layer here can have multiple update and message functions, you can pass arbitrary closures to propagate. Any message passing can be achieved by using apply_edges and propagate. This is strictly more flexible than any abstraction defined in GeometricFlux and allows for efficient specializations based on the message function type. Maybe you should take some time to explore the domentation and the codebase.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants