-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Any chance this could be wrapped in a python package? #10
Comments
This comment has been minimized.
This comment has been minimized.
Hi Helveg, i think for this project you can use the simplest C++ python wrapping, see here https://docs.python.org/3/extending/extending.html#a-simple-example. But you will still need to split the project into three projects: library with all the code, console program that uses the library, python wrapper that uses the library. I would actually suggest that you simply call the console program from python in a separate process directly, see here: https://docs.python.org/3.6/library/subprocess.html#subprocess.run. This function (subprocess.run) will return when the console program finishes computation (but you can also change this behaviour). And then you just read the output files packing.xyzd and packing.nfo. Usually, this is not the best option for calling C/C++ code because of the overhead to create another process and to parse output text files. But this concern is only relevant when the C/C++ code runs for 1 or 20 or 200 milliseconds. This code typically will run for several minutes - several days (depending on the compression rate and the number of particles; minutes for 10 000 particles, compression rate 1e-2, LS algorithm). So this communication overhead is really negligible. And it will be extremely easy to implement (no changes to the C++ code required, just a few lines of python code to run a process, read a binary array with numpy, and some postprocessing to scale particle diameters as explained in the readme). Also, maybe you can simply pre-generate the packing or packings and then read the resulting files from a python code. It can be even easier. Hope this helps! Best, |
Could you clarify the process here? I'm interested in Python bindings for ease of reuse, though I also don't have much experience here. Also came across:
|
@sgbaird I have more understanding of the process now, and I'd recommend setting up Python bindings with |
Hi @sgbaird , thank you very much for advertising the package and creating the Google Collab notebook! I didn't even know that one can compile programs there on the fly! Amazing! Would you mind if i reference it in the readme directly? I think it can be pretty helpful and makes all the steps clear! We can even put the Notebook code to git in this repo, if you don't mind. If you like, feel free to make a Pull Request yourself directly with the updates to the readme and the notebook. Speaking of Python bindings, i did it last time four(?) years ago, and forgot the details a bit, but i recall that following the documentation was enough. So one has to create a C (or C++) library (a lib or dll) with some functions that follow certain conventions and accept parameters in a certain way and provide some other functions that describe what is available in your library in a way that Python interpreter understands (there should be a function with a predefined name that returns a list of all function names that python can call and a description of their parameters), and then you can import the dll or lib (don't remember) in Python and Python interpreter will be able to pass the calls to your library. But again, maybe it's just not needed, and you can just call the code through an external process (how-to-run-an-exe-file-with-the-arguments-using-python). Which you actually do in the notebook! Best Regards, |
@Helveg , thanks for your readiness to help (even though we don't know each other in person)! |
@VasiliBaranov for sure! Happy to make a PR with the notebooks and README update.
I might give the Python bindings a try another time. For now, I'll probably stick with the notebook functionality 👍 |
Exploring the Python bindings option again. Any thoughts on https://github.com/JeremyBYU/cpp-pybind-skel? |
The end user-friendliness I'm looking for is something like the following (starting from scratch): conda create -n packing_generation python==3.9.*
conda activate packing_generation
pip install packing_generation import numpy as np
from packing_generation import simulate
X = np.repeat(1.0, num_particles)
packing_fraction = simulate(X) cc @MrJasonBear |
Here's what ChatGPT suggests for "I want to expose some C++ code as a Python package and host it on PyPI. What tools should I use to make this happen?" 😛
|
The skeleton looks good. Never heard of SWIG before. I'd advise the pybind11 Getting Started: https://pybind11.readthedocs.io/en/stable/basics.html It reflects the simplicity of defining a Python importable module and functions, which may already provide pretty much everything we need for this project. If you define the cpp part with pybind11, I can add fully functioning wheels for the project :) |
Compare this to the verbose ways of what Python actually requires you to define: |
Hi all, if you need my input, some years ago i used the default Python way and it was indeed a bit verbose, but actually pretty straightforward, no additional libraries or tricks needed, worked without problems for me. SWIG and pybind11 definitely look interesting, but i am not sure if it is worth it for the API surface of this package (a couple of function calls..) Best Regards, |
I'd love to use your packing generator as a way to place different neuronal celltypes in a brain simulation by representing them as packed polydisperse particles. However this project should be easily deployable in Python. I've heard of automatic wrappers such as SIP to allow any C/C++ software to be made into a Python package but I have no experience with it. I was wondering if you'd be interested in assisting me, as I haven't been able to find equally elegant software in the Python package ecosystem.
The text was updated successfully, but these errors were encountered: