- refactoring template formatting for
input_spec
- fixing issues with input fields with extension (and using them in templates)
- adding simple validators to input spec (using
attr.validator
) - adding
create_dotfile
for workflows, that creates graphs as dotfiles (can convert to other formats if dot available) - adding a simple user guide with
input_spec
description - expanding docstrings for
State
,audit
andmessenger
- updating syntax to newer python
- refactoring the error handling by padra: improving raised errors, removing nodes from the workflow graph that can't be run
- refactoring of the
input_spec
: adapting better to the nipype interfaces - switching from
pkg_resources.declare_namespace
to the stdlibpkgutil.extend_path
- moving
readme
to rst format
- Use pkgutil to declare
pydra.tasks
as a namespace package, ensuring better support for editable mode.
- Add
pydra.tasks
namespace package to enable separate packages ofTask
s to be installed intopydra.tasks
. - Raise error when task or workflow name conflicts with names of attributes, methods, or other tasks already added to workflow
- Mention
requirements.txt
in README
- removing the tutorial to a separate repo
- adding windows tests to codecov
- accepting
None
as a valid output from aFunctionTask
, also for function that returns multiple values - fixing slurm error files
- adding
wf._connection
tochecksum
- allowing for updates of
wf._connections
- editing output, so it works with
numpy.arrays
- removing
to_job
and pickling task instead (workers read the tasks and set the proper input, so the multiple copies of the input are not kept in the memory) - adding standalone function
load_and_run
that can load and run a task from a pickle file - removing
create_pyscript
and simplifying the slurm worker - improving error reports in errors flies
- fixing
make_class
so theOutput
is properly formatted
- fixing
hash_dir
function - adding
get_available_cpus
to get the number of CPUs available to the current process or available on the system - adding simple implementation for
BoshTask
that uses boutiques descriptor - adding azure to CI
- fixing code for windows
- etelementry updates
- adding more verbose output for task
result
- returns values or indices for input fields - adding an experimental implementation of Dask Worker (limited testing with ci)
- reorganization of the
State
class, fixing small issues with the class - fixing some paths issues on windows os
- adding osx and window sto the travis runs (right now allowing for failures for windows)
- adding
PydraStateError
for exception in theState
class - small fixes to the hashing functions, adding more tests
- adding
hash_dir
to calculate hash forDirectory
type
- passing
wf.cache_locations
to the task - using
rerun
from submitter to all task - adding
test_rerun
andpropagate_rerun
for workflows - fixing task with a full combiner
- adding
cont_dim
to specify dimensionality of the input variables (how much the input is nested)
- adding sphinx documentation
- moving from
dataclasses
toattrs
- adding
container
flag to theShellCommandTask
- fixing
cmdline
,command_args
andcontainer_args
for tasks with states - adding
CONTRIBUTING.md
- fixing hash calculations for inputs with a list of files
- using
attr.NOTHING
for input that is not set
- supporting tuple as a single element of an input
- fixing: nodes with states and input fields (from splitter) that are empty were failing
- big changes in
ShellTask
,DockerTask
andSingularityTask
- customized input specification and output specification for
Task
s - adding singularity checks to Travis CI
- binding all input files to the container
- customized input specification and output specification for
- big changes in
- changes in
Workflow
- passing all outputs to the next node:
lzout.all_
- fixing inner splitter
- passing all outputs to the next node:
- changes in
- allowing for
splitter
andcombiner
updates - adding
etelementry
support
- Core dataflow creation and management API
- Distributed workers:
- concurrent futures
- SLURM
- Notebooks for Pydra concepts
Initial Pydra Dataflow Engine release.