All notable changes to this project will be documented (for humans) in this file.
The format is based on Keep a Changelog and this project adheres to Semantic Versioning.
Just a quick re-release for PyPI ensuring (manually) that setuptools_scm is present
in the environment while running sdist
so all auxililary files such as setup_tools.py
are included in the distribution.
- "A typical workflow" section in README.md based on AWS EC2 HTCondor cluster
- EC2: easier on user handling of keys, use NITRC-CE AMI by default
- EC2: ability to login to a ec2-condor cluster
Minor feature and bug fix release
- use of etelemetry
- initial support for LSF submitted
- testing of venv tracking
- setting of
DATALAD_SSH_IDENTITYFILE
earlier (in__init__
) for DataLad orchestrator
Feature and bug fix release after a long release silence.
run
- support for SLURM submitterfull-except-datalad
setuptools installation "extra_requires" schemeAwsCondor
resource type to assist in establishing a simple HPC cluster with condor as ubmitted in AWS S3put
,get
of implemented resources got a recursive mode
run
- switch to use use GNU parallel instead of the one from moreutils for local execution- fixed up/improved documentation (in particular for
run
) - use
docker>=3
instead ofdocker-py
- a wide variety of fixes
- python 3.5 support
Yarik needed to do a quick release to absorb changes to run
functionality.
Major rename - a NICEMAN grows into a ReproMan. Too many changes to summarize
reproman run
to execute computation on local or remote resource, with possibility to submit computation to PBS and Condor.
Largely bugfixes and small enhancements. Major work is ongoing in PRs to provide new functionality (such as remote execution and environment comparisons)
- Tracing RPM-based (RedHat, CentOS) environments
- Tracing Singularity images
- A variety of fixes and enhances in tracing details of git, conda, etc resources.
- interactive ssh sessions fixes through use of
fabric
module instead of custom code
- Refactored handling of resource parameters to avoid code duplication/boiler plate
Enhancement and fixes primarily targeting better tracing (collecting information about) of the computational components
- tracing of
- docker images
diff
command to provide summary of differences between two specs- conda environments could be regenerated from the environments
- relative paths could be provided to the
retrace
command
- tracing of Debian packages and Git repositories should be more robust to directories
- handling of older
conda
environments
Minor release with a few fixes and performance enhancements
- Create apt .sources files pointing to APT snapshot repositories
- Batch commands invocations in Debian tracer to significantly speed up retracing
- Output of the (re)traced spec into a file
A minor release to demonstrate retrace
functionality
Just a template for future records:
TODO Summary