Releases: datalad/datalad-metalad
Bug fix release
- ensure pipeline definitions are installed
Fix pipeline installation
- Fix a bug in the installation specification that prevented pipelines from beeing installed
0.3.4
Bugfix release
- Disable result renderer in subdataset in core extractor calls by @christian-monch in #258
- Fix traverser-error on unresolved top_level_dir path argument by @christian-monch in #259
- Update minimal datalad metadata model version to 0.3.1 (this fixes a bug in fetching and pushing of metadata)
- Add an automated pypi release action
Bug fix release
This release fixes one bug:
- Issue #253 :
top-level-dir
argument ofDatasetTraverser
does not handle relative paths correctly.
Bug fix release
Fix missing requirements
Minimal release to add a missing requirement of "six"
Metadata handling, the next generation
This is a new suite of datalad commands to handle metadata. It includes a new storage backend, that does not interfere (and does not inter-operate) with the metadata storage system used in versions < 0.3.0
.
New commands:
The following commands are new, or have a changed syntax and semantic
meta-add
: add metadata to a repositorymeta-dump
: dump metadata stored in a repository.meta-aggregate
: aggregate metadata stored in sub-datasets and store it in the super-datasetmeta-extract
: run extractors to create metadata from repository-contentmeta-filter
: run filter over all or parts of the metadata stored in a repositorymeta-conduct
: execute pipelines that process metadata, e.g. extract metadata from all files in a dataset and add it to the repository. A few pipelines are provided with the release.
New extractors and idexers:
studyminimeta
-extractor: creates metadata from studyminimeta-files, i.e.<dataset-root>/.studyminimeta.yaml
.studyminimeta
-indexer: creates key-value sets fromstudyminimeta
-metadata that is optimized fordatalad search
.
Note
This new suite of metadata commands is more or less a complete rewrite of the previous version. The concept has evolved, the syntax and semantics of the commands have changed, and the documentation is still growing.
At the moment there is no tool to import old metadata into the new system (although that is possible by reading the old metadata, converting each entry to a metadata record and add it to the new system via meta-add
). The next release is intended to provide a tool for that.
Full Changelog: 0.2.1...0.3.0
Fixing the worst
- Minor release to rectify a few general things before development restarts.
- Discontinue support for Python 2
custom
extractor now reports files IDs- Various internal changes to keep pace with DataLad development
- Prevent needless re-extraction of metadata (thanks to @adswa and @bpoldrack for analysis and fix)
Settling in
- New
runprov
metadata extractor fordatalad run
record metadata as W3C PROV meta_report()
was renamed tometa_dump()
to better reflect its purpose and capabilities- Various smallish changes for more robust and streamlined behavior