Others out there, such as:<p><pre><code> - DataVerse: https://dataverse.org
- Omeka: https://omeka.org/s/
- also many internally developed archival systems
</code></pre>
To us it's mostly up to the granularity of the metadata, and wether metadata fields support e.g. ISO standards for validation (which is why it's difficult for a single metadata standard to rule all, unless all you need is free text descriptions). Another need is a way to batch ingest.<p><pre><code> > Compatible with hundreds of formats
</code></pre>
I'm becoming increasingly critical of file format verification as a gatekeeper for digital preservation, but perhaps someone can explain why I'm wrong. Striving towards open specifications is of course a must for long term preservation (cave paintings persevere, while digital data is fragile in so many ways), but if conversion incurs data loss, parallel archiving should be an option. Worse, we've had issues with old systems that deny archiving data althogether because of an automated format checker standing in the way (we'll use FITS [0] in the future, but that just uses a bunch of other type checkers under the hood and doesn't seem bomb proof from the little testing I've done). Formats such as MP4 also seem like a nightmare to validate (granted, lots of cameras out there ignore following specifications). But archiving nothing at all over a few proprietary formats every now and then is a horrible outcome (see experience above). At the very least, it must be possible to override automated format checking if necessary.<p><pre><code> [0]: https://projects.iq.harvard.edu/fits</code></pre>