-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mets:file URL handling: keep remote links #323
Comments
Oh, BTW, this would also offer a chance to write the original remote URL into PAGE's |
That's a great proposal and would also be an option to keep The idea behind the Apart from that, I'm open for the idea, but it will take some time because we have to change file handling in a few places for this (much like your AlternativeImage work, with additional checks and new possible points of failure in the logic). |
Yes, the ZVDD guidelines are pretty restrictive. FLocat isn't repeatable. But it also says |
We could also implement the local_filename stuff as additional FLocat as you propose and have a processor that strips the METS down to ZVDD requirements. |
Sounds good to me. Stripping down or publishing non-persistable parts (and probably ingesting provenance data) would always be one necessary last processor (and probably a institution-specifc one), right? |
Should be revisited now that the OLA-HD client has arrived. |
This has since been implemented in #1079, released in v2.54.0 |
Currently with workspaces we can either keep images on the remote side by using http URLs in
mets:file/mets:FLocat/@xlink:href
(which means they have to be downloaded again and again during processing), or get local filesystem copies with relative paths by cloning withdownload=True
or bagging and spilling (but then the source information will be lost forever).When processing is finished and I want to make my workspace public, I now have to upload my shiny new results in addition to the original images – which I might not even have the rights to publish myself. It would be much better, if the original remote URLs would be used again for that – even if I used local copies in between.
METS-XML allows that: A
mets:FLocat
hasxs:@maxoccurs=unbounded
withinmets:file
, with the following documented semantic:So why don't we keep 2
FLocat
elements in that case, one relative path for local processing and one remote URL for provenance/bookkeeping? When making results public, the local copies could be disposed of again, e.g. when bagging with--manifestation-depth=partial
.The text was updated successfully, but these errors were encountered: