You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the last months I've been using CoW intensively for the transformation of the ANDB datasets (working together with @wouterbeek ). Some of the issues that we encountered might be of interest to the CoW developers community. I am not sure if everything qualifies as an issue, therefore I would like to start by presenting them as a list. If there's anything in here that draws your attention please to not hesitate to contact me.
It's not possible to create literals with datatype xsd:anyURI
To create a language tag it is necessary to use lang:nl and xsd:string together in your CoW script. But actually that combination is not allowed in RDF (not sure if this is a problem, but it is confusing)
The transformation process of a large dataset can take up a looong time (ANDB dataset >2 hours)
It is not always clear what is the purpose of the different metadata graphs that are created. Do you always want to publish those? (As happens when uploading to Druid)
Some triples appear to present properties of the transformation, not of the contents of the dataset
SHACL validation is not supported (Validation of the complete dataset is not possible therefore it should be performed as part of the ETL)
The text was updated successfully, but these errors were encountered:
In the last months I've been using CoW intensively for the transformation of the ANDB datasets (working together with @wouterbeek ). Some of the issues that we encountered might be of interest to the CoW developers community. I am not sure if everything qualifies as an issue, therefore I would like to start by presenting them as a list. If there's anything in here that draws your attention please to not hesitate to contact me.
The text was updated successfully, but these errors were encountered: