Hey there :)
The crawler saves all generated files locally; the saving location can be configured of course. The aim would be to commit all RDFs to a dedicated repository. This repository specifically :) (since we are hosting LOSH)
So data is available as RDF independently from Wikibase
@hoijui I'm struggling with copying data with bash :) I got the zip file with data in the following structure:
My intention (which may be wrong) is copying all TTLs into the
RDF folder of this repo, sorted by their data source with meaningful file names (project designation + version + okh.ttl).
So e.g. the folder
github includes this file:
Could you help me copying all the other files with that naming scheme into this repository?