Compare commits
4 commits
87dcd4d65c
...
34754cc68a
| Author | SHA1 | Date | |
|---|---|---|---|
| 34754cc68a | |||
| cd85a66c46 | |||
| d2481d6e80 | |||
| da22d312a9 |
34 changed files with 552962 additions and 2 deletions
1902
Cargo.lock
generated
Normal file
1902
Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load diff
18
Cargo.toml
Normal file
18
Cargo.toml
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
[package]
|
||||
name = "mapping"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[[bin]]
|
||||
name = "step-01"
|
||||
path = "src/map/step_01.rs"
|
||||
|
||||
[[bin]]
|
||||
name = "step-02"
|
||||
path = "src/map/step_02.rs"
|
||||
|
||||
[dependencies]
|
||||
sophia = "0.9"
|
||||
oxigraph = "*"
|
||||
mysql = "*"
|
||||
urlencoding = "*"
|
||||
8
Dockerfile
Normal file
8
Dockerfile
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
FROM mariadb:10.11
|
||||
|
||||
ENV MARIADB_ROOT_PASSWORD=root
|
||||
ENV MARIADB_DATABASE=migrants
|
||||
ENV MARIADB_USER=migrants
|
||||
ENV MARIADB_PASSWORD=1234
|
||||
|
||||
COPY teatre-migrants.sql /docker-entrypoint-initdb.d/
|
||||
144
README.md
144
README.md
|
|
@ -1,3 +1,143 @@
|
|||
# migrants
|
||||
# Theatre Migrants
|
||||
|
||||
To generate a knowledge graph about migrants in the theater in Europe.
|
||||
To generate a knowledge graph about migrants in the theatre in Europe.
|
||||
|
||||
## Running the scripts
|
||||
|
||||
The mapping scripts have been reimplemented in Rust for faster execution. Both
|
||||
scripts must be run from this directory (`mapping/`).
|
||||
|
||||
**Prerequisites:** Start the MariaDB container before running step 1:
|
||||
|
||||
```sh
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
**Step 1** — Direct Mapping from MariaDB to RDF (`data/graph-01.ttl`):
|
||||
|
||||
```sh
|
||||
cargo run --release --bin step-01
|
||||
```
|
||||
|
||||
**Step 2** — Apply SPARQL UPDATE queries (`data/graph-02.ttl`):
|
||||
|
||||
```sh
|
||||
cargo run --release --bin step-02
|
||||
```
|
||||
|
||||
Alternatively, after installing with `cargo install --path .`:
|
||||
|
||||
```sh
|
||||
step-01
|
||||
step-02
|
||||
```
|
||||
|
||||
## Generating the ontology
|
||||
|
||||
Next there are set of steps describing how to generate the migrants RDF graph.
|
||||
|
||||
### Step 1 - Loading the input data into a relational database
|
||||
|
||||
#### Task
|
||||
|
||||
The file `teatre-migrants.sql` contains the dump of a MariaDB database. The tables involved in this schema are described in the file `db_schema.md`. We will load this data in MariaDB to access the data with SQL. To this end:
|
||||
|
||||
1. Create a Dockerfile to create a docker container for MariaDB.
|
||||
|
||||
2. Upload the dump into a database in the container.
|
||||
|
||||
3. Create a Ruby script `map/step-01.rb` that uses the gem `sequel` to connect to the database. This Ruby script should return a file called `graph-01.ttl` containing all the data from the tables loaded in the database using the direct mapping from relational databases to RDF.
|
||||
|
||||
#### Summary
|
||||
|
||||
The `Dockerfile` creates a MariaDB 10.11 container that automatically loads `teatre-migrants.sql` on first start. The `docker-compose.yml` exposes the database on port 3306 with a healthcheck.
|
||||
|
||||
The script `map/step-01.rb` connects to the database via `sequel` and implements the [W3C Direct Mapping](https://www.w3.org/TR/rdb-direct-mapping/) for all 9 tables (`location`, `migration_table`, `organisation`, `person`, `person_profession`, `personnames`, `relationship`, `religions`, `work`). Each table row becomes an RDF resource identified by its primary key, each column becomes a datatype property, and each foreign key becomes an object property linking to the referenced row. The output file `graph-01.ttl` contains 162,029 triples.
|
||||
|
||||
To run:
|
||||
|
||||
```sh
|
||||
docker compose up -d
|
||||
bundle exec ruby map/step-01.rb
|
||||
```
|
||||
|
||||
### Step 2 - Generate Objects
|
||||
|
||||
Continents and countries should be objects instead of literals. To this end, we can transform the following data:
|
||||
|
||||
```
|
||||
base:location\/ARG-BahBlanca-00 a base:location;
|
||||
base:location\#City "Bahia Blanca";
|
||||
base:location\#Continent "South America";
|
||||
base:location\#Country "Argentina";
|
||||
base:location\#GeoNamesID "3865086";
|
||||
base:location\#IDLocation "ARG-BahBlanca-00";
|
||||
base:location\#latitude -3.87253e1;
|
||||
base:location\#longitude -6.22742e1;
|
||||
base:location\#wikidata "Q54108";
|
||||
base:location\#wikipedia "https://en.wikipedia.org/wiki/Bah%C3%ADa_Blanca" .
|
||||
```
|
||||
|
||||
Into the following data:
|
||||
|
||||
```
|
||||
base:location\/ARG-BahBlanca-00 a base:location;
|
||||
base:location\#City base:City-BahiaBlanca;
|
||||
base:location\#Continent base:Continent-SouthAmerica;
|
||||
base:location\#Country base:Country-Argentina;
|
||||
base:location\#GeoNamesID "3865086";
|
||||
base:location\#IDLocation "ARG-BahBlanca-00";
|
||||
base:location\#latitude -3.87253e1;
|
||||
base:location\#longitude -6.22742e1;
|
||||
base:location\#wikidata "Q54108";
|
||||
base:location\#wikipedia "https://en.wikipedia.org/wiki/Bah%C3%ADa_Blanca" .
|
||||
|
||||
base:City-BahiaBlanca a base:City;
|
||||
rdfs:label "Bahia Blanca"@en .
|
||||
|
||||
base:Continent-SouthAmerica a base:Continent;
|
||||
rdfs:label "South America"@en .
|
||||
|
||||
base:Country-Argentina a base:Country;
|
||||
rdfs:label "Argentina"@en .
|
||||
```
|
||||
|
||||
Notice that all ranges of property `rdfs:label` are stated to be in English.
|
||||
|
||||
Generate an SPARQL UPDATE query that do this tranformation for all elements of the table and save it a new folder called `updates`. Do the same with the other tables, proposing which columns should be defined as objects. For every table define a different SPARQL UPDATE query and to be saved in the `updates` folder. Enumerate these generated queries adding a prefix number like 001, 002, 003, and so on.
|
||||
|
||||
After generating the update queries, generate a Ruby script that executes the updates on the RDF graph generated in the previous step and generates a new RDF graph to be saved: `data/graph-02.ttl`.
|
||||
|
||||
#### Summary
|
||||
|
||||
19 SPARQL UPDATE queries in `updates/` transform literal values into typed objects across all tables:
|
||||
|
||||
| Query | Table | Column | Object type |
|
||||
|-------|-------|--------|-------------|
|
||||
| 001 | location | Continent | Continent |
|
||||
| 002 | location | Country | Country |
|
||||
| 003 | location | State | State |
|
||||
| 004 | location | City | City |
|
||||
| 005 | migration_table | reason | MigrationReason |
|
||||
| 006 | migration_table | reason2 | MigrationReason |
|
||||
| 007 | organisation | InstType | InstitutionType |
|
||||
| 008 | person | gender | Gender |
|
||||
| 009 | person | Nametype | Nametype |
|
||||
| 010 | person | Importsource | ImportSource |
|
||||
| 011 | person_profession | Eprofession | Profession |
|
||||
| 012 | personnames | Nametype | Nametype |
|
||||
| 013 | relationship | Relationshiptype | RelationshipType |
|
||||
| 014 | relationship | relationshiptype_precise | RelationshipTypePrecise |
|
||||
| 015 | religions | religion | Religion |
|
||||
| 016 | work | Profession | Profession |
|
||||
| 017 | work | Profession2 | Profession |
|
||||
| 018 | work | Profession3 | Profession |
|
||||
| 019 | work | EmploymentType | EmploymentType |
|
||||
|
||||
Each query replaces a literal value with an object reference and creates the object with `rdf:type` and `rdfs:label` (in English). The script `map/step-02.rb` loads `data/graph-01.ttl`, applies all queries in order, and writes `data/graph-02.ttl` (164,632 triples).
|
||||
|
||||
To run:
|
||||
|
||||
```sh
|
||||
bundle exec ruby map/step-02.rb
|
||||
```
|
||||
|
|
|
|||
182044
data/graph-01.ttl
Normal file
182044
data/graph-01.ttl
Normal file
File diff suppressed because it is too large
Load diff
185949
data/graph-02.ttl
Normal file
185949
data/graph-02.ttl
Normal file
File diff suppressed because it is too large
Load diff
10
docker-compose.yml
Normal file
10
docker-compose.yml
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
services:
|
||||
db:
|
||||
build: .
|
||||
ports:
|
||||
- "3306:3306"
|
||||
healthcheck:
|
||||
test: ["CMD", "healthcheck.sh", "--connect", "--innodb_initialized"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 10
|
||||
182045
graph-01.ttl
Normal file
182045
graph-01.ttl
Normal file
File diff suppressed because it is too large
Load diff
139
map/step-01.rb
Normal file
139
map/step-01.rb
Normal file
|
|
@ -0,0 +1,139 @@
|
|||
#!/usr/bin/env ruby
|
||||
# frozen_string_literal: true
|
||||
|
||||
# Step 1: Direct Mapping from relational database to RDF
|
||||
# Implements the W3C Direct Mapping (https://www.w3.org/TR/rdb-direct-mapping/)
|
||||
|
||||
require 'sequel'
|
||||
require 'rdf'
|
||||
require 'rdf/turtle'
|
||||
|
||||
BASE_IRI = 'http://example.org/migrants/'
|
||||
|
||||
DB = Sequel.mysql2(host: '127.0.0.1', port: 3306, user: 'migrants', database: 'migrants', password: '1234')
|
||||
|
||||
# Foreign key definitions: table -> { column -> [referenced_table, referenced_column] }
|
||||
FOREIGN_KEYS = {
|
||||
migration_table: {
|
||||
IDPerson: [:person, :IDPerson],
|
||||
IDStartPlace: [:location, :IDLocation],
|
||||
IDDestPlace: [:location, :IDLocation]
|
||||
},
|
||||
organisation: {
|
||||
IDLocation: [:location, :IDLocation]
|
||||
},
|
||||
person: {
|
||||
IDBirthPlace: [:location, :IDLocation],
|
||||
IDDeathPlace: [:location, :IDLocation]
|
||||
},
|
||||
personnames: {
|
||||
IDPerson: [:person, :IDPerson]
|
||||
},
|
||||
person_profession: {
|
||||
IDPerson: [:person, :IDPerson]
|
||||
},
|
||||
relationship: {
|
||||
IDPerson_active: [:person, :IDPerson],
|
||||
IDPerson_passive: [:person, :IDPerson],
|
||||
IDLocation: [:location, :IDLocation],
|
||||
IDOrganisation: [:organisation, :IDOrganisation]
|
||||
},
|
||||
religions: {
|
||||
IDPerson: [:person, :IDPerson]
|
||||
},
|
||||
work: {
|
||||
IDPerson: [:person, :IDPerson],
|
||||
IDLocation: [:location, :IDLocation],
|
||||
IDOrganisation: [:organisation, :IDOrganisation],
|
||||
IDOrganisation2: [:organisation, :IDOrganisation]
|
||||
}
|
||||
}.freeze
|
||||
|
||||
# Primary keys for each table
|
||||
PRIMARY_KEYS = {
|
||||
location: :IDLocation,
|
||||
migration_table: :IDMig,
|
||||
organisation: :IDOrganisation,
|
||||
person: :IDPerson,
|
||||
person_profession: :IDProfPerson,
|
||||
personnames: :IDPersonname,
|
||||
relationship: :IDRel,
|
||||
religions: :IDReligion,
|
||||
work: :IDWork
|
||||
}.freeze
|
||||
|
||||
def row_iri(table, pk_value)
|
||||
RDF::URI.new("#{BASE_IRI}#{table}/#{URI.encode_www_form_component(pk_value.to_s)}")
|
||||
end
|
||||
|
||||
def sanitize_name(name)
|
||||
name.to_s.gsub(/[^a-zA-Z0-9_-]/, '_').gsub(/_+/, '_').gsub(/\A_+|_+\z/, '')
|
||||
end
|
||||
|
||||
def column_iri(table, column)
|
||||
RDF::URI.new("#{BASE_IRI}#{table}##{sanitize_name(column)}")
|
||||
end
|
||||
|
||||
def class_iri(table)
|
||||
RDF::URI.new("#{BASE_IRI}#{table}")
|
||||
end
|
||||
|
||||
def ref_iri(table, fk_col)
|
||||
RDF::URI.new("#{BASE_IRI}#{table}#ref-#{sanitize_name(fk_col)}")
|
||||
end
|
||||
|
||||
def to_rdf_literal(value)
|
||||
case value
|
||||
when Integer
|
||||
RDF::Literal.new(value, datatype: RDF::XSD.integer)
|
||||
when Float
|
||||
RDF::Literal.new(value, datatype: RDF::XSD.double)
|
||||
when Date
|
||||
RDF::Literal.new(value.to_s, datatype: RDF::XSD.date)
|
||||
when Time, DateTime
|
||||
RDF::Literal.new(value.to_s, datatype: RDF::XSD.dateTime)
|
||||
when TrueClass, FalseClass
|
||||
RDF::Literal.new(value, datatype: RDF::XSD.boolean)
|
||||
else
|
||||
RDF::Literal.new(value.to_s)
|
||||
end
|
||||
end
|
||||
|
||||
graph = RDF::Graph.new
|
||||
|
||||
PRIMARY_KEYS.each do |table, pk_col|
|
||||
fk_defs = FOREIGN_KEYS.fetch(table, {})
|
||||
|
||||
DB[table].each do |row|
|
||||
pk_value = row[pk_col]
|
||||
subject = row_iri(table, pk_value)
|
||||
|
||||
# rdf:type
|
||||
graph << [subject, RDF.type, class_iri(table)]
|
||||
|
||||
row.each do |col, value|
|
||||
next if value.nil?
|
||||
|
||||
col_sym = col.to_sym
|
||||
|
||||
if fk_defs.key?(col_sym)
|
||||
# Foreign key -> object property linking to referenced row
|
||||
ref_table, _ref_col = fk_defs[col_sym]
|
||||
graph << [subject, ref_iri(table, col), row_iri(ref_table, value)]
|
||||
else
|
||||
# Regular column -> datatype property
|
||||
graph << [subject, column_iri(table, col), to_rdf_literal(value)]
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
output_path = File.expand_path('../data/graph-01.ttl', __dir__)
|
||||
RDF::Turtle::Writer.open(output_path, prefixes: {
|
||||
rdf: RDF.to_uri,
|
||||
xsd: RDF::XSD.to_uri
|
||||
}) do |writer|
|
||||
graph.each_statement { |stmt| writer << stmt }
|
||||
end
|
||||
|
||||
puts "Written #{graph.count} triples to #{output_path}"
|
||||
38
map/step-02.rb
Normal file
38
map/step-02.rb
Normal file
|
|
@ -0,0 +1,38 @@
|
|||
#!/usr/bin/env ruby
|
||||
# frozen_string_literal: true
|
||||
|
||||
# Step 2: Transform literal values into RDF objects using SPARQL UPDATE queries
|
||||
|
||||
require 'rdf'
|
||||
require 'rdf/turtle'
|
||||
require 'sparql'
|
||||
|
||||
input_path = File.expand_path('../data/graph-01.ttl', __dir__)
|
||||
output_path = File.expand_path('../data/graph-02.ttl', __dir__)
|
||||
updates_dir = File.expand_path('../updates', __dir__)
|
||||
|
||||
puts "Loading graph from #{input_path}..."
|
||||
graph = RDF::Graph.load(input_path)
|
||||
puts "Loaded #{graph.count} triples."
|
||||
|
||||
Dir.glob(File.join(updates_dir, '*.rq')).sort.each do |query_file|
|
||||
query = File.read(query_file)
|
||||
name = File.basename(query_file)
|
||||
|
||||
before = graph.count
|
||||
SPARQL.execute(query, graph, update: true)
|
||||
after = graph.count
|
||||
puts "Applied #{name}: #{before} -> #{after} triples (#{after - before >= 0 ? '+' : ''}#{after - before})"
|
||||
end
|
||||
|
||||
puts "Writing #{graph.count} triples to #{output_path}..."
|
||||
RDF::Turtle::Writer.open(output_path, prefixes: {
|
||||
rdf: RDF.to_uri,
|
||||
rdfs: RDF::RDFS.to_uri,
|
||||
xsd: RDF::XSD.to_uri,
|
||||
base: RDF::URI.new('http://example.org/migrants/')
|
||||
}) do |writer|
|
||||
graph.each_statement { |stmt| writer << stmt }
|
||||
end
|
||||
|
||||
puts "Done."
|
||||
4
queries/list_classes.sparql
Normal file
4
queries/list_classes.sparql
Normal file
|
|
@ -0,0 +1,4 @@
|
|||
SELECT DISTINCT ?class WHERE {
|
||||
?s a ?class .
|
||||
}
|
||||
ORDER BY ?class
|
||||
5
queries/list_properties.sparql
Normal file
5
queries/list_properties.sparql
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
SELECT DISTINCT ?property WHERE {
|
||||
?s ?property ?o .
|
||||
FILTER(?property != <http://www.w3.org/1999/02/22-rdf-syntax-ns#type>)
|
||||
}
|
||||
ORDER BY ?property
|
||||
233
src/map/step_01.rs
Normal file
233
src/map/step_01.rs
Normal file
|
|
@ -0,0 +1,233 @@
|
|||
/// Step 1: Direct Mapping from relational database to RDF.
|
||||
///
|
||||
/// Connects to MariaDB and produces `data/graph-01.ttl` following the
|
||||
/// W3C Direct Mapping specification.
|
||||
///
|
||||
/// Usage: Run from the mapping project directory:
|
||||
/// cargo run --release --bin step-01
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::fs;
|
||||
use std::io::BufWriter;
|
||||
|
||||
use mysql::prelude::*;
|
||||
use mysql::*;
|
||||
use sophia::api::ns::{rdf, xsd};
|
||||
use sophia::api::prelude::*;
|
||||
use sophia::api::prefix::{Prefix, PrefixMapPair};
|
||||
use sophia::api::term::SimpleTerm;
|
||||
use sophia::turtle::serializer::turtle::{TurtleConfig, TurtleSerializer};
|
||||
|
||||
type MyGraph = Vec<[SimpleTerm<'static>; 3]>;
|
||||
|
||||
const BASE_IRI: &str = "http://example.org/migrants/";
|
||||
|
||||
/// Primary keys for each table.
|
||||
const PRIMARY_KEYS: &[(&str, &str)] = &[
|
||||
("location", "IDLocation"),
|
||||
("migration_table", "IDMig"),
|
||||
("organisation", "IDOrganisation"),
|
||||
("person", "IDPerson"),
|
||||
("person_profession", "IDProfPerson"),
|
||||
("personnames", "IDPersonname"),
|
||||
("relationship", "IDRel"),
|
||||
("religions", "IDReligion"),
|
||||
("work", "IDWork"),
|
||||
];
|
||||
|
||||
/// Foreign key definitions: (table, column, referenced_table).
|
||||
const FOREIGN_KEYS: &[(&str, &str, &str)] = &[
|
||||
("migration_table", "IDPerson", "person"),
|
||||
("migration_table", "IDStartPlace", "location"),
|
||||
("migration_table", "IDDestPlace", "location"),
|
||||
("organisation", "IDLocation", "location"),
|
||||
("person", "IDBirthPlace", "location"),
|
||||
("person", "IDDeathPlace", "location"),
|
||||
("personnames", "IDPerson", "person"),
|
||||
("person_profession", "IDPerson", "person"),
|
||||
("relationship", "IDPerson_active", "person"),
|
||||
("relationship", "IDPerson_passive", "person"),
|
||||
("relationship", "IDLocation", "location"),
|
||||
("relationship", "IDOrganisation", "organisation"),
|
||||
("religions", "IDPerson", "person"),
|
||||
("work", "IDPerson", "person"),
|
||||
("work", "IDLocation", "location"),
|
||||
("work", "IDOrganisation", "organisation"),
|
||||
("work", "IDOrganisation2", "organisation"),
|
||||
];
|
||||
|
||||
fn build_fk_map() -> HashMap<(&'static str, &'static str), &'static str> {
|
||||
let mut map = HashMap::new();
|
||||
for &(table, col, ref_table) in FOREIGN_KEYS {
|
||||
map.insert((table, col), ref_table);
|
||||
}
|
||||
map
|
||||
}
|
||||
|
||||
fn row_iri(table: &str, pk_value: &str) -> SimpleTerm<'static> {
|
||||
let encoded = urlencoding::encode(pk_value);
|
||||
SimpleTerm::Iri(
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
format!("{}{}/{}", BASE_IRI, table, encoded).into(),
|
||||
),
|
||||
)
|
||||
}
|
||||
|
||||
fn class_iri(table: &str) -> SimpleTerm<'static> {
|
||||
SimpleTerm::Iri(
|
||||
sophia::api::term::IriRef::new_unchecked(format!("{}{}", BASE_IRI, table).into()),
|
||||
)
|
||||
}
|
||||
|
||||
fn column_iri(table: &str, column: &str) -> SimpleTerm<'static> {
|
||||
SimpleTerm::Iri(
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
format!("{}{}#{}", BASE_IRI, table, column).into(),
|
||||
),
|
||||
)
|
||||
}
|
||||
|
||||
fn ref_iri(table: &str, fk_col: &str) -> SimpleTerm<'static> {
|
||||
SimpleTerm::Iri(
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
format!("{}{}#ref-{}", BASE_IRI, table, fk_col).into(),
|
||||
),
|
||||
)
|
||||
}
|
||||
|
||||
fn rdf_type_term() -> SimpleTerm<'static> {
|
||||
SimpleTerm::Iri(sophia::api::term::IriRef::new_unchecked(
|
||||
rdf::type_.iri().unwrap().as_str().to_string().into(),
|
||||
))
|
||||
}
|
||||
|
||||
fn to_rdf_literal(value: &Value) -> Option<SimpleTerm<'static>> {
|
||||
match value {
|
||||
Value::NULL => None,
|
||||
Value::Int(i) => Some(SimpleTerm::LiteralDatatype(
|
||||
i.to_string().into(),
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
xsd::integer.iri().unwrap().as_str().to_string().into(),
|
||||
),
|
||||
)),
|
||||
Value::UInt(u) => Some(SimpleTerm::LiteralDatatype(
|
||||
u.to_string().into(),
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
xsd::integer.iri().unwrap().as_str().to_string().into(),
|
||||
),
|
||||
)),
|
||||
Value::Float(f) => Some(SimpleTerm::LiteralDatatype(
|
||||
f.to_string().into(),
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
xsd::double.iri().unwrap().as_str().to_string().into(),
|
||||
),
|
||||
)),
|
||||
Value::Double(d) => Some(SimpleTerm::LiteralDatatype(
|
||||
d.to_string().into(),
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
xsd::double.iri().unwrap().as_str().to_string().into(),
|
||||
),
|
||||
)),
|
||||
Value::Date(year, month, day, _, _, _, _) => Some(SimpleTerm::LiteralDatatype(
|
||||
format!("{:04}-{:02}-{:02}", year, month, day).into(),
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
xsd::date.iri().unwrap().as_str().to_string().into(),
|
||||
),
|
||||
)),
|
||||
Value::Bytes(b) => {
|
||||
let s = String::from_utf8_lossy(b).into_owned();
|
||||
Some(SimpleTerm::LiteralDatatype(
|
||||
s.into(),
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
xsd::string.iri().unwrap().as_str().to_string().into(),
|
||||
),
|
||||
))
|
||||
}
|
||||
_ => {
|
||||
// Treat everything else as a plain string literal
|
||||
let s: String = from_value(value.clone());
|
||||
Some(SimpleTerm::LiteralDatatype(
|
||||
s.into(),
|
||||
sophia::api::term::IriRef::new_unchecked(
|
||||
xsd::string.iri().unwrap().as_str().to_string().into(),
|
||||
),
|
||||
))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
let fk_map = build_fk_map();
|
||||
|
||||
let url = "mysql://migrants:1234@127.0.0.1:3306/migrants";
|
||||
let pool = Pool::new(url)?;
|
||||
let mut conn = pool.get_conn()?;
|
||||
|
||||
let mut graph = MyGraph::new();
|
||||
let rdf_type = rdf_type_term();
|
||||
|
||||
for &(table, pk_col) in PRIMARY_KEYS {
|
||||
let query = format!("SELECT * FROM `{}`", table);
|
||||
let rows: Vec<Row> = conn.query(query)?;
|
||||
|
||||
for row in &rows {
|
||||
let pk_value: String = row
|
||||
.get::<Value, _>(pk_col)
|
||||
.map(|v| from_value::<String>(v))
|
||||
.unwrap_or_default();
|
||||
let subject = row_iri(table, &pk_value);
|
||||
|
||||
// rdf:type
|
||||
graph.push([subject.clone(), rdf_type.clone(), class_iri(table)]);
|
||||
|
||||
let columns = row.columns_ref();
|
||||
for (i, col_def) in columns.iter().enumerate() {
|
||||
let col_name = col_def.name_str().to_string();
|
||||
let value: Value = row.get::<Value, _>(i).unwrap_or(Value::NULL);
|
||||
|
||||
if value == Value::NULL {
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Some(&ref_table) = fk_map.get(&(table, col_name.as_str())) {
|
||||
// Foreign key -> object property
|
||||
let fk_value: String = from_value(value);
|
||||
graph.push([
|
||||
subject.clone(),
|
||||
ref_iri(table, &col_name),
|
||||
row_iri(ref_table, &fk_value),
|
||||
]);
|
||||
} else {
|
||||
// Regular column -> datatype property
|
||||
if let Some(literal) = to_rdf_literal(&value) {
|
||||
graph.push([subject.clone(), column_iri(table, &col_name), literal]);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let output_path = "data/graph-01.ttl";
|
||||
fs::create_dir_all("data")?;
|
||||
let file = fs::File::create(output_path)?;
|
||||
let writer = BufWriter::new(file);
|
||||
|
||||
let mut prefix_map: Vec<PrefixMapPair> = TurtleConfig::default_prefix_map();
|
||||
prefix_map.push((
|
||||
Prefix::new_unchecked("rdf".into()),
|
||||
Iri::new_unchecked("http://www.w3.org/1999/02/22-rdf-syntax-ns#".into()),
|
||||
));
|
||||
prefix_map.push((
|
||||
Prefix::new_unchecked("xsd".into()),
|
||||
Iri::new_unchecked("http://www.w3.org/2001/XMLSchema#".into()),
|
||||
));
|
||||
let config = TurtleConfig::new().with_own_prefix_map(prefix_map);
|
||||
|
||||
let mut serializer = TurtleSerializer::new_with_config(writer, config);
|
||||
serializer.serialize_graph(&graph)?;
|
||||
|
||||
let count = graph.len();
|
||||
eprintln!("Written {} triples to {}", count, output_path);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
84
src/map/step_02.rs
Normal file
84
src/map/step_02.rs
Normal file
|
|
@ -0,0 +1,84 @@
|
|||
/// Step 2: Apply SPARQL UPDATE queries to transform the RDF graph.
|
||||
///
|
||||
/// Loads `data/graph-01.ttl`, applies all SPARQL UPDATE queries from the
|
||||
/// `updates/` directory (sorted alphabetically), and writes the result
|
||||
/// to `data/graph-02.ttl`.
|
||||
///
|
||||
/// Usage: Run from the mapping project directory:
|
||||
/// cargo run --release --bin step-02
|
||||
|
||||
use std::fs;
|
||||
|
||||
use oxigraph::io::{RdfFormat, RdfParser};
|
||||
use oxigraph::model::GraphNameRef;
|
||||
use oxigraph::store::Store;
|
||||
|
||||
fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
let input_path = "data/graph-01.ttl";
|
||||
let output_path = "data/graph-02.ttl";
|
||||
let updates_dir = "updates";
|
||||
|
||||
// Create in-memory store and load input graph
|
||||
let store = Store::new()?;
|
||||
|
||||
eprintln!("Loading graph from {}...", input_path);
|
||||
let input = fs::File::open(input_path)?;
|
||||
let reader = std::io::BufReader::new(input);
|
||||
let parser = RdfParser::from_format(RdfFormat::Turtle)
|
||||
.without_named_graphs()
|
||||
.with_default_graph(GraphNameRef::DefaultGraph);
|
||||
store.load_from_reader(parser, reader)?;
|
||||
|
||||
let initial_count = count_triples(&store);
|
||||
eprintln!("Loaded {} triples.", initial_count);
|
||||
|
||||
// Read and sort SPARQL UPDATE files
|
||||
let mut update_files: Vec<_> = fs::read_dir(updates_dir)?
|
||||
.filter_map(|e| e.ok())
|
||||
.map(|e| e.path())
|
||||
.filter(|p| {
|
||||
p.extension()
|
||||
.and_then(|e| e.to_str())
|
||||
.map_or(false, |e| e == "rq")
|
||||
})
|
||||
.collect();
|
||||
update_files.sort();
|
||||
|
||||
// Apply each SPARQL UPDATE query
|
||||
for query_file in &update_files {
|
||||
let query = fs::read_to_string(query_file)?;
|
||||
let name = query_file
|
||||
.file_name()
|
||||
.and_then(|n| n.to_str())
|
||||
.unwrap_or("unknown");
|
||||
|
||||
let before = count_triples(&store);
|
||||
store.update(&query)?;
|
||||
let after = count_triples(&store);
|
||||
|
||||
let diff = after as i64 - before as i64;
|
||||
let sign = if diff >= 0 { "+" } else { "" };
|
||||
eprintln!(
|
||||
"Applied {}: {} -> {} triples ({}{})",
|
||||
name, before, after, sign, diff
|
||||
);
|
||||
}
|
||||
|
||||
let final_count = count_triples(&store);
|
||||
eprintln!("Writing {} triples to {}...", final_count, output_path);
|
||||
|
||||
// Dump store to Turtle
|
||||
fs::create_dir_all("data")?;
|
||||
let output = fs::File::create(output_path)?;
|
||||
let writer = std::io::BufWriter::new(output);
|
||||
store.dump_graph_to_writer(GraphNameRef::DefaultGraph, RdfFormat::Turtle, writer)?;
|
||||
|
||||
eprintln!("Done.");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn count_triples(store: &Store) -> usize {
|
||||
store
|
||||
.quads_for_pattern(None, None, None, None)
|
||||
.count()
|
||||
}
|
||||
40
src/schema.rb
Executable file
40
src/schema.rb
Executable file
|
|
@ -0,0 +1,40 @@
|
|||
#!/usr/bin/env ruby
|
||||
# frozen_string_literal: true
|
||||
|
||||
# Usage: ruby src/schema.rb TYPE [--graph FILE]
|
||||
# TYPE: classes | properties
|
||||
# FILE: path to an RDF/Turtle file (default: last data/graph-*.ttl)
|
||||
|
||||
require 'bundler/setup'
|
||||
require 'rdf'
|
||||
require 'rdf/turtle'
|
||||
require 'sparql'
|
||||
|
||||
type = ARGV.shift
|
||||
if !%w[classes properties].include?(type)
|
||||
abort "Usage: ruby src/schema.rb TYPE [--graph FILE]\n TYPE: classes | properties"
|
||||
end
|
||||
|
||||
graph_file = nil
|
||||
if ARGV[0] == '--graph'
|
||||
ARGV.shift
|
||||
graph_file = ARGV.shift
|
||||
end
|
||||
|
||||
graph_file ||= Dir.glob(File.join('data', 'graph-*.ttl')).sort.last
|
||||
|
||||
abort "No graph file found." if graph_file.nil? || !File.exist?(graph_file)
|
||||
|
||||
query_file = File.join(__dir__, '..', 'queries', "list_#{type}.sparql")
|
||||
query = File.read(query_file)
|
||||
|
||||
$stderr.puts "Loading #{graph_file}..."
|
||||
graph = RDF::Graph.load(graph_file)
|
||||
$stderr.puts "Loaded #{graph.count} triples."
|
||||
|
||||
solutions = SPARQL.execute(query, graph)
|
||||
|
||||
key = type == 'classes' ? :class : :property
|
||||
solutions.each do |solution|
|
||||
puts solution[key].to_s
|
||||
end
|
||||
16
updates/001-location-continent.rq
Normal file
16
updates/001-location-continent.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/location#Continent> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/location#Continent> ?obj .
|
||||
?obj a <http://example.org/migrants/Continent> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/location#Continent> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Continent-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/002-location-country.rq
Normal file
16
updates/002-location-country.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/location#Country> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/location#Country> ?obj .
|
||||
?obj a <http://example.org/migrants/Country> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/location#Country> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Country-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/003-location-state.rq
Normal file
16
updates/003-location-state.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/location#State> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/location#State> ?obj .
|
||||
?obj a <http://example.org/migrants/State> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/location#State> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/State-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/004-location-city.rq
Normal file
16
updates/004-location-city.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/location#City> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/location#City> ?obj .
|
||||
?obj a <http://example.org/migrants/City> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/location#City> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/City-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/005-migration_table-reason.rq
Normal file
16
updates/005-migration_table-reason.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/migration_table#reason> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/migration_table#reason> ?obj .
|
||||
?obj a <http://example.org/migrants/MigrationReason> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/migration_table#reason> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/MigrationReason-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
15
updates/006-migration_table-reason2.rq
Normal file
15
updates/006-migration_table-reason2.rq
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/migration_table#reason2> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/migration_table#reason2> ?obj .
|
||||
?obj a <http://example.org/migrants/MigrationReason> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/migration_table#reason2> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/MigrationReason-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/007-organisation-insttype.rq
Normal file
16
updates/007-organisation-insttype.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/organisation#InstType> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/organisation#InstType> ?obj .
|
||||
?obj a <http://example.org/migrants/InstitutionType> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/organisation#InstType> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/InstitutionType-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/008-person-gender.rq
Normal file
16
updates/008-person-gender.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/person#gender> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/person#gender> ?obj .
|
||||
?obj a <http://example.org/migrants/Gender> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/person#gender> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Gender-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/009-person-nametype.rq
Normal file
16
updates/009-person-nametype.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/person#Nametype> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/person#Nametype> ?obj .
|
||||
?obj a <http://example.org/migrants/Nametype> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/person#Nametype> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Nametype-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/010-person-importsource.rq
Normal file
16
updates/010-person-importsource.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/person#Importsource> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/person#Importsource> ?obj .
|
||||
?obj a <http://example.org/migrants/ImportSource> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/person#Importsource> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/ImportSource-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/011-person_profession-eprofession.rq
Normal file
16
updates/011-person_profession-eprofession.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/person_profession#Eprofession> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/person_profession#Eprofession> ?obj .
|
||||
?obj a <http://example.org/migrants/Profession> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/person_profession#Eprofession> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Profession-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/012-personnames-nametype.rq
Normal file
16
updates/012-personnames-nametype.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/personnames#Nametype> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/personnames#Nametype> ?obj .
|
||||
?obj a <http://example.org/migrants/Nametype> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/personnames#Nametype> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Nametype-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/013-relationship-relationshiptype.rq
Normal file
16
updates/013-relationship-relationshiptype.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/relationship#Relationshiptype> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/relationship#Relationshiptype> ?obj .
|
||||
?obj a <http://example.org/migrants/RelationshipType> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/relationship#Relationshiptype> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/RelationshipType-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/014-relationship-relationshiptype_precise.rq
Normal file
16
updates/014-relationship-relationshiptype_precise.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/relationship#relationshiptype_precise> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/relationship#relationshiptype_precise> ?obj .
|
||||
?obj a <http://example.org/migrants/RelationshipTypePrecise> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/relationship#relationshiptype_precise> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/RelationshipTypePrecise-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/015-religions-religion.rq
Normal file
16
updates/015-religions-religion.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/religions#religion> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/religions#religion> ?obj .
|
||||
?obj a <http://example.org/migrants/Religion> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/religions#religion> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Religion-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/016-work-profession.rq
Normal file
16
updates/016-work-profession.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/work#Profession> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/work#Profession> ?obj .
|
||||
?obj a <http://example.org/migrants/Profession> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/work#Profession> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Profession-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
15
updates/017-work-profession2.rq
Normal file
15
updates/017-work-profession2.rq
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/work#Profession2> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/work#Profession2> ?obj .
|
||||
?obj a <http://example.org/migrants/Profession> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/work#Profession2> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Profession-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
15
updates/018-work-profession3.rq
Normal file
15
updates/018-work-profession3.rq
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/work#Profession3> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/work#Profession3> ?obj .
|
||||
?obj a <http://example.org/migrants/Profession> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/work#Profession3> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/Profession-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
16
updates/019-work-employmenttype.rq
Normal file
16
updates/019-work-employmenttype.rq
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
|
||||
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
|
||||
|
||||
DELETE {
|
||||
?s <http://example.org/migrants/work#EmploymentType> ?val .
|
||||
}
|
||||
INSERT {
|
||||
?s <http://example.org/migrants/work#EmploymentType> ?obj .
|
||||
?obj a <http://example.org/migrants/EmploymentType> .
|
||||
?obj rdfs:label ?label .
|
||||
}
|
||||
WHERE {
|
||||
?s <http://example.org/migrants/work#EmploymentType> ?val .
|
||||
BIND(IRI(CONCAT("http://example.org/migrants/EmploymentType-", ENCODE_FOR_URI(REPLACE(?val, " ", "")))) AS ?obj)
|
||||
BIND(STRLANG(STR(?val), "en") AS ?label)
|
||||
}
|
||||
Loading…
Reference in a new issue