Glossary

Term Definitions 

(Simpl-)Agent: An agent serves as a local gateway towards the data space services. 

Access Policy: Access policies are configuration entities that represent how the provider want to control access to his service. The access policy allows you to configure operations, permissions, access rules, and selection rules that restricts/allow certain access to a service. 

Data Quality Rules: Data quality rules can be defined as a set of guidelines, standards, or criteria used to assess and ensure the quality of data within a system or organization. 

Explanatory text:   
Data quality rules are a formal and structured definition of required quality of the self-description. These rules include assessing completeness, consistency, correctness, and other quality aspects, e.g., defined in ISO 25012. The checks go beyond semantic conformance, such as identifying inconsistencies between data instances, detecting redundant information, or verifying data integrity. The data quality rules can be classified into mandatory rules and recommended rules. Mandatory rules need to be fulfilled completely for all instances, e.g., the field needs to comply to a specific format. For the recommended rule it is possible to define a threshold on how many instances need to fulfil the rule, e.g., of the 10 fields at least 6 need to exist in the self-description. 

Data space catalogue: The catalogues enable the discovery of services within the data space.  

Dataspace: A distributed system defined by a governance framework that enables secure and trustworthy data transactions between participants while supporting trust and data sovereignty. A data space is implemented by one or more infrastructures and enables one or more use cases. 

Local preferred authentication mechanism: A local preferred authentication mechanism is a method or process chosen by a system or organization for verifying the identity of users, devices, or entities within a specific, localized environment. 

Ontology: An ontology is a formalized and structured knowledge within a specific domain. It includes the concepts (also called the vocabulary) as well as the relationships between the concepts. For the use of the automatic validation, the ontology should be provided in an RDF Schema (RDFS) or Web Ontology Language (OWL) specifications. 

Quality validation: Quality Validation assesses the completeness, accuracy, consistency, and reliability of the resource description. It ensures that the data meets predefined quality standards and is suitable for its intended use. 

Resource Description: Resource description refers to the metadata records that describe various resources, allowing identification, searchability, access management, preservation, and prediction of resource behaviour. These resources can include data, applications, or infrastructure. 

Security credentials: Security credentials are the authentication details used to verify the identity and access rights of a user or system within a data space (e.g. passwords). 

Semantic validation: Semantic validation involves verifying that the resource description not only conforms to the correct syntax but also makes logical sense and adheres to business rules and constraints. It checks for meaningfulness and correctness within the data context. 

Syntax validation: Syntax validation refers to the process of checking if the data or code conforms to the defined syntactical rules and formats. It ensures that the input is correctly structured according to the required grammar and patterns. 

Example: Ensuring that a JSON file is properly formatted with matching brackets and correct key-value pairs. 

Usage Policy: Usage Policies are configuration entities that represent on how the provider wants to control the behaviour usage of the consumed service. The usage policy allows you to configure the content processing, transfer and repurposing the service. 

Vocabulary Provider: The vocabulary provider grants the ontologies and vocabularies that are standardised in the data space.