Semantic-Driven Interoperability, Embedded AI, and Hardware/Software Integration in SMARTY
Service Mediation via Semantic Web Mappings
The SMARTY Project addresses interoperability challenges through declarative semantic-web based service mediation, resolving both syntactic and semantic mismatches across systems. Traditional any-to-any mappings require pairwise translations between systems, leading to scalability issues. Instead, SMARTY adopts an any-to-one approach, where a central RDF-based reference model serves as a semantic pivot.
- Lifting & Lowering: Data from heterogeneous formats (e.g., JSON, XML) is transformed into RDF (lifting) using standardized vocabularies, then converted back to target formats (lowering) while preserving semantics.
- Toolchain: The open-source Chimera framework (built on Apache Camel) implements this mediation, supporting protocols like HTTP, MQTT, and WebSocket. Key components include:
- Graph Component: Executes SPARQL queries and RDF validations (SHACL).
- RML/MTL Components: Map data to/from RDF using RML or the optimized Mapping Template Language (MTL).
This approach ensures seamless integration across partners’ systems, critical for use cases like secure edge communications.

Embedded AI and NLP for Cooperative Perception
SMARTY integrates knowledge graphs (KGs) with embedded AI to enable context-aware, multimodal perception in edge environments (e.g., smart mobility). Key innovations include:
- Ontology Alignment: Unifying standards like SOSA (sensors), VSS (vehicular signals), and OpenXOntology to represent real-time driving scenes.
- Symbolic + LLM Reasoning: Combines OWL rules and SPARQL queries with LLMs for tasks such as:
- Generating natural-language event summaries (e.g., “vehicle approaching from blind spot”).
- Enhancing Retrieval-Augmented Generation (RAG) via graph embeddings (e.g., ULTRA).
- Validation: Simulated in CARLA with multimodal datasets, evaluating metrics like LLM grounding accuracy and system responsiveness.
This fusion of symbolic AI and LLMs bridges perception with decision-making, enabling safer autonomous systems.

Hardware/Software Integration via Ontologies
SMARTY’s semantic layer ensures interoperability across hardware accelerators and software stacks:
- Quantum-Resistant Secure Element (QR-SE):
- Hardware: Describes interfaces (I2C, SPI), algorithms (KYBER, DILITHIUM), and power metrics.
- Software: Captures APIs, toolchains, and licensing for seamless integration.
- Ultra-Low Power Edge Processor:
- Tracks firmware versions, cryptographic functionalities, and driver compatibility to optimize PQC/Edge-AI workloads.
- PQC Accelerator (BSC):
- Uses ontologies to model register states, memory regions, and execution pipelines, enabling automated security checks and OTA updates.
- SafeSU Integration:
- Monitors AXI-4 bus contention in the SELENE SoC, enforcing quotas to guarantee timing for critical applications.
Conclusions
The D3.1 report establishes a foundation for secure, interoperable communication systems by combining hardware accelerators with semantic web technologies. Key outcomes include:
- Standardized Mediation: The any-to-one RDF model reduces integration complexity.
- AI-Ready KGs: Unified ontologies enable cooperative perception and LLM-enhanced reasoning.
- Observable Hardware: Semantic descriptions improve system verification and resource management.
As SMARTY progresses, these methodologies will guide scalable deployments in defense, IoT, and smart mobility, setting benchmarks for quantum-resistant, low-latency networks.
