Welcome to the NeSyAI4 Tutorial for Industry 4.0 Repository at the International Semantic Web Conference!
This repository serves as a resource for attendees of our hands-on tutorial on Neuro-Symbolic AI for Industry 4.0. In this tutorial, we explore the fusion of semantic web technology and machine learning, demonstrating practical applications in the context of Industry 4.0.
During the hands-on session at the International Semantic Web Conference, participants will gain practical experience in applying Neuro-Symbolic AI to Industry 4.0. They'll explore semantic technology, ontologies, and deep learning to address real-world manufacturing challenges, using Bosch scenarios as examples.
This tutorial is organized by Robert Bosch researhers in the field of Neuro-Symbolic AI and Industry 4.0:
-
Diego Rincon-Yanez
Affiliation: Bosch Center for AI Email: fixed-term.diego.rincon [at] de.bosch.com, rinconyanezd+iswc [at] gmail.com -
Irlan Grangel-Gonzalez
Affiliation: Bosch Corporate Research Email: irlan.grangelgonzalez [at] de.bosch.com -
Mohamed H. Gad-elrab
Affiliation: Bosch Corporate Research Email: Mohamed.Gad-Elrab [at] de.bosch.com -
Yuqicheng Zhu Affiliation: Bosch Center for AI Email: Yuqicheng.Zhu [at] de.bosch.com
-
Evgeny Kharlamov
Affiliation: Bosch Corporate Research Email: Evgeny.Kharlamov [at] de.bosch.com
Feel free to reach out to us with any questions or concerns regarding the tutorial. We look forward to your participation!
Make sure you have the following installed software before starting the hands-on session:
- Git Client
- Text Editor
- Docker Engine Instance
Download the prepared compose.yaml
file from the repository via wget
and execute the command using the utility.
git clone https://github.com/d1egoprog/ISWC23-I40Tutorial.git
docker compose -p iswc23-i40 up -d
The previous lines will download the repository (specially for the example files) and run the following docker compose file automatically in a pipeline fashion.
version: '3.7'
services:
mapping:
build:
context: ./1.${COMPONENT_1}/.
args:
- VERSION=${VERSION}
- COMPONENT_NAME=${COMPONENT_1}
environment:
- ONTOLOGY=${ONTOLOGY}
- OUTPUT_KG=${OUTPUT_KG}
- INPUT_MAPPING=${INPUT_MAPPING}
volumes:
- ./files/datasources:/opt/${COMPONENT_1}/datasources
- ./files/ontologies:/opt/${COMPONENT_1}/ontologies
- ./files/mappings:/opt/${COMPONENT_1}/mappings
- ./files/output:/opt/${COMPONENT_1}/output
image: ${PROJECT_PREFIX}-${COMPONENT_1}:${VERSION}
querying:
environment:
- INPUT_KG=${OUTPUT_KG}_merged.ttl
volumes:
- ./files/output:/opt/jena-fuseki/output
- ./files/logs:/opt/jena-fuseki/logs
ports:
- 3030:3030
image: d1egoprog/jena-fuseki:4.9.0-web
depends_on:
mapping:
condition: service_completed_successfully
visualization:
image: d1egoprog/webvowl:1.1.7
ports:
- 8080:8080
restart: always
depends_on:
mapping:
condition: service_completed_successfully
For running on M1 and M2 the parameter platform: linux/amd64
needs to be added to the service in the docker compose file at the same level from the build
statement.
services:
mapping:
platform: linux/amd64
build:
context: ./1.${COMPONENT_1}/.
args:
- VERSION=${VERSION}
- COMPONENT_NAME=${COMPONENT_1}
We welcome your feedback and contributions to this repository. If you have suggestions, improvements, or additional resources to share, please feel free to open issues, submit pull requests, or reach out to the tutorial organizers.
Enjoy your learning journey with Neuro-Symbolic AI for Industry 4.0!