Published in

Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication - UbiComp '13 Adjunct

DOI: 10.1145/2494091.2499217

Links

Tools

Export citation

Search in Google Scholar

Rapid prototyping of semantic applications in smart spaces with a visual rule language

This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

One of the major limitations of Ambient Intelligent systems today is the lack of semantic models in human behavior and the environment, so that the system can recognize the specific activity being performed by the users and act accordingly. In this context, we address the general problem of knowledge representation in Smart Spaces. In order to monitor and act over human behavior in intelligent environments, we design a sufficiently simple and flexible visual language to be managed by non-expert users, thus facilitating the programming of the environment. The prototype of the visual language serves to represent rules about human behavior to provide the Smart Space with more usability. These rules can be mapped into SPARQL queries and rule subscriptions. In addition, we add support to represent imprecise and fuzzy knowledge. The proposed general-domain language can help managing resource allocation, assisting people with special needs, in remote monitoring and other domains.