Effettua una ricerca
Vincenzo Di Lecce
Ruolo
Professore Associato
Organizzazione
Politecnico di Bari
Dipartimento
Dipartimento di Ingegneria Elettrica e dell'Informazione
Area Scientifica
Area 09 - Ingegneria industriale e dell'informazione
Settore Scientifico Disciplinare
ING-INF/05 - Sistemi di Elaborazione delle Informazioni
Settore ERC 1° livello
PE - Physical sciences and engineering
Settore ERC 2° livello
PE6 Computer Science and Informatics: Informatics and information systems, computer science, scientific computing, intelligent systems
Settore ERC 3° livello
PE6_1 Computer architecture, pervasive computing, ubiquitous computing
Aim of this work is to present a model of an intelligent short term demand side management system (DSM) based on a distributed measurement and management data system. The system is designed to improve the profitability of modern self-production energy plants reducing the power consumption and maintaining the same comfort level for users. The DSM problem is modeled as an auction based multi-agent system. The proposed system is composed of a sensor network and a central processing unit. Each network node is handled by an agent and it is able to regulate the power consumption of a single environment (in this work a room of a public building). Each agent reacts to a new critical condition entering in competition with the others to gain the access at a shared limited resource. The competition is regulated by an auction based system. As the first experimental results are showing, the proposed system can be the consumer’s key to maximize the profitability of the self-production energy plants
This work presents an intelligent demand side management (DSM) system modeled according to an auction based multi-agent system (MAS). The system is designed to improve the sustainability of energy self-production systems thanks to energy saving features while guaranteeing the maintenance of the user's desired comfort level. The proposed system is composed of a sensor network and a central processing unit. Each network node is handled by an agent and it is able to regulate the power consumption of a single environment (e.g., a room). The first live tests were carried out within a public building. Results seems promising for maximizing the sustainability as well as the profitability of self-production energy systems
In this paper, we spotlight the issue of considering precision and accuracy as concepts to inflate in the interpretation of measurements, and not only in the measurements themselves. Except from microscale dimensions where the Heisenberg principle can be a matter of concern, when we normally measure a physical quantity we are supposing to be truthfully compliant with the hypothesis that our instrumental equipment plus the measurement context are not affecting significantly the observed measurands. Then, by means of repeated experiments, we make use of precision and accuracy assessments to provide an estimate of the quality of the measure, i.e., its closeness to the true value. In this context, one issue that deserves attention is the influence the context has on the interpretation of the values of the measurands. This problem is strictly related to the information processing at the application level. Consciously far from claiming to provide a complete picture, we propose some reflections coming from the experience gained in the field of computational models to enhance selectivity in highly cross-sensitive sensors. The main point is that, hidden behind every form of interpretation, there is the need of a knowledge model sufficiently robust to encapsulate the multi-level granular context variables affecting the measures. Our prevision is that next generation smart sensor technologies will have to deal with such knowledge-related issues on a wider scale. Since the market is moving towards ever more “intelligent” measures, a preliminary assessment of these problems ought to be pursued.
This work presents an ongoing research aimed at interpreting the responses of non-selective gas sensors (such as metal oxide resistive ones) in terms of simple IF THEN rules. In particular, it is shown how a logical combination of the output of three extremely low-cost sensors, namely MQ131, MQ136 and TGS2602, can be arranged to produce IF THEN inferences able to discriminate among CO, SO2 and NH3 emissions. The outcome is quantitatively similar to that obtained with high-selective and costly chemical sensors. The experimental results, albeit grounding on an empirical base, seem to support the idea that smart compositions of low-cost sensors are able to manifest surprising discrimination abilities.
The collaboration of more people with different culture and knowledge leads to well-known problems related to the terminology used among them and more generally to the reference ontology. The living labs approach can also be used to produce a convergence vocabulary. Hereinafter, the experience related to our activity is presented, it was made possible thanks to public funding for the Puglia tremor project aimed at territorial monitoring
In this paper, basing on the results of previous studies, we present an information processing architecture that aims to lessen the distance between sensors and applications in as much humanly unsupervised way as possible. With the impressive growth of today’s computational and miniaturization technologies, it is likely that the distance between sensors and applications could be covered by one single device or system only. Practically, it happens that experts such as system integrators or engineers are often needed to fill in the gap. The proposed computational model uses the notion of ‘holon’, a concept that is well suited for multi-level information processing, from the raw data level up to the knowledge-oriented information level. For test purposes, a data analysis platform called H-GIS that grounds on the current proposal has been employed and a case study has been commented.
A granule is any atomic element that is not distinguishable from its peers for manifest features but only for the fact that it represents a singleton (eventually overarching a subset of elements) among other singletons. The importance of granule in Computational Intelligence (CI) is testified by the recent development of Granular Computing (GrC) whose aim is to provide computational methodologies and tools to properly handle information processing at different granularity levels. One important aspect, sometimes dismissed by mainstream research in GrC, is the way interpretations are hidden in observational data at multiple granule scales. It is often the case, in fact, that certain patterns showing coarse statistical evidence at a given observation level have a number of well-defined rules of interpretation at a finer granule level. Currently available CI tools seem to lack on this point. This work reports on the experience gained in developing a CI tool for data analysis named H-GIS (Holonic-Granularity Inference System). The tool is specifically conceived to focus on measurement data interpretation at multiple granularity scales by employing the modeling framework of the so-called holonic systems.
In this paper, basing on the results of previous studies, we present an information processing architecture that aims to lessen the distance between sensors and applications in as much humanly unsupervised way as possible. With the impressive growth of today’s computational and miniaturization technologies, it is likely that the distance between sensors and applications could be covered by one single device or system only. Practically, it happens that experts such as system integrators or engineers are often needed to fill in the gap. The proposed computational model uses the notion of ‘holon’, a concept that is well suited for multi-level information processing, from the raw data level up to the knowledge-oriented information level. For test purposes, a data analysis platform called H-GIS that grounds on the current proposal has been employed and a case study has been commented.
In this work, we present M-DUST, a novel low-cost and real-time smart monitoring sensor for Particulate Matter (PM) emission measurement. It is based on the Tyndall scattering process to count particles concentration. A comparison on different methods to evaluate particles concentration has been discussed. A mechanical filter is used to select the particulate matter with the appropriate cut-off aerodynamic diameter. The presented device is an intelligent sensor thanks to its features, such as: ability to make self-diagnosis, self-adaptation and transparency to communication interface. Tests are carried out in the Italian city of Taranto by using non-toxic substances and analysis chamber.
Holon is a powerful metaphor which captures the recursive structure of biological systems and the organization of their decision processes arranged at various granularity abstraction levels. From a computational intelligence perspective, a holon can be conceived as a goal-oriented community of lower-level holons led by more specific targets. Sub-holons co-operate on sub-problems that represent the source problem at a lower knowledge abstraction level. Such a (recursive) hierarchical organization constitutes the so-called holarchy. Holonic thinking is hence particularly suited for complex and intelligent systems modeling: in particular, its success has been proved in the field of Intelligent Manufacturing. Nevertheless, albeit hierarchical and granular thinking are two fundamental prerequisites in Software Engineering, the use of holonic thinking as software paradigm is still flawing in the literature at the moment. In this regard, the paper introduces the concept of ‘holonic granule’ as a novel software building-block for modeling complex granular systems. Prospective applications of holonic granule-based software models are then commented with particular emphasis to industrial automation and environmental monitoring settings.
This work introduces a novel perspective in the study of smart sensors technology. The final aim is to develop a new methodology that supports the conception, design and implementation of complex sensor-based systems in a more structured and information-oriented way. A smart sensor can be considered as a hardware/software transducer able to bring the measured physical signal(s) at an application level. However, when viewed through the lens of artificial intelligence, sensor ‘smartness’ appears to stay in between merely transduction and complex post-processing, with the boundary purposely left blurry and undetermined. Thanks to the recent literature findings on the socalled ‘holonic systems’, a more precise characterization and modeling of the smart sensor is provided. A ‘holon’ is a bio-inspired conceptual and computational entity that, as a cell in a living organism, plays the two roles of a part and a whole at the same time. To bring the right evidence of to the advantages of the holonic approach, an example smart application and a related prototype implementation for the disambiguation of low-cost gas sensor responses is shown. The proposed approach unravels the inherent complexity of the disambiguation problem by means of a scalable architecture entirely based on holonic-inspired criteria. Furthermore, the overall setup is economically competitive with other high-selective (hence high-cost) sensor-based solutions
Rapid developments of smart sensor technologies envisage a new era where information handling and knowledge sharing will play a crucial role. Traditional sensors were conceived as simple hardware transducers of physical quantities into measurable signals, eventually requiring an analogue/digital conversion to make data available for software applications. IEEE-1451 family of standards has added to mere transduction some architectural prescriptions to mainly address the issue of connection transparency, a desirable property virtually making any sensor a plug-and-play device. Our percept is that next generation smart sensor-based systems will have to face another challenge: the need to endow devices with the ability to process application-level bits of knowledge to best accomplish their informative goals. As a result, unexpected proactive and dialogue-oriented behaviors will have to be taken into account, thus reducing the gap between what we commonly refer to as smart sensors and intelligent agents. In order to support this view, a semantic-driven sensor-based system architecture is introduced and an example proof-of-concept case study is commented.
Advances in remote sensing technology are now providing tools to support geospatial mapping of the soil properties for the application to the management of agriculture and the environment. In this paper results of visible and near IR spectral reflectance are presented and discussed. A supportable evaluation of organic matter in the soil is the absence of a specific signature, this concept arose out of the widely shared observation of scientific community in this concern. The obtained results show that a morphologic approach based on an experimental distance model is an appropriate and efficient method to deal with this matter
Condividi questo sito sui social