Effettua una ricerca
Roberto Bellotti
Ruolo
Professore Ordinario
Organizzazione
Università degli Studi di Bari Aldo Moro
Dipartimento
DIPARTIMENTO INTERATENEO DI FISICA
Area Scientifica
AREA 02 - Scienze fisiche
Settore Scientifico Disciplinare
FIS/07 - Fisica Applicata (a Beni Culturali, Ambientali, Biologia e Medicina)
Settore ERC 1° livello
Non Disponibile
Settore ERC 2° livello
Non Disponibile
Settore ERC 3° livello
Non Disponibile
We propose a dynamical model for the estimation of operational risk in bank- ing institutions. Operational risk is the risk that a financial loss occurs as the result of failed processes. Examples of operational losses are losses generated by internal fraud, human error and failed transactions. In order to encompass the most heterogeneous set of processes, in our approach the losses of each process are generated by the interplay among random noise, interactions with other processes and the efforts the bank makes to avoid losses. We show how some relevant parameters of the model can be estimated from a database of historical operational losses, validate the estimation procedure and test the forecasting power of the model. Some advantages of our approach over the traditional statistical techniques are that it allows us to follow the whole time evolution of the losses and to take into account different-time correlations among the processes.
A novel dynamical model for the study of operational risk in banks and suitable for the calculation of the Value at Risk (VaR) is proposed. The equation of motion takes into account the interactions among different bank's processes, the spontaneous generation of losses via a noise term and the efforts made by the bank to avoid their occurrence. Since the model is very general, it can be tailored on the internal organizational structure of a specific bank by estimating some of its parameters from historical operational losses. The model is exactly solved in the case in which there are no causal loops in the matrix of couplings and it is shown how the solution can be exploited to estimate also the parameters of the noise. The forecasting power of the model is investigated by using a fraction! of simulated data to estimate the parameters, showing that for f = 0.75 the VaR can be forecast with an error similar or equal to 10(-3). (C) 2011 Elsevier B.V. All rights reserved.
We evaluate the exposure during nadir observations with JEM-EUSO, the Extreme Universe Space Observatory, on-board the Japanese Experiment Module of the International Space Station. Designed as a mission to explore the extreme energy Universe from space, JEM-EUSO will monitor the Earth's nighttime atmosphere to record the ultraviolet light from tracks generated by extensive air showers initiated by ultra-high energy cosmic rays. In the present work, we discuss the particularities of space-based observation and we compute the annual exposure in nadir observation. The results are based on studies of the expected trigger aperture and observational duty cycle, as well as, on the investigations of the effects of clouds and different types of background light. We show that the annual exposure is about one order of magnitude higher than those of the presently operating ground-based observatories.
Background In the framework of the clinical validation of research tools, this investigation presents a validation study of an automatic medial temporal lobe atrophy measure that is applied to a naturalistic population sampled from memory clinic patients across Europe. Methods The procedure was developed on 1.5-T magnetic resonance images from the Alzheimer's Disease Neuroimaging Initiative database, and it was validated on an independent data set coming from the DESCRIPA study. All images underwent an automatic processing procedure to assess tissue atrophy that was targeted at the hippocampal region. For each subject, the procedure returns a classification index. Once provided with the clinical assessment at baseline and follow-up, subjects were grouped into cohorts to assess classification performance. Each cohort was divided into converters (co) and nonconverters (nc) depending on the clinical outcome at follow-up visit. Results We found the area under the receiver operating characteristic curve (AUC) was 0.81 for all co versus nc subjects, and AUC was 0.90 for subjective memory complaint (SMCnc) versus all co subjects. Furthermore, when training on mild cognitive impairment (MCI-nc/MCI-co), the classification performance generally exceeds that found when training on controls versus Alzheimer's disease (CTRL/AD). Conclusions Automatic magnetic resonance imaging analysis may assist clinical classification of subjects in a memory clinic setting even when images are not specifically acquired for automatic analysis.
Numerous publications and commercial systems are available that deal with automatic detection of pulmonary nodules in thoracic computed tomography scans, but a comparative study where many systems are applied to the same data set has not yet been performed. This paper introduces ANODE09 ( http://anode09.isi.uu.nl), a database of 55 scans from a lung cancer screening program and a web-based framework for objective evaluation of nodule detection algorithms. Any team can upload results to facilitate benchmarking. The performance of six algorithms for which results are available are compared; five from academic groups and one commercially available system. A method to combine the output of multiple systems is proposed. Results show a substantial performance difference between algorithms, and demonstrate that combining the output of algorithms leads to marked performance improvements.
Precision measurements of the positron component in the cosmic radiation provide important information about the propagation of cosmic rays and the nature of particle sources in our Galaxy. The satellite-borne experiment PAMELA has been used to make a new measurement of the cosmic-ray positron flux and fraction that extends previously published measurements up to 300 GeV in kinetic energy. The combined measurements of the cosmic-ray positron energy spectrum and fraction provide a unique tool to constrain interpretation models. During the recent solar minimum activity period from July 2006 to December 2009, approximately 24 500 positrons were observed. The results cannot be easily reconciled with purely secondary production, and additional sources of either astrophysical or exotic origin may be required.
Precision measurements of the electron component in the cosmic radiation provide important information about the origin and propagation of cosmic rays in the Galaxy. Here we present new results regarding negatively charged electrons between 1 and 625 GeV performed by the satellite-borne experiment PAMELA. This is the first time that cosmic-ray e
X-ray imaging with grating interferometry has previously been regarded as a technique providing information only in direct space. It delivers absorption, phase, and dark-field contrast, which can be viewed as parameters of the underlying but unresolved scattering distribution. Here, we present a method that provides the ultrasmall-angle x-ray scattering distribution and, thus, allows simultaneous access to direct and reciprocal space information.
The north-south asymmetry for cosmic-ray particles was measured with one instrument of the PAMELA satellite-borne experiment in the period June 2006-May 2009. The analysis has been performed by two independent methods: by comparing the count rates in regions with identical geomagnetic conditions and by comparing the experimental distribution of particle directions with the simulated distribution that would be in the case of an isotropic particle flux. The dependences of the asymmetry on energy release in the PAMELA calorimeter and on time have been constructed. The asymmetry (N n - N s )/(N n + N s ) is 0.06 ± 0.004 at the threshold energy release in the calorimeter and gradually decreases with increasing energy release. The observed effect is shown to be produced by electrons in the energy range 10-100 GeV.
Protons and helium nuclei are the most abundant components of the cosmic radiation. Precise measurements of their fluxes are needed to understand the acceleration and subsequent propagation of cosmic rays in our Galaxy. We report precision measurements of the proton and helium spectra in the rigidity range 1 gigavolt to 1.2 teravolts performed by the satellite-borne experiment PAMELA (payload for antimatter matter exploration and light-nuclei astrophysics). We find that the spectral shapes of these two species are different and cannot be described well by a single power law. These data challenge the current paradigm of cosmic-ray acceleration in supernova remnants followed by diffusive propagation in the Galaxy. More complex processes of acceleration and propagation of cosmic rays are required to explain the spectral structures observed in our data.
The energy spectra of galactic cosmic rays carry fundamental information regarding their origin and propagation. These spectra, when measured near Earth, are signifi- cantly affected by the solar magnetic field. A comprehensive description of the cosmic radiation must therefore include the transport and modulation of cosmic rays inside the heliosphere. During the end of the last decade the Sun underwent a peculiarly long quiet phase well suited to study modulation processes. In this paper we present proton spectra measured from July 2006 to December 2009 by PAMELA. The large collected statistics of protons allowed the time variation to be followed on a nearly monthly basis down to 400 MV. Data are compared with a state-of-the-art three-dimensional model of solar modulation.
Event-related potentials (ERPs) are usually obtained by averaging thus neglecting the trial-to-trial latency variability in cognitive electroencephalography (EEG) responses. As a consequence the shape and the peak amplitude of the averaged ERP are smeared and reduced, respectively, when the single-trial latencies show a relevant variability. To date, the majority of the methodologies for single-trial latencies inference are iterative schemes providing suboptimal solutions, the most commonly used being the Woody's algorithm.
Condividi questo sito sui social