Please login first

List of accepted submissions

Show results per page
Find papers
  • Open access
  • 70 Reads
A Dual Measure of Uncertainty: The Deng Extropy

Lad, Sanfilippo and Agrò (2015) introduced the extropy as the dual concept of entropy. This measure of uncertainty has attracted the interest of researchers, and several versions of the extropy have been studied in the literature. Moreover, in the context of the Dempster--Shafer theory of evidence, Deng studied a new measure of discrimination, named the Deng entropy. In this talk, we define the Deng extropy and study its relation with Deng entropy, and examples are proposed in order to compare them. The behaviour of Deng extropy is studied under changes of focal elements. A characterization result is given for the maximum Deng extropy and, finally, a numerical example in pattern recognition is discussed in order to highlight the relevance of the new measure.

[1] Buono, F.; Longobardi, M. A Dual Measure of Uncertainty: The Deng Extropy. Entropy 2020, 22, 582.
[2] Deng, Y. Deng entropy. Chaos Solitons Fractals 2016, 91, 549-553.
[3] Lad, F.; Sanfilippo, G.; Agrò, G. Extropy: complementary dual of entropy. Stat. Sci. 2015, 30, 40-58.

  • Open access
  • 61 Reads
Generalized inference for the efficient reconstruction of weighted networks

Introduction. Network reconstruction is an active field of research. Among the methods proposed so far, some assume that the binary and weighted constraints jointly determine the reconstruction output; others consider the weights estimation step as completely unrelated to the binary one. Amidst the former ones, a special mention is deserved by the Enhanced Configuration Model; the algorithms of the second group, instead, are those iteratively adjusting the link weights on top of some previously-determined topology.

Methods and Results. Here we develop a theoretical framework that provides an analytical, unbiased procedure to estimate the weighted structure of a network, once its topology has been determined, thus extending the Exponential Random Graph (ERG) recipe. Our approach treats the information about the topological structure as a priori; together with the proper weighted constraints, it represents the input of our generalized reconstruction procedure. The probability distribution describing link weights is, then, determined by maximizing the key quantity of our algorithm, i.e. the conditional entropy under a properly-defined set of constraints. This algorithm returns a conditional probability distribution depending on a vector of unknown parameters; in alignment with previous results, their estimation is carried out by considering a generalized likelihood function. In our work, we compare two possible specifications of this framework.

Conclusions. The knowledge of the structure of a financial network gives valuable information for estimating the systemic risk. However, since financial data are typically subject to confidentiality, network reconstruction techniques become necessary to infer both the presence of connections and their intensity. Recently, several "horse races" have been conducted to compare the performance of these methods. Here, we establish a generalised likelihood approach to rigorously define and compare methods for reconstructing weighted networks: the best one is obtained by "dressing" the best-performing, available binary method with an exponential distribution of weights.

  • Open access
  • 83 Reads
Constraint choice and model selection in the generalized maximum entropy principle

The maximum entropy principle (MEP) is a powerful statistical inference tool that provides a rigorous way to guess the probability distribution of the states of a system which is known only through partial information.

Generalizing the Shore and Johnson's axioms, Jos Uffink (1995) proved that the functionals which are suitable to be used in the MEP belong to a one-parameter family, which the Shannon entropy is a member of.
The resulting probability distributions are generalized exponentials, of which Boltzmann distribution is a special case.
It has been discussed (P. Jizba and J. Korbel, 2019) that this generalized approach is suitable to study systems which do not respect standard hypothesis such as ergodicity, short-range interactions or exponential growth of the sample space: the resulting probability distributions take into account correlations that may not have been observed.

In this presentation, the maximum likelihood method to evaluate the parameters of such distributions and to perform model selection starting from empirical data will be discussed.

In particular, it will be shown that the maximum likelihood approach to estimate the Lagrange multipliers leads to an equation that justifies the use of the q-generalized momenta as constraints in the entropy maximization.
Moreover, it will be shown that the likelihood function spontaneously emerges from the maximization of entropy: in particular, it will be proved that the log-likelihood is equal to minus the entropy once the Lagrange multipliers are fixed to satisfy the maximum likelihood condition.

Lastly, simple examples based on synthetic data will be presented to show that this approach provides accurate estimations of the parameters.

  • Open access
  • 35 Reads
Fisher information of Landau states and relative information against the lowest level

An electron in a constant magnetic field has the energy levels known as the Landau levels.
One can obtain the corresponding radial wave function in cylindrical polar coordinates
(e.g., textbook of Landau & Lifshitz). This system is not explored so far in terms of
information-theoretical point of view. We here focus on Fisher information associated
with these Landau states specified by the two quantum numbers. Fisher information provides
a useful measure of the electronic structure in quantum systems such as hydrogen-like atoms [1,2]
and molecules under Morse potentials [3]. We numerically evaluate the generalized Laguerre
polynomials contained in the radial wave functions. We report that Fisher information increases
linearly with the quantum number n that specifies energy levels, but decreases monotonically
with the quantum number m (i.e., the index of the generalized Laguerre polynomial).

Also, we present relative Fisher information of the Landau states by setting the
lowest Landau state as a reference density. The analytical form is just 4n, which does not
depend on the other quantum number m.


1. T. Yamano, Relative Fisher information of hydrogen-like atoms, Chem. Phys. Lett. 691 (2018) 196
2. T. Yamano, Fisher information of radial wavefunctions for relativistic hydrogenic atoms, Chem. Phys. Lett. 731 (2019) 136618
3. T. Yamano, Relative Fisher information for Morse potential and isotropic quantum oscillators, J. Phys. Commun. 2 (2018) 085018

  • Open access
  • 63 Reads
Preliminary study of entropy-based indicators to discriminate cancer-related characteristics.

Select entropy-based indicators (such as Kolmogorov Complexity, Shannon Information Entropy and the Index of Regularity) have been used in this preliminary study to classify genes with acceptable results. This need for classification is driven by the interest of the scientific community in determining whether a given gene possesses or lacks cancer-related characteristics. A subset of genes was chosen, based on previous studies and on random selection. These genes have been represented by their DNA sub-sequence and have been divided into two groups: those that have a relation to cancer (that is, they either cause cancer, as in oncogenes, or are tumor suppressors) and those that are not related to cancer issues (i.e., normal genes). Initially, eleven classifiers were used and compared, some of which reflected an accuracy rate of over 70%. This accuracy rate represents the percentage of correct predictions (cancer-related or not) within a test set of genes. These results shed some light on the fact that, in effect, oncogenes and normal genes have different patterns and structures and can potentially be used as a predictor for novel genes and features. This exploratory study also analyzes non-classic classifiers and evaluates the prospects of clustering and advanced machine-learning algorithms to determine significant patterns within DNA sequences.

  • Open access
  • 74 Reads
Hellinger Entropy Concept: multidisciplinary applications.

The use of a metric to assess distance between probability densities is an important practical problem used in artificial intelligence or recommendation systems. The generalized α-formalisms introduced by Rényi and Tsallis are the basis of well-known entropies and divergence models. A particular α-divergence that, was presented in a previous work from the co-authors. This particular α-divergence, in our perspective, was already essentially defined by Hellinger. The concept of Hellinger entropy makes it possible, through a maximum-entropy syllogism, to state a bound for the Hellinger metric. The square root divergence is a metric, and its nonparametric estimator has information-theoretic bounds, that can be directly computed from the data. Information-theoretic bounds for Hellinger distance are developed in this work. The asymptotic behavior allows to use this metric, in a competitive scenario with three or more densities, like clustering. The bound can be directly computed from the data making this method suitable for streaming data.

  • Open access
  • 131 Reads
Statistical entropy opens a new way to assess the recyclability of products

Statistical entropy is applied to assess various treatment technologies in waste management. It measures the effect of a treatment on waste flows, and thus the mixing/concentrating of materials and substances. The stronger the mixing, the higher is the produced statistical entropy (which is in accordance to the law of thermodynamics), and vice versa. For example, recyclers aim to generate outputs of concentrated target materials out of a mixed waste input, which corresponds to a statistical entropy decrease. The recycling effectiveness increases, the lower the statistical entropy is. Besides assessment of processes, statistical entropy can also be used to assess individual products and their material distribution respectively. Thus, complex products that consist of manifold materials show an increased material distribution/mixing, which again translates into high statistical entropy. As recycling efforts increase, the more complex products are, it seems feasible to assess the product inherent recyclability by statistical entropy. The lower the statistical entropy of the product, the higher is the recyclability. Because material concentrations can vary substantially within the different product components, information on the product structure needs to be considered too. Thus, the developed statistical entropy approach bases on material concentrations and information on the product assembly. To demonstrate the new application of statistical entropy, a case study is presented in which the recyclability of a typical smartphone is evaluated. The results show that statistical entropy is an appropriate metric to describe the recyclability of products and enables important insights in the design of products. It could act as a planning tool for product designers and manufacturers to promote products of higher recyclability. Further, the new statistical entropy approach could be of relevance for the implementation of the European Union’s Circular Economy Package.

  • Open access
  • 44 Reads
Detection of internal defects in concrete and evaluation of a healthy part of concrete by non-contact acoustic inspection using normalized spectral entropy

In recent years, deterioration of concrete structures has become a social problem, and there is a demand for a non-contact, non-destructive method for inspecting internal defects in concrete structures. In our noncontact acoustic inspection, a target surface of concrete is vibrated with strong aerial sound waves, the vibration velocity distribution is measured two-dimensionally using a scanning laser Doppler vibrometer. Then, after time-frequency gate process, acoustic features quantities (vibrational energy ratio and spectral entropy) are calculated. By analysis using them, it has become possible to detect and visualize internal defects from a long distance (5-30 m). Traditional spectral entropy has no problem in evaluating the fluctuation of spectral entropy value within the same measurement condition or a measurement plane. However, it was not possible to directly compare and evaluate the spectral entropy values because the meaning of the upper and lower limit of spectral entropy values are ambiguous between different measurement conditions and different objects. To solve this, we introduce normalized spectral entropy. It will become possible to compare the values in different measurement conditions and different objects on the same scale, and it becomes possible to statistically evaluate a healthy part of concrete as well as detect defects.

  • Open access
  • 161 Reads
ECG and EDA information transfer on emotion evaluation

Emotions are behind decision-making, perception and learning. Studying emotions and their responses allow us to understand people’s preferences and their strategies to adapt across contexts. Both peripheral and central nervous systems are activated by emotions, which are translated on behavioural and physiological alterations.

Data from 4 participants were collected in healthy volunteers, which came to the lab three times. Each session intends to induce one emotion between happy, fear, neutral. At the beginning, participants rest for 4 minutes to collect baseline data. Afterwards, they watched intense movies associated with each condition for 25 minutes. In this work, we target two physiological signals: electrocardiogram (ECG), and electrodermal activity (EDA) in the happy condition.

In this work, it is intended to study the information transfer between signals. For the baseline and happy condition, the information dynamics based on the linear Gaussian approximation was computed to characterize information storage and transfer between ECG and EDA signals.

The major idea is to present the signals more feasible for data collection in emotion quantification. The ecological validity of the studies is truly compromised when the experimenter affects the usual participants’ daily routine by electrode placement, which ultimately may influence the participant reactions to stimuli, and therefore, masking the quantitative signal evaluation.

Considering the self-entropy of each signal, EDA and ECG present similar values. It was observed that in all participants, the ECG transfers information to the EDA, indicating that the ECG information may pertain in the system. Literature indicates that the EDA is one of the first signals in emotion response, nevertheless, this signal is highly noisy. Hence, this and the result achieved in this study may indicate that the ECG is a stronger signal when we want to evaluate emotions in real contexts (with highly ecological validity).

  • Open access
  • 79 Reads
Design of high-C Cr-Co-Ni medium entropy alloy for tribological applications

Medium and high entropy alloys (MEAs/HEAs), which typically have three or more main elements, were initially designed to have high configurational entropy (Sconf) stabilizing a simple, single-phase solid solution (SS) over multi-phase alloys. Having multiple main elements will indeed increase the Sconf of simple SSs, but it may decrease the enthalpy also increase, in a lower degree, Sconf of other complex phases, leading to multi-phase microstructures. Therefore, although the initial idea behind MEAs/HEAs was refuted, the interest remains, the focus shifted to exploring its vast compositional field, which includes an almost infinite number of alloys. Some of them with potentially better properties than those existing today. The Cantor alloy (equiatomic CrMnFeCoNi) and the equiatomic CrCoNi alloy, both single-phase with face-centered cubic (FCC) structure, are among the toughest materials ever reported. In this work, computational thermodynamic calculations (CALPHAD method) predicted that C additions in the Cr40Co40Ni20 MEA favor the formation of a promising multi-phase microstructure for wear applications. A large amount of C was incorporated into the alloy by melting in a graphite crucible, this process allowed C saturation in the melt, which as will be shown, can be well controlled by carefully selecting the casting temperature, in this case we achieved 24 at% C by melting at 1600 ºC. A Cr30.4Co30.4Ni15.2C24 MEA was produced, characterized by X-ray diffraction (XRD) and scanning electron microscopy (SEM) equipped with energy-dispersive spectroscopy (EDS). Experimental results revealed a microstructure composed of: graphite flakes, hard primary Cr-rich carbides, and a tough eutectic matrix of FCC phase and carbides; in good agreement with thermodynamic calculations. These findings highlight the great flexibility and potential in MEAs/HEAs design, making it possible to obtain microstructures and sets of properties that are beneficial for a given application.