Please login first

List of accepted submissions

Show results per page
Find papers
  • Open access
  • 74 Reads
Bi-dimensional colored fuzzy entropy applied to melanoma dermoscopic images

Recently, a bi-dimensional fuzzy entropy measure has been proposed for image texture evaluation. Herein, a new bi-dimensional fuzzy entropy is proposed to process colored images. Our algorithm, FuzEnC2D, in opposition to dos Santos et al. (2018) definition, evaluates each color channel individually with consideration of global and local characteristics. We propose to apply it for the characterization of melanoma's dermoscopic images. In this work, FuzEnC2D is tested by evaluating its sensitivity to change of parameters, rotation sensitivity, ability to determine irregularity through shuffling pixels, and consistency according to different image sizes. For those purposes, white noise and colored Brodatz textures are used. The algorithm is also applied to dermoscopic images of the public PH2 dataset to evaluate its performance in distinguishing common nevi, atypical nevi, and melanoma lesions. The results reveal a relative decrease of, at most, 29.97 % for FuzEnC2D-values when considering different parameters values. On the other hand, the consistency and low rotation sensitivity of the algorithm are revealed by analyzing the same texture with different sizes (maximum relative difference of 4.34%) and when comparing the entropy of an image upon rotation (maximum relative difference of 0.36%). Besides, after shuffling the pixels of an image, FuzEnC2D-values of shuffled images increases up to 8.9 times of the original values. Moreover, using the red channel's entropy, a common nevi lesion is statistically different from an atypical one (p = 0.004 with the Kruskal-Wallis test). Regarding the green channel, a statistical difference (p = 0.034) is observed between atypical nevi lesions and melanoma. Also, differentiating a common nevi lesion from lesions diagnosed as melanoma is possible regardless the RGB channels. Finally, the FuzEnC2D algorithm appears as a promising algorithm to analyze, through an entropy-based measure, the texture of colored images.

  • Open access
  • 111 Reads
Estimation of Relative Entropy Measures based on Quantile Regression

The estimation of relative entropy measures such as mutual information, conditional and joint entropy, or transfer entropy requires the estimation of conditional and joint densities. When the data are continuous, a multi-variate kernel density estimation or a discretization scheme is usually applied. The problem with this discretization approach is that for mutual information as well as transfer entropy the resulting measure does not converge monotonically to the true value when the number of discrete bins is increased. In the absence of a distribution theory, hypothesis testing is only possible by means of bootstrapping. We propose to estimate the necessary joint and conditional frequencies by means of quantile regression. This allows us to avoid arbitrary binning and all associated problems. Moreover, due to the semi-parametric nature of this approach, the computational burden is decisively reduced compared to multi-variate kernel density estimation. Instead, we show that we can flexibly use quantile regressions to estimate densities in order to calculate joint, conditional and transfer entropy as well as mutual information. Furthermore, by casting our into a Generalized Method of Moments framework, we develop the asymptotic theory to conduct inference on relative entropy measures for multiple variables.

  • Open access
  • 89 Reads
A stepwise assessment of parsimony and entropy in species distribution modelling

Entropy is an intrinsic characteristic of the geographical distribution of a biological species. A species distribution with higher entropy involves more uncertainty, i.e., is more gradually constrained by the environment. Species distribution modelling tries to yield models with low uncertainty, but normally has to produce them by increasing their complexity, which is detrimental for another desirable property of the models, parsimony. By modelling the distribution of 18 vertebrate species in mainland Spain, we show that entropy may be computed along the forward-backward stepwise selection of variables in Generalized Linear Models to check whether uncertainty is reduced at each step. This allows selecting the model that best combines the complementary characteristics of certainty and parsimony. This also allows to disentangle the entropy due to the intrinsic uncertainty of the species distribution from that due to failure in the model specification. A reduction of entropy was produced asymptotically in each step of the model, with some exceptions. This asymptote could be used to distinguish the entropy attributable to the species distribution from that attributable to model misspecification. We discussed the differential suitability of Shannon and fuzzy entropy for this end. The use of Shannon entropy in distribution modelling has not biogeographical sense, because it computes probability of presence as if the species were only present in one cell of the study area. Fuzzy entropy has not such restriction and always has values between zero and one, which produces results that are commensurable between species and study areas. Fuzzy entropy is also more correlated with AUC values. Using a stepwise approach and fuzzy entropy may be helpful to counterbalance the uncertainty and the complexity of the models. The model yielded at the step with the lowest entropy combines reduction of uncertainty with parsimony, which results in high efficiency.

  • Open access
  • 225 Reads
Case studies of Statistical Entropy Analysis in Recycling Processes: A tool in Support of a Circular Economy

Statistical entropy (SE), has been used alongside material flow analysis (MFA) to aid studies of resource efficiency and waste management. Given that statistical entropy operates based on principles of information theory, it can be used to describe quantitatively systems with mixed materials flow, including their overall distribution of components throughout transformational stages, e.g., mechanical pre-treatment stages in recycling processes. In the present work, we present two cases where our research group has used statistical entropy analysis as an analytical tool for recycling process, with the aim of supporting a transition towards a circular economy.

In the first place, a comparison between two lithium-ion battery recycling processes was carried out by the combination of material flow analysis and statistical entropy. In this manner, an efficiency weight at a systemic level is given to the mechanical processing stages, which is reflected by the entropic values at final stages. In other words, while both systems obtained recycled materials with similar characteristics, the system that presented a more efficient pre-processing stage (i.e. lower statistical entropy value) presented a lower entropy value of the substances at final stage. Secondly, statistical entropy and material flow analysis was used as to aid in the design of mechanical processing stages for thermoelectric devices (TEDs). A total of 106 thermoelectric devices were mechanically processed by different comminution methods, physical and chemical characterization were carried out and the entropy values evaluated in each stage. The statistical entropy results were used to design a process which presented a fast and efficient entropy reduction in all the throughput, such, entailed a total of 5 stages and obtained fractions of high purity and treatability.

  • Open access
  • 173 Reads
New Explanation for the Mpemba effect
Published: 05 May 2021 by MDPI in Entropy 2021: The Scientific Tool of the 21st Century session Thermodynamics

The purpose of this study is to check out the involvement of entropy in Mpemba effect. Provided that preheating of the water the cooling duration is reduced, we theoretically show that water gains more entropy when warmed and re-cooled to the original temperature. Water molecules are oriented dipoles joined by hydrogen bonds. When water is heated, this structure collapses (i.e., the entropy increases). When water is re-cooled to a lower temperature, the previous structure is not re-formed immediately. Sometimes, when the re-cooling is performed within a freezer, there is not enough time for the structure to re-form because of the high cooling rate. The entropy reduction curve as a function of the temperature, S = f(T), shows retardation (a lag) relative to the entropy growth curve. Water that has been heated and re-cooled to the initial temperature shows greater entropy than that before it was heated. This means that, while its molecules now have the same kinetic energy, their thermal motion after heating is less oriented with respect to the structure mentioned above. After re-cooling, random collisions are more likely, owing to this the temperature decreases more quickly.

  • Open access
  • 56 Reads
Quasistatic and quantum-adiabatic Otto engine for a two-dimensional material: The case of a graphene quantum dot
Published: 05 May 2021 by MDPI in Entropy 2021: The Scientific Tool of the 21st Century session Thermodynamics

The concept of quantum heat engines (QHEs) was introduced by Scovil and Schultz-Dubois in Ref. [1], in which they demonstrate that a three-level energy maser can be described as a heat engine operating under a Carnot cycle. This important research gave way to the study of quantum systems implemented as the working substances of heat machines with the goal of realizing efficient nanoscale devices. These devices are characterized by the structure of their working substance, the thermodynamic cycle of operation, and the dynamics that govern the cycle [2–3]. In this study, we analyze the performance of a quasi-static and quantum-adiabatic magnetic Otto cycle for a two-dimensional material: the case of a graphene quantum dot [4]. For graphene quantum dots [5,6], the low-energy approach using the Dirac equation with boundary conditions is an excellent approximation. Modulating an external/perpendicular magnetic field, in the quasistatic approach, we found behaviors in the total work extracted that are not present in the quantum-adiabatic formulation. Additionally, we find that, in the quasi-static approach, the engine yielded a higher performance in terms of total work extracted and efficiency as compared with its quantum-adiabatic counterpart. In the quasi-static case, this is due to the working substance being in thermal equilibrium at each point of the cycle, maximizing the energy extracted in the adiabatic strokes.

[1] H. E. D. Scovil and E. O. Schulz-DuBois, Phys. Rev. Lett. 2, 262 (1959)

[2] C. M. Bender, D. C. Brody, and B. K. Meister, Proc. R. Soc. London A 458, 1519 (2002)

[3] T. Feldmann, E. Geva, and R. Kosloff, Am. J. Phys. 64, 485 (1996)

[4] Francisco J. Peña, D. Zambrano, O. Negrete, Gabriele De Chiara, P. A. Orellana, and P. Vargas. Phys. Rev. E 101, 012116

[5] M. Grujic, M. Zarenia, A. Chaves, M. Tadic, G. A. Farias, and F. M. Peeters, Phys. Rev. B 84, 205441 (2011).

[6] R. Thomsen and T. G. Pedersen, Phys. Rev. B 95, 235427 (2017).

  • Open access
  • 61 Reads
Dissipative extension of Electrodynamics
Published: 05 May 2021 by MDPI in Entropy 2021: The Scientific Tool of the 21st Century session Thermodynamics

In nonequilibrium thermodynamics, electrodynamic interaction and electrodynamic forces appear as non-dissipative, external phenomena. Irreversibility is due to Ohm's law and polarisation. However, the theoretical approaches of polarisation and thermal couplings do not apply to Lorentz force and electromagnetic stresses. The choice of state variables is also problematic. Thermodynamic stability cannot be valid for para- and diamagnetic materials at the same time, choosing either magnetisation or the magnetic field strength, or the corresponding four quantities in a special relativistic framework, as a state variable. Moreover, any particular choice leads to shape-dependent homogeneous thermodynamic bodies, and therefore the extensivity condition of thermodynamic state variables cannot be introduced without any further ado.
In the presentation, I survey the problems of thermodynamic compatibility of electrodynamics and suggest some explanations. The main ideas are originated in a novel approach to gravity in the framework of nonequilibrium thermodynamics, where the gravitational potential is a thermodynamic state variable, and the balances of mass, momentum and energy are constraints for the entropy inequality. Naturally, for electrodynamics, special relativity is a necessary background.

  • Open access
  • 67 Reads
Entropy-Based ECG Biometric Identification

There is a great interest nowadays in using ECG for biometric identification, due to some of its intrinsic properties. There are different proposals in the literature to solve this task. Most of them based on feature extraction and machine learning methods applied on those features and, recently, others based on deep learning algorithms applied directly on the signals - providing better results, but requiring a much higher computational cost. In this study we aim to show how an approach based on compression methods can be used to perform this task. Our approach uses the notion of relative compression to provide a measure of similarity from a new signal to the different participants present on the database. This is an approach which relies heavily on the entropy of the data sources, as the compressors used to perform the relative compression are based on finite-context models (similar to Markov models). We aim to show that it is possible to perform biometric identification using ECG signals using this methods, even without applying any kind of fiducial point detection on the ECG, making the method, in theory, more general than for this specific task or type of signal. Our results show that this approach is feasible in practical terms and provide competitive results, achieving an accuracy of around 89.3% using an publicly available database with 25 participants.

  • Open access
  • 78 Reads
Entropy of Vostok Ice Core Data and Kalman Filter Harmonic Bank Climate Predictor

Entropy of Vostok ice core data together with our notion of Kalman Filter Harmonic Bank (KFHB) Climate Prediction Engine (CPE) are introduced in this paper. In particular we examine CO2 Cycle 1 data (the most recent data cycle), and analyze so called Spectral Entropy of CO2 harmonics obtained by standard Fast Fourier Transform (FFT) analysis. We also introduce treatment of Vostok Data as a sample from a corresponding non stationary stochastic process for which, instead of FFT, we can use Karhunen-Loeve Expansion (KLE) for a set of discrete data values and the corresponding autocorrelation matrix, defining Representation Entropy as a broader concept compared to Spectral Entropy for FFT. Initial results for Spectral Entropy are presented as a measure of amplitude and energy analysis informational effectiveness which determines a set of signal harmonics implemented in a form of KFHB whereas each harmonic is generated by a two state Kalman Filter. The total signal is then represented as a sum of a set of amplitude or energy significant harmonics (hence the name Kalman Filter Harmonic Bank). Spectral Entropy calculations point to a suitable number of FFT generated harmonics to be used for signal synthesis by harmonic truncation. We also analyze using amplitude vs. energy (amplitude squared) as a base for entropic calculations. Similarly in the case of KLE, Representation Entropy would play the same role. Ultimately we are working to implement this approach into an effective Machine Learning short and long term CPE. It is critical to perform very detailed time and frequency data analysis as a solid base for the CPE methodology for modelling variations in climate.

  • Open access
  • 61 Reads
Measuring Functional Connectivity of Human Intra-Cortex Regions with Total Correlation

The economy of brain organization makes the primate brain consume less energy but efficiency. The neurons densely wired each other dependent on both anatomy structure connectivity and functional connectivity. Here, I only describe functional connectivity with Functional Magnetic Resonance Imaging (fMRI) data. Most importantly, how to quantitative measure information share or separate among functional brain regions, what’s worse, fMRI data exist large dimensional problems or “curse dimensionality” [1]. However, the multivariate total correlation method can perfectly address the above problems. In this paper, two things measured with the information-theoretic technique - total correlation [2, 3, 4]. First and foremost, quantitative measures intra-cortex regions dependent or independent from others from the information-theoretic view. Second, quantitative measures of intra-cortex functional connectivity play a crucial role in the mental clinical diagnosis.

The brain's sensitivity to the perceptual environment then adapts and responds to the outside world. The information integrated and separated happens in the brain and consumes less energy [5, 6]. In other words, the brain can be treated as an energy, entropy, physical complex system, or it's a naturally perfect stochastic complex system. Mathematically, the brain function can be denoted as: R = S(X), X represents stimuli from the outside world, S refers to unknown nonlinear functions which describe brain information processes, and R denotes response. In this paper, we use functional brain atlases to extract time series, then information estimated with total correlation, it can capture non-linear or even non-monotonic relationships in the high-dimensional data compared to other approaches, e.g. Spearman-ρ [7] and the Kendall-τ [8]et al.

In summary, this paper shows the estimated intra-cortex brain region's functional connectivity and also addresses non-linear relationships in the brain signal through total correlation.