Energy, entropy, and information potential for neural computation

Free Download

Authors:

Edition: PhD Thesis

Size: 1 MB (1177122 bytes)

Pages: 206/206

File format:

Language:

Publishing Year:

Category:

Xu D.

The major goal of this research is to develop general nonparametric methods for the estimation of entropy and mutual information, giving a unifying point of view for their use in signal processing and neural computation. In many real world problems, the information is carried solely by data samples without any other a priori knowledge. The central issue of “learning from examples” is to estimate energy, entropy or mutual information of a variable only from its samples and adapt the system parameters by optimizing a criterion based on the estimation.

Reviews

There are no reviews yet.

Be the first to review “Energy, entropy, and information potential for neural computation”
Shopping Cart
Scroll to Top