Wird geladen...

Minimising the Kullback–Leibler Divergence for Model Selection in Distributed Nonlinear Systems

The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theor...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Entropy (Basel)
Hauptverfasser: Cliff, Oliver M., Prokopenko, Mikhail, Fitch, Robert
Format: Artigo
Sprache:Inglês
Veröffentlicht: MDPI 2018
Schlagworte:
Online Zugang:https://ncbi.nlm.nih.gov/pmc/articles/PMC7512642/
https://ncbi.nlm.nih.gov/pubmed/33265171
https://ncbi.nlm.nih.govhttp://dx.doi.org/10.3390/e20020051
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!