A carregar...

Minimising the Kullback–Leibler Divergence for Model Selection in Distributed Nonlinear Systems

The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theor...

ver descrição completa

Na minha lista:
Detalhes bibliográficos
Publicado no:Entropy (Basel)
Main Authors: Cliff, Oliver M., Prokopenko, Mikhail, Fitch, Robert
Formato: Artigo
Idioma:Inglês
Publicado em: MDPI 2018
Assuntos:
Acesso em linha:https://ncbi.nlm.nih.gov/pmc/articles/PMC7512642/
https://ncbi.nlm.nih.gov/pubmed/33265171
https://ncbi.nlm.nih.govhttp://dx.doi.org/10.3390/e20020051
Tags: Adicionar Tag
Sem tags, seja o primeiro a adicionar uma tag!