Izvoz spreman — 
Učitavanje...

A Kullback-Leibler Divergence for Bayesian Model Diagnostics

This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified and that certain regularity conditions hold true (ref. Akaike [2]). We derive the...

Cijeli opis

Spremljeno u:
Bibliografski detalji
Izdano u:Open J Stat
Glavni autori: Wang, Chen-Pin, Ghosh, Malay
Format: Artigo
Jezik:Inglês
Izdano: 2011
Teme:
Online pristup:https://ncbi.nlm.nih.gov/pmc/articles/PMC4235748/
https://ncbi.nlm.nih.gov/pubmed/25414801
https://ncbi.nlm.nih.govhttp://dx.doi.org/10.4236/ojs.2011.13021
Oznake: Dodaj oznaku
Bez oznaka, Budi prvi tko označuje ovaj zapis!