We are  pleased to  announce the  release of PRISM2.1  which is  available for
download at:

  http://sato-www.cs.titech.ac.jp/prism/.

This is a major update enhanced by three new inference methods (VT,VB-VT,MCMC)
in addition to already available ones (EM,MAP-EM,VB-EM,DA-EM).

PRISM is a logic-based  probabilistic modeling language offering a declarative
interface  between  users and  machine  learning  tasks.   As a  probabilistic
extension of  Prolog, it is Turing-complete  and covers a wide  range of known
models  (BNs,HMMs,PCFGs,etc) and also  unexplored models  you define  by PRISM
programs.   PRISM  can  reduce  pains  in  probabilistic  modeling  by  making
available  high-level logical  expressions together  with high  level built-in
predicates for machine learning tasks listed below.

[1] Exact probability computation: efficient dynamic programming used
[2] Sampling:
     forward sampling, MCMC (Metropolis-Hastings style) for Bayesian inference
[3] Parameter learning:
     EM, DA-EM(deterministic annealing EM), MAP-EM, VT(Viterbi training, hard 
EM)
[4] Approximate Bayesian inference: VB(variational Bayes)-EM, VB-VT
[5] Viterbi inference:
     by parameters or by posterior distributions obtained from MCMC
[6] Model score computation: BIC, Cheesman-Stutz score, VFE(variational free 
energy),
     log marginal likelihood via MCMC
[7] Computing standard statistics

You can immediately  check these functionalities with many  test programs that
come with a  self-contained user manual in the PRISM  package. PRISM helps you
identify and develop the best (Bayesian, non-Bayeian) model for your problem.


With Best Regards,

Taisuke Sato, Neng-Fa Zhou, Yoshitaka Kameya
_______________________________________________
uai mailing list
uai@ENGR.ORST.EDU
https://secure.engr.oregonstate.edu/mailman/listinfo/uai

Reply via email to