I'm familiar with the frequentist/Bayesian debate. In my view, Bayesian estimation is only useful for regularization, which is only necessary when you have a small amount of data, or the parameter optimization is underdetermined or multimodal. The goal is always to estimate the (parameters of the) distribution of the phenomenon of interest, or to make a decision (optimize some function of random quantities) with prior estimates on the relevant distributions. It is my understanding that Bayesians also believe in fixed distributions for given phenomena. It's simply the case that the true parameters are unknown and must be estimated from limited data. It's not the case that the distribution is in some nebulous state with the parameters assuming new values with each realization. Having imprecise knowledge about probabilities is different from having imprecise probabilities, whatever the latter may be defined to be.
The main point I wanted to make, which got lost in the preliminary philosophical commentary, was that it doesn't make sense to specify a probability distribution individually for each probability in a discrete distribution. If you want to regularize your estimate of the discrete distribution in some way, then you can put a prior on the set of discrete values, e.g. requiring adjacent probabilities to be close in value. But you can't put priors on the individual probabilities without regard for the joint distribution, since the marginal distributions of the probablilities must be consistent with the fact that each realization of the set must sum to 1. Jason -----Original Message----- From: Arvey, Aaron [mailto:[EMAIL PROTECTED] Sent: Thursday, November 17, 2005 10:12 AM To: Jason Palmer; Lotfi Zadeh; uai@engr.orst.edu Subject: RE: [UAI] Imprecise Probabilities--A simple and yet computationallynontrivial problem See below for response. > -----Original Message----- > From: [EMAIL PROTECTED] on behalf of Jason Palmer > Sent: Mon 11/14/2005 3:42 AM > To: Lotfi Zadeh; uai@engr.orst.edu > Subject: Re: [UAI] Imprecise Probabilities--A simple and yet computationallynontrivial problem > > Dear Prof. Zadeh (and others), > > My view is that the notion of imprecise probabilities is not well-defined. > It seems that you are imagining something like, before the experiment is > performed, the probability distribution is drawn randomly from some set of > distributions, then the experiment is performed based on that > distribution. > But it seems to me that these intermediary "probabilities" aren't really > probabilities then, they are model parameters. There will still be a final > distribution over the values of the outcome of the experiment, which in > your model is found by marginalizing over the "random probabilities". > > To me, the essence of probability is that some event occurs according to a > fixed distribution. A probability distribution can't apply to only one > experiment. Whatever event that is being observed either will or will not > occur with some stationary distribution. If it does have a distribution, > then it is impossible to distinguish any particular generating sequence of > random variables. Any model that yields that final distribution when > marginalized is possible. You might be able to talk about the distribution > evolving in time. > Perhaps the misunderstanding is in (somewhat philosophical) statistics. A Frequentist statistician (A Frequentist) would likely agree with you and say that a maximum likelihood point estimate should be used. However, a Bayesian statistician (A Bayesian) would likely state that parameters to a model should also be modeled probabilistically if they are unknown. In the distant past, mathematical Bayesians were mocked for their impreciseness; however, the Bayesian view has now been largely accepted in both the mathematical statistics community as well as in the computational community. A cursory google search for "comparison of bayesian and frequentist" turned up the following link to a presentation by a statistician http://www.pnl.gov/bayesian/Berry/index.htm I hope that this has shed some light on the situation. Regards, Aaron -----Original Message----- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Lotfi Zadeh Sent: Wednesday, November 09, 2005 5:15 PM To: uai@ENGR.ORST.EDU Cc: [EMAIL PROTECTED] Subject: [UAI] Imprecise Probabilities--A simple and yet computationally nontrivial problem Most real-world probabilities are imprecise. For this reason, as we move further into the age of machine intelligence and mechanized decision-making, the problem of how to deal with imprecise probabilities is certain to grow in visibility and importance. A major contribution to the theory of imprecise probabilities was Peter Walley's 1991 book "Statistical Reasoning with Imprecise Probabilities," London: Chapman and Hall. Since then, considerable progress has been made. And yet, what is obvious is that the problem of computation with imprecise probabilities is intrinsically complex and far from definitive solution. As a case in point, I posted to the UAI list (September 22, 2005) a seemingly simple problem which does not have a simple solution. In the following, a broadened version of the problem is presented. The simplest version is Problem (a). My perception is that even this simple problem is computationally nontrivial. Do you have a simple solution? Do you have any solutions to Problems (b), (c) and (d)? Problem: X and Y are random variables taking values in the set (1, 2, ...,n). The entries in the joint probability matrix, P, are of the form "approximately aij," where the aij take values in the unit interval and add up to unity. What is the marginal probability distribution of X? Four special cases: (a) "approximately aij," is interpreted as an interval centering on aij; (b) "approximately aij," is interpreted as a triangular fuzzy number centering on aij; (c) "approximately aij," is interpreted as a uniform probability distribution over an interval centering on aij; and (d) "approximatelt aij," is interpreted as a triangular probability density function centering on aij. With warm regards to all Lotfi -- Lotfi A. Zadeh Professor in the Graduate School, Computer Science Division Department of Electrical Engineering and Computer Sciences University of California Berkeley, CA 94720 -1776 Director, Berkeley Initiative in Soft Computing (BISC) ########################################### This message has been scanned by F-Secure Anti-Virus for Microsoft Exchange. For more information, connect to http://www.f-secure.com/ _______________________________________________ uai mailing list uai@ENGR.ORST.EDU https://secure.engr.oregonstate.edu/mailman/listinfo/uai