Skip to content

Commit

Permalink
Final max commit
Browse files Browse the repository at this point in the history
  • Loading branch information
Dhruvanshu-Joshi committed Jul 4, 2023
1 parent 8125d08 commit 9759720
Showing 1 changed file with 1 addition and 99 deletions.
100 changes: 1 addition & 99 deletions pymc/logprob/order.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,105 +111,7 @@ def find_measurable_max(fgraph: FunctionGraph, node: Node) -> Optional[List[Tens

@_logprob.register(MeasurableMax)
def max_logprob(op, values, base_rv, **kwargs):
r"""Compute the log-likelihood graph for the `Max` operation.
Parameters
----------
op : Max-Op
values : tensor_like
rv : TensorVariable
Returns
-------
logprob : TensorVariable
Examples
--------
It is often desirable to find the log-probability of the maximum of i.i.d. random variables.
The "max of i.i.d. random variables" refers to finding the maximum value among a collection of random variables that are independent and identically distributed.
The example below illustrates how to find the Maximum from the distribution of random variables.
.. code-block:: python
import pytensor.tensor as pt
x = pt.random.normal(0, 1, size=(3,))
x.name = "x"
print(x.eval())
#[0.61748772 1.08723759 0.98970957]
x_max = pt.max(x, axis=None)
print(x_max.eval())
# 1.087237592696084
The log-probability of the maximum of i.i.d. random variables is a measure of the likelihood of observing a specific maximum value in a set of independent and identically distributed random variables.
The formula that we use here is :
\ln(f_{(n)}(x)) = \ln(n) + (n-1) \ln(F(x)) + \ln(f(x))
where f(x) represents the p.d.f and F(x) represents the c.d.f of the distribution respectively.
An example corresponding to this is illustrated below:
.. code-block:: python
import pytensor.tensor as pt
from pymc import logp
x = pt.random.uniform(0, 1, size=(3,))
x.name = "x"
# [0.09081509 0.84761712 0.59030273]
x_max = pt.max(x, axis=-1)
# 0.8476171198716373
x_max_value = pt.scalar("x_max_value")
x_max_logprob = logp(x_max, x_max_value)
test_value = x_max.eval()
x_max_logprob.eval({x_max_value: test_value})
# 0.7679597791946853
Currently our implementation has certain limitations which are mandated through some constraints.
We only consider a distribution of RandomVariables and the logp function fails for NonRVs.
.. code-block:: python
import pytensor.tensor as pt
from pymc import logp
x = pt.exp(pt.random.beta(0, 1, size=(3,)))
x.name = "x"
x_max = pt.max(x, axis=-1)
x_max_value = pt.vector("x_max_value")
x_max_logprob = logp(x_max, x_max_value)
The above code gives a Runtime error stating logprob method was not implemented as x in this case is not a pure random variable.
A pure random variable in PyMC represents an unknown quantity in a Bayesian model and is associated with a prior distribution that is combined with the likelihood of observed data to obtain the posterior distribution through Bayesian inference
We assume only univariate distributions as for multivariate variables, the concept of ordering is ambiguous since a "depth function" is required .
We only consider independent and identically distributed random variables, for now.
In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent.
.. code-block:: python
import pytensor.tensor as pt
from pymc import logp
x = pm.Normal.dist([0, 1, 2, 3, 4], 1, shape=(5,))
x.name = "x"
x_max = pt.max(x, axis=-1)
x_max_value = pt.vector("x_max_value")
x_max_logprob = logp(x_max, x_max_value)
The above code gives a Runtime error stating logprob method was not implemented as x in this case is a Non-iid distribution.
Note: We assume a very fluid definition of i.i.d. here. We say that an RV belongs to an i.i.d. if that RVs do not have different stochastic ancestors.
"""
r"""Compute the log-likelihood graph for the `Max` operation."""
(value,) = values

logprob = _logprob_helper(base_rv, value)
Expand Down

0 comments on commit 9759720

Please sign in to comment.