I'm used to doing a Gaussian Process regression in GPFlow which lets you do this to solve for the posterior analytically:import gpflow as gpfrom gpflow.kernels import RBF, White, Periodic, Lineark = RBF(x.shape[1]) + White(x.shape[1])m = gp.models.GPR(x, y, k)self.model = mm.compile()opt = gp.train.ScipyOptimizer()opt.minimize(m)I've recently moved to PyMC3 and am trying to accomplish the same thing as above. I've found in the documentation this bit of code (https://docs.pymc.io/notebooks/GP-slice-sampling.html#Examine-actual-posterior-distribu...Read more

Is it possible to write decision-making models in either Stan or PyMC3? By that I mean: we define not only the distribution of random variables, but also the definition of decision and utility variables, and determine the decisions maximizing expected utility.My understanding is that Stan is more of a general optimizer than PyMC3, so that suggests decision models would be more directly implemented in it, but I would like to hear what people have to say.Edit: While it is possible to enumerate all decisions and compute their corresponding expecte...Read more

The code is in PyMC3, but this is a general problem. I want to find which matrix (combination of variables) gives me the highest probability. Taking the mean of the trace of each element is meaningless because they depend on each other.Here is a simple case; the code uses a vector rather than a matrix for simplicity. The goal is to find a vector of length 2, where the each value is between 0 and 1, so that the sum is 1.import numpy as npimport theanoimport theano.tensor as ttimport pymc3 as mc# define a theano Op for our likelihood functionclas...Read more

I am trying to define a pymc3.Normal variable with the following as mu:import numpy as npimport pymc3 as pmmx = np.array([[0.25 , 0.5 , 0.75 , 1. ], [0.25 , 0.333, 0.25 , 0. ], [0.25 , 0.167, 0. , 0. ], [0.25 , 0. , 0. , 0. ]])epsilon = pm.Gamma('epsilon', alpha=10, beta=10)p_ = pm.Normal('p_', mu=mx, shape = mx.shape, sd = epsilon)The problem is that all random variables in p_ get the same std (epsilon). I would like the first row to use epsilon1, the second row epsilon2 etc.How Can ...Read more

I'm new to Pymc3 and I'm trying to create the Categorical Mixture Model shown in https://en.wikipedia.org/wiki/Mixture_model#Categorical_mixture_model . I'm having difficulty hooking up the 'x' variable. I think it's because I have to make the z variable Deterministic, but I'm getting an error message at the line where 'x' is assigned : "ValueError: We expected 3 inputs but got 2.". It looks like the p function only accepts 2 inputs so I'm stuck. I've tried a bunch of different things, but haven't been able to get this to work yet.import nu...Read more

I am working with a simple bivariate normal model with a somewhat unconventional prior. The main issue I have is that my posteriors are inconsistent from one run to the next, which I'm guessing is related to an issue of high dependence between consecutive samples. Here are my specific questions.What is the best way to get N independent samples? At the moment, I've been calling sample() to get a big chain (e.g. length 10,000) and then taking every 100th sample starting at 1,000. But looking now at an autocorrelation profile of one of the par...Read more

I am trying to build a model for the likelihood function of a particular outcome of a Langevin equation (Brownian particle in a harmonic potential):Here is my model in pymc2 that seems to work:https://github.com/hstrey/BayesianAnalysis/blob/master/Langevin%20simulation.ipynb#define the model/function to be fitted.def model(x): t = pm.Uniform('t', 0.1, 20, value=2.0) A = pm.Uniform('A', 0.1, 10, value=1.0) @pm.deterministic(plot=False) def S(t=t): return 1-np.exp(-4*delta_t/t) @pm.deterministic(plot=False) def s(t=t): ...Read more

I am replicating some of the examples presented in "Think Bayes" by Allen Downey to pymc3.His great book provides us some introductory examples to Bayesian Methods and is done using Allen's own library.There is the "Train Problem", where you need to predict the number of trains a company have based on the number you see painted on each train (each train is numbered from 1 to N)The likelihood of this problem is basicallydef likelihood(self, data, hypo): if data > hypo: return 0 return 1/hypofor data in stream: for hypo in hypo...Read more

In PyMC3 examples, priors and likelihood are defined inside with statement, but they are not explicitly defined if they are priors or likelihood. How do I define them?In following example code, alpha and beta are priors and y_obs is likelihood(As PyMC3 examples states). My question is: How PyMC3 internal code finds out if distribution is of prior or likelihood? There should be some explicit parameter to tell PyMC3 internals about type of distribution (prior/likelihood). I know y_obs is likelihood, but I could define more y_obs1 y_obs2. How PyMC...Read more

I am trying to use PyMC3 to fit a model to some observed data. This model is based on external code (interfaced via theano.ops.as_op), and depends on multiple parameters that should be fit by the MCMC process. Since the gradient of the external code cannot be determined, I use the Metropolis-Hastings sampler.I have established Uniform priors for my inputs, and generate a model using my custom code. However, I want to compare the simulated data to my observations (a 3D np.ndarray) using the chi-squared statistic (sum of the squares of data-model...Read more

What are the returned values of find_MAP in pymc3 ?It seems that pymc3.Normal and pymc3.Uniform variables are not considered the same: for pymc3.Normal variables, find_MAP returns a value that looks like the maximum a posteriori probability. But for pymc3.Uniform variables, I get a '_interval' suffix added to the name of the variable and I don't find anywhere in the doc the meaning of the returned value (that may seem absurd, not even within the physical limits).For example:import numpy as npimport pymc3 as pm3# create basic data such as obs = ...Read more

I failed to fit a method belonging to an instance of a class, as a Deterministic function, with PyMc3. Can you show me how to do that ?For simplicity, my case is summarised below with a simple example. In reality, my constraint is that everything is made through a GUI and actions like ‘find_MAP’ should be inside methods linked to pyqt buttons.I want to fit the function ‘FunctionIWantToFit’ over the data points. Problem, the following code:import numpy as npimport pymc3 as pm3from scipy.interpolate import interp1dimport theano.tensor as ttimport...Read more

Can anyone tell me what's wrong in my code below ?I am a casual user of pymc2, generally for solving physical equations. I have troubles to adapt a fit to pymc3 and the documentation seems to me unclear. Also I did not recognize my problem on forums, probably because I don’t know what is my problem… I use the find_MAP method to get a first guess of fitted values but this first guess is completly wrong (not even inside the physical limits) and a warning tells me there are discrete variables (which is wrong), implying the gradient is not availabl...Read more

Does anyone know how I can see the final acceptance-rate in PyMC3 (Metropolis-Hastings) ? Or in general, how can I see all the information that pymc3.sample() returns ? Thanks...Read more

I have a list of observed data score and a list of indices ind. Every element of ind is either 0, 1, or 2. score and ind have the same length, and ind partitions score into three sets: if ind[i] is k, then score[i] is in set k. I would like to fit three normal distributions to the data, one normal for set 0, one normal for set 1, and one normal for set 2. My PyMC3 code to set up the model is:with pm.Model(): mean = pm.Uniform('mean', 0, 1, shape=3) sd = pm.Uniform('sd', 0, 1, shape=3) mean_i = pm.Deterministic('mean_i', mean[ind]) ...Read more