MIT License
ParaMonte: plain powerful parallel Monte Carlo library.
Copyright (C) 2012-present, The Computational Data Science Lab
https://github.com/cdslaborg/paramonte
References

Simple power-law (or linear!) regression via the ParaDRAM sampler

NOTE
If you are viewing an HTML version of this MATLAB live script on the web, you can download the corresponding MATLAB live script *.mlx file to this HTML page at,
https://github.com/cdslaborg/paramontex/raw/main/MATLAB/mlx/regression_powerlaw_data_paradram/regression_powerlaw_data_paradram.mlx
This code, along with other MATLAB live scripts, is located at,
https://github.com/cdslaborg/paramontex/tree/main/MATLAB/mlx
Once you download the file, open it in MATLAB to view and interact with its contents, which is the same as what you see on this page.
NOTE
This ParaDRAM sampler example is also available in Python language as a Jupyter Notebook, on this page,
https://nbviewer.jupyter.org/github/cdslaborg/paramontex/blob/main/Python/Jupyter/regression_powerlaw_data_paradram/regression_powerlaw_data_paradram.ipynb
This Jupyter Notebook, along with other Python Jupyter Notebooks, is located at,
https://github.com/cdslaborg/paramontex/tree/main/Python/Jupyter
IMPORTANT - macOS users
If you are a macOS (Darwin) user and you have downloaded the prebuilt ParaMonte libraries from the GitHub release page to use on your macOS system, read this troubleshooting page before continuing.

Setting up the path to the ParaMonte::MATLAB library

================================================================
First, we will clean up the MATLAB environment and make sure the path to the ParaMonte library is in MATLAB's path list.
clc;
clear all;
close all;
format compact; format long;
%%%%%%%%%%%% IMPORTANT %%%%%%%%%%%%%
% Set the path to the ParaMonte library:
% Change the following path to the ParaMonte library root directory,
% otherwise, make sure the path to the ParaMonte library is already added
% to MATLAB's path list.
pmlibRootDir = './';
addpath(genpath(pmlibRootDir),"-begin");
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% change MATLAB's working directory to the folder containing this script
% if MATLAB Live Scripts did not create a temporary folder, we would not
% have all of these problems!
try
setwdFilePath = websave("setwd.m","https://github.com/cdslaborg/paramontex/raw/main/MATLAB/mlx/setwd.m");
run(setwdFilePath); % This is a MATLAB script that you can download from the same GitHub location given in the above.
catch % alas, we will have to run the simulations in MATLAB Live Script's temporary folder
filePath = mfilename('fullpath');
[currentDir,fileName,fileExt] = fileparts(filePath); cd(currentDir);
cd(fileparts(mfilename('fullpath'))); % Change working directory to source code directory.
end

Simple power-law (or linear!) regression

===============================================
Suppose we have collected a dataset comprised of following pair observations,

and we would like to model the dependence of Y on X. First thing to do in such problems is to explore the data and visualize it in search of any potential patterns, as we do below.
X = [0.5, 2.4, 3.2, 4.9, 6.5, 7.8];
Y = [0.8, 9.3, 37.9, 68.2, 155, 198];
figure; hold on; box on;
plot(X,Y,".", "markerSize", 30, "color","red");
xlabel("X");
ylabel("Y");
set(gca,"xscale","linear","yscale","linear");
hold off;
Obviously, the relationship between X and Y in this dataset is non-linear. A fairly experienced Data Scientist could immediately guess the underlying relatioship in this dataset: power-law, which can be readily confirmed by checking the relationship on a log-log plot. This works because if,

figure; hold on; box on;
plot(X,Y,".", "markerSize", 30, "color","red");
xlabel("X");
ylabel("Y");
set(gca,"xscale","log","yscale","log");
hold off
However, a simple line does not perfectly fit the (log-transformed) data either. Therefore, we must also take into account the inherent variability in our data around the linear fit. Although not always correct, the simplest most popular assumption about the noise in data is that the Y is distributed according to a Normal distribution around the best-fit line to our log-transformed data,

The unknown parameters of this Normal distribution, the mean ( μ ) and the standard deviation ( σ ) would have to be then determined by fitting. Fortunately, the mean of the normal distribution can be simply taken to be the prediction of our best linear fit. Therefore, the mean is a function of the x value and the slope and intercept of our linear fit to log-transformed data. However, the standard deviation would be to fit and constrained separately.
Now that we have constructed our model and the noise, we can proceed to use the principle of Maximum Likelihood to constrain the unknown parameters of the model using the Markov Chain Monte Carlo approach that is available from the ParaMonte library via the ParaDRAM() sampler class.
To summarize, the unknown parameters are the slope and intercept of the linear fit and the standard deviation sigma of the Normal distribution.
The probability of observing a single data point is given by the above Normal Probability Density Function (PDF). Therefore, the probability of observing all of our dataset, assuming they are independent and identically distributed (i.i.d.), is simply the multiplication of all of the individual probabilities,

The likelihood is often difficult as the likelihood value is often very tiny causing underflow in computers. Therefore, we often prefer to work with the log-transformation of the likelihood which does not suffer from the same problem,

We now go on to implement this log-likelihood as a MATLAB function. However, before doing so, we shall take care of an extremely important issue: The possible range of values of the parameters.
If you pay attention to the parameters, you will notice that the intercept and slope can in general be any real value, from negative infinity to positive infinity. However, the third parameter, the standard deviation of the Normal distribution, is a strictly positive-valued number. This range-asymmetry can cause instabilities in our Markov Chain sampler.
How can we solve this and symmetrize the range of sigma? The answer is simple: We will sample instead of σ itself. So, our third sampling parameter will be logSigma instead of sigma, which has a symmetric possible range from negative infinity to positive infinity. Then, inside our likelihood function, we will convert logSigma back to sigma.
The getLogLike() function implementation is provided at the end of this live script. Now let's test this likelihood function for a random input test value for the parameters to ensure it works fine. We will pick our best guess (simply by visual inspection of the data log-log plot).
getLogLike([0.8,2,0.1])
ans = -6.501444916760644
NOTE
Since the mathematical objective functions (e.g., probability density functions) can take extremely small or large values, we often work with their natural logarithms instead. This is the reason behind the naming convention used in the ParaMonte library for the user's objective functions: getLogFunc, implying that the user must provide a function that returns the natural logarithm of the target objective function.
See MATLAB Live Script for an in-depth discussion of why we need to work with the logarithm of mathematical objective functions in optimization and sampling problems.
IMPORTANT
Note that in MATLAB all vectors and arrays are, by default, column-major, and so is the input value, point, to getLogFunc(). This means that the size of point is (ndim,1). Thus, all other vectors that potentially interact with point, must be also of the same size as point: (ndim,1).

Running a ParaDRAM simulation on a single processor

================================================================
We can now proceed to the set up a ParaDRAM Markov Chain Monte Carlo sampler to constrain the three unknown parameters of the model.
Note: To run the ParaMonte samplers in parallel, visit the ParaMonte library's documentation website.
We will first create an instance of the paramonte class,
pm = paramonte();

Checking the ParaMonte library's version

==========================================
Before runnnig any simulations, it is good to check which version of the ParaMonte library you have and are working with,
pm.version.interface.get() % get the ParaMonte::MATLAB (interface) version
ans = "ParaMonte MATLAB Interface Version 2.5.0"
pm.version.interface.dump()
ans = "2.5.0"
pm.version.kernel.get() % the ParaMonte::Kernel version (the version of the ParaDRAM sampler)
ans = "ParaMonte MATLAB Kernel Version 1.5.1"
pm.version.kernel.dump()
ans = "1.5.1"
We can also verify the existence of the essential components of the current installation of the ParaMonte library on the system,
pm.verify();
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::: :::: _/_/_/_/ _/_/ _/_/ _/ _/ _/_/_/_/_/ _/ _/ _/ _/_/_/_/ _/ /_/_/ _/_/_/_/ _/ _/ _/ _/_/ _/_/_/ _/_/_/ _/_/_/ _/_/_/ _/ _/ _/_/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/_/_/_/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/ _/_/_/ _/_/_/_/ _/ _/_/_/_/ _/_/_/ _/_/_/ _/_/ _/ _/ _/_/ _/_/_/ ParaMonte plain powerful parallel Monte Carlo library Version 2.5.0 :::: :::: ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: ParaMonte - NOTE: Intel MPI library for 64-bit architecture detected at: ParaMonte - NOTE: ParaMonte - NOTE: "C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2020.1.216\windows\mpi\intel64\bin" ParaMonte - NOTE: ParaMonte - NOTE: To perform ParaMonte simulations in parallel on a single node, run the ParaMonte - NOTE: following two commands, in the form and order specified, on a MATLAB-aware ParaMonte - NOTE: mpiexec-aware command-line interface, such as the Intel Parallel Studio's command-prompt: ParaMonte - NOTE: ParaMonte - NOTE: "C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2020.1.216\windows\mpi\intel64\bin\mpivars.bat" ParaMonte - NOTE: ParaMonte - NOTE: mpiexec -localonly -n 3 matlab -batch "main_mpi" ParaMonte - NOTE: ParaMonte - NOTE: where, ParaMonte - NOTE: ParaMonte - NOTE: 0. the first command defines the essential environment variables and, ParaMonte - NOTE: the second command runs in the simulation in parallel, in which, ParaMonte - NOTE: 1. you should replace the default number 3 with the number of ParaMonte - NOTE: processors you wish to assign to your simulation task, ParaMonte - NOTE: 2. main_mpi.m is the MATLAB file which serves as the entry point to ParaMonte - NOTE: your simulation, where you call the ParaMonte sampler routines. ParaMonte - NOTE: ATTN: Notice the MATLAB filename must appear without the file extension. ParaMonte - NOTE: ATTN: Notice the MATLAB filename must be enclosed with quotation marks. ParaMonte - NOTE: 3. -localonly indicates a parallel simulation on only a single node (this ParaMonte - NOTE: flag will obviate the need for MPI library credentials registration). ParaMonte - NOTE: For more information, visit: ParaMonte - NOTE: ParaMonte - NOTE: https://software.intel.com/en-us/get-started-with-mpi-for-windows ParaMonte - NOTE: ParaMonte - NOTE: Note that the above two commands must be executed on a command-line that recognizes ParaMonte - NOTE: both MATLAB and mpiexec applications, such as the Intel Parallel Studio's command-prompt. ParaMonte - NOTE: For more information, in particular, on how to register to run Hydra services ParaMonte - NOTE: for multi-node simulations on Windows servers, visit: ParaMonte - NOTE: ParaMonte - NOTE: https://www.cdslab.org/paramonte ParaMonte - NOTE: To check for the ParaMonte or the MPI library installations status or, ParaMonte - NOTE: to display the above messages in the future, use the following ParaMonte - NOTE: commands on the MATLAB command-line: ParaMonte - NOTE: ParaMonte - NOTE: pm = paramonte(); ParaMonte - NOTE: pm.verify();

Setting up the ParaDRAM simulation specifications

====================================================
Now, we create an instance of the ParaDRAM sampler class,
pmpd = pm.ParaDRAM();
The simplest scenario is to run the simulation with the default specifications that are appropriately determined by the ParaDRAM sampler. However, for the purpose of clarity reproducibility, we will specify a few simulation specs,
For a complete list of all ParaDRAM simulation specifications, visit the ParaDRAM simulation specifications webpage.

The simulation output file names and storage path

We will store all output files in a directory with the same name as this MATLAB Live Script's name.
By default, all ParaDRAM simulation output files are named properly (see this page for a review of ParaDRAM simulation output files). In general, it is a good habit to specify an output file prefix for any ParaDRAM simulation. This way, the restart of an incomplete simulation, if it happens for some reason, would be seamless and doable by just rerunning the simulation, without any change of any sort to any parts of the codes or the simulation.
We will prefix all simulation output files the same as the filename of this script,
pmpd.spec.outputFileName = "./out/regression_powerlaw"; % specify the output file prefixes.
The above outputFileName will result in the generation of a folder named ./out/, which will keep the simulation output files, all of which are prefixed with mvn_serial.
NOTE
If the specified prefix ends with a forward slash / on Linux/macOS or with a backslash \ on Windows, then the prefix provided will serve as the directory in which the output files will be stored.

Ensuring the reproducibility of the results

We will also initialize the random seed to a fixed number to ensure the reproducibility of the simulation results in the future,
pmpd.spec.randomSeed = 12345; % set the random seed for the sake of reproducibity.
For the sake of illustration, let's also ask for a modest refinement (decorrelation) of the final sample,
pmpd.spec.overwriteRequested = true; % overwrite the existing output files just in case they already exist.
pmpd.spec.variableNameList = ["intercept", "slope", "logSigma"]; % set the output names of the parameters.
pmpd.spec.chainSize = 20000; % set the number of uniquely sampled points from the likelihood function.
Since we have three unknown parameters to estimate (the intercept and slope of the linear fit as well as the dispersion of data around the fit as measured by the standard deviation of the Normal distribution ( σ ), we will define NDIM = 3.
NDIM = 3; % The number of unknown parameters to constrain.

The default domain of the objective function

The default starting point of the sampler for exploring the objective function is the center of the domain of the objective function. The default domain of the objective function chosen by the sampler is along each dimension of the objective function. This is a fine default assumption for this particular simulation and in most other scenarios. If needed, you can change them via the following attributes,
pmpd.spec.domainLowerLimitVec = -1.e300 * ones(NDIM,1); % set the lower limits of the domain of the objective function to negative infinity (This is the default value)
pmpd.spec.domainUpperLimitVec = +1.e300 * ones(NDIM,1); % set the upper limits of the domain of the objective function to negative infinity (This is the default value)
The ParaDRAM sampler guarantees to not sample any points outside the hypercube defined by the above limits.

Calling the ParaDRAM sampler

===============================
Note that none of the above specification settings were necessary, but is in general a good habit to define them prior to the simulation.
We now call the ParaDRAM sampler routine to run the sampling simulation. For the same of this simulation, we will also make the output files overwritable (before doing, The ParaDRAM sampler method takes only two arguments. We can call it like the following,
pmpd.runSampler ( NDIM ... This is the number of dimensions of the objective function
, @getLogLike ... This is the objective function
);
ParaDRAM - NOTE: Running the ParaDRAM sampler in serial mode... ParaDRAM - NOTE: To run the ParaDRAM sampler in parallel visit: ParaDRAM - NOTE: ParaDRAM - NOTE: https://www.cdslab.org/paramonte ************************************************************************************************************************************ ************************************************************************************************************************************ **** **** **** **** **** ParaMonte **** **** Plain Powerful Parallel **** **** Monte Carlo Library **** **** **** **** Version 1.5.1 **** **** **** **** Build: Sun Jan 03 05:10:49 2021 **** **** **** **** Department of Physics **** **** Computational & Data Science Lab **** **** Data Science Program, College of Science **** **** The University of Texas at Arlington **** **** **** **** originally developed at **** **** **** **** Multiscale Modeling Group **** **** Center for Computational Oncology (CCO) **** **** Oden Institute for Computational Engineering and Sciences **** **** Department of Aerospace Engineering and Engineering Mechanics **** **** Department of Neurology, Dell-Seton Medical School **** **** Department of Biomedical Engineering **** **** The University of Texas at Austin **** **** **** **** For questions and further information, please contact: **** **** **** **** Amir Shahmoradi **** **** **** **** shahmoradi@utexas.edu **** **** amir.shahmoradi@uta.edu **** **** ashahmoradi@gmail.com **** **** **** **** cdslab.org/pm **** **** **** **** https://www.cdslab.org/paramonte/ **** **** **** **** **** ************************************************************************************************************************************ ************************************************************************************************************************************ ************************************************************************************************************************************ **** **** **** Setting up the ParaDRAM simulation environment **** **** **** ************************************************************************************************************************************ ParaDRAM - NOTE: Interfacing MATLAB with ParaDRAM... ParaDRAM - NOTE: Variable outputFileName detected among the input variables to ParaDRAM: ParaDRAM - NOTE: ./out/regression_powerlaw ParaDRAM - NOTE: ParaDRAM - NOTE: Absolute path to the current working directory: ParaDRAM - NOTE: D:\Dropbox\Projects\20180101_ParaMonte\git\paramontex\MATLAB\mlx\regression_powerlaw_data_paradram ParaDRAM - NOTE: ParaDRAM - NOTE: Generating the requested directory for the ParaDRAM output files: ParaDRAM - NOTE: .\out\ ParaDRAM - NOTE: ParaDRAM - NOTE: ParaDRAM output files will be prefixed with: ParaDRAM - NOTE: .\out\regression_powerlaw ParaDRAM - NOTE: Searching for previous runs of ParaDRAM... ParaDRAM - NOTE: No pre-existing ParaDRAM run detected. ParaDRAM - NOTE: Starting a fresh ParaDRAM run... ParaDRAM - NOTE: Generating the output report file: ParaDRAM - NOTE: .\out\regression_powerlaw_process_1_report.txt ParaDRAM - NOTE: Running the simulation in serial on 1 process. ParaDRAM - NOTE: Please see the output report and progress files for further realtime simulation details. Accepted/Total Func. Call Dynamic/Overall Acc. Rate Elapsed/Remained Time [s] ========================= ========================= ========================= 77 / 999 0.0769 / 0.0770 0.0400 / 10.3496 195 / 1999 0.1195 / 0.0983 0.0870 / 8.8361 340 / 2999 0.1417 / 0.1127 0.1260 / 7.2858 520 / 3999 0.1792 / 0.1294 0.1710 / 6.4059 688 / 4999 0.1769 / 0.1389 0.2100 / 5.8947 861 / 5999 0.1695 / 0.1440 0.2350 / 5.2238 1045 / 6999 0.1754 / 0.1485 0.2650 / 4.8068 1269 / 7999 0.2177 / 0.1571 0.3010 / 4.4429 1485 / 8999 0.2090 / 0.1629 0.3300 / 4.1144 1699 / 9999 0.2099 / 0.1676 0.3610 / 3.8886 1937 / 10999 0.2264 / 0.1729 0.3890 / 3.6275 2125 / 11999 0.1890 / 0.1743 0.4200 / 3.5329 2342 / 12999 0.2141 / 0.1773 0.4530 / 3.4155 2608 / 13999 0.2581 / 0.1831 0.4860 / 3.2410 2847 / 14999 0.2315 / 0.1863 0.5190 / 3.1269 3111 / 15999 0.2484 / 0.1902 0.5560 / 3.0184 3350 / 16999 0.2306 / 0.1926 0.5890 / 2.9274 3540 / 17999 0.1983 / 0.1929 0.6330 / 2.9433 3758 / 18999 0.2175 / 0.1942 0.6650 / 2.8741 3979 / 19999 0.2121 / 0.1951 0.6940 / 2.7943 4240 / 20999 0.2504 / 0.1977 0.7250 / 2.6948 4427 / 21999 0.1816 / 0.1970 0.7570 / 2.6629 4620 / 22999 0.2041 / 0.1973 0.7950 / 2.6466 4828 / 23999 0.2151 / 0.1980 0.8240 / 2.5894 5089 / 24999 0.2475 / 0.2000 0.8590 / 2.5169 5298 / 25999 0.2151 / 0.2006 0.9020 / 2.5031 5553 / 26999 0.2507 / 0.2025 0.9480 / 2.4664 5771 / 27999 0.2285 / 0.2034 1.0310 / 2.5420 6002 / 28999 0.2342 / 0.2045 1.0650 / 2.4838 6242 / 29999 0.2323 / 0.2054 1.0950 / 2.4135 6477 / 30999 0.2428 / 0.2066 1.1270 / 2.3530 6705 / 31999 0.2269 / 0.2072 1.1640 / 2.3080 6962 / 32999 0.2464 / 0.2084 1.2040 / 2.2548 7279 / 33999 0.2941 / 0.2109 1.2540 / 2.1915 7542 / 34999 0.2683 / 0.2126 1.2910 / 2.1325 7770 / 35999 0.2300 / 0.2131 1.3290 / 2.0918 8011 / 36999 0.2427 / 0.2139 1.3600 / 2.0353 8281 / 37999 0.2714 / 0.2154 1.3930 / 1.9713 8527 / 38999 0.2407 / 0.2160 1.4300 / 1.9241 8797 / 39999 0.2609 / 0.2171 1.4640 / 1.8644 9039 / 40999 0.2523 / 0.2180 1.5020 / 1.8214 9332 / 41999 0.2792 / 0.2195 1.5400 / 1.7605 9611 / 42999 0.2742 / 0.2207 1.5820 / 1.7101 9842 / 43999 0.2395 / 0.2212 1.6190 / 1.6710 10126 / 44999 0.2831 / 0.2225 1.6540 / 1.6128 10381 / 45999 0.2575 / 0.2233 1.6830 / 1.5595 10664 / 46999 0.2742 / 0.2244 1.7180 / 1.5041 10935 / 47999 0.2640 / 0.2252 1.7530 / 1.4532 11218 / 48999 0.2719 / 0.2262 1.7950 / 1.4052 11467 / 49999 0.2489 / 0.2266 1.8250 / 1.3580 11747 / 50999 0.2772 / 0.2276 1.8610 / 1.3075 11992 / 51999 0.2480 / 0.2280 1.9090 / 1.2748 12247 / 52999 0.2585 / 0.2286 1.9430 / 1.2300 12494 / 53999 0.2573 / 0.2291 1.9880 / 1.1943 12745 / 54999 0.2554 / 0.2296 2.0200 / 1.1499 13014 / 55999 0.2693 / 0.2303 2.0600 / 1.1058 13254 / 56999 0.2350 / 0.2304 2.1100 / 1.0739 13477 / 57999 0.2358 / 0.2305 2.1540 / 1.0426 13738 / 58999 0.2593 / 0.2310 2.1940 / 1.0001 13979 / 59999 0.2392 / 0.2311 2.2290 / 0.9601 14201 / 60999 0.2333 / 0.2311 2.2630 / 0.9241 14479 / 61999 0.2801 / 0.2319 2.3060 / 0.8793 14719 / 62999 0.2481 / 0.2322 2.3400 / 0.8396 14998 / 63999 0.2685 / 0.2327 2.3710 / 0.7908 15263 / 64999 0.2571 / 0.2331 2.4120 / 0.7486 15535 / 65999 0.2752 / 0.2338 2.4450 / 0.7027 15781 / 66999 0.2538 / 0.2341 2.4940 / 0.6668 16017 / 67999 0.2314 / 0.2340 2.5380 / 0.6311 16286 / 68999 0.2655 / 0.2345 2.5770 / 0.5877 16576 / 69999 0.2821 / 0.2352 2.6150 / 0.5402 16848 / 70999 0.2705 / 0.2357 2.6510 / 0.4960 17080 / 71999 0.2429 / 0.2358 2.6800 / 0.4582 17357 / 72999 0.2752 / 0.2363 2.7130 / 0.4131 17644 / 73999 0.2751 / 0.2368 2.7500 / 0.3672 17950 / 74999 0.2980 / 0.2376 2.7930 / 0.3190 18195 / 75999 0.2430 / 0.2377 2.8260 / 0.2803 18459 / 76999 0.2667 / 0.2381 2.8590 / 0.2387 18737 / 77999 0.2672 / 0.2385 2.8960 / 0.1952 19020 / 78999 0.2808 / 0.2390 2.9370 / 0.1513 19300 / 79999 0.2729 / 0.2394 2.9740 / 0.1079 19573 / 80999 0.2736 / 0.2398 3.0190 / 0.0659 19835 / 81999 0.2567 / 0.2400 3.0640 / 0.0255 20000 / 82708 0.2515 / 0.2401 3.0880 / 0.0000 ParaDRAM - NOTE: Computing the statistical properties of the Markov chain... ParaDRAM - NOTE: Computing the final decorrelated sample size... ParaDRAM - NOTE: Generating the output sample file: ParaDRAM - NOTE: .\out\regression_powerlaw_process_1_sample.txt ParaDRAM - NOTE: Computing the statistical properties of the final refined sample... ParaDRAM - NOTE: Mission Accomplished. ParaDRAM - NOTE: To read the generated output files, try the following: ParaDRAM - NOTE: ParaDRAM - NOTE: pmpd.readReport() % to read the summary report from the output report file. ParaDRAM - NOTE: pmpd.readSample() % to read the final i.i.d. sample from the output sample file. ParaDRAM - NOTE: pmpd.readChain() % to read the uniquely-accepted points from the output chain file. ParaDRAM - NOTE: pmpd.readMarkovChain() % to read the Markov Chain. NOT recommended for extremely-large chains. ParaDRAM - NOTE: pmpd.readRestart() % to read the contents of an ASCII-format output restart file. ParaDRAM - NOTE: pmpd.readProgress() % to read the contents of an output progress file. ParaDRAM - NOTE: ParaDRAM - NOTE: For more information and examples on the usage, visit: ParaDRAM - NOTE: ParaDRAM - NOTE: https://www.cdslab.org/paramonte
Done. We can read the generated sample to analyze and visualize the results. However, before doing so, we will make sure there is no lack of convergence in the MCMC chain by inspecting the **adaptation measure** of the chain that is available in the output chain file.
chainList = pmpd.readChain();
ParaDRAM - WARNING: The ParaDRAM input simulation specification `pmpd.spec.outputDelimiter` is not set. ParaDRAM - WARNING: This information is essential for successful reading of the requested chain file(s). ParaDRAM - WARNING: Proceeding with the default assumption of comma-delimited chain file contents... ParaDRAM - NOTE: 1 files detected matching the pattern: ParaDRAM - NOTE: "D:\Dropbox\Projects\20180101_ParaMonte\git\paramontex\MATLAB\mlx\regression_powerlaw_data_paradram\out\regression_powerlaw*" ParaDRAM - NOTE: processing file: "D:\Dropbox\Projects\20180101_ParaMonte\git\paramontex\MATLAB\mlx\regression_powerlaw_data_paradram\out\regression_powerlaw_process_1_chain.txt" ParaDRAM - NOTE: reading the file contents... ParaDRAM - NOTE: done in 0.707380 seconds. ParaDRAM - NOTE: ndim = 3, count = 20000 ParaDRAM - NOTE: computing the sample correlation matrix... ParaDRAM - NOTE: creating the heatmap plot object from scratch... ParaDRAM - NOTE: done in 0.316870 seconds. ParaDRAM - NOTE: computing the sample covariance matrix... ParaDRAM - NOTE: creating the heatmap plot object from scratch... ParaDRAM - NOTE: done in 0.126480 seconds. ParaDRAM - NOTE: computing the sample autocorrelation... ParaDRAM - NOTE: creating the line plot object from scratch... ParaDRAM - NOTE: creating the scatter plot object from scratch... ParaDRAM - NOTE: creating the lineScatter plot object from scratch... ParaDRAM - NOTE: done in 0.538260 seconds. ParaDRAM - NOTE: adding the graphics tools... ParaDRAM - NOTE: creating the line plot object from scratch... ParaDRAM - NOTE: creating the scatter plot object from scratch... ParaDRAM - NOTE: creating the lineScatter plot object from scratch... ParaDRAM - NOTE: creating the line3 plot object from scratch... ParaDRAM - NOTE: creating the scatter3 plot object from scratch... ParaDRAM - NOTE: creating the lineScatter3 plot object from scratch... ParaDRAM - NOTE: creating the histogram plot object from scratch... ParaDRAM - NOTE: creating the histogram2 plot object from scratch... ParaDRAM - NOTE: creating the histfit plot object from scratch... ParaDRAM - NOTE: creating the contour plot object from scratch... ParaDRAM - NOTE: creating the contourf plot object from scratch... ParaDRAM - NOTE: creating the contour3 plot object from scratch... ParaDRAM - NOTE: creating the grid plot object from scratch... ParaDRAM - NOTE: The processed chain files are now stored in the output variable as a cell array. For example, to ParaDRAM - NOTE: access the contents of the first (or the only) chain file stored in an output variable named ParaDRAM - NOTE: OUTPUT_CELL_ARRAY, try: ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.df ParaDRAM - NOTE: ParaDRAM - NOTE: To access the plotting tools, try: ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.<PRESS TAB TO SEE THE LIST OF PLOTS> ParaDRAM - NOTE: ParaDRAM - NOTE: For example, ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.line.make() % to make 2D line plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.scatter.make() % to make 2D scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.lineScatter.make() % to make 2D line-scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.line3.make() % to make 3D line plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.scatter3.make() % to make 3D scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.lineScatter3.make() % to make 3D line-scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.contour3.make() % to make 3D kernel-density contour plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.contourf.make() % to make 2D kernel-density filled-contour plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.contour.make() % to make 2D kernel-density plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.histogram2.make() % to make 2D histograms. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.histogram.make() % to make 1D histograms. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.grid.make() % to make GridPlot ParaDRAM - NOTE: ParaDRAM - NOTE: To plot or inspect the variable autocorrelations or the correlation/covariance matrices, try: ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.stats.<PRESS TAB TO SEE THE LIST OF COMPONENTS> ParaDRAM - NOTE: ParaDRAM - NOTE: For more information and examples on the usage, visit: ParaDRAM - NOTE: ParaDRAM - NOTE: https://www.cdslab.org/paramonte
chain = chainList{1}; % Choose the contents of the first read chain file (in this case, there is only one file).
chain.plot.scatter.ycolumns = "AdaptationMeasure"; ... Choose the AdaptationMeasure column of the output chain file to plot.
chain.plot.scatter.ccolumns = []; % Set the color of the points to empty from the default logLikelihood value.
chain.plot.scatter.make();
set(gca,"yscale","log");
xlabel("MCMC Sample Count");
ylabel("Proposal Adaptation Measure");
The power-law decay of the adaptation measure tells us everything has gone well with our simulation. If there were to exist any signs of a lack of convergence, we would not see this power-law decaying behavior. This power-law decay implies that the proposal adaptation decayed and diminished consistently throughout the entire simulation, as we wished.
We can now visualize the resulting refined sample by reading and plotting it from the output sample file.
sampleList = pmpd.readSample();
ParaDRAM - WARNING: The ParaDRAM input simulation specification `pmpd.spec.outputDelimiter` is not set. ParaDRAM - WARNING: This information is essential for successful reading of the requested sample file(s). ParaDRAM - WARNING: Proceeding with the default assumption of comma-delimited sample file contents... ParaDRAM - NOTE: 1 files detected matching the pattern: ParaDRAM - NOTE: "D:\Dropbox\Projects\20180101_ParaMonte\git\paramontex\MATLAB\mlx\regression_powerlaw_data_paradram\out\regression_powerlaw*" ParaDRAM - NOTE: processing file: "D:\Dropbox\Projects\20180101_ParaMonte\git\paramontex\MATLAB\mlx\regression_powerlaw_data_paradram\out\regression_powerlaw_process_1_sample.txt" ParaDRAM - NOTE: reading the file contents... ParaDRAM - NOTE: done in 0.041740 seconds. ParaDRAM - NOTE: ndim = 3, count = 3033 ParaDRAM - NOTE: computing the sample correlation matrix... ParaDRAM - NOTE: creating the heatmap plot object from scratch... ParaDRAM - NOTE: done in 0.076205 seconds. ParaDRAM - NOTE: computing the sample covariance matrix... ParaDRAM - NOTE: creating the heatmap plot object from scratch... ParaDRAM - NOTE: done in 0.024757 seconds. ParaDRAM - NOTE: computing the sample autocorrelation... ParaDRAM - NOTE: creating the line plot object from scratch... ParaDRAM - NOTE: creating the scatter plot object from scratch... ParaDRAM - NOTE: creating the lineScatter plot object from scratch... ParaDRAM - NOTE: done in 0.142410 seconds. ParaDRAM - NOTE: adding the graphics tools... ParaDRAM - NOTE: creating the line plot object from scratch... ParaDRAM - NOTE: creating the scatter plot object from scratch... ParaDRAM - NOTE: creating the lineScatter plot object from scratch... ParaDRAM - NOTE: creating the line3 plot object from scratch... ParaDRAM - NOTE: creating the scatter3 plot object from scratch... ParaDRAM - NOTE: creating the lineScatter3 plot object from scratch... ParaDRAM - NOTE: creating the histogram plot object from scratch... ParaDRAM - NOTE: creating the histogram2 plot object from scratch... ParaDRAM - NOTE: creating the histfit plot object from scratch... ParaDRAM - NOTE: creating the contour plot object from scratch... ParaDRAM - NOTE: creating the contourf plot object from scratch... ParaDRAM - NOTE: creating the contour3 plot object from scratch... ParaDRAM - NOTE: creating the grid plot object from scratch... ParaDRAM - NOTE: The processed sample files are now stored in the output variable as a cell array. For example, to ParaDRAM - NOTE: access the contents of the first (or the only) sample file stored in an output variable named ParaDRAM - NOTE: OUTPUT_CELL_ARRAY, try: ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.df ParaDRAM - NOTE: ParaDRAM - NOTE: To access the plotting tools, try: ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.<PRESS TAB TO SEE THE LIST OF PLOTS> ParaDRAM - NOTE: ParaDRAM - NOTE: For example, ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.line.make() % to make 2D line plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.scatter.make() % to make 2D scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.lineScatter.make() % to make 2D line-scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.line3.make() % to make 3D line plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.scatter3.make() % to make 3D scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.lineScatter3.make() % to make 3D line-scatter plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.contour3.make() % to make 3D kernel-density contour plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.contourf.make() % to make 2D kernel-density filled-contour plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.contour.make() % to make 2D kernel-density plots. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.histogram2.make() % to make 2D histograms. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.histogram.make() % to make 1D histograms. ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.plot.grid.make() % to make GridPlot ParaDRAM - NOTE: ParaDRAM - NOTE: To plot or inspect the variable autocorrelations or the correlation/covariance matrices, try: ParaDRAM - NOTE: ParaDRAM - NOTE: OUTPUT_CELL_ARRAY{1}.stats.<PRESS TAB TO SEE THE LIST OF COMPONENTS> ParaDRAM - NOTE: ParaDRAM - NOTE: For more information and examples on the usage, visit: ParaDRAM - NOTE: ParaDRAM - NOTE: https://www.cdslab.org/paramonte
sample = sampleList{1}; % Choose only the contents of the first read file.
% Plot the trace of the sampled variables.
for colname = sample.df.Properties.VariableNames(2:end)
sample.plot.line.colormap.enabled = true;
sample.plot.line.colormap.values = "autumn";
sample.plot.line.make("ycolumns",colname);
xlabel("MCMC Count");
ylabel(colname);
end
% Plot the histogram of the sampled variables.
for colname = sample.df.Properties.VariableNames(2:end)
sample.plot.histogram.make("xcolumns",colname);
xlabel("MCMC Count");
ylabel(colname);
end

General Guidelines on the ParaMonte visualization tools

===================================================================
In the following sections, we will discuss some of the postprocessing and visualization tools that automatically ship with the ParaMonte library. However, this is just the tip of the iceberg. For more detailed instructions on how to use the individual visualization tools of the ParaMonte library, visist the following MATLAB live scripts,
line / scatter plots
density plots
other plots
Finally, lets take a look at the mean inferred values of the parameters.
sample.df.Properties.VariableNames(2:4)
ans = 1×3 cell
'intercept' 'slope' 'logSigma'
mean(sample.df{:,2:4})
ans = 1×3
1.022976101134784 2.048782241941972 -0.937985291135638
std(sample.df{:,2:4})
ans = 1×3
0.323266011393424 0.219215523337699 0.401223654160306
Our initial guess for the parameter values was not too far from the best-fit values! As for the value of sigma, all we need to do is to exponentiate the mean value of logSgima.
sigma = exp(mean(sample.df.logSigma));
disp("sigma (i.e., normal dispersion of data around the mean expected values) = " + num2str(sigma))
sigma (i.e., normal dispersion of data around the mean expected values) = 0.39142
Now, lets make a grid plot of all of the parameters sampled together,
sample.plot.grid.reset("hard"); % reset the grid plot, just in case...
ParaDRAM - NOTE: creating the grid plot object from scratch...
sample.plot.grid.plotType.upper.enabled = false; % disable upper triangle.
sample.plot.grid.plotType.lower.value = "lineScatter"; % let lower triangle be lineScatter plot.
sample.plot.grid.columns = [2,3,4]; % corresponding to "intercept, slope, logSigma
sample.plot.grid.make(); % make grid plot of all parameters.
generating subplot #1: (1,1) out of 9
skipping subplot #2: (1,2) out of 9 skipping subplot #3: (1,3) out of 9 generating subplot #4: (2,1) out of 9
generating subplot #5: (2,2) out of 9
skipping subplot #6: (2,3) out of 9 generating subplot #7: (3,1) out of 9
generating subplot #8: (3,2) out of 9
generating subplot #9: (3,3) out of 9
Now, let's visualize the best-fit line and its associated uncertainties.
sample.df
ans = 3033×4 table
 SampleLogFuncinterceptslopelogSigma
1-5.5367637000000001.2335464000000001.578242100000000-0.486792640000000
2-5.9586445000000001.4059460000000001.527652300000000-0.807927460000000
3-2.8590012000000001.4249843000000001.831913200000000-0.774201690000000
4-2.6755268000000000.9748607900000002.108216300000000-0.610714630000000
5-2.6755268000000000.9748607900000002.108216300000000-0.610714630000000
6-4.2269535000000000.4617512700000002.324425000000000-0.475352680000000
7-4.2269535000000000.4617512700000002.324425000000000-0.475352680000000
8-2.6696650000000000.8272471800000002.274596700000000-0.754379390000000
9-2.6696650000000000.8272471800000002.274596700000000-0.754379390000000
10-2.8487771000000000.7254652300000002.327238400000000-0.791513490000000
11-2.8487771000000000.7254652300000002.327238400000000-0.791513490000000
12-2.2278651000000000.6989348600000002.229620500000000-0.926848120000000
13-2.5557610000000000.7080037300000002.314497600000000-0.946291930000000
14-3.3210158000000000.5516748100000002.358846500000000-1.004365600000000
15-2.4808821000000000.7774701000000002.202844800000000-1.465802500000000
16-1.3571460000000001.0542051000000001.966705900000000-1.051932500000000
17-1.3571460000000001.0542051000000001.966705900000000-1.051932500000000
18-1.5852551000000000.9065333000000002.173307200000000-0.977302050000000
19-3.3745739000000000.5203536700000002.324835500000000-0.756915580000000
20-2.6939121000000001.3207929000000001.871187400000000-0.699710800000000
21-4.9932624000000001.5123797000000001.695099100000000-0.240228080000000
22-6.0301810000000001.7412726000000001.503666100000000-0.132503020000000
23-7.5180771000000000.1856579400000002.2172984000000000.137609240000000
24-3.5912285000000001.0658937000000002.247064400000000-0.669385840000000
25-3.3998393000000001.0453077000000002.254512600000000-0.770514250000000
26-1.6378994000000001.0573691000000001.934174800000000-1.269478400000000
27-2.7692041000000001.0184037000000001.888984200000000-0.773945500000000
28-3.6341041000000001.2825364000000001.930848200000000-1.598034100000000
29-3.3233635000000000.9555422400000001.924117100000000-1.327614000000000
30-3.2420079000000001.0577559000000001.863440200000000-1.288276000000000
31-3.2420079000000001.0577559000000001.863440200000000-1.288276000000000
32-2.8404248000000001.3364659000000001.873984100000000-1.396458300000000
33-2.4836522000000000.9032470200000002.266732700000000-0.992624810000000
34-2.5801089000000001.2381570000000001.910037100000000-1.532955400000000
35-3.4225689000000001.2381069000000002.000080900000000-0.488302630000000
36-2.6261897000000001.1945726000000002.091127300000000-1.233649500000000
37-3.2292863000000000.6611226100000002.127965000000000-0.696717760000000
38-3.9021460000000001.5854133000000001.689298600000000-0.658813930000000
39-2.1365372000000001.3122919000000001.836194300000000-0.991382450000000
40-2.1365372000000001.3122919000000001.836194300000000-0.991382450000000
41-1.9819187000000000.8999441500000002.031547200000000-0.869417360000000
42-6.0287759000000001.8854172000000001.766167700000000-0.455882700000000
43-7.0665001000000001.3575273000000001.6986711000000000.196356430000000
44-3.0532115000000000.9565816300000002.249334400000000-0.676198970000000
45-2.2512528000000001.1288197000000001.898103700000000-0.790383850000000
46-3.6329985000000001.4046164000000002.009080600000000-0.927780340000000
47-2.1793771000000000.7564102700000002.252572500000000-0.896556010000000
48-1.7062410000000000.9267396600000002.178427100000000-1.378430500000000
49-5.1476430000000000.3693035300000002.563111000000000-0.391946750000000
50-10.433895000000000-1.0357271000000003.0662990000000000.084698540000000
51-4.8897125000000000.7036550300000002.147987300000000-0.199849810000000
52-2.0661022000000000.7991039400000002.111081900000000-1.346204900000000
53-3.1964126000000000.8073378500000002.197784600000000-0.527395670000000
54-2.0802930000000001.2590732000000002.013320700000000-1.071509500000000
55-2.4038165000000000.6555606500000002.240631600000000-1.003514500000000
56-3.2135249000000001.2619736000000002.100884800000000-0.960299850000000
57-2.4916547000000001.0481928000000001.911053700000000-1.373725300000000
58-4.1671636000000000.8306479400000002.033065800000000-1.529267400000000
59-2.0692579000000000.9447760800000002.162980700000000-1.509707800000000
60-2.1584068000000001.0401698000000002.142370800000000-1.421462400000000
61-1.8514151000000001.0116460000000002.116188400000000-0.841609330000000
62-2.8018692000000000.7502601900000002.326003700000000-0.828322530000000
63-1.6458090000000001.2626463000000001.900521800000000-1.138285900000000
64-3.4078018000000001.5246771000000001.783567000000000-0.993630780000000
65-3.3633753000000000.8960869200000001.941682400000000-1.221592300000000
66-2.7488599000000000.9076726300000002.051957300000000-1.622934100000000
67-1.5062766000000001.1687598000000001.906211300000000-1.088678600000000
68-2.7325571000000000.9756289900000001.981967800000000-0.637268120000000
69-3.6536028000000001.1691175000000002.063128400000000-0.434464410000000
70-2.1478962000000000.7284028100000002.144775700000000-1.068490400000000
71-1.0223165000000001.0927704000000002.045709400000000-1.236078500000000
72-1.2888683000000001.0669373000000002.098291300000000-1.211513700000000
73-1.2888683000000001.0669373000000002.098291300000000-1.211513700000000
74-1.2888683000000001.0669373000000002.098291300000000-1.211513700000000
75-1.1990054000000001.0903051000000002.072711500000000-1.232877000000000
76-1.1990054000000001.0903051000000002.072711500000000-1.232877000000000
77-4.5511769000000001.4256995000000001.600044300000000-0.545662230000000
78-4.0733807000000001.5592690000000001.630912100000000-0.757688880000000
79-1.8837082000000001.0334109000000001.927707600000000-1.229938700000000
80-1.6927264000000000.8656784500000002.155502500000000-1.463215400000000
81-2.6217540000000000.9596355900000001.929457900000000-1.203161600000000
82-1.1965416000000001.0091619000000001.992921900000000-1.277344000000000
83-1.1965416000000001.0091619000000001.992921900000000-1.277344000000000
84-1.2814217000000000.9829234100000002.002213400000000-1.290751200000000
85-2.0749521000000001.2457718000000001.919876800000000-0.829491800000000
86-3.0472598000000001.1258996000000001.867722200000000-0.607099390000000
87-4.9528322000000001.3881167000000001.589794500000000-0.413175380000000
88-5.6683045000000001.5504865000000001.527295400000000-0.210508780000000
89-3.3206601000000001.3622440000000001.748074700000000-1.180409400000000
90-1.7021446000000000.8542047700000002.178086800000000-1.412892900000000
91-3.2754320000000001.3578133000000002.012117500000000-0.721969940000000
92-1.2097998000000001.1194029000000001.972484000000000-1.401323300000000
93-1.2700851000000000.9377714000000002.074393200000000-1.023011300000000
94-2.5017458000000001.3095586000000001.916568500000000-0.751087740000000
95-1.1603446000000001.0803846000000002.050501900000000-1.413891500000000
96-3.1716440000000000.7664353900000002.312177300000000-0.634993190000000
97-4.7455605000000001.4025436000000001.587184900000000-0.528321970000000
98-6.6842192000000001.6319744000000002.065126100000000-0.083741878000000
99-2.8321004000000001.1762294000000001.923132600000000-0.597953390000000
100-4.3839682000000001.5236788000000001.970818100000000-0.711132230000000
xvalues = 0:0.1:10;
yvalues = exp(mean(sample.df.intercept)) * xvalues .^ mean(sample.df.slope);
figure; hold on; box on;
plot(xvalues, yvalues, "color", "blue", "linewidth", 3)
plot(X,Y,".", "markerSize", 30, "color","red");
xlabel("X");
ylabel("Y");
set(gca,"xscale","linear","yscale","linear");
hold off;
This plot may be already enough for many people. But the joy of MCMC is in uncertainty quantification and the ability to visualize the uncertainty in our knowledge. This example is no exception.
What we have shown above is just the expected best-fit line to our data. But corresponding to each sampled parameter set in the output sample file, there exists a possible regression line although each with a different probability of being correct. We can visualize all of them together to understand the state of uncertainty in our best-fit regression.
% pick only a subset of sampled parameters
xvalues = 0:0.05:10;
figure("color","white"); hold on; box on;
first = 1;
last = 300;
slopes = sample.df.slope(first:last);
intercepts = sample.df.intercept(first:last);
for i = 1:last-first+1
slope = slopes(i);
intercept = intercepts(i);
yvalues = exp(intercept) * xvalues .^ slope;
p = plot(xvalues, yvalues, "color", "blue", "lineWidth", 0.05);
p.Color = [0,0,1,0.1]; % set the transparency of the lines.
end
% Now plot the data on top of the prediction lines.
plot(X,Y,".", "markerSize", 30, "color","red");
xlabel("X");
ylabel("Y");
set(gca,"xscale","linear","yscale","linear");
hold off;
Unfortunately, the line transparency in MATLAB live script does not work as expected as of MATLAB 2021A. However, you can copy and paste the above code snippet directly to the MATLAB command line to get a nice transparent probabilisitic plot of prediction lines.
regression_powerlaw_data_paradram_prediction_mc.png

Running a single-chain ParaDRAM simulation in parallel on multiple processors

===============================================================================================
For large scale time-intensive problems, you can also run the ParaDRAM sampler in parallel on multiple processors, and with as many processors as needed. One unique feature of the parallelization capabilities of the ParaMonte library is that, you do not need to code anything in your objective function in parallel. You do not even need to have the MATLAB parallel Toolbox installed on your system. All parallelizations and communications are automatically handled by the ParaMonte library routines. For instructions on how to sample your expensive objective functions in parallel, visit the ParaMonte library's documentation website.

Final Remarks

=================
There are many more functionalities and features of the ParaMonte library that were neither explored nor mentioned in this example Jupyter notebook. You can explore them by checking the existing components of each attribute of the ParaDRAM sampler class and by visiting the ParaMonte library's documentation website.
function logLike = getLogLike(param)
% =======================================================================
%
% Return the natural logarithm of the likelihood of observing the (X,Y) dataset defined
% globally outside of this function given the input parameter vector ``param``.
%
% Parameters
% ==========
%
% param
% A 64-bit real vector containing the parameters of the model in the following order:
% intercept, slope, log(sigma) (standard deviation of the Normal distribution).
%
% Returns
% =======
%
% logLike
% The natural logarithm of the likelihood of observing the dataset (X,Y) given ``param``.
%
% =======================================================================
% Declare the data.
% Outside MATLAB live scrip files, one can use `global` attribute to
% declare data globally in MATLAB and use the global data for
% likelihood evaluation inside the likelihood function.
% Global declaration of data, however, is impossible
% within the MATLAB live scripts as of MATLAB 2021.
% See this page for an script/function implementation
% of this example in main.m and getLogFunc.m files
% using global data:
% https://github.com/cdslaborg/paramontex/raw/main/MATLAB/mlx/regression_powerlaw_data_paradram/
X = [0.5, 2.4, 3.2, 4.9, 6.5, 7.8];
Y = [0.8, 9.3, 37.9, 68.2, 155, 198];
logX = log(X);
logY = log(Y);
% Compute the expected value of y, given each value of x.
% This is the mean of the Normal distribution given the x values.
%expectedLogY = getExpectedLogY(logX,param(1:2));
expectedLogY = param(1) + param(2) * logX;
% Compute the log-PDFs of oberserving each data point given the inpur parameters.
logProbDensities = log(normpdf(logY, expectedLogY, exp(param(3)))); % Note that param(3) is logSigma.
% Compute and return the log-likliehood
logLike = sum(logProbDensities);
end

{% comment %}
{% endcomment %}
web counter