Releases: google-deepmind/PGMax
Releases · google-deepmind/PGMax
v0.6.1
What has changed
- Fix installing setup requirements
- Remove use of deprecated
jnp.NINF
- Remove use of deprecated
np.product
v0.6.0
What has changed
Deprecation notice
bp.run_bp
is deprecated in favor ofbp.run
.bp.run_bp
will be removed in future releases.- Dropped support for
python 3.7
.
New features
- Added a SDLP solver that solves the smoothed dual of the LP relaxation of the MAP problem by accelerated gradient descent.
- An alternative to Belief Propagation with convergence guarantees.
- Based on the Smooth Dual LP-MAP problem introduced in this paper.
- Leverages existing PGMax infrastructure to implement the underlying message-passing algorithm.
- Added a solver for the primal of the LP relaxation based on the LP solver from
cvxpy
.
Changes
- Implemented a unified interface for the BP and the SDLP solvers.
- Solvers can be created via
where
inferer = infer.build_inferer(fg.bp_state, backend=BACKEND)
BACKEND
can be"bp"
or"sdlp"
.
- Solvers can be created via
- Added a
run_with_diffs
function to the BP solver to monitor BP convergence. - Improved the stability of message updates at low temperatures for logical factors and pool factors.
- Improved the numerical stability of the messages / potentials clipping.
- Improved the handling of various corner cases, including variables with single state, empty variable groups, energy with infinite potentials, etc..
Examples
- Updated all example colabs to use the new unified interface.
- Added a colab notebook which uses the SDLP solver to
- Run inference on an Ising Model and compare its results with BP
- Extract sparse feature activations from visually complex binary scenes.
v0.5.1
What has changed
- Added a new module to compute the energy of a MAP decoding
- Defined inference arguments for each factor type which split all the 2D wiring arrays of the different factor types into contiguous 1D arrays; leading to 5x compile time speedups
- Stored new information in var_states_for_edges which makes inference faster
- Added PGMax documentation to ReadTheDocs
- Added a nicer representation of NDVarArray
v0.5.0
PGMax is now part of DeepMind 🎉!
Following the acquisition of Vicarious by Alphabet in 2022, PGMax developments have now moved from Vicarious to DeepMind.
What has changed
- Added PoolFactors with optimized inference procedure.
- License changed from MIT License to Apache 2.0 License.
- Adopted pyink as the new auto-formatter.