Skip to content

hwborchers/ctv-optimization

Repository files navigation

Optimization and Mathematical Programming

Maintainer: Hans W. Borchers
Contact: hwb at mailbox.org
Version: 2024-07-05
Web page Optimization and Mathematical Programming

This page has been forged from the CRAN Task View "Optimization and Mathematical Programming", version end of 2021. For the current CRAN Task View see here. This page will be further developed in a different direction, hoping to give the user more extensive and useful information. It still contains a list of R packages that offer facilities for solving numerical and combinatorial optimization problems, including statistical regression tasks modeled as optimization problems.

Contents

Packages in this view are roughly structured according to these topics. (See also the "Related links" section at the end of the task view.) Please note that many packages provide functionality for more than one class of optimization problems. Suggestions and improvements for this task view are welcome and can be made through issues or pull requests on GitHub or via e-mail to the maintainer address.

Optimization Infrastructure Packages

  • The optimx package provides a replacement and extension of the optim() function in Base R with a call to several function minimization codes in R in a single statement. These methods handle smooth, possibly box-constrained functions of several or many parameters. Function optimr() in this package extends the optim() function with the same syntax but more 'method' choices. Function opm() applies several solvers to a selected optimization task and returns a data frame of results for easy comparison.

  • The R Optimization Infrastructure (ROI) package provides a framework for handling optimization problems in R. It uses an object-oriented approach to define and solve various optimization tasks from different problem classes (e.g., linear, quadratic, non-linear programming problems). This makes optimization transparent for the user as the corresponding workflow is abstracted from the underlying solver. The approach allows for easy switching between solvers and thus enhances comparability. For more information see the ROI home page.

  • The package CVXR provides an object-oriented modeling language for Disciplined Convex Programming (DCP). It allows the user to formulate convex optimization problems in a natural way following mathematical convention and DCP rules. The system analyzes the problem, verifies its convexity, converts it into a canonical form, and hands it off to an appropriate solver such as ECOS or SCS to obtain the solution.For more information see the CVXR home page.

General Purpose Continuous Solvers

Package stats offers several general-purpose optimization routines. For one-dimensional unconstrained function optimization there is optimize() which searches an interval for a minimum or maximum. Function optim() provides an implementation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method, bounded BFGS, conjugate gradient (CG), Nelder-Mead, and simulated annealing (SANN) optimization methods. It utilizes gradients, if provided, for faster convergence. Typically it is used for unconstrained optimization but includes an option for box-constrained optimization.

Additionally, for minimizing a function subject to linear inequality constraints, stats contains the routine constrOptim(). Then there is nlm which is used for solving nonlinear unconstrained minimization problems. nlminb() offers box-constrained optimization using the PORT routines.

  • Package lbfgs wraps the libBFGS C library by Okazaki and Morales (converted from Nocedal's L-BFGS-B 3.0 Fortran code), interfacing both the L-BFGS and the OWL-QN algorithm, the latter being particularly suited for higher-dimensional problems.
  • lbfgsb3c interfaces J.Nocedal's L-BFGS-B 3.0 Fortran code, a limited memory BFGS minimizer, allowing bound constraints and is applicable to higher-dimensional problems. It has an 'optim'-like interface based on 'Rcpp'.
  • Package roptim provides a unified wrapper to call C++ functions of the algorithms underlying the optim() solver; and optimParallel provides a parallel version of the L-BFGS-B method of optim(); using these packages can significantly reduce the optimization time.
  • RcppNumerical is a collection of open-source libraries for numerical computing and their integration with 'Rcpp'. It provides a wrapper for the L-BFGS algorithm, based on the LBFGS++ library (based on code of N. Okazaki).
  • Package ucminf implements an algorithm of quasi-Newton type for nonlinear unconstrained optimization, combining a trust region with line search approaches. The interface of ucminf() is designed for easy interchange with optim().
  • marqLevAlg implements a parallelized version of the Marquardt-Levenberg algorithm. It is particularly suited for complex problems and when starting from points very far from the final optimum. The package is designed to be used for unconstrained local optimization.
  • mize implements optimization algorithms in pure R, including conjugate gradient (CG), Broyden-Fletcher-Goldfarb-Shanno (BFGS) and limited memory BFGS (L-BFGS) methods. Most internal parameters can be set through the calling interface.
  • stochQN provides implementations of stochastic, limited-memory quasi-Newton optimizers, similar in spirit to the LBFGS. It includes an implementation of online LBFGS, stochastic quasi-Newton and adaptive quasi-Newton.
  • nonneg.cg realizes a conjugate-gradient based method to minimize functions subject to all variables being non-negative.
  • Package dfoptim, for derivative-free optimization procedures, contains quite efficient R implementations of the Nelder-Mead and Hooke-Jeeves algorithms (unconstrained and with bounds constraints).
  • Package nloptr provides access to NLopt, an LGPL licensed library of various nonlinear optimization algorithms. It includes local derivative-free (COBYLA, Nelder-Mead, Subplex) and gradient-based (e.g., BFGS) methods, and also the augmented Lagrangian approach for nonlinear constraints.
  • Package alabama provides an implementations of the Augmented Lagrange Barrier minimization algorithm for optimizing smooth nonlinear objective functions with (nonlinear) equality and inequality constraints.
  • Package Rsolnp provides an implementation of the Augmented Lagrange Multiplier method for solving nonlinear optimization problems with equality and inequality constraints (based on code by Y. Ye).
  • NlcOptim solves nonlinear optimization problems with linear and nonlinear equality and inequality constraints, implementing a Sequential Quadratic Programming (SQP) method; accepts the input parameters as a constrained matrix.
  • In package Rdonlp2 (see the r rforge("rmetrics") project) function donlp2(), a wrapper for the DONLP2 solver, offers the minimization of smooth nonlinear functions and constraints. DONLP2 can be used freely for any kind of research purposes, otherwise it requires licensing.
  • psqn provides quasi-Newton methods to minimize partially separable functions; the methods are largely described in "Numerical Optimization" by Nocedal and Wright (2006).
  • clue contains the function sumt() for solving constrained optimization problems via the sequential unconstrained minimization technique (SUMT).
  • BB contains the function spg() providing a spectral projected gradient method for large-scale optimization with simple constraints. It takes a nonlinear objective function as an argument as well as basic constraints.
  • ManifoldOptim is an R interface to the 'ROPTLIB' optimization library. It optimizes real-valued functions over manifolds such as Stiefel, Grassmann, and Symmetric Positive Definite matrices.
  • Several derivative-free optimization algorithms are provided with package minqa; e.g., the functions bobyqa(), newuoa(), and uobyqa()allow minimizing a function of many variables by a trust region method that forms quadratic models by interpolation.bobyqa()` additionally permits box constraints bounds) on the parameters.
  • subplex provides unconstrained function optimization based on a subspace searching simplex method.
  • In package trust, a routine with the same name offers local optimization based on the "trust region" approach.
  • trustOptim implements "trust region" for unconstrained nonlinear optimization. The algorithm is optimized for objective functions with sparse Hessians.
  • Package quantreg contains variations of simplex and of interior point routines ( nlrq(), crq()). It provides an interface to L1 regression in the R code of function rq().

Quadratic Optimization

  • In package quadprog solve.QP() solves quadratic programming problems with linear equality and inequality constraints. (The matrix has to be positive definite.) quadprogXT extends this with absolute value constraints and absolute values in the objective function.
  • osqp provides bindings to OSQP, the 'Operator Splitting QP' solver from the University of Oxford Control Group; it solves sparse convex quadratic programming problems with optional equality and inequality constraints efficiently.
  • Package piqp implements an interface to the Proximal Interior Point Quadratic Programming solver, cf. PIQP from EPFL; it combines an infeasible interior point method with the proximal method of multipliers.
  • qpmadr interfaces the 'qpmad' software and solves quadratic programming (QP) problems with linear inequality, equality and bound constraints, using the method by Goldfarb and Idnani.
  • kernlab contains the function ipop for solving quadratic programming problems using interior point methods. (The matrix can be positive semidefinite.)
  • Dykstra solves quadratic programming problems using R. L. Dykstra's cyclic projection algorithm for positive definite and semidefinite matrices. The routine allows for a combination of equality and inequality constraints.
  • coneproj contains routines for cone projection and quadratic programming, estimation, and inference for constrained parametric regression, and shape-restricted regression problems.
  • LowRankQP solves quadratic programming problems where the Hessian is represented as the product of two matrices. It implements a primal-dual interior point method (for semidefinite quadratic forms).
  • The COIN-OR project 'qpOASES' implements a reliable QP solver, even when tackling semi-definite or degenerated QP problems; it is particularly suited for model predictive control (MPC) applications; the ROI plugin ROI.plugin.qpoases makes it accessible for R users.
  • mixsqp implements the "mix-SQP" algorithm, based on sequential quadratic programming (SQP), for maximum likelihood estimations in finite mixture models.
  • limSolve offers to solve linear or quadratic optimization functions, subject to equality and/or inequality constraints.
  • CGNM finds multiple solutions of nonlinear least-squares problems, without assuming uniqueness of the solution.

Test and Benchmarking Collections

  • Objective functions for benchmarking the performance of global optimization algorithms can be found in globalOptTests.
  • smoof has generators for a number of both single- and multi-objective test functions that are frequently used for benchmarking optimization algorithms; offers a set of convenient functions to generate, plot, and work with objective functions.
  • flacco contains tools and features used for an Exploratory Landscape Analysis (ELA) of continuous optimization problems, capable of quantifying rather complex properties, such as the global structure, separability, etc., of the optimization problems.
  • Package r github("jlmelville/funconstrain") (on Github) implements 35 of the test functions by More, Garbow, and Hillstom, useful for testing unconstrained optimization methods.

Least-Squares Problems

Function solve.qr() (resp. qr.solve()) handles over- and under-determined systems of linear equations, returning least-squares solutions if possible. And package stats provides nls() to determine least-squares estimates of the parameters of a nonlinear model. nls2 enhances function nls() with brute force or grid-based searches, to avoid being dependent on starting parameters or getting stuck in local solutions.

  • Package nlsr provides tools for working with nonlinear least-squares problems. Functions nlfb and nlxb are intended to eventually supersede the 'nls()' function in Base R, by applying a variant of the Marquardt procedure for nonlinear least-squares, with bounds constraints and optionally Jacobian described as R functions.
  • Package minpack.lm provides a function nls.lm() for solving nonlinear least-squares problems by a modification of the Levenberg-Marquardt algorithm, with support for lower and upper parameter bounds, as found in MINPACK.
  • Package onls fits two-dimensional data by means of orthogonal nonlinear least-squares regression (ONLS), using Levenberg-Marquardt minimization; it provides functionality for fit diagnostics and plotting and comes into question when one encounters "error in variables" problems.
  • Package nnls interfaces the Lawson-Hanson implementation of an algorithm for non-negative least-squares, allowing the combination of non-negative and non-positive constraints.
  • Package lsei contains functions that solve least-squares linear regression problems under linear equality/inequality constraints. Functions for solving quadratic programming problems are also available, which transform such problems into least squares ones first. (Based on Fortran programs of Lawson and Hanson.)
  • Package gslnls provides an interface to nonlinear least-squares optimization methods from the GNU Scientific Library (GSL). The available trust region methods include the Levenberg-Marquadt algorithm with and without geodesic acceleration, and several variants of Powell's dogleg algorithm.
  • Package bvls interfaces the Stark-Parker implementation of an algorithm for least-squares with upper and lower bounded variables.
  • Package onls (archived) implements orthogonal nonlinear least-squares regression (ONLS, a.k.a. Orthogonal Distance Regression, ODR) using a Levenberg-Marquardt-type minimization algorithm based on the ODRPACK Fortran library.
  • colf performs least squares constrained optimization on a linear objective function. It contains a number of algorithms to choose from and offers a formula syntax similar to lm().

Semidefinite and Convex Solvers

  • Package ECOSolveR provides an interface to the Embedded COnic Solver (ECOS), a well-known, efficient, and robust C library for convex problems. Conic and equality constraints can be specified in addition to integer and boolean variable constraints for mixed-integer problems.
  • Package scs applies operator splitting to solve linear programs (LPs), second-order cone programs (SOCP), semidefinite programs, (SDPs), exponential cone programs (ECPs), and power cone programs (PCPs), or problems with any combination of those cones.
  • Package clarabel provides an interior point numerical solver for convex optimization problems using a novel homogeneous embedding, that solves linear programs (LPs), quadratic programs (QPs), second-order cone programs (SOCPs), semidefinite programs (SDPs), and problems with exponential and power cone constraints. (See Clarabel Docs)
  • sdpt3r solves general semidefinite Linear Programming problems, using an R implementation of the MATLAB toolbox SDPT3. Includes problems such as the nearest correlation matrix, D-optimal experimental design, Distance Weighted Discrimination, or the maximum cut problem.
  • cccp contains routines for solving cone-constrained convex problems by means of interior-point methods. The implemented algorithms are partially ported from CVXOPT, a Python module for convex optimization
  • CSDP is a library of routines that implements a primal-dual barrier method for solving semidefinite programming problems; it is interfaced in the Rcsdp package.
  • The DSDP library implements an interior-point method for semidefinite programming with primal and dual solutions; it is interfaced in package Rdsdp.

Global and Stochastic Optimization

  • Package DEoptim provides a global optimizer based on the Differential Evolution algorithm. RcppDE provides a C++ implementation (using Rcpp) of the same DEoptim() function.
  • DEoptimR provides an implementation of the jDE variant of the differential evolution stochastic algorithm for nonlinear programming problems (It allows handling constraints in a fexible manner.)
  • The CEoptim package implements a cross-entropy optimization technique that can be applied to continuous, discrete, mixed, and constrained optimization problems.
  • GenSA is a package providing a function for generalized Simulated Annealing which can be used to search for the global minimum of a quite complex non-linear objective function with a large number of optima.
  • GA provides functions for optimization using Genetic Algorithms in both, the continuous and discrete case. This package allows to run corresponding optimization tasks in parallel.
  • In package gafit gafit() uses a genetic algorithm approach to find the minimum of a one-dimensional function.
  • Package genalg contains rbga(), an implementation of a genetic algorithm for multi-dimensional function optimization.
  • Package rgenoud offers genoud(), a routine which is capable of solving complex function minimization/maximization problems by combining evolutionary algorithms with a derivative-based (quasi-Newtonian) approach.
  • Machine coded genetic algorithm (MCGA) provided by package mcga is a tool that solves optimization problems based on byte representation of variables.
  • A particle swarm optimizer (PSO) is implemented in package pso, and also in psoptim. Another (parallelized) implementation of the PSO algorithm can be found in package ppso available from rforge.net/ppso .
  • Package hydroPSO implements the Standard Particle Swarm Optimization (SPSO) algorithm; it is parallel-capable and includes several fine-tuning options and post-processing functions.
  • r github("floybix/hydromad") (on Github) contains the SCEoptim function for Shuffled Compex Evolution (SCE) optimization, an evolutionary algorithm, combined with a simplex method.
  • Package ABCoptim implements the Artificial Bee Colony (ABC) optimization approach.
  • Package metaheuristicOpt contains implementations of several evolutionary optimization algorithms, such as particle swarm, dragonfly and firefly, sine cosine algorithms and many others.
  • Package ecr provides a framework for building evolutionary algorithms for single- and multi-objective continuous or discrete optimization problems. And emoa has a collection of building blocks for the design and analysis of evolutionary multiobjective optimization algorithms.
  • CMA-ES by N. Hansen, global optimization procedure using a covariance matrix adapting evolutionary strategy, is implemented in several packages: In packages cmaes and cmaesr, in parma as cmaes, in adagio as pureCMAES, and in rCMA as cmaOptimDP, interfacing Hansen's own Java implementation.
  • Package Rmalschains implements an algorithm family for continuous optimization called memetic algorithms with local search chains (MA-LS-Chains).
  • An R implementation of the Self-Organising Migrating Algorithm (SOMA) is available in package soma. This stochastic optimization method is somewhat similar to genetic algorithms.
  • nloptr supports several global optimization routines, such as DIRECT, controlled random search (CRS), multi-level single-linkage (MLSL), improved stochastic ranking (ISR-ES), or stochastic global optimization (StoGO).
  • The NMOF package provides implementations of differential evolution, particle swarm optimization, local search and threshold accepting (a variant of simulated annealing). The latter two methods also work for discrete optimization problems, as does the implementation of a genetic algorithm that is included in the package.
  • OOR implements optimistic optimization methods for global optimization of deterministic or stochastic functions.
  • RCEIM implements a stochastic heuristic method for performing multi-dimensional function optimization.
  • Package graDiEnt implements the Stochastic Quasi-Gradient Differential Evolution (SQG-DE) optimization algorithm; being derivative-free, it combines the robustness of the population-based "Differential Evolution" with the efficiency of gradient-based optimization.

Mathematical Programming Solvers

This section provides an overview of open source as well as commercial optimizers.

  • Package ompr is an optimization modeling package to model and solve Mixed Integer Linear Programs in an algebraic way directly in R. The models are solver-independent and thus offer the possibility to solve models with different solvers. (Inspired by Julia's JuMP project.)
  • linprog solves linear programming problems using the function solveLP() (the solver is based on lpSolve) and can read model files in MPS format.
  • In the boot package there is a routine called simplex() which realizes the two-phase tableau simplex method for (relatively small) linear programming problems.
  • rcdd offers the function lpcdd() for solving linear programs with exact arithmetic using the GNU Multiple Precision (GMP) library.
  • The NEOS Server for Optimization provides online access to state-of-the-art optimization problem solvers. The packages rneos and ROI.plugin.neos enable the user to pass optimization problems to NEOS and retrieve results within R.

Interfaces to Open Source Optimizers

  • Package lpSolve contains the routine lp() to solve LPs and MILPs by calling the freely available solver lp_solve. This solver is based on the revised simplex method and a branch-and-bound (B&B) approach. It supports semi-continuous variables and Special Ordered Sets (SOS). Furthermore lp.assign() and lp.transport() are aimed at solving assignment problems and transportation problems, respectively. Additionally, there is the package lpSolveAPI which provides an R interface to the low-level API routines of lp_solve (see also project r rforge("lpsolve") on R-Forge). lpSolveAPI supports reading linear programs from files in lp and MPS format.
  • Packages glpkAPI as well as package Rglpk provide an interface to the GNU Linear Programming Kit (GLPK). Whereas the former provides access to low-level routines the, latter provides a routine Rglpk_solve_LP() to solve MILPs using GLPK. Both packages offer the possibility to use models formulated in the MPS format.
  • Rsymphony has the routine Rsymphony_solve_LP() that interfaces the SYMPHONY solver for mixed-integer linear programs. (SYMPHONY is part of the Computational Infrastructure for Operations Research (COIN-OR) project.) Package lpsymphony in Bioconductor provides a similar interface to SYMPHONY which is easier to install.
  • The NOMAD solver is implemented in the crs package for solving mixed integer programming problems. This algorithm is accessible via the snomadr() function and is primarily designed for constrained optimization of black box functions.
  • 'Clp' and 'Cbc' are open-source solvers from the COIN-OR suite. 'Clp' solves linear programs with continuous objective variables and is available through ROI.plugin.clp. 'Cbc' is a powerful mixed integer linear programming solver (based on 'Clp'); package 'rcbc' can be installed from: r github("dirkschumacher/rcbc") (on Github).
  • Package highs is an R interface to the HiGHS solver. HiGHS is currently among the best open-source mixed-integer linear programming solvers. Furthermore, it can be used to solve quadratic optimization problems (without mixed integer constraints).

Interfaces to Commercial Optimizers

This section surveys interfaces to commercial solvers. Typically, the corresponding libraries have to be installed separately.

  • Package Rcplex provides an interface to the IBM CPLEX Optimizer. CPLEX provides dual/primal simplex optimizers as well as a barrier optimizer for solving large-scale linear and quadratic programs. It offers a mixed integer optimizer to solve difficult mixed integer programs including (possibly non-convex) MIQCP. Note that CPLEX is not free and you have to get a license. Academics will receive a free license upon request.
  • Package Rmosek provides an interface to the (commercial) MOSEK optimization library for large-scale LP, QP, and MIP problems, with emphasis on (nonlinear) conic, semidefinite, and convex tasks. The solver can handle SOCP and quadratically constrained programming (QPQC) tasks and offers to solve difficult mixed integer programs. (Academic licenses are available free of charge. An article on Rmosek appeared in the JSS special issue on Optimization with R, see below.)
  • 'Gurobi Optimization' ships an R package with its software that allows for calling its solvers from R. Gurobi provides powerful solvers for LP, MIP, QP, MIQP, SOCP, and MISOCP models. See their website for more details. (Academic licenses are available on request.)

Some more commercial companies, e.g. 'LocalSolver', 'Artelys Knitro', or 'FICO Xpress Optimization', have R interfaces that are installed while the software gets installed. Trial licenses are available, see the corresponding websites for more information.

Combinatorial Optimization

  • Package adagio provides R functions for single and multiple knapsack and bin packing problems, solves subset sum, maximal sum subarray, empty rectangle and set cover problems, and finds Hamiltonian paths in graphs.
  • In package clue solve_LSAP() enables the user to solve the linear sum assignment problem (LSAP) using an efficient C implementation of the Hungarian algorithm. And function LAPJV() from package TreeDist implements the Jonker-Volgenant algorithm to solve the Linear Sum Assignment Problem (LSAP) even faster.
  • Package qap solves Quadratic Assignment Problems (QAP) applying a simulated annealing heuristics (other approaches will follow).
  • igraph, a package for graph and network analysis, uses the very fast igraph C library. It can be used to calculate shortest paths, maximal network flows, minimum spanning trees, etc.
  • mknapsack solves multiple knapsack problems, based on LP solvers such as 'lpSolve' or 'CBC'; will assign items to knapsacks in a way that the value of the top knapsacks is as large as possible.
  • Package 'knapsack' (see R-Forge project r rforge("optimist")) provides routines from the book `Knapsack Problems' by Martello and Toth. There are functions for (multiple) knapsack, subset sum, and binpacking problems. (Use of Fortran codes is restricted to personal research and academic purposes only.)
  • nilde provides routines for enumerating all integer solutions of linear Diophantine equations, resp. all solutions of knapsack, subset sum, and additive partitioning problems (based on a generating functions approach).
  • matchingR implements the Gale-Shapley algorithm for stable marriage and the college admissions problems, the stable roommates, and the house-allocation problems.
  • Package optmatch provides routines for solving matching problems by translating them into minimum-cost flow problems and then solved optimally by the RELAX-IV codes of Bertsekas and Tseng (free for research).
  • Package TSP provides basic infrastructure for handling and solving the traveling salesperson problem (TSP). The main routine solve_TSP() solves the TSP through several heuristics. In addition, it provides an interface to the Concorde TSP Solver, which has to be downloaded separately.
  • rminizinc provides an interface to the open-source constraint modeling language and system MiniZinc (to be downloaded separately). R users can apply the package to solve combinatorial optimization problems by modifying existing 'MiniZinc' models, and also by creating their own models.

Multi Objective Optimization

  • Function caRamel in package caRamel is a multi-objective optimizer, applying a combination of the multi-objective evolutionary annealing-simplex (MEAS) method and the non-dominated sorting genetic algorithm (NGSA-II); it was initially developed for the calibration of hydrological models.
  • Multi-criteria optimization problems can be solved using package mco which implements genetic algorithms.
  • GPareto provides multi-objective optimization algorithms for expensive black-box functions and uncertainty quantification methods.
  • The rmoo package is a framework for multi- and many-objective optimization, allowing to work with the representation of real numbers, permutations, and binaries, offering a high range of configurations.

Miscellaneous

  • The data cloning algorithm is a global optimization approach and a variant of simulated annealing which has been implemented in package dclone. The package provides low level functions for implementing maximum likelihood estimating procedures for complex models.
  • The irace package implements automatic configuration procedures for optimizing the parameters of other optimization algorithms, that is (offline) tuning their parameters by finding the most appropriate settings given a set of optimization problems.
  • Package kofnGA uses a genetic algorithm to choose a subset of a fixed size k from the integers 1:n, such that a user-supplied objective function is minimized at that subset.
  • copulaedas provides a platform where 'estimation of distribution algorithms' (EDA) based on copulas can be implemented and studied; the package offers various EDAs, and newly developed EDAs can be integrated by extending an S4 class.
  • tabuSearch implements a tabu search algorithm for optimizing binary strings, maximizing a user-defined target function, and returns the best (i.e. maximizing) binary configuration found.
  • Besides functionality for solving general isotone regression problems, package isotone provides a framework of active set methods for isotone optimization problems with arbitrary order restrictions.
  • mlrMBO is a flexible and comprehensive R toolbox for model-based optimization ('MBO'), also known as Bayesian optimization. And rBayesianOptimization is an implementation of Bayesian global optimization with Gaussian Processes, for parameter tuning and optimization of hyperparameters.
  • The desirability package contains S3 classes for multivariate optimization using the desirability function approach of Harrington (1965).
  • Package sna contains the function lab.optimize() which is the front-end to a set of heuristic routines for optimizing some bivariate graph statistics.
  • maxLik adds a likelihood-specific layer on top of a number of maximization routines like Brendt-Hall-Hall-Hausman (BHHH) and Newton-Raphson among others. It includes a summary and print methods that extract the standard errors based on the Hessian matrix and allows easy swapping of maximization algorithms.

References

Core packages

Related links

Other resources

About

Optimization Task View

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published