Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Not for Merge] New interface for Evolutionary.jl #26

Closed
wants to merge 2 commits into from

Conversation

GiggleLiu
Copy link
Contributor

Suggestions for interface change.

@coveralls
Copy link

Coverage Status

Coverage decreased (-58.0%) to 42.039% when pulling 77d9109 on GiggleLiu:master into c7fb224 on wildart:master.

@wildart
Copy link
Owner

wildart commented Nov 28, 2018

I find AbstractOptProblem interface superficial. Can you elaborate more on its usage. Usually, it's possible to design a optimized "loss" without complex type definitions. What is the benefit of such interface?

@GiggleLiu
Copy link
Contributor Author

GiggleLiu commented Nov 28, 2018

Not for interfaces, but for the design.

  • I find currently, program accepts limited input types like Float, Vector... Isn’t ‘eltype(::AbstractOptProblem)’ a more natural design? Also, current version of ‘getIndividual’ function is not satisfactory, isn’t ‘getIndividual(::AbstractOptProblem)’ better?
  • In general, one need to specify multiple functions “loss, jacobian and hessian”, then dispatch is a natural way of specifying them.

Since all these features can be replaced by using more optional parameters, this design does not make interfaces easier to use. But I believe it is be a good intermediate layer towards unified optimizer framework.

Nevermind if you don’t like it, this not a nessesary feature.

@wildart
Copy link
Owner

wildart commented Nov 30, 2018

I like API from Optim.jl where optimize function is called with a multiple parameters which is similar to what individual methods of this package have. BTW, Optim-style API supports gradients and hessians.

I think a similar API can be crafted:

N = 2
optimize(f, N, CMAES= 3, λ = 12))

x0 = [.5, .5] # initPopulation
optimize(f, x0, ES(
    initStrategy = strategy= 1.0),
    recombination = average,
    mutation = isotropic,
    μ = 15, ρ = 5, λ = 100)
))

optimize(f, N, GA(
    populationSize = 100,
    ɛ = 0.1,
    selection = sus,
    crossover = intermediate(0.25),
    mutation = domainrange(fill(0.5,N)))
))

and in addition have a similar interface for iterators

ci = Optimize(f, N, CMAES= 3, λ = 12))
for (count, cr) in enumerate(ci)
	if count == 1_1000 || cr.σ<1e-10 break end
	println("BEST: $(cr.fitpop[1]): $(cr.σ)")
end

Also, current version of ‘getIndividual’ function is not satisfactory, isn’t ‘getIndividual(::AbstractOptProblem)’ better?

Definitely getIndividual should be rewritten in multiple dispatch style which will allow its extension to different types of individual encoding.

@wildart wildart mentioned this pull request Mar 18, 2020
3 tasks
@wildart
Copy link
Owner

wildart commented Apr 23, 2020

See #49 for the new API implementation.

@wildart wildart closed this Apr 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants