Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize copy of a SparseMatrixCOO to a dense matrix #33

Merged
merged 1 commit into from
May 23, 2021

Conversation

frapac
Copy link
Collaborator

@frapac frapac commented May 20, 2021

When solving dense nonlinear optimization problem, copying a SparseMatrixCOO to a dense array amount to up to 11% of the total running time in MadNLP.optimize!. This PR should add an @inbounds macro to optimize the function _copyto!

@frapac frapac requested a review from sshin23 May 20, 2021 20:17
@frapac
Copy link
Collaborator Author

frapac commented May 22, 2021

Tests are breaking because of IterativeSolvers. See PR #34

@codecov
Copy link

codecov bot commented May 22, 2021

Codecov Report

Merging #33 (6315a38) into master (698a009) will not change coverage.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master      #33   +/-   ##
=======================================
  Coverage   86.33%   86.33%           
=======================================
  Files          19       19           
  Lines        2634     2634           
=======================================
  Hits         2274     2274           
  Misses        360      360           
Impacted Files Coverage Δ
src/matrixtools.jl 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 698a009...6315a38. Read the comment docs.

@sshin23 sshin23 merged commit 2ce1a30 into MadNLP:master May 23, 2021
@sshin23
Copy link
Member

sshin23 commented May 23, 2021

@frapac Thanks for working on this! Later I think later we should separately define a dense nonlinear program type and natively support dense NLPs. But for now, this looks good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants