Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

genericize CGLS #53

Closed
wants to merge 2 commits into from
Closed

genericize CGLS #53

wants to merge 2 commits into from

Conversation

marcrasi
Copy link
Collaborator

@marcrasi marcrasi commented May 12, 2020

This PR introduces a DecomposedAffineFunction protocol abstracting the concept of an affine function whose linear and bias components are accessible. Also, its linear component can be applied in the forwards or adjoint direction.

This PR also changes CGLS to work in terms of an abstract DecomposedAffineFunction. Finally, this PR conforms GaussianFactorGraph to DecomposedAffineFunction so that we can run CGLS on it.

This allows us to run CGLS on anything, as long as it supports the required operations. For example:

  • GaussianFactorGraph as it is today.
  • We can experiment with other representations of gaussian factor graphs that involve fewer heap allocations, without modifying the implementation of CGLS, and without messing with the existing GaussianFactorGraph!
  • Matrix free jacobians.
  • In principle, it should be possible to write a DistributedGaussianFactorGraph that distributes its state over multiple computers, and run the existing CGLS algorithm without any modifications. (I haven't thought too hard about this, so there may be some practical obstacles.)

I wrote more justifications for why genericizing is good at #42 (comment) and #42 (comment).

Examples/Pose2SLAMG2O/main.swift Outdated Show resolved Hide resolved
}
}

/// Errors + scalar
static func + (_ lhs: Self, _ rhs: Double) -> Self {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain briefly how these are automatically implemented? Curious on the workings...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah! We have automatic synthesis for AdditiveArithmetic and VectorProtocol that looks at all the fields of the struct. If all the fields conform to AdditiveArithmetic or VectorProtocol, it automatically implements the requirements by applying the functions to all the members.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The single field Array<Error>.DifferentiableView conforms to AdditveArithmetic and VectorProtocol because:

  1. Error (aka Vector) conforms to them.
  2. Array.DifferentiableView conforms to AdditiveArithmetic when its Element does: https://github.com/apple/swift/blob/4bc72aedb2b32c33ed8e2ec241615fc890b60002/stdlib/public/Differentiation/ArrayDifferentiation.swift#L98
  3. Array.DifferentiableView conforms to VectorProtocol when its Element does: https://github.com/tensorflow/swift-apis/blob/a8a24c46e478ce50c1c9a7718a41eec453e5b670/Sources/TensorFlow/StdlibExtensions.swift#L224

Sources/SwiftFusion/Optimizers/CGLS.swift Outdated Show resolved Hide resolved
@dellaert
Copy link
Member

See my comment in #42 - might impact above

@marcrasi
Copy link
Collaborator Author

Based on discussions in #42 and with Dave, I have improved the protocol naming.

@dellaert
Copy link
Member

We really need to talk :-) I sent an invite for a hangout at 11am your time.


// MARK: - Additional EuclideanVectorSpace requirements.

public var squaredNorm: Double {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we still need the .differentiableMap methods?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.differentiableMap is still necessary for writing a differentiable function that does a map over an array.

This function currently isn't declared @differentiable, so it does not need to use .differentiableMap.

I like to avoid adding the @differentiable annotation until we actually need to use a function in a differentiable way, to keep things as simple as possible.

@marcrasi
Copy link
Collaborator Author

marcrasi commented Jun 2, 2020

Closing this to clean up, since it won't be relevant with the new generic factor graph data structure.

@marcrasi marcrasi closed this Jun 2, 2020
@marcrasi marcrasi deleted the feature/generic_cgls branch June 2, 2020 17:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants