Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

first attempts at writing the serializers for the new types PhylogeneticModel and GroupBasedPhylogeneticModel #4010

Draft
wants to merge 15 commits into
base: master
Choose a base branch
from

Conversation

mariusneu
Copy link
Contributor

I am currently trying to serialize the new data types introduced in the algebraic statistics (sub)project on algebraic phylogenetics by @marinagarrote et al (cf. #3812). Following the documentation and writing the code as naive as possible, I got problems when testing the functions by trying to save an instance of GroupPhylogeneticModel, with the following error message: Unsupported type 'GroupBasedPhylogeneticModel' for encoding.

Copy link

codecov bot commented Aug 14, 2024

Codecov Report

Attention: Patch coverage is 17.07317% with 34 lines in your changes missing coverage. Please review.

Project coverage is 84.10%. Comparing base (6f856d1) to head (fdb6a0b).
Report is 62 commits behind head on master.

Files with missing lines Patch % Lines
...erimental/AlgebraicStatistics/src/serialization.jl 0.00% 26 Missing ⚠️
src/Serialization/Combinatorics.jl 0.00% 6 Missing ⚠️
src/Serialization/containers.jl 75.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #4010      +/-   ##
==========================================
- Coverage   84.58%   84.10%   -0.48%     
==========================================
  Files         596      613      +17     
  Lines       81969    83133    +1164     
==========================================
+ Hits        69336    69923     +587     
- Misses      12633    13210     +577     
Files with missing lines Coverage Δ
...ntal/AlgebraicStatistics/src/PhylogeneticModels.jl 100.00% <100.00%> (ø)
src/Serialization/containers.jl 88.16% <75.00%> (+0.41%) ⬆️
src/Serialization/Combinatorics.jl 74.41% <0.00%> (-12.07%) ⬇️
...erimental/AlgebraicStatistics/src/serialization.jl 0.00% <0.00%> (ø)

... and 108 files with indirect coverage changes

save_object(s, graph(pm), :tree) # already implemented
save_object(s, number_states(pm), :states) # already implemented
save_object(s, probability_ring(pm), :probability_ring) # already implemented
save_object(s, root_distribution(pm), :root_distribution) # needs to be implemented, or we change the entry type to QQFieldElem
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can using save_typed_object here if the type isn't always know, or if your using a vector any because not all entries are the same type you'll need to use a tuple

@@ -19,6 +20,19 @@ function load_object(s::DeserializerState, g::Type{Graph{T}}) where T <: Union{D
return g(smallobj)
end

function save_object(s::SerializerState, e::Edge)
save_data_dict(s) do
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this can be done using save_data_array instead, it'll make file sizes smaller

antonydellavecchia and others added 2 commits August 21, 2024 17:58
…zation of dicts to catch the case where the key is not serialized with params


function load_object(s::DeserializerState, g::Type{PhylogeneticModel})
graph = load_object(s, Graph{Directed}, :tree)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here you would need to use

Suggested change
graph = load_object(s, Graph{Directed}, :tree)
graph = load_typed_object(s, :tree)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants