-
Notifications
You must be signed in to change notification settings - Fork 184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create ZoneOffsetCalculator
#5540
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good work! Unsure on some of the details in datagen but the rest looks great.
/// The default mapping between period and offsets. The second level key is a wall-clock time represented as | ||
/// the number of minutes since the local unix epoch. It represents when the offsets ended to be used. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Issue: This is the opposite of MetazonePeriodV1
, and in your data structs, you end up with a bunch of i32::MAX
instead of 0
, which will make the two data structs more difficult to merge. Is there a reason you did it this way?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TZDB defines its data like this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you can do it in 20-30 minutes, I think I prefer storing them the same way as Metazones. If you can't do it in that amount of time, we'll just leave it a the follow-up issue.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried and it's very messy. The rules are not guaranteed to be ordered, so you have to assemble everything first, and then convert end times to start times. For this conversion you need to keep track of the offset at that point. Not a 20 minute thing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, let's file a follow-up issue to merge the two markers and then we'll have to do this when we merge them.
#[cfg_attr(feature = "datagen", databake(path = icu_timezone::provider))] | ||
#[cfg_attr(feature = "serde", derive(serde::Deserialize))] | ||
#[yoke(prove_covariance_manually)] | ||
pub struct ZoneOffsetPeriodV1<'data>( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Observation: MetazonePeriodV1 has
"aqcas": {
"0": "auwe",
"20930040": "case",
"21128580": "auwe",
"21995640": "case",
"22164060": "auwe",
"24617760": "case",
"25345020": "auwe",
"25647600": "case",
"25879200": "auwe",
"26168820": "case",
"26393280": "auwe",
"26695681": "case",
"26927340": "auwe",
"27219841": "case",
"27451500": "auwe",
"27744001": "case",
"27971520": "auwe"
},
And ZoneOffsetPeriodV1 has
"aqcas": {
"20930520": [
64,
0
],
"21129240": [
88,
0
],
"21996120": [
64,
0
],
"22164060": [
88,
0
],
"24618240": [
64,
0
],
"25345680": [
88,
0
],
"25648080": [
64,
0
],
"25879860": [
88,
0
],
"26169300": [
64,
0
],
"26393940": [
88,
0
],
"26696161": [
64,
0
],
"26928000": [
88,
0
],
"27220321": [
64,
0
],
"27452160": [
88,
0
],
"27744481": [
64,
0
],
"27972180": [
88,
0
],
"2147483647": [
64,
0
]
},
which seem highly correlated. There are some that match exactly, such as 22164060
. There are others that are virtually the same, such as 21128580
and 21129240
, differing by 11 hours, which seems more likely to be a rounding / calculation / data error than an actual difference in when the two events occurred (metazone switch and offset switch).
@nekevss and @nordzilla may want to review. |
@@ -0,0 +1,1459 @@ | |||
# tzdb data for Africa and environs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that you are downloading the TZDB from the GitHub tag, you don't need the tests/data/tzdb directory, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now I need it, unless you want cargo test
to download it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh it's needed for the same reason that we have a subset of CLDR JSON in the repo?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes
#5466
Supersedes #5515