-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix problems computing feature density for overviews creation #258
Conversation
Fixes #257 23 = 32 bits per integer - 1 sign bit - 8 bits px/tile
* The initial iteration of the recursive feature density query shouldn't start beyond the maximum level * Correct off-by-one limit the rest of iterations
Note that _CDB_Feature_Density_Ref_Z_Strategy returns the first level for which overviews should not be used, and that in some cases _CDB_Feature_Density should look beyond the max level to compute a feature density estimate.
So, if we create overviews for level 23 the base table will never be used by Mapnik (images an utfgrid). Is this a feature or a bug?
If the bug side wins, we could make the overviews maximum level to be 22, or extend |
the thing is, why we don't use a real math function? https://github.com/mapbox/postgis-vt-util/blob/master/src/Z.sql |
If the table had x and/or y columns they were picked by an inner select instead of the recursive arguments. Fixes #256
Regarding using a formula for Z: there's a PR for that: #197 (but this doesn't return zoom levels I'll take a look to see if it might have any implications (specially returning values greater than 23) |
I finally found the cause of the floating point arguments error! (#256) The recursive query introduces the names
Which are used (unqualified) among other places in an inner subquery from the table:
And in this case, if the table has and Here these names must be qualified as |
good catch! |
The maximum |
@rafatower please review |
@@ -1,3 +1,29 @@ | |||
-- Maximum zoom level for which overviews may be created | |||
CREATE OR REPLACE FUNCTION _CDB_MaxOverviewLevel() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why not changing the internal representation of coordinates to bigint
to be able to hold them up to z = 31?
functions such as CDB_XYZ_Extent
and others can have their bigint
counterpart. Both can coexist and the integer
version be marked as "deprecated" and make them issue a warning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
By using bigints we could raise the maximum overview level from 23 to 29 or 30 (it'd be pointless to support higher levels because using such levels would cause problems elsewhere).
I hadn't think of that approach, basically because I think having up to 23 overview levels should be enough for most cases, but I can do a quick check to see if it a) is easy to implement b) doesn't have a performance impact (the feature density query is already quite slow).
We still have to determine if the fix fo #257 fixes #256 too (in all the cases I've tried the error is like in the first problem: integer too large, and not like the second: casted to float8