Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Table row statistics not stable for hash partitioned table #33920

Closed
mjonss opened this issue Apr 13, 2022 · 2 comments
Closed

Table row statistics not stable for hash partitioned table #33920

mjonss opened this issue Apr 13, 2022 · 2 comments
Labels
severity/moderate sig/planner SIG: Planner type/bug The issue is confirmed as a bug.

Comments

@mjonss
Copy link
Contributor

mjonss commented Apr 13, 2022

Bug Report

Please answer these questions before submitting your issue. Thanks!

1. Minimal reproduce step (Required)

Looping over this test will eventually result in different partition row count for table t1:

DROP DATABASE IF EXISTS TICASE_2577;
CREATE DATABASE TICASE_2577;
USE TICASE_2577;

create table t1 (id int) partition by hash(id) partitions 10; 
create table t2 (id int) partition by range(id) (
partition p0 values less than (10),
partition p1 values less than (20),
partition p2 values less than (30),
partition p3 values less than (maxvalue));
create table t3 (id int) partition by range columns (id) (
partition p0 values less than (10),
partition p1 values less than (20),
partition p2 values less than (30),
partition p3 values less than (maxvalue));

insert into t1 values (1),(2),(3),(7),(11),(22),(23),(34),(45),(66),(72),(89),(97),(100);
insert into t2 values (1),(2),(3),(7),(11),(22),(23),(34),(45),(66),(72),(89),(97),(100);
insert into t3 values (1),(2),(3),(7),(11),(22),(23),(34),(45),(66),(72),(89),(97),(100);

analyze table t1; 
analyze table t2; 
analyze table t3; 

select PARTITION_NAME,PARTITION_METHOD,PARTITION_EXPRESSION,TABLE_ROWS from information_schema.partitions where table_name = 't1' and table_schema = 'TICASE_2577';
select PARTITION_NAME,PARTITION_METHOD,PARTITION_EXPRESSION,TABLE_ROWS from information_schema.partitions where table_name = 't2' and table_schema = 'TICASE_2577';
select PARTITION_NAME,PARTITION_METHOD,PARTITION_EXPRESSION,TABLE_ROWS from information_schema.partitions where table_name = 't3' and table_schema = 'TICASE_2577';

2. What did you expect to see? (Required)

select PARTITION_NAME,PARTITION_METHOD,PARTITION_EXPRESSION,TABLE_ROWS from information_schema.partitions where table_name = 't1' and table_schema = 'TICASE_2577';
PARTITION_NAME	PARTITION_METHOD	PARTITION_EXPRESSION	TABLE_ROWS
p0	HASH	`id`	1
p1	HASH	`id`	2
p2	HASH	`id`	3
p3	HASH	`id`	2
p4	HASH	`id`	1
p5	HASH	`id`	1
p6	HASH	`id`	1
p7	HASH	`id`	2
p8	HASH	`id`	0
p9	HASH	`id`	1

3. What did you see instead (Required)

select PARTITION_NAME,PARTITION_METHOD,PARTITION_EXPRESSION,TABLE_ROWS from information_schema.partitions where table_name = 't1' and table_schema = 'TICASE_2577';
PARTITION_NAME	PARTITION_METHOD	PARTITION_EXPRESSION	TABLE_ROWS
p0	HASH	`id`	2
p1	HASH	`id`	4
p2	HASH	`id`	6
p3	HASH	`id`	4
p4	HASH	`id`	2
p5	HASH	`id`	2
p6	HASH	`id`	2
p7	HASH	`id`	4
p8	HASH	`id`	0
p9	HASH	`id`	2

Also seen this in our automated tests:

select PARTITION_NAME,PARTITION_METHOD,PARTITION_EXPRESSION,TABLE_ROWS from information_schema.partitions where table_name = 't1' and table_schema = 'TICASE_2577';
PARTITION_NAME	PARTITION_METHOD	PARTITION_EXPRESSION	TABLE_ROWS
p0	HASH	`id`	1
p1	HASH	`id`	4
p2	HASH	`id`	6
p3	HASH	`id`	4
p4	HASH	`id`	1
p5	HASH	`id`	1
p6	HASH	`id`	1
p7	HASH	`id`	4
p8	HASH	`id`	0
p9	HASH	`id`	1

4. What is your TiDB version? (Required)

tidb_version(): Release Version: v6.1.0-alpha-173-g32b9c14779
Edition: Community
Git Commit Hash: 32b9c14779c2a7dd73003667d81bb42f67a33385
Git Branch: HEAD
UTC Build Time: 2022-04-12 23:40:29
GoVersion: go1.18
Race Enabled: false
TiKV Min Version: v3.0.0-60965b006877ca7234adaced7890d7b029ed1306
Check Table Before Drop: false
@mjonss mjonss added the type/bug The issue is confirmed as a bug. label Apr 13, 2022
@xuyifangreeneyes
Copy link
Contributor

The bug is known. #22934 explains the root cause.

@mjonss
Copy link
Contributor Author

mjonss commented Apr 20, 2022

Closing as duplicate of #22934.

@mjonss mjonss closed this as completed Apr 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
severity/moderate sig/planner SIG: Planner type/bug The issue is confirmed as a bug.
Projects
None yet
Development

No branches or pull requests

3 participants