Having discussed
about the importance of Real-Time statistics and how it helps the optimizer to
produce better estimates in the previous
blog post, it is the time to look at the deep insights about Real-Time statistics
feature.
In this
blogpost we will see about how the real-time statistic will work with base
statistics in place.
demo@PDB19> create table
t(
2 n1 number primary key,
3 n2 number ,
4 n3 varchar2(80) );
Table created.
demo@PDB19> insert into
t(n1,n2,n3)
2 with rws as ( select rownum n from dual connect by level <= 1000 )
3 select n, ceil(sqrt(n)),
4 to_char(to_date(n,'j'),'jsp')
5 from rws;
1000 rows created.
demo@PDB19> commit;
Commit complete.
So, we got a
table created and loaded it with sample data sets from conventional path dml’s
demo@PDB19> select
num_rows,blocks,stale_stats,notes
2 from user_tab_statistics
3 where table_name ='T';
NUM_ROWS
BLOCKS STALE_S NOTES
---------- ---------- ------- -------------------------
demo@PDB19> select
column_name,num_nulls,num_distinct,low_value,high_value,notes
2 from user_tab_col_statistics
3 where table_name ='T';
no rows selected
demo@PDB19>
we don’t have
basic statistics in the data dictionary, since the above load is neither direct
path load nor we don’t have any calls to dbms_stats package.
Now lets do few more loads to this table
demo@PDB19> insert into
t(n1,n2,n3)
2 with rws as ( select rownum+1000 n from dual connect by level <= 10000 )
3 select n, ceil(sqrt(n)),
4 to_char(to_date(n,'j'),'jsp')
5 from rws;
10000 rows created.
demo@PDB19> commit;
Commit complete.
And the explain
plan from the above dml shows this
demo@PDB19> explain plan
for
2 insert into t(n1,n2,n3)
3 with rws as ( select rownum+1000 n from dual connect by level <= 1000 )
4 select n, ceil(sqrt(n)),
5 to_char(to_date(n,'j'),'jsp')
6 from rws;
Explained.
demo@PDB19> @xplan
PLAN_TABLE_OUTPUT
-----------------------------------------------------------------------------------------
Plan hash value: 761049541
-----------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
-----------------------------------------------------------------------------------------
| 0 | INSERT STATEMENT | | 1 | 13 | 2 (0)| 00:00:01 |
| 1 | LOAD TABLE CONVENTIONAL | T | | | | |
| 2 | OPTIMIZER STATISTICS GATHERING | | 1 | 13 | 2 (0)| 00:00:01 |
| 3 | VIEW | | 1 | 13 | 2 (0)| 00:00:01 |
| 4 | COUNT | | | | | |
|* 5 | CONNECT BY WITHOUT FILTERING| | | | | |
| 6 | FAST DUAL | | 1 | | 2 (0)| 00:00:01 |
-----------------------------------------------------------------------------------------
Predicate Information
(identified by operation id):
---------------------------------------------------
5 - filter(LEVEL<=1000)
18 rows selected.
The above plan
shows real-time statistics being gathered during conventional dml, but we still
don’t see any real-time statistics kick in
demo@PDB19> select
num_rows,blocks,stale_stats,notes
2 from user_tab_statistics
3 where table_name ='T';
NUM_ROWS
BLOCKS STALE_S NOTES
---------- ---------- ------- -------------------------
demo@PDB19> select
column_name,num_nulls,num_distinct,low_value,high_value,notes
2 from user_tab_col_statistics
3 where table_name ='T';
no rows selected
demo@PDB19>
even after
flushing the monitoring info explicitly
demo@PDB19> exec
dbms_stats.flush_database_monitoring_info;
PL/SQL procedure successfully
completed.
demo@PDB19> select
num_rows,blocks,stale_stats,notes
2 from user_tab_statistics
3 where table_name ='T';
NUM_ROWS
BLOCKS STALE_S NOTES
---------- ---------- ------- -------------------------
demo@PDB19> select
column_name,num_nulls,num_distinct,low_value,high_value,notes
2 from user_tab_col_statistics
3 where table_name ='T';
no rows selected
demo@PDB19>
so just because
the statistics gathered doesn’t mean they are stored. So, if we don’t have any normal
statistics on the underlying tables, then we don’t get into real-time
statistics.
2 n1 number primary key,
3 n2 number ,
4 n3 varchar2(80) );
2 with rws as ( select rownum n from dual connect by level <= 1000 )
3 select n, ceil(sqrt(n)),
4 to_char(to_date(n,'j'),'jsp')
5 from rws;
2 from user_tab_statistics
3 where table_name ='T';
---------- ---------- ------- -------------------------
2 from user_tab_col_statistics
3 where table_name ='T';
Now lets do few more loads to this table
2 with rws as ( select rownum+1000 n from dual connect by level <= 10000 )
3 select n, ceil(sqrt(n)),
4 to_char(to_date(n,'j'),'jsp')
5 from rws;
2 insert into t(n1,n2,n3)
3 with rws as ( select rownum+1000 n from dual connect by level <= 1000 )
4 select n, ceil(sqrt(n)),
5 to_char(to_date(n,'j'),'jsp')
6 from rws;
-----------------------------------------------------------------------------------------
Plan hash value: 761049541
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
-----------------------------------------------------------------------------------------
| 0 | INSERT STATEMENT | | 1 | 13 | 2 (0)| 00:00:01 |
| 1 | LOAD TABLE CONVENTIONAL | T | | | | |
| 2 | OPTIMIZER STATISTICS GATHERING | | 1 | 13 | 2 (0)| 00:00:01 |
| 3 | VIEW | | 1 | 13 | 2 (0)| 00:00:01 |
| 4 | COUNT | | | | | |
|* 5 | CONNECT BY WITHOUT FILTERING| | | | | |
| 6 | FAST DUAL | | 1 | | 2 (0)| 00:00:01 |
-----------------------------------------------------------------------------------------
---------------------------------------------------
2 from user_tab_statistics
3 where table_name ='T';
---------- ---------- ------- -------------------------
2 from user_tab_col_statistics
3 where table_name ='T';
2 from user_tab_statistics
3 where table_name ='T';
---------- ---------- ------- -------------------------
2 from user_tab_col_statistics
3 where table_name ='T';
No comments:
Post a Comment