Wednesday, October 21, 2020

How to transfer SQL Plan baselines from one database to another

A customer called with a request to transfer a specific SQL plan from production to test. The query was found in the shared pool:
SELECT TO_CHAR(LAST_ACTIVE_TIME, 'dd.mm.yyyy hh24:mi') "last active",
       SQL_ID,
       PLAN_HASH_VALUE,
       SQL_PLAN_BASELINE "BASELINE", 
       SQL_PROFILE,
       IS_RESOLVED_ADAPTIVE_PLAN "AdaptivePlan",
       CHILD_NUMBER AS "Child Num",
       ELAPSED_TIME,
       EXECUTIONS,
       ROUND(ELAPSED_TIME/1000000) "duration (sec)"
       ,CASE WHEN EXECUTIONS > 0 THEN
        ROUND( (ELAPSED_TIME/EXECUTIONS)/1000000, 1)
        ELSE
            NULL
       END "sec per exe"
FROM V$SQL 
WHERE SQL_ID in('5cbuqgt5m5961')
ORDER BY LAST_ACTIVE_TIME DESC;
last active SQL_ID PLAN_HASH_VALUE BASELINE SQL_PROFILE AdaptivePlan Child Num ELAPSED_TIME EXECUTIONS duration (sec) sec per exe
21.10.2020 13:34 5cbuqgt5m5961
3389950398
SQL_PLAN_ds58srfg86uf4f5a21fb1 SYS_SQLPROF_014fb0ca24980001  
1
210182320
1402169
210
0

In the test database, the same query does not have a plan in the SMB, nor does it have a SQL Profile. The same query as above gave me this result in the test database:
last active SQL_ID PLAN_HASH_VALUE BASELINE SQL_PROFILE AdaptivePlan Child Num ELAPSED_TIME EXECUTIONS duration (sec) sec per exe
21.10.2020 10:47 5cbuqgt5m5961
1355560190
     
0
110508458900
27032
110508
4,1
We see that the query needs a SQL Plan baseline and a SQL Profile to perform well.
Oracle has a way to export the metadata needed to stabilize the query plan, and import it in another database.
When exporting SQL Plan baselines you would use the package dbms_spm

Here is what I did to transfer the SQL Plan from one database to another.
I am connecting to the databases as a privileged user called dbadmin throughout this post. In part two of this post, I will show you how I transferred the SQL Profile from one database to another.

1. On the source database, create a staging table for SQL Plan Baselines:
vi 1.cre_stagetab.sql
Add the following:
connect dbadmin
BEGIN
  DBMS_SPM.CREATE_STGTAB_BASELINE (
    table_name => 'MYSTAGETAB');
END;
/
Execute the file:
sqlplus /nolog @1.cre_stagetab.sql
Still in the source database, check who the owner of the baseline is:
SELECT plan_name,
         sql_handle,
         creator,
         origin,
         TO_CHAR (created, 'dd.mm.yyyy hh24:mi') "created",
         TO_CHAR (last_executed, 'dd.mm.yyyy hh24:mi') "lastexecuted",
         enabled,
         accepted,
         fixed,
         autopurge,
         module,
         ROUND (ELAPSED_TIME / 1000000) "duration in seconds",
         cpu_time,
         buffer_gets,
         disk_reads
    FROM DBA_SQL_PLAN_BASELINES
where plan_name='SQL_PLAN_ds58srfg86uf4f5a21fb1';
The output shows that the owner is SYS:
PLAN_NAME SQL_HANDLE CREATOR ORIGIN created lastexecuted ENABLED ACCEPTED FIXED AUTOPURGE MODULE duration in seconds CPU_TIME BUFFER_GETS DISK_READS
SQL_PLAN_ds58srfg86uf4f5a21fb1 SQL_dc1518bb9e8369c4 SYS MANUAL-LOAD-FROM-STS 30.03.2020 15:13 18.10.2020 03:13 YES YES YES YES JDBC Thin Client
8
3897100
597270
715

2. Create the file 2.pack_stagetab.sql:
vi 2.pack_stagetab.sql
Add the following:
conn dbadmin
alter session set nls_language='american';

DECLARE
  v_plan_cnt NUMBER;
BEGIN
  v_plan_cnt := DBMS_SPM.PACK_STGTAB_BASELINE (
    table_name => 'MYSTAGETAB'
,   enabled    => 'yes'
,   creator    => 'SYS'
,   plan_name  => 'SQL_PLAN_ds58srfg86uf4f5a21fb1'
);
END;
/
exit
Run the file:
sqlplus /nolog @2.pack_stagetab.sql
Let's verify that the staging table has some rows in it:
connect dbadmin
select count(*)
from  MYSTAGETAB;
3. Create an export parameter file:
vi 3.expdp.par
Add the following. Make sure the directory DP exist and is writable:
userid=dbadmin
tables=MYSTAGETAB
logfile=expdp_MYSTAGETAB.log
dumpfile=MYSTAGETAB.dmp
job_name=exp_MYSTAGETAB
directory=DP
Run the export:
expdp parfile=3.expdp.par
When the export is done, transfer the file to the destination server using ssh.
At this point logon to the destination server.

4. Create an import parameter file:
vi 4.impdp.par
Add the following. Make sure the directory DP exist and is writable:
userid=dbadmin
tables=MYSTAGETAB
logfile=impdp_MYSTAGETAB.log
dumpfile=MYSTAGETAB.dmp
job_name=imp_MYSTAGETAB
directory=DP
Run the import:
impdp parfile=4.impdp.par
5. Create the file for unpacking the SQL Plans to the local SMB:
vi 5.unpack.sql
Add the following:
conn dbadmin
alter session set nls_language='american';
set serveroutput on
DECLARE
  v_plan_cnt NUMBER;
BEGIN
  v_plan_cnt := DBMS_SPM.UNPACK_STGTAB_BASELINE (
    table_name => 'MYSTAGETAB'
    ,plan_name  => 'SQL_PLAN_ds58srfg86uf4f5a21fb1'
);
  DBMS_OUTPUT.put_line('Number of plans loaded: ' || v_plan_cnt);
END;
/
exit
Run the file:
sqlplus /nolog @5.unpack.sql
You should now be able to see a different execution plan when executing the query at the top of this article:
last active SQL_ID PLAN_HASH_VALUE BASELINE SQL_PROFILE AdaptivePlan Child Num ELAPSED_TIME EXECUTIONS duration (sec) sec per exe
21.10.2020 13:19 5cbuqgt5m5961
1355560190
     
1
20850937263
1488
20851
14
21.10.2020 12:55 5cbuqgt5m5961
1355560190
     
0
112190741129
27134
112191
4,1
The performance actually got worse! The reason is that we are missing the SQL Profile I will now transfer the SQL Profile to the same server in the next post.

No comments:

Post a Comment