There may be a need to partition an existing table to improve the performance on it.
Need for Table Partitioning:
1. Improve the maintainability
2. Improve the performance of DML operations and Select operations.
Methods for Partitioning an existing table:
There are various methods for partitioning an existing table.
1. DBMS_REDEFINITION
2. Export/Import
We are going to cover here about Export/Import Methodology:
Export and Import is one of the oldest techniques for Table partitioning.
Steps are really very simple:
Lets explain better with an example:
Situation:
There is a huge table
big_table which is frequently undergoing updations, direct reads and insertions too..
At this point we found certain performance issues when the table is growing larger and larger, as it is too hectic to scan for the whole table to update/select only certain set of data.
Here comes the idea of partitioning the huge table into chunks and get the required data from the chunks instead of the whole table:
Step 1: Export the huge table lets say it is of 30GB
expdp apps/apps directory=EXP_IMP dumpfile=big_table.dmp logfile=big_table_exp.log Tables=big_table
Step 2 : Make a note of the "BIG_TABLE"'s Structure as we need to recreate it
Structure includes constraints , grants, indices built upon it,etc.
Step 3 :Drop the table "BIG_TABLE"
drop table apps.big_table cascade constraints;
Step 4: Recreate the table with required partitions.
Recreate the table with the required partitioning strategy single or composite partitioning..
Step 5: Recreate all the indices that are built on top of the BIG_TABLE.
Step 6: Import the table "BIG_TABLE" from the dump
impdp apps/apps directory=EXP_IMP dumpfile=big_table.dmp logfile=big_table_imp.log Tables=big_table Table_exists_action=APPEND
Your Import will complete with one warning stating that the table already exists. This can be ignored as we created the table.
Note:
If Table_exists_action=APPEND option is not given then your import will fail and will not insert any rows to the table.
This option instructs oracle to insert the data if the object already exists.
For more information on the Export/Import utility you can follow the below link:
http://www.oracle-base.com/articles/10g/OracleDataPump10g.php
After partitioning the BIG_TABLE you will really see the dramatic increase in the performance for both DML and Read operations.......
Best practices:
1.Never try to do this directly in your production environment
2. Test in a DEV/TEST environment where you have more or less same amount of data.
3. Note down the timing taken, which will be useful for planning the same for production.
4. Test the performance before and after partitioning.
Comments
Post a Comment