Suppose you have a huge table, and that table contains a column with very low cardinality.
One of the values of that column represent most of the table (let's say 99%).
For example:
Create table missions
(
id number,
name varchar2(100),
status number,
<more columns>
);
select status, count(1) from missions
group by status;
status count(1)
==== ======
1 20000
2 30000
3 50000000
The application wants to select all the missions with status 1.
It's obvious that we need to create an index with 'status' as the first column.
However, creating a normal index on ('status','id','name') will index all the values of the table, even though we will NEVER use this index for selecting status =3, and that's just a waste of memory and a heavy index that can reflect on performance.
The solution
Using a function-based index and the fact that Oracle isn't indexing null values, we can create this nice index:
Create index missions_sts_1 on missions
(
decode(status, 1, 1, NULL),
decode(status, 1, id, NULL),
decode(status, 1, name, NULL)
)
(and a very similar index for status=2).
This index will weight 1% from the original index, and that's a huge different.
Just don't forget to use in your queries the full column definition as it written on the light index:
select decode(status, 1, id, NULL)
from missions
where decode(status, 1, 1, NULL) = 1
Good luck!
No comments:
Post a Comment