I was working with a customer today who had brought in a new Oracle source into their OBIEE physical layer. The source had several tables with date columns that used the DATE datatype, and these tables were then used to create a simple dimensional model, like the one below.
Everything worked well, until they created a request with a filter on the date column. Remember that the column in Oracle is a DATE datatype, yet the filter added by Answers below adds a TIMESTAMP to the predicate.
Now the reason that this is an issue, is if you’ve got an index defined on the date column. If you let Answers generate the query as it is, you end up with SQL looking like this:
-------------------- Logical Request (before navigation): RqList distinct Items.Order ID as c1 GB, Items.Order Date as c2 GB DetailFilter: Items.Order Date = TIMESTAMP '2007-04-12 00:00:00.000' OrderBy: c1 asc, c2 asc +++Administrator:480000:48000b:----2009/03/06 21:27:26 -------------------- Sending query to database named orcl2 (id: <<309826>>): select distinct T6622.ORDID as c1, T6631.ORDERDATE as c2 from ITEMS T6622, ORDERS T6631 where ( T6622.ORDID = T6631.ORDID and T6631.ORDERDATE = TIMESTAMP '2007-04-12 00:00:00' ) order by c1, c2
which will mean that your index won’t get used, unless you create a corresponding function-based index on the date column. It also screws up partition elimination as well, which can seriously impact the response time of your query.
The answer to this problem is to go back to the physical model in Oracle BI Administrator, and change the DATETIME datatype that OBIEE assigns to Oracle DATE datatypes by default, to an OBIEE DATE datatype, like this:
This altered datatype then propagates through to the business model and mapping layer, and then the presentation layer of the model. After restarting the Oracle BI Server just to be on the safe side, reloading the server metadata in Oracle BI Answers, I go to create the same report. To my surprise though, the filter prompt still adds a timestamp to my choice of date.
Cue several hours of fiddling with column formulas in Answers, wrapping the filter value in a CAST function and so on, all of which only partially solved the problem and usually ended up with either the predicate or the table column being wrapped in a function call, which again cancelled out the index. In the end, we just tried dropping the column from the business model and mapping layer, and then adding it back in again from the physical layer.
Now, after adding the new column to the presentation layer, saving the repository and then reloading the server metadata in Answers, when I create a new request in Answers the filter prompt just uses the date on its own.
Then, if I take a look at the SQL in the query log, we can see that the predicate is just a date. Checking the execution plan afterwards shows that the index is indeed being used.
-------------------- Logical Request (before navigation): RqList distinct Items.Order ID as c1 GB, Items.ORDERDATE as c2 GB DetailFilter: Items.ORDERDATE = DATE '2007-04-12' OrderBy: c1 asc, c2 asc +++Administrator:300000:300005:----2009/03/06 21:46:19 -------------------- Sending query to database named orcl2 (id: <<12826>>): select distinct T6622.ORDID as c1, T6631.ORDERDATE as c2 from ITEMS T6622, ORDERS T6631 where ( T6622.ORDID = T6631.ORDID and T6631.ORDERDATE = TO_DATE('2007-04-12' , 'YYYY-MM-DD') ) order by c1, c2
So the moral of the story there is, if you’re bringing Oracle data in and it’s got DATE datatypes, make sure you correct the DATETIME datatypes that the import process assigns to them in the physical model, otherwise you’ll hit this same problem around superfluous TIMESTAMPs. If you’ve already built your logical model before you find this issue, make sure you delete and the re-add the date columns after you correct the physical model datatype, otherwise Answers will still go on using TIMESTAMPs even if you change the repository model and even refresh the Answers metadata. Simple once you know how, but it took us a good few hours to sort out.
The story didn’t end there though. The customer also set up a session variable that held, for each user, yesterday’s date. They populated this within an initialization block that selected against the SYSDATE pseudo-column, truncating it and taking 1 off to return yesterday’s date.
As you can see, again by default OBIEE assigns it a DATETIME datatype, which will cause the same problem as before if we try and use it in a filter. No amount of TO_DATE or CAST as DATE around the SQL would turn it into a plain old DATE datatype, so in the end we wrapped a TO_CHAR around it and converted it into the correct CHAR format for our database’s NLS_DATE settings, like this:
Now, if we include the session variable in a filter, like this:
The resulting SQL generated by the BI Server includes this CHAR version of the date directly in the query, which Oracle then implicitly converts to a date datatype because we’re using the correct date format for the database’s NLS_DATE setting.
-------------------- Logical Request (before navigation): RqList distinct Items.Order ID as c1 GB, Items.ORDERDATE as c2 GB DetailFilter: Items.ORDERDATE = '06-MAR-2009' OrderBy: c1 asc, c2 asc +++Administrator:300000:300006:----2009/03/06 21:58:47 -------------------- Sending query to database named orcl2 (id: <<12934>>): select distinct T6622.ORDID as c1, T6631.ORDERDATE as c2 from ITEMS T6622, ORDERS T6631 where ( T6622.ORDID = T6631.ORDID and T6631.ORDERDATE = '06-MAR-2009' ) order by c1, c2
Checking the execution plan for this new query, it is indeed using the index we created. Now obviously, this is a “dangerous thing” as we’re making the assumption that our database, or more correctly database client software, will always keep using this same NLS_LANG setting, but given that I can’t see any other way of getting the BI Server to store the date session variable as a date rather than date and time, it’s a working solution that solved the immediate problem.
So, a day of fiddling around with dates, and thanks to John Minkjan, Venkat and Christian Berg who chipped in with some advice. If anyone else faces the same issue, hopefully this posting will be of use.