Skip to main content

Simplify Aggregates Management with Aggregate Persistence Wizard

In my previous post Using Aggregates and Combining Data Sources in OBIEE I explained how to add aggregates to physical layer and how to include these in business model. In general that process is quite straightforward, but what if there are many aggregates that have to be brought into your repository. Well, in that case, using Aggregate Persistent Wizard to create aggregates might be better idea.

So let's explore this tool that is available in BI Administration Tool.

Before running Aggregate Persistence Wizard, you might want to run Global Consistency Check and Model to check if there are any errors or inconsistencies that might affect creation of aggregates. There shouldn't be any.

Once you run both tools and check for issues, you ca run Aggregate Persistence Wizard. To start the six-step process, navigate to Tools menu in BI Administration Tool and select Utilities.

When Utilities pop-up window opens, select Aggregate Persistence from the list of available utilities.



Click on Execute to start the wizard.



In the first step script name and location have to be specified. The final result of Aggregate Persistence Wizard is a script that has to be executed using command line utility nqcmd.



In the second step, you have to select measures that you would like to aggregate in your aggregated fact table. You can select fact table or individual measures.



The third step is about selecting logical dimensions and levels of aggregation. In example above, measures selected in the 2nd step, should be aggregated by Customer Segment of H Customers dimension, by Year of H Year dimension and so on.



Fourth step is actually very important. In this step you specify where aggregated data will be stored. The case is that aggregates might be stored in a different database. Maybe you would want to store them in Exalytics Oracle In-Memory database and obtain super fast responses for your queries. There is practically no limitations where you would put your aggregates.I believe this is one of Oracle Business Intelligence strongest points.

In order to complete the wizard process, created script command has to be reviewed and confirmed.



Now, you can continue with other aggregates definition or you can simply finish the process and review and run generated script. For our simple example, the following script has been generated:

create aggregates

"ag_Fact_Sales" for "SIOUG Business Model PAW"."Fact Sales"("Sales Revenue","Units Sold","Fixed Costs","Variable Costs","# of Orders") at levels ("SIOUG Business Model PAW"."H Customers"."Customer Segment", "SIOUG Business Model PAW"."H Offices"."Company", "SIOUG Business Model PAW"."H Products"."Brand", "SIOUG Business Model PAW"."H Time"."Year") using connection pool "SIOUG"."Demo Conn Pool" in "SIOUG".."SIOUG";

As said above, this script has to be executed using nqcmd command line utility.

Nqcmd command line utility can be found at this location:



The command that would execute the script on BI server should be like this:

nqcmd –d training_obiee –u weblogic –p Admin123 –s create_agg_w_apw.sql –utf16  

After successful completion, there should be several changes in you repository and database. There should be new aggregated tables in your database schema, as specified in the "connection pool" step of the wizard. There should be changes in RPD, where all required objects should be created in both, physical layer and business models.

In the database schema you've specified you can check for new tables that were generated and populated with aggregated data. You can observe that table and column names were auto generated, but still in an understandable manner.



There are changes in RPD as well. You can check and review changes in physical layer, where all new tables that were created are also imported and modelled.



And all required objects (logical dimensions, logical table sources, logical levels) are also created in business model.



Using other words, everything that was needed to do manually before, has been done now automatically.

And now when everything is created it should work fine. But what about refreshing data every night or other period?

There is a way to set up incremental refresh process with BI tools (if there is no other way). Using Job manager you can set up jobs that are triggered every required period and that would append new data to aggregates.

Navigate to this folder first:



Open nqquery.log and search for BEGIN: Create Aggregates:



Create aggregates section in log is rather log. Section that you should find should look like this:



If you take that whole part of log and modify it a bit, then you can use it in ETL process. For example like this:

SET VARIABLE DISABLE_CACHE_HIT=1, DISABLE_CACHE_SEED=1, DISABLE_SUMMARY_STATS_LOGGING=1, INACTIVE_SCHEMAS='': populate "ag_Fact_Sales" mode ( append table  connection pool "SIOUG"."Demo Conn Pool") as  select_business_model "SIOUG Business Model PAW"."Dim Customers"."Customer Segment ID" as "Customer_S000015B4","SIOUG Business Model PAW"."Dim Offices"."Company ID" as "Company_ID000015B7","SIOUG Business Model PAW"."Dim Products"."Brand ID" as "Brand_ID000015C0","SIOUG Business Model PAW"."Dim Time"."Year" as "Year000015C8","SIOUG Business Model PAW"."Fact Sales"."Sales Revenue" as "Sales_Reve000015CF","SIOUG Business Model PAW"."Fact Sales"."Units Sold" as "Units_Sold000015D0","SIOUG Business Model PAW"."Fact Sales"."Fixed Costs" as "Fixed_Cost000015D1","SIOUG Business Model PAW"."Fact Sales"."Variable Costs" as "Variable_C000015D2","SIOUG Business Model PAW"."Fact Sales"."# of Orders" as "Z_of_Order000015D3" from "SIOUG Business Model PAW” 
where "SIOUG Business Model PAW"."Dim Time"."Year” = ‘2015’;

Of course this is just an example which you need to adopt to your specific requirements, but that’s the idea how you could setup incremental refresh for which you would need some scheduling tool like OBIEE's Job Manager.

And in the end let's just check and confirm that everything works fine.

Create the following query:



When run, this query produces the following (part) log entry:

WITH SAWITH0 AS (select distinct T5907.Units_Sold000015D0 as c1,     T5907.Sales_Reve000015CF as c2,     T5876.Customer_S000015B3 as c3,     T5883.Company000015B6 as c4,     T5890.Brand000015BF as c5,     T5897.Year000015C8 as c6,     T5876.Customer_S000015B4 as c7,     T5883.Company_ID000015B7 as c8,     T5890.Brand_ID000015C0 as c9
from      SA_Year000015F6 T5897,
     SA_Brand000015EF T5890,
     SA_Company000015EA T5883,
     SA_Custome000015E5 T5876
,     ag_Fact_Sales T5907
where  ( T5883.Company_ID000015B7 = T5907.Company_ID000015B7 and T5876.Customer_S000015B4 = T5907.Customer_S000015B4 and T5890.Brand_ID000015C0 = T5907.Brand_ID000015C0 and T5897.Year000015C8 = T5907.Year000015C8 ) ),...

As you can verify, tables generated by Aggregate Persistence Wizard are used.


And if you try to drill down on any of the columns, base tables are used again:


WITH SAWITH0 AS (select sum(T5346.UNITS) as c1, 
     sum(T5346.REVENUE) as c2,     
     T5337.SEGMENT_DESC as c3,     
     T5340.COMPANY as c4,     
     T5340.ORGANIZATION as c5,    
     T5343.BRAND as c6,     
     T5355.PER_NAME_YEAR as c7,
     T5337.SEGMENT_KEY as c8,
     T5340.ORG_KEY as c9,
     T5343.BRAND_KEY as c10
from
     SAMP_TIME_DAY_D_AGG T5355 /* D_Time_AGG */,
     SAMP_PRODUCTS_D_AGG T5343 /* D_Products_AGG */ , 
     SAMP_OFFICES_D_AGG T5340 /* D_Offices_AGG */ ,   
     SAMP_CUSTOMERS_D_AGG T5337 /* D_Customers_AGG */ ,     
     SAMP_REVENUE_F_AGG T5346 /* F_Revenue_AGG */ 
where ...

As you can see, end user is not aware of any data source combination and mashups that actually happen in behind. Also all reports will run unchanged. Just faster.

In my next blog I'm going to explore how other data sources can be used in combination with relational. Especially we will look into Essbase, true MOLAP, and will combine data from Oracle database with data coming from Oracle Essbase.


Using Aggregates in OBIEE Series: