Thursday, August 25, 2011

OBIEE Publisher 11g - Performance Monitoring and User Auditing

BI Publisher performance monitoring enables you to monitor the performance of queries, reports and document generation and to analyze the provided details. User auditing provides information about what users logged in, when, how many times, what reports they accessed, and other actions they took within the application.


Enabling Monitoring and Viewing the Audit Log


To enable monitoring:
1. Update properties in the BI Publisher server configuration file.
2. Copy the component_events.xml file to your Middleware Home.
3. Configure the Audit Policy Settings with Fusion Middleware Control (Enterprise Manager)
4. Restart WebLogic Server.

To View Audit Log:
5. Configure Audit Repository
6. Create Data Source in WebLogic Server
7. Register the Audit-Storing Database to your Domain
8. Create Auditing Reports

1. Update Properties in the BI Publisher Server Configuration File

Three properties from the configuration file ‘xmlp-server-config.xml’ needs to be updated, the default location of the file is “config/bipublisher/repository/Admin/Configuration/xmlp-server-config.xml”.
Set: MONITORING_ENABLED = “true” and AUDIT_ENABLED = “true”, add a new property AUDIT_JPS_INTEGRATION = “true”. Below is a sample xmlp-server-config.xml file.
Before update
After Update
<'xmlpConfig xmlns="http://xmlns.oracle.com/oxp/xmlp">
   <'property name="SAW_SERVER" value=""/>
   <'property name="SAW_SESSION_TIMEOUT" value="90"/>
   <'property name="DEBUG_LEVEL" value="exception"/>
   <'property name="SAW_PORT" value=""/>
   <'property name="SAW_PASSWORD" value=""/>
   <'property name="SAW_PROTOCOL" value="http"/>
   <'property name="SAW_VERSION" value="v4"/>
   <'property name="SAW_USERNAME" value=""/>
   <'property name="MONITORING_ENABLED" value="false"/>
   <'property name="AUDIT_ENABLED" value="false"/>  
<'/xmlpConfig>
<'xmlpConfig xmlns="http://xmlns.oracle.com/oxp/xmlp">
   <'property name="SAW_SERVER" value=""/>
   <'property name="SAW_SESSION_TIMEOUT" value="90"/>
   <'property name="DEBUG_LEVEL" value="exception"/>
   <'property name="SAW_PORT" value=""/>
   <'property name="SAW_PASSWORD" value=""/>
   <'property name="SAW_PROTOCOL" value="http"/>
   <'property name="SAW_VERSION" value="v4"/>
   <'property name="SAW_USERNAME" value=""/>
   <'property name="MONITORING_ENABLED" value="true"/>
   <'property name="AUDIT_ENABLED" value="true"/>
   <'property name=”AUDIT_JPS_INTEGRATION” value=”true"/>
<'/xmlpConfig>
note: remove ' from tags above
2. Copy the component_events.xml file to your Middleware Home
a. Create a new directory “xmlpserver” at MIDDLEWARE_HOME/oracle_common/modules/oracle.iau_11.1.1/components
b. Copy the existing file “component_events.xml” from MIDDLEWARE_HOME/user_projects/domains/bifoundation_domain/config/bipublisher/repository/Admin/Audit to MIDDLEWARE_HOME/oracle_common/modules/oracle.iau_11.1.1/components/xmlpserver location.
3. Configure the Audit Policy Settings with Fusion Middleware Control
a. Login in to Oracle Fusion Middleware Control.
b. User WebLogic Domain, right-click bifoundation_domain and chose Security and click Audit Policy
c. To set the Audit Level for BI Publisher, choose Medium from the Audit Level drop down menu or choose Custom to enable individual components. For this exercise we choose Medium.
4. Restart WebLogic Server
Restart the WebLogic Server using Fusion Middleware Control.
After restart the Audit information will be available in /AdminServer/logs/auditlogs/xmlpserver/audit.log. Create few reports and notice the audit logs in the file. Now we can create and configure audit repository to store the logs in database table instead in log files. Reports can be built over these tables to monitor.
5. Configure Audit Repository.
a. Create the audit schema using RCU (Repository Creation Utility – a separate package required for along with the OBIEE 11.x package)
                                  i. Run rcu.bat from RCU_HOME/bin
                                 ii. Follow the installation wizard and select Audit Services as show below.
                              iii. Continue the wizard and click Finish.
                             iv. As show below three schemas will be created for audit by the installation.
1. OBI_IAU
2. OBI_IAU_APPEND
3. OBI_IAU_VIEWER
6. Create a Data Source in WebLogic Server
To access the database schemas that we have created above we need to create a JDBC connection on the WebLogic Server. So that Audit Framework can access the schemas. Following are the steps to create a JDBC connection using Oracle WebLogic Server
a. Login to WebLogic Server Administration Console
b. Under Services, click Data Sources.
c. Click Lock and Edit button to enable editing.
d. On the Summary of JDBC Data Sources page, click New and click Generic Data Source.
e. Enter the following details for the new data source;
                           i. Name (example: BIP_AuditDB)
                          ii. JNDI Name (example: jdbc/AuditDB)
                         iii. Database Type (example: Oracle)
f. Click Next and select the database driver “Oracle's Driver (Thin XA) Versions: 9.0.1 or later” for Oracle database.
g. Enter the required connection properties like Database name, host name, port, database user name (for our exercise it’s OBI_IAU) and the password.
h. Click Next and accept the default setting and then click Test Configuration button as show below.
i. If the connection was successful, click Activate Changes to make the changes available.
7. Register the Audit-Storing Database to your Domain
a. Login to Fusion Middleware Control
b. Navigate to the WebLogic Domain, right click bifoundation_domain, then select Security, then Audit Store
c. Click Search Data Sources. From the Select Data Source dialog, select the data source you created and click OK
d. Click Apply and restart the WebLogic Server.
Note: After the restart, BI Publisher will store all auditing data into the database table called "IAU_BASE". To verify this procedure, try logging in to BI Publisher and opening a few reports. You should see the activity audited in the "IAU_BASE" table. If not, check the log file, which is located at: $BI_HOME/user_projects/domains/bifoundation_domain/servers/AdminServer/logs/AdminServer-diagnostic.log.
With the above data in IAU_BASE you can design the auditing reports using BI Publisher.
8. Create Auditing Reports
Following are the few steps in brief assuming the user knows about creating reports in BI Publisher in detail. 
  a. Create data source in BI Publisher
      Register the audit data source (JNDI/JDBC connection) that you created in the previous procedure as a JNDI data source in BI Publisher. Because you created a JDBC connection registered as JNDI, you do not need to create a new JDBC connection by typing the connection URL, username/password, and so on. You can just register it using the JNDI name (for example: jdbc/AuditDB).
           i. Log in to BI Publisher with administrator privileges and click the Administration link.
          ii. Under Data Sources, click JNDI Connection, and then click Add Data Source.
         iii. Enter the Data Source Name and JNDI Name. The JNDI Name is the name you provided       in the WebLogic Console as the auditing data source (for example: jdbc/AuditDB).
        iv. Click Test Connection to ensure that the data source connection works.
          v. Add the appropriate roles to the data source so that the report developers and consumers         can view the reports built on this data source. Click Apply to save.
b. Create (advanced) auditing reports for OBIEE version 11.1.1.5
With the version 11.1.1.5, we can make use of xmlpserver_audit.sql script to create a new table XMLPSERVER. Following are the steps;
                                                  i. Locate the table creation script xmlpserver_audit.sql at /user_projects/domains/bifoundation_domain/config/bipublisher/repository/Admin/Audit
                                                 ii. Edit the PERMISSIONS and SYNONYMS sections of the script as shown below.
From:
To:
                                               iii. Execute the updated script to create a new table and in OBI_IAU schema that you have created. You can login as SYS and alter the session to execute on OBI_IAU schema.
                                             iv. Restart the WebLogic Server.
                                              v. Create few sample reports in BI Publisher and notice the auditing information in the new table XMLPSERVER which will be used for building advanced BI Publisher auditing reports.
c. Create a data model
                                              i. Create a new data model with JNDI as the default data soruce.
                                            ii. Build or apply the following query using IAU_BASE table from OBI_IAU schema.
select
IAU_BASE.IAU_COMPONENTTYPE as IAU_COMPONENTTYPE,
IAU_BASE.IAU_EVENTTYPE as IAU_EVENTTYPE,
IAU_BASE.IAU_EVENTCATEGORY as IAU_EVENTCATEGORY,
IAU_BASE.IAU_TSTZORIGINATING as IAU_TSTZORIGINATING,
to_char(IAU_TSTZORIGINATING, 'YYYY-MM-DD') IAU_DATE,
to_char(IAU_TSTZORIGINATING, 'DAY') as IAU_DAY,
to_char(IAU_TSTZORIGINATING, 'HH24') as IAU_HH24,
to_char(IAU_TSTZORIGINATING, 'WW') as IAU_WEEK_OF_YEAR,
IAU_BASE.IAU_INITIATOR as IAU_INITIATOR,
IAU_BASE.IAU_RESOURCE as IAU_RESOURCE,
IAU_BASE.IAU_TARGET as IAU_TARGET,
IAU_BASE.IAU_MESSAGETEXT as IAU_MESSAGETEXT,
IAU_BASE.IAU_FAILURECODE as IAU_FAILURECODE,
IAU_BASE.IAU_REMOTEIP as IAU_REMOTEIP
from
OBI_IAU.IAU_BASE IAU_BASE
where
IAU_BASE.IAU_COMPONENTTYPE = 'xmlpserver'
To create a data model that contains only the BI Publisher data, then you can filter the data based on the value of the IAU_COMPONENTTYPE column that contains the product name. For BI Publisher, the value is "xmlpserver"
                                          iii. Test the sample with Get XML Output and save XML to your data model.
                                        iv. Save the data model.
d. Create the reports
Use know techniques for creating BI Publisher layouts, this exercise is out of scope.
                                           i. Using the layout options create a new BI Publisher layout.
                                          ii. Select the data model you have created in the previous procedure.
                                        iii. Click Add New Layout and click Base Templates. Following is the sample report from the layout editor (from Oracle)
                                    iv. Below are few samples auditing report with the data from the newly created schemas. Use both IAU_BASE and XMLPSERVER tables to build advanced detailed reports.

Friday, June 24, 2011

DAC Reporting in OBIEE

The purpose of this blog is to show how to consume data in the DAC repository via OBIEE reports.

What is DAC?

Oracle Business Intelligence Data Warehouse Administration Console (Oracle DAC) provides a centralized console for schema management, configuration, administration, loading, and monitoring of the Oracle Business Analytics Warehouse. The Data warehouse Application Console (DAC) is used to create and schedule Informatica ETL workflows. It manages the following data warehouse load processes:
  • Dynamic generation of subject areas and execution plans
  • Dynamic settings for parallelism and load balancing
  • Intelligent task queue engine based on user-defined and computed scores
  • Index management for ETL and query performance
  • Embedded high-performance OTLP change capture techniques
  • Restart from any point of failure
DAC / ETL Statistics
The metadata for ETL is stored in the DAC repository that is composed of approximately 120 tables. These tables store the ETL metadata such as tasks, task steps, execution plan details, connection details, run history, etc.
In this exercise we are only interested in the run history table - W_ETL_DEFN_RUN. This table has the ETL execution details for every Execution plan that was run through DAC (see screenshot below).

We will be walking you through the steps for creating a simple solution that will allow you to analyze the metadata in W_ETL_DEFN_RUN in OBIEE.
RPD
Below is a simple RPD model that is based on the single W_ETL_DEFN_RUN table from the DAC repository.
Reports
Using the subject area (Statistics – ETL) shown above, you can build useful reports to analyze the execution history (for each execution plan type).
1. Current month summary view with duration in minutes & hours.
2. Snapshot of run duration for each month.
Notice the failed steps in orange below.
3. Report showing Error / Status description with number of steps – Total, Failed, and Success.
4. Compare the ETL for large duration (2010 and 2011).
By adding other repository tables into your RPD you will be able to develop more sophisticated reports. You can even have error reports delivered to you with iBots whenever ETL errors occur.

Friday, June 17, 2011

Informatica - Beyond ETL

Informatica has long been one of the business intelligence / data warehouse industry leaders in data integration. Its PowerCenter and PowereExchange products have consistently been positioned in the leader quadrant for data integration by Gartner. However, Informatica is not resting on its laurels. Since 2009 Informatica has gone on an acquisitions rampage buying up best in breed companies with similar data-centric DNA while at the same time continuing to innovate and broaden its base products. The Informatica Platform now enables you to integrate data within the traditional enterprise (Enterprise Data Integration), control data in the clouds (cloud data integration), share data with your partners (B2B Data Exchange), archive data for efficiency and eDiscovery (Application ILM), and deliver data at zero latency (Ultra Messaging). To unlock the business value of all this data, the platform enables you to cleanse and gain trustworthy data (Data quality), to detect and act on operational intelligence (Complex Event Processing) and gain a 360 view of your strategic, authoritative master data assets (Master Data Management).

Informatica’s CEO stated that “only Informatica delivers the comprehensive, unified and open data integration platform to empower the data-centric enterprise.”


Clearly, Informatica has moved beyond ETL to embrace these key data-centric technologies.

Acquisitions since January 2009:
  • Applimation (Jan. 2009) – Leader in Application Information Life Cycle Management. Helping Customers manage their data assets from cradle-to-grave:

  • Data Archive for storing inactive and rarely accessed data on lower cost storage medium.
  • Data Privacy for masking identifying and identity data to remain in compliance with today’s stringent personal information and privacy laws.
  • Data Subset for creating purpose-built data sets for environment (DEV/QA/SYSTEM) or functional (CRM/ERP/FINANCE/HR) based testing.
  • Address Doctor (Mid-2009) – Leader in Worldwide Name and Address Validation for Over 200 Countries. Providing Customers with the highest quality name and address content for name and address standardization and validation.
  • Agent Logic (Oct. 2009) – Leader in Complex Event Processing and Operational Intelligence. Helping Customers institutionalize event detection patterns and human analysis patterns into automated opportunistic action and response alerts.
  • Siperian (Jan. 2010) – Leader in Master Data Management. Helping Customers achieve mastery of a single view of ‘X’ across all mission critical data domains.

Wednesday, June 8, 2011

Data Modeling: Schema Generation Issue with ERwin Data Modeler 7.3

We are using Computer Associate's ERwin Data Modeler 7.3 for data modeling. In one of our engagements, we are pushing data model changes to the physical database (Oracle 11g R2) via ERwin's Forward Engineering / Schema Generation functionality.



The Forward Engineer / Schema Generation wizard offers the "Physical Order" option under "Column". Checking this box is supposed to force the ERwin DDL generation engine to preserve the column order (i.e. the column order in the physical database must match the column order in the logical/physical model).

Problem: Even though our data modelers pay very close attention to the column order in the logical and physical model in ERwin, the Forward Engineering / Schema Generation process seems to lose the physical column order when generating the DDL (even though the "Physical Order" option is checked). Thus, the column order in the Oracle database never matches with the column order in the ERwin data model.

Solution / Work-Around: This is a bug as the behavior of this option seems to be inverted. If you want to preserve the column order, the "Physical Order" must be un-checked.