Disclaimer

All of the topics discussed here in this blog comes from my real life encounters. They serve as references for future research. All of the data, contents and information presented in my entries have been altered and edited to protect the confidentiality and privacy of the clients.

Various scenarios of designing RPD and data modeling

Find the easiest and most straightforward way of designing RPD and data models that are dynamic and robust

Countless examples of dashboard and report design cases

Making the dashboard truly interactive

The concept of Business Intelligence

The most important concept ever need to understand to implement any successful OBIEE projects

Making it easy for beginners and business users

The perfect place for beginners to learn and get educated with Oracle Business Intelligence

Tuesday, August 20, 2013

Oracle DAC interview questions and answers


Oracle DAC is an essential part of BI Apps, which is seldom being introduced in a systematic training course, although we use them all the time. There can be quite a lot of things to ask about when it comes to working with DAC, especially during interviews for BI Apps related projects. So I am going to gather some of the common interview questions with regard to DAC.

1. Name some of the DAC source system parameters:
TYPE2_FLAG, GLOBOL1_CURR_CODE, Initial_extract_date etc.. (The goal is just to name a few and of course, nobody remembers exactly the spelling)

2. To configure for initial full load, what are the things that needs to be done:
A, in DAC, set the value for initial_extract_date to avoid loading way too many data into target
B, to load base table W_DAY_D, nullify all of the refresh date to enable to full load. Do the same for all other aggregated time table like W_Week_D etc. At each task level where day dimension is being part of (SIL_daydimension), set the $$start date and $$end date parameter values at the task level to determine how long period your day dimension should store.
C. If your company does have multiple currency, then you need to configure currency in DAC by assigning currency code and exchange rate to DAC parameters like globol1 (2,3)_curr_code and globol1 (2,3)_curr_rate_type. BI Apps support up to 3 types of currency.
D. Configure GL Hierarchy so the info stores in W_Hierarchy_D. No DAC configuration needed
E. DATASOURCE_NUM_ID is a DAC parameters that determine which datasource system the extraction is taking place. In physical data source tab under 'setup' view, this field can be edited with integer number from 1 to 10 to represent different DB source.

3. Explain how to set up metadata in DAC to load data into the target

For basic intro on how DAC work in terms of executing the tasks, find out here

4. How to configure incremental loading in DAC
A. The refresh date under physical data source stores the last ETL run time, by nullifying this, the DAC will run full load or it will run incremental load based on the refresh date value.
B. Under task, there is 'incremental load' commend, by checking this, it will do either full load or incremental load regardless of refresh date.


-------------------------------------------------------------------------------

Below are the list of questions about DAC found through googling, since these questions have NOT been provided with answers, I have provided my answers, feel free to read it for your reference:

1. Over all architecture of DAC ?
DAC server and DAC Client. They must co-locate with Informatica Integration service, repository service and Informatica repository


2. Why we should use DAC and not control all execution through informatica ?
For better performance management, such as creating index, dropping index, truncating before load. Without DAC a custom ETL process will be needed, which has to survive the upgrate

3. Can we run multiple execution plan at the same time in DAC ?
Yes. only if the execution plan are not loading into the same table or using the same phyiscal table source

4. Explain DAC export/import
A way to import or export DAC repository metadata for upgrade or backup. Logica, System, runtime objects can be import/export

5. Have you change any of the DAC parameters ? If so which one and why ?
You have to understand what are the DAC parameters and the purpose of each. For example, Initial_extract_date can be modified when configure for initial full load, so the value for initial extract date will be used to filter out records from the source that are older than this date.

6. How do you Determine the Informatica Server Maximum Sessions Parameter Setting in DAC?
One you register informatica server in Dac client

7. Can dac send an email just in case of any failures ?
In DAC Client, toolbar, click email recipient, then in Tools--> DAC Server setup, Email configuration

8. Can you execute the sql scrip through DAC ? If yes how ?

Yes, at task level, in execution type, select SQL file. As a bonus to this answer, this article explains how to run store procedures in DAC.

9. in DAC How you can disable table indexes before loading and enable the index once load is complete ?
Just drop and recreate index

10.Let say you are running the normal incremental load. But just for today you want to extract data from AP_INVOCIES_ALL from 12/12/2011? How you can achieve this ?

Modify the refresh date to be 12/12/2011


11.How DAC Determines the Order of Task Execution within an Execution Plan ?
Based on tasks source/target table, Task phase (extract dim, load fact etc) and 'truncate always' properties, to run them in particular order, create task group


12.What are Micro ETL Execution Plans ? How can you Build and run them ?

According to Oracle document:
Micro ETL execution plans are ETL processes that you schedule at very frequent intervals, such as hourly or half-hourly. They usually handle small subject areas or subsets of larger subject areas. The DAC tracks refresh dates for tables in micro ETL execution plans separately from other execution plans and uses these refresh dates in the change capture process.

in design -- subject areas, create copy of subject area, inactive the unwanted tasks and create new execution plan for this subject area

13.From you past experience – explain scenario where Micro ETL Execution Plans produced wrong results on reports?

According to Oracle Document:
CAUTION:  Micro ETL processes can cause issues with data inconsistencies, data availability, and additional load on the transactional database. Therefore, you should consider the following factors before implementing a micro ETL process:

For related star schemas, if one schema is omitted from a micro ETL execution plan, the cross-star reports may be inaccurate. For example, if the Person fact table is refreshed more frequently than the Revenue fact table, a report that spans the Person and Revenue dimensional schemas may produce inconsistent results.
If you omit dimension tables from a micro ETL execution plan, the foreign keys for the fact tables will point to Unspecified rows for the new dimension records. The foreign key references will be resolved when the Complete ETL execution plan is run, but users of the reports should be aware of such inconsistencies.
If you do not include aggregate tables in micro ETL execution plans, the reports that use data from these tables will be inconsistent with the reports that use data from the detailed fact tables. However, if aggregate tables are included in the micro ETL execution plan, the aggregate calculations are performed for each ETL process, which will take a constant amount of time and may be inefficient to perform at such frequent intervals.
Hierarchy tables are rebuilt during every ETL execution plan by querying the base dimension tables. This operation takes a constant amount of time. If the base tables are big, this operation may take a long time and may be inefficient if the micro ETL execution plan runs several times a day. However, if you avoid populating the hierarchy tables during micro ETL processes, data inconsistencies will occur.
With micro ETL execution plans, caching will occur more frequently, which may have performance implications.
Micro ETL execution plans will put more load on the transactional database because of the frequent extracts.


14. Let say you can not use DAC scheduler to schedule you execution plan. What other options do you have ? How you can achieve this ?

Use Informatica scheduler.

15.Does DAC keeps track of refresh dates for all the source/target tables ?

According to Oracle Document:
Refresh dates are tracked only for tables that are either a primary source or a primary target on tasks in a completed run of an execution plan. The DAC runs the full load command for tasks on which a table is a primary source or target if the refresh date against the table is null. When there are multiple primary sources, the earliest of the refresh dates will trigger a full load or an incremental load. If any one of the primary source tables has no refresh date, then the DAC will run the full load command.


16.Consider the scenario as below for task T1
Primary Source has not null last refresh date
Primary Target has null last refresh date
Will task T1 executes in full or incremental ?

Based on answers provided from question 15, what do you think?


17.Explain the upgrade/merge options for DAC 7.8.4 & below and new versions ?

Use upgrade/merge wizzard.
1. Repository Upgrade (DAC 784) --- upgrade Dac
2. Refresh Base --- For upgrading BI Apps
3. Simplified Refresh From Base -- This option is similar to the Refresh Base option. It allows you to upgrade the DAC Repository from an older release of Oracle BI Applications to a new release without comparing repositories and creating a Difference Report.
4. Replace Base --- Upgrade when phasing out older transaction system to newer one
5. Peer to Peer Merge  --- Mergre different DAC instance of repository

18. Using DAC command line – write a script to check weather informatica services are up or not ?

use dacCmdLine InformaticaStatus. Below is the list of all commend lines according to Oracle:




19.Can we have two DAC server on the same machine ?
You can run two DAC servers on the same machine as long as they are listening on different ports and pointing to two different repositories

20.Explain briefly What kind of DAC Repository Objects Held in Source System Containers ?

Subject Areas -- A logical grouping of tables related to a particular subject or application context. It also includes the tasks that are associated with the tables, as well as the tasks required to load the tables. Subject areas are assigned to execution plans, which can be scheduled for full or incremental loads

tables -- Phsyical tables in DB

Indexes -- Just like your physical DB indexes

Tasks -- Unit of work for loading tables

Task groups  ---- Grouping of tasks that can be bundled to run as a group

Execution plans -- A data transformation plans defined on subject areas that needs to be transformed at certain frequencies of time

Schedules -- Determine how often execution plan runs.


21.What is Authentication file ? If you have dac client installed can you access DAC repository without Authentication file ?

According to Oracle Document:
When you configure a connection to the DAC Repository, the configuration process includes creating a new authentication file or selecting an existing authentication file. The authentication file authenticates the database in which the repository resides. If you create a new authentication file, you will specify the table owner and password for the database.

22.Explain Index, Table and Task Actions in DAC ?

According to Oracle Document:
Index action: Override the default behavior for dropping and creating indexes

Table action: Override the default behavior for truncating and analyzing tables

Task action: Can add new functionality of task behavior, such as precedinf action, success action, failure action, upon failure restart



23.How DAC Handles Parameters at Runtime ?

According to Oracle Document:
During an ETL execution, DAC reads and evaluates all parameters associated with that ETL run, including static and runtime parameters defined in DAC, parameters held in flat files, and parameters defined externally to DAC. DAC consolidates all the parameters for the ETL run, deduplicates any redundant parameters, and then creates an individual parameter file for each Informatica session. This file contains the evaluated name-value pairs for all parameters, both static and runtime, for each workflow that DAC executes. The parameter file contains a section for each session under a workflow. DAC determines the sessions under a workflow during runtime by using the Informatica pmrep function ListObjectDependencies.

The naming convention for the parameter file is

....txt

DAC writes this file to a location specified in the DAC system property InformaticaParameterFileLocation. The location specified by the property InformaticaParameterFileLocation must be the same as the location specified by the Informatica parameter property $PMSourcefileDir.

24. How DAC Determines Tasks Required for any given subject area ?

According to Oracle Document:
You define a subject area by specifying a fact table or set of fact tables to be the central table or tables in the subject area. When a subject area is defined, DAC performs the following process to determine the relevant tasks:

DAC identifies the dimension tables associated with the facts and adds these tables to the subject area.

DAC identifies the related tables, such as aggregates, associated with the fact or dimension tables and adds them to the subject area definition.

DAC identifies the tasks for which the dimension and fact tables listed in the two processes above are targets tables and adds these tasks into the subject area.

Tasks that DAC automatically assigns to a subject area are indicated with the Autogenerated flag (in the Tasks subtab of the Subject Areas tab).

You can inactivate a task from participating in a subject area by selecting the Inactive check box (in the Tasks subtab of the Subject Areas tab). When the Inactive check box is selected, the task remains inactive even if you reassemble the subject area.

You can also remove a task from a subject area using the Add/Remove command in the Tasks subtab of the subject Areas tab, but when you remove a task it is only removed from the subject area until you reassemble the subject area.

DAC identifies the source tables for the tasks identified in the previous process and adds these tables to the subject area.

DAC performs this process recursively until all necessary tasks have been added to the subject area. A task is added to the subject area only once, even if it is associated with several tables in the subject area. DAC then expands or trims the total number of tasks based on the configuration rules, which are defined as configuration tags. This process can be resource intensive because DAC loads all of the objects in the source system container into memory before parsing.


25.Difference between Homogeneous and Heterogeneous execution plans.

According to Oracle Document:

Homogeneous

This type of execution plan extracts data from multiple instances of the same source system. For example, a business might have an instance of Oracle EBS 11i in one location and time zone and another instance of Oracle EBS 11i in another location and time zone. In such cases, the timing of data extraction from the different instances can be staggered to meet your business requirements.

Heterogeneous

This type of execution plan extracts data from one or more instances of dissimilar source systems. For example, a business might have an instance of Siebel 7.8 in one location, an instance of Oracle EBS 11i in another location, and a second instance of Oracle EBS 11i in yet a third location. You can also stagger the timing of data extraction when you use this type of execution plan.

Wednesday, July 31, 2013

Understanding the implementation of value based hierarchy (parent-child) in OBIEE 11G

Hello

As we all know, from 11g onwards, the feature of implementing value based hierarchy is available. There are several great articles out there that explains the steps of such implementation in 11G, therefore I am not going to repeat these things. What I want to talk about is more of a concept of value based hierarchy implementation, which will hopefully answer the questions that a lot of beginners reading about this topic may have as such "How do I know the right table structures for this type of hierarchy?"

As we know, employee dimension is a very typical example of value based hierarchy, but I don't want you to stop right here. There could be so many other tables that can be designed to support value based hierarchy based on reasons we will touch upon.

Conceptually, we all know what value based hierarchy is from looking at sample diagrams of hierarchical trees where branches are labelled with employee names. Now, if you look at the physical table where this type of hierarchy is supported, it will look like this:

Employee                 Manager                   Level (or other names)                other attributes
Me :)                                                                8                                            phone number, address etc
Superlek                     Me                              7                                             ......
Attachai                     Superlek                        6                                            .......
Sombat                      Attachai                         5                                            .......
Dekkers                     Sombat                         4                                            .........
Orono                        Dekkers                        3                                            ...........
Yodsanan                   Orono                          2                                             ...........
Mick                          Yodsanan                     1                                             ...........
Gina                           Mick                            0                                             ..........
Decharwin                  Me                               7                                             ............


From the above example, it clearly stores Employees in relation to the immediate boss in a value based hierarchical fashion. The level (or whatever name it can be) will keep an integer that indicates the person's level in this hierarchy. This table can have other attributes that handles changes and updates of employee depending on how the business wants to keep track of historical record. But the idea is pretty clear here.

Now if you want to redesign this table to store structure based hierarchy similar to time dimension, then this table structure will look like this:

Employee            Manager           Sr Manager        Director         ......       CEO
Gina                      Mick                Yodsanan            Orono                       Me :)

The reason structure based hierarchy isn't the best for employee dimension is that it's very rigid. All of the columns are fixed and levels are pretty much hard coded. In the future, if someone gets promoted, demoted, retired, or given a new rank that didn't exist before, it will become very difficult to maintain this kind of records in structure based hierarchy. The reason why time dimension is great for structure based hierarchy is that, the calender system that we use in our daily life doesn't change. In other words, 'February' will always be in 'Month' column, it will never get 'promoted' and become 'quarter' nor will be demoted to be 'Week'.

Based on the simple concept, you should be able to think of other dimensions that can be good candidates for either structure based hierarchy or value based hierarchy. It can also depend on the business process of each individual company or industry, it can also base on determining factors, such as pricing based hierarchy or promotion based hierarchy and so on.

Therefore, when it comes to determining what type of hierarchy should be implemented in OBIEE, the decision should be made way prior to it gets to OBIEE. It should be a decision made during the designing of project roadmap.

Thanks

Until next time

                    

Saturday, July 20, 2013

Navigating between reports and dashboards when there is no common fields to pass parameters

Hello All

Today I want to talk about navigating between reports and dashboards. As we all know, this feature can be done in 11G through action frameworks, this allows us to click on any particular value of the actionlink defined column and navigate us to another dashboard page (BI Content if we choose it in the actionlink) and the value we clicked previously will be filtered.

This is a pretty standard feature that has been part of OBIEE product since it was still called Siebel Analytics. However, in order for this feature to work as straightforward as it is, the condition is that there must be a common field used in both reports. In other words, if you want to navigate from report 1 to report 2 by clicking on the values of a column, then that column needs to exist in both reports for this to work.

This becomes an issue sometimes when that column is technically not the same column yet it is the same column from the business perspective. In other words, we have all used alias in our data modeling and it gives us a lot more flexibility in creating the kind of reports we want. Alias of the same physical table can be used in different star and snow-flake schemas without causing conflicts. However, these alias columns, although being referred to the same physical table, aren't being treated the same column from OBIEE.

Here is the case:

We have a dashboard here that I am going to click on any values of 'Vlan Interface', then the page should become another dashboard with only that 'Vlan interface' filtered


However, it is not working. The destiny report is successfully being fetched, however the filter box is empty. This means the value that we clicked was being passed from the first page, but it is not received in the destiny page.



The reason here is that, although both the Vlan Interface column and Interface prompt has the same data and they are from the same interface column in the physical table, they are different columns according to OBIEE. They are from different folders and they have different names, they simply exist in different places.




 I wish OBIEE can have the intelligence, or an added feature that allow users to define actionlinks by including or excluding the navigation of the actual physical column. Well, until that happens in the future, we just have to work around it now.

To make the destiny dashboard receive the values passed from the first dashboard, there must be a common field between the 2. Since we want to click on 'Vlan Interface' and go to the second dashboard, then we will create a dashboard prompt using 'Vlan Interface' columns and then place it on the destiny dashboard:


In this prompt, we will set the default section as SQL statement:

Select "Node Interface", "Node Logical Interface" from table where rcount("Node Interface", "Node Logical Interface") = 2.

This is nothing but setting a default value so that if you do come to this dashboard by direct access, this prompt will have some default value to run against all of the reports there. If you access this page by clicking on 'Vlan Interface' value from the other dashboard, then the value you click will be shown in this prompt after the navigation. This is basically the idea. I choose rcount = '2' simply because I know the first value in my table is 'N/A'.

Set the presentation variables here, we will pass this variable to the existing 'interface' prompt on this page later.



Now that we have created this prompt, we will put it on the destiny dashboard

Now that this prompt is on the dashboard, but I want it be work behind the scene only. I don't want users to see this prompt and the 'interface' prompt at the same time. So it's just the matter of hiding this section all the time.

Simply create a report that always returns some data and use it as the condition:




Define a condition in the section Condition:

True if row count = 0 determines that this section will display only when the row count of this report = 0, which is always going to be false. This implies that this section will always be hidden.



Now, we need to modify the existing 'interface' prompt to make it affected by the 'Vlan interface' prompt that we just created.

Here, the idea is to change the default section of this prompt so that it's default value is whatever 'Vlan Interface prompt' passes :

SELECT "Node Interface"."Node Logical Interface" from Table where "Node Interface"."Node Logical Interface" = '@{V}'

Here, the presentation variable 'V' could be the value of rcount = 2 in the physical table or whatever Vlan Interface value the user click from the first page:



Now, let's give it a test. First, we go to this dashboard page with direct access. We see that the 'Vlan Interface' prompt is hidden, and the interface prompt is showing a default interface value, this is good.



Now, let's go back to the first dashboard page to test the navigation part:

Click on Vlan interface 'ETH Vlan 1410 U1/M1/IF0':

We arrive at the destiny dashboard with 'ETH Vlan 1410 U1/M1/IF0' being passed successfully:



Everything looks as if nothing extra has been done, this is perfect.

Thanks

Until next time

Saturday, July 6, 2013

Calling Javascript function from OBIEE dashboard

Hello again

This time I am going to show how to call javascript function from OBIEE dashboard. The script can be written and saved in the common.js file located in:

OBIEE_Folder/user_projects/domains/bifoundation_domain/servers/bi_server1/tmp/_WL_user/analytics_11.1.1/7dezjl/war/res/b_mozilla/common.js

I have written a simple script as the follow and I saved it in the bottom of the common.js file:

// MY Stuff
function obi_popup()
{
alert("Hello World")
}

So as you can see, the function name is called 'obi_popup' and the function itself is supposed to return a text 'Hello World' when called.

Once this is done, restart the OBIEE opmn service.

Go to the dashboard and add a text section. In there add the following text:



check "contain HTML markup" box

This will appear a button labelled 'hi there!', and by clicking on it, the javascript 'obi_popup' should be called


Click and the result:



Thanks,

until next time

Saturday, June 22, 2013

Restarting all OBIEE 11G domain services in linux and common issues

Hello

Although there are a lot of articles out there that talks about how to restart OBIEE 11G, I find them tend to be less clear to beginners. Trust me, if you have just started using 11G, you will find out that there are some common issues that you will run into when you are restarting your OBIEE system. So today, I want to gets these things straighten out.

Let's start and assume that your OBIEE is already up and running, and you have made some configuration related changes or you have to go through some other system maintenance, which requires you to shut down OBIEE and start it again. Now, I am not going to explain why to do or how OBIEE architecture works,  I find that unhelpful to a lot of people, so let's start with steps.

To Stop OBIEE 11G domain components:
(Duh! restarting means stop, then start)

1. Stop Opmn

in your Linux terminal window, run the following command line:
Go to the following path:
$your OBIEE main folder/instances/instance1/bin/

Then type
./opmnctl stopall

2. Stop Managed Server

Go to the following path:

$Your OBIEE folder/user_projects/domains/bifoundation_domain/bin/

Then type
./stopManagedWebLogic.sh bi_server1

Enter the credentials for the WebLogic administrator (in my case it is weblogic/weblogic123) when prompted.

Let it run and it will be done eventually. (Duh!)

3. Stop Node Manager

Since there is no script that stops the node manager, you will have to kill it, so type the following in your terminal:

ps -ef |grep Node|grep nodemanager |cut -c10-15

It will show the Node Manager process, there is an PID that comes with it the result, note it down and then type:

kill

After having done so, you can re-execute : ps -ef |grep Node|grep nodemanager |cut -c10-15

This time, it shouldn't return any results, that means node manager has been stopped.

4. Stop WebLogic Domain

In the same Terminal go to the following path:

$Your OBIEE folder/user_projects/domains/bifoundation_domain/bin/

Then type:
./stopWebLogic.sh

Let it execute and eventually it will complete the stoppage.

Now starting all OBIEE domain components:
You will eventually realize that the sequence of starting each components is pretty much in reverse of the stopping sequence

A. Start WebLogic Domain

In the following folder:

$Your OBIEE folder/user_projects/domains/bifoundation_domain/bin/

Type:
./startWebLogic.sh

If you want to have it run in the background, which means the component remains started after you close the terminal session (you pretty much have to do it this way), type this:

./startWebLogic.sh &

Now if you are restarting, it is likely you will run into some issues at this step, so scroll down to where I provide the error handling

B. Start Node Manager

Go to the following path:

$Your OBIEE folder/wlserver_10.3/server/bin/

Type:
./startNodeManager.sh &
(for running in the background)

Let it run and keep the windows opened for a while

C. Start Managed Server

Go to the following path:

$Your OBIEE folder/user_projects/domains/bifoundation_domain/bin

Type:
./startManagedWebLogic.sh bi_server1 &
(for running in the background)

Just like step A, it is likely you will run into some common issues, if you do, please scroll to the error handling section of this post to see if the errors I mention here applies to you or not.

D. Start Oracle BI System Components (OPMN)

go to the following path:

$Your OBIEE Folder/instances/instance1/bin

Type:
./opmnctl startall &
(again the '&' is for running it in the background)



Error Handling:

Error 1:
There are 1 nested errors:

Weblogic.manaement.managementException: Unable to obtain lock on /path......../AdminServer.Lok. Server may already be running

It is pretty common that you will run into this error, it looks like this in my system:



When you see this error, just go to path which I highlighted and you will find the AdminServer.lok file or bi_server1.lok file and delete it:


Basically, AdminServer.Lok and BI_Server1.lok can be found here:
YourOBIEEFolder/user_projects/domains/bifoundation_domain/servers/AdminServer/tmp
YourOBIEEFolder/user_projects/domains/bifoundation_domain/servers/bi_server1/tmp

If AdminServer.lok exists, it will cause errors when you start Weblogic (In Step A)
If Bi_server1.lok exists, it will cause errors when you start Manage Server (In Step C)

Simply delete these files and execute the run script again, the errors will go away.

Error 2:

When you are running these starting scripts in the background as indicated in Step A and C, the system prompts you to enter the username, when you do so, you end up getting 'weblogic: comment not found' and the system doesn't get started. When you run the script without '&', everything worked normally. The behavior is seen as following in my system:


When you see a behavior like this, it has something to do with boot.properties file not found in the following directory:

/user_projects/domains/bifoundation_domains/servers/AdminServer/security --- This will cause above error when you run step A 

/user_projects/domains/bifoundation_domains/servers/bi_server1/security/ -- This will cause above error when you run step C

Now you can either create your own boot.properties file in both directory, or you can find the existing boot.properties file in the following directory:

/user_projects/domains/bifoundation_domains/servers/bi_server1/data/nodemanager as shown:





If you open this file, you will see that the weblogic admin user password has been encrypted. If you would, just copy the boot.Properties file from '/user_projects/domains/bifoundation_domains/servers/bi_server1/data/nodemanager ' to the following 2 directories:

/user_projects/domains/bifoundation_domains/servers/AdminServer/security --- This will take care of the above error when you run step A 

/user_projects/domains/bifoundation_domains/servers/bi_server1/security/ -- This will take care of the above error when you run step C

In Summary:

When you get these 2 errors handled, your restarting of OBIEE will be smooth 90% of the time.

Thanks

Until next time


Related Posts Plugin for WordPress, Blogger...