Disclaimer

All of the topics discussed here in this blog comes from my real life encounters. They serve as references for future research. All of the data, contents and information presented in my entries have been altered and edited to protect the confidentiality and privacy of the clients.

Various scenarios of designing RPD and data modeling

Find the easiest and most straightforward way of designing RPD and data models that are dynamic and robust

Countless examples of dashboard and report design cases

Making the dashboard truly interactive

The concept of Business Intelligence

The most important concept ever need to understand to implement any successful OBIEE projects

Making it easy for beginners and business users

The perfect place for beginners to learn and get educated with Oracle Business Intelligence

Sunday, October 21, 2012

Basic External table authentication with password encription

Hello

Today, I want to talk about some real basic stuff in OBIEE 10G, which is external table authentication. I know there are many similar posts out there on the internet, I still want to post my own version. This is not to show off or anything, just so I don't have to memorize these things in the future if the article is kept in my own space. I know that we are talking about 11G nowadays, everybody is talking about cool things they can do with 11G, google maps, unstructured data and so on. I like to get into those things too, in fact I have. But I still believe that the understanding of basic security approach in OBIEE, the usages of session variables, the standard process of user management can benefit in the long run.

So without further digression, the external table authentication is a way to manage users and user groups in an external DB table for OBIEE to authenticate user's accessibility to the application. LDAP Authentication on the other hand, is to store user information in LDAP server. For external table authentication, the definition of the table isn't always the same from company to company. But the basics are the same. You typically will have 3 tables:

First one is user table that stores user ID and other user Infos;



second one is user group table that stores user group information:


Now because 1 user can belong to many groups and 1 group can have many users, they naturally form n:n relationship. Therefore, we have a third table which map each user to user group:


Now that we have the structures ready, we will implement external table authentication in OBIEE by creating session variables. My first initialization block is 'Isauthentication' which is associated to several session variables:
User
Displayname
Loglevel
Email
Password

The follow screenshot is misleading, it should've been printed 'create 5 session variables for this block':



The initialization string for this block is:
(The bold text are the variable names with just created)



select USERNAME, DISPLAYNAME, LOGLEVEL, EMAIL,PASSWORD from IS_SEC_USER
where UPPER(USERNAME) = UPPER(':USER') and PASSWORD =  IS_GET_MD5_VAL(':PASSWORD')

This query basically goes into user table and find all of the record where column 'username' = the value of 'USER' variable and column Password = the value of the 'PASSWORD' variable.  Due to the nature of session variable, these values will be assigned when user initiates a session by attempting to log in.  The test result of this initialization block is shown:



Now what is 'IS_GET_MD5_VAL'? It is a DB function that encrypts the password in user table:

This is just an fyi, you don't have to bother about it:


create or replace FUNCTION IS_GET_MD5_VAL (p_in VARCHAR2)
   RETURN VARCHAR2
IS
   l_hash   VARCHAR2 (2000);
BEGIN
   l_hash :=RAWTOHEX(UTL_RAW.cast_to_raw(DBMS_OBFUSCATION_TOOLKIT.md5 (input_string=> p_in)));
   RETURN l_hash;
END;

So the password in the user table looks like this:


 Anyways, let's move on to the next initialization block that we will create, which in my case is authorization.
The purpose of this is to get all of the user groups that the users belong to. The previous initialization block gets the users from user table, therefore, it should take precedence over this one.

Note the row-wise initialization is needed here. It will return all of the rows that the user belong to, instead of a single value, which variable typically does.




The initialization string is this:


ISAuthorization= select 'GROUP', UG.GROUPNAME from IS_SEC_USER_USERGROUPS FFF, IS_SEC_USER U, IS_SEC_USERGROUP UG where
UG.ID = FFF.USERGROUPS and
U.ID = FFF.IS_SEC_USER and
UPPER(U.USERNAME) = UPPER(':USER')

This is clear that this query joins all of the 3 tables and find the right group that any user belong to. This is executed after the previous initialization block.

So the idea here is that when users log on, the first session will get the user's ID and password from user table if it exist, then it will join to the group table and find out what are all of the groups this user belong to. So not only user is able to log into OBIEE, but also it's privilege will also be defined based on the group privilege. We can define user/group privilege in OBIEE later.

So from previous screenshot we know there is a dummy user call 'Admin', although the password shown in the table is encrypted, but we created its password to be 'Admin'.

Now lets log on to OBIEE via Admin/Admin:





Now the result is:





Thanks

Until next time

Thursday, October 18, 2012

OBIEE The application of creating sub-report to filter another report

Hello guy

In the creation of OBIEE, we all know that we can use 1 report's result set as filter condition for another report. This is called sub-report, the equivalent SQL statements from the log is then nested query. Today, I want to talk about 1 good situation where this technique can be very helpful.

Now look at the below report. This report provides a lot of PWE ID from the left.



Now, I want to create another report, the second report should only show the  PWE ID that are related to the ones from the first report when user specifies any PWE from the prompt. In other words, when user pick PWE '10083_HSPA_CD_100005', then the first report will return only for that PWE, but the second report that we are about to create should return for '10083_HSPA_CD_100006' instead of '10083_HSPA_CD_100005' because according to our rules, they are related.

So this is a perfect example of using sub-report, knowing the relation is obtained based on the prefix of the PWEs up to the last '_'. In other words, if the prefix before the last 6 digits are the same, they are related.

Subsequently, we will go back to the first report and add a new column called 'initial' with the following expression:
SUBSTRING(Pseudowire."PWE ID",1, (LENgth(Pseudowire."PWE ID")-6))

This substrings the text from '1' position, and it will go for whatever length of the text up to the last 6 digits. So in our case, the last 6 digits of each row will be chopped off:





Now let's start with the second report by adding all of the wanted fields including 'initial' with the same above expression:



Now that we have the second report which we call 'related Pseudowires', we will then start creating filters for this report. The idea is the take the column 'initial' from this report, and match it up with the same column from the first report with 'equal' operator. In this case, the second report has to return the PWE that belongs to the same initial as determined from the first report.  Then we will nest the report again by creating another filter on PWE ID. This time, PWE ID from the second report show NOT equal to the PWE ID from the first report. So when both conditions are satisfied, the only record that will come back from the second report will be the related PWE.



Now, put these reports on the dashboard. As an option, I can also define a guided navigation for section 2 where second is held. The guided navigation will act as a condition that when the second report returns at least 1 row, it will appear on the dashboard, otherwise, it's hidden.



Now, let's see how this works on the dashboard. First we don't specify the prompt value. So the first report comes back with all PWEs and second report is nowhere to be seen:



Now, select 1 PWE from the prompt and rerun the dashboard:


Now the second report appears, but it is displaying only the related PWE from the first report.

Thanks,

Until next time.

Thursday, October 11, 2012

How to report data dynamically based on user's personal time zone setting -- Part 3


So from the last time I took a step further to solve the issue of reporting based on time zone, although it wasn't 100% correct, at least it opens up a few options. The conclusion there was that we need to find a way to dynamically choose the time zone offset value and use it for our time conversation. Where I was stuck was at the daylight saving changes that happens from all over the world.

Therefore, I have decided to introduce a time zone table that stores all of the time zone and the corresponding offset value. The table structure is like below, and I have manually added 2 rows, both are for GMT -6 time zone. However, one of the time zone is currently having DST in effect while the other doesn't:


Then, bring this table into OBIEE's model:



Now, let's create some dashboard prompts using these 2 columns:



Here on Time Zone prompt, I am applying the following expression to show the contend:

SELECT case when "Time Zone"."Time Zone" = '@{session.timeZone}' then "Time Zone"."Time Zone" end FROM "Capacity Management" where case when "Time Zone"."Time Zone" = '@{session.timeZone}' then "Time Zone"."Time Zone" end is not null

The previously discovered session variable 'timeZone' is truly handy here. So the idea is that, when the time zone column equals the locale name that's generated by user session, it will return that exact locale name which is also stored in the time zone table. The where clause eliminates all the other records which will be null in this case. Also, the default value of this prompt will be the session variable.

Now create the second prompt on offset value:


Now the second prompt is also coming from the same time zone table, this should record the offset value that matches the locale name in the table. To do some, in the 'SQL result' setting, I entered the following expression:

SELECT "Time Zone"."Offset Value" FROM "Time Zone" where  "Time Zone"."Time Zone" = '@{session.timeZone}'

Also, I have set a presentation variable which will pass this value into any filters or tables that need conversions. I call this variable 'Offset'.

So let's test this prompt on the dashboard with the first locale setting, which is central America time:


The result looks great. These fields are automatically populated.

Now change the locale setting to central time U.S & Canada:


Then refresh the log in again and we see it automatically updates:



Now, the rest is pretty straightforward.  Remember in my previous post where I was forced to use the following expression to convert my time stamp in my timestampadd function? 

Timestampadd(sql_tsi_day, cast(Substring(case when '@{session.timeZone}' = '(GMT) Casablanca, Monrovia' then '(GMT+00:00) Casablanca, Monrovia' when '@{session.timeZone}' = '(GMT) Greenwhich Mean Time: Dublin, Edinburgh, Lisbon, London' then '(GMT+00:00) Greenwhich Mean Time: Dublin, Edinburgh, Lisbon, London' when '@{session.timeZone}' = '' then '(GMT+00:00) Default' else '@{session.timeZone}' end From 5 for 3) as int)/24 , Date."Hour Date"

Now I can simply replace all these codes to the following:

Timestampadd(sql_tsi_day, @{offset}/24 , Date."Hour Date". 

This will take care of all scenarios..

From OBIEE's point of view, we are all done. So of course, we have to maintain our time zone table and we have to figure out what offset value is for each time zone and when is their DST start and end date of the year. I went to wikipedia and it was pretty eye opening and fun to learn all these stuffs. 

So then, I will load the entire table with all of the time zone name and offset values. Some script will have to be called on specific day to update the offset value for DST. Although this may sound complicated, it is not in real life. Think about this, most of the time we will be supporting business in Asia, Europe, North America, South America or Africa.. A lot of those time zones never observe or no longer observe DST. The main countries that do are in Europe or North America, and most of their DST period are from March to Oct or April to Nov. Then we have a few exceptions in Australia, even then, there aren't that many to deal with. Most of the time, we won't cover them all unless your BI Platform is so big that it truly need to cover users from Marshall islands, Baghdad or New Guinea, which I highly doubt. I would say, this approach takes care of most cases and should be a good solution..

Thanks, until next time 


Tuesday, October 9, 2012

How to report data dynamically based on user's personal time zone setting -- Part 2


The previous entry has concluded the problem with implementing reports based on different time zone, this time we will proceed with the possible workaround.

One thing we have realized is that, the time stamp being displayed on the report is changing according to the user locale, this means it occurs per user session. We definitely should take advantage of such feature, now we have to figure out a way to capture the time zone offset value for each user locale setting and use that to do our data conversion.

Luckily, OBIEE provides a list of pre-determined system variables. A lot of you may already be familiar with the likes of 'current_date', which returns the current time stamp. Here are a few other variables familiarly for the time zone locales:


So, let's try using 'timeZone' variable and see what I can get out of it. The proper syntax for calling this variable is '@{session.timeZone}', which can be used in column formula, filter values and prompts. Now I am just creating a new column in the report with this variable, it should return the time zone name in text format, the values will be exactly what you choose in your locale drop down list.



My idea is that, since most of the locale names are in text format and it tells you the offset values such as '(GMT-07:00) Arizona', I will then just take this value and substring it so it will only have the part after '-0' and be 3 digit long, and then convert the result into integer. I then get my offset values. The good thing about this is that, the value changes dynamically based on user's locale setting. I save myself from having to mess with session authentication. The only though, is that there are 2 locales names: '(GMT) Casablanca, Monrovia' and '(GMT) Greenwhich Mean Time: Dublin, Edinburgh, Lisbon, London' do not have any numeric implications. In these 2 cases, I will just change the text 'GMT' to 'GMT +00:00' with a case statement.

So putting all these thoughts together into a formula, I then create a new column that returns the offset value with the following statement:

cast(Substring(case when '@{session.timeZone}' = '(GMT) Casablanca, Monrovia' then '(GMT+00:00) Casablanca, Monrovia' when '@{session.timeZone}' = '(GMT) Greenwhich Mean Time: Dublin, Edinburgh, Lisbon, London' then '(GMT+00:00) Greenwhich Mean Time: Dublin, Edinburgh, Lisbon, London' when '@{session.timeZone}' = '' then '(GMT+00:00) Default' else '@{session.timeZone}' end From 5 for 3) as int)



Now the latter part '@{session.timeZone}' = '' then '(GMT+00:00) Default' was simply my attempt trying to take into consideration when user locale is set 'Default'. This however, has been proven to not work. But for now, leave it as it is.

Moving on, let's set the locale to be Arizona time:






And run the report:


As you can see, now the offset values and time zone columns are both reflecting what the time zone locale should be..In my case, the offset value is -7.

So I need to apply this offset value into my date and hour filters in order to take Arizona time into consideration. In my case, it is to edit the filter column formula where the original number divided by 24 would be replaced by the offset value. From the below screenshot, you will see a big cast statement divided by 24 within the timestampadd function. The exact reason of such set up can be read here if you care, but I digress. 






Like I said from the previous post, this is a workaround, which doesn't quiet get it at 100%. One of the main issue of this solution is that it doesn't take daylight saving changes into consideration. This can be a big problem in the long run as daylight saving is one of the most complex thing to tackle. Nevertheless, having timeZone session variable is really handy, it opens more possibilities. Now with the tools we have, I am exploring into other solutions that might be able to deal with this requirement more efficiently. 

I think I have found one! My first round of testing looks very promising.


Stay tuned!




Until next time
Related Posts Plugin for WordPress, Blogger...