Catching Up with OBIEE News

My mate has been fired from his job at a dress alteration company.

Apparently he didn’t turn up enough.

OK, lets keep up the pace this year. Soo much to learn in the OBIEE world, so little time.

Lets talk about size. How big is your OBIEE? I was recently at an OBIEE presentation and the speaker talked about a ‘large’ obiee user base of 50 users. Sorry, but this is tiny, not even small. Should OBIEE really be used for less than 50 users? How about less than 100? Maybe, maybe not, but I do know it can handle millions of users. Seriously. MILLIONS.
Obviously not all at the same time, concurrent will be depending upon your BI Server hardware – You will obviously have to use Exadata for the database, but should you use a cluster? Well, you could. In fact most do, but whats best for a larger user base is ONE machine. Seriously, ONE.
One Exalytics machine, using as many of the processors as possible and you will have a simple environment that is fast enough for thousands of concurrent users. When you have more than 4,000 users per day, then forget the cluster, buy the Exalytics. If your client/company won’t buy Exalytics then they will spend more money trying to build and maintain the cluster of slower machines. You can go for cluster, and if you do, aim for one machine per 1000 daily users.

BTW Just how do you define concurrent users? Number of people logged in any any one time? Number of people logging in per hour? per day?. Is number of user relevent when some are running more queries than others? Is number of queries relevent when one person runs load intesive and 100 people run a quick query?

My Advice? Ask the users about their experience with the system. Use the Apex Survey app (You have installed that, right?) and monitor averages, Total Server time, Total database time, number of queires per 10 minutes, number of user logins, etc, etc. Develop a rich picture of server stats and make sure your users can let you know easily when the performance is unacceptable.


Anyway, read these.


blogBlogs of the week

  1. Uploading a file to Oracle storage cloud service using REST API

Ketan Rout writes the 2nd part of a 2 part article which demonstrates how to upload data in near-real time from an on-premise oracle database to Oracle Storage Cloud Service.

The first part of the article can be found here.

2. APEX 5.2 early outlook

Scott Wesley writes a blog post about the following slide that recently appeared at a conference:


3. Oracle’s Database as a Service in action

Porus Homi Havewala shares the following:

which is a preview lecture of the Udemy Course: Oracle Private Database Cloud. It can be found here.

4. Oracle Conferences

Neil Chandler reviews a few of the conferences he has been able to attend this year.

5. Data Modeler Reports: Templates and Configurations

Heli writes: “I do not understand why there are configurations and templates related to reports in Data Modeler. What is the difference and when are they used?”

6. PL/SQL Security Coding Practices. Introduction to a better architecture part 1.

Robert Lockard says, “I have been seeing this database architecture for over thirty years and it’s high time we stopped using it. Before I go too far, let me tell you I get it, you have pressure to get the application out the door and working in a defined timeframe. I still design and develop systems and the pressure to take shortcuts can be great. This short cut is a security killer.”

7. Oracle Data Visualization Desktop Connecting to Essbase

Wayne D. Van Sluys talks us through how to do this.

8. Oracle 12cR2 – Howto setup Active DataGuard with Temporary Tablespace Groups

William Sescu blogs: “Temporary Tablespaces Groups exist for quite a while now (since 10gR2), but they are, for whatever reason not so often used. Personally, I think they are quite cool. Very easy to setup, and especially in big environments with a lot of parallel processing very useful. But this blog will not be about Temporary Tablespace Groups. They are already explained in the 12.2 Admin Guide.”

9. Best Practices – Data movement between Oracle Storage Cloud Service and HDFS

Ketan Rout writes, “Oracle Storage Cloud Service should be the central place for persisting raw data produced from another PaaS services and also the entry point for data that is uploaded from the customer’s data center. Big Data Cloud Service ( BDCS ) supports data transfers between Oracle Storage Cloud Service and HDFS. Both Hadoop and Oracle provides various tools and Oracle engineered solutions for the data movement. This document outlines various tools and describes the best practices to improve data transfer usability between Oracle Storage Cloud Service and HDFS.”

10. Loading Data into Oracle BI Cloud Service using BI Publisher Reports and REST Web Services

Dayne Carley says, “This post details a method of loading data that has been extracted from Oracle Business Intelligence Publisher (BIP) into the Oracle Business Intelligence Cloud Service (BICS). The BIP instance may either be Cloud-Based or On-Premise. It builds upon the A-Team post Extracting Data from Oracle Business Intelligence 12c Using the BI Publisher REST API.This post uses REST web services to extract data from an XML-formatted BIP report.”

twitterThis week on Twitter

UKOUG tweeted UKOUG Conferences 2016 – it’s a wrap and Ria’s journey to the cloud with new Human Capital Management Tool

Rebecca Wagner posted The Rittman Mead Open Source Project

Michael Vickers tweeted this photo from UKOUG_Tech16


LinkedinThis week on LinkedIn

Benjamin Perez-Goytia shared Integrating Big Data Preparation (BDP) Cloud Service with Business Intelligence Cloud Service (BICS)

Kanthikiran Kalagoni posted Repository Upload Procedure in OBIEE 11G and 12C

Tanya Heise shared Business Analytics Monthly Index – October 2016


Stories from, and

Videos such as A Lookback at #Kscope16 – ODTUG Took Chicago By Storm

and Schema Wars Episode 1: The Unstructured Menace