Friday, December 20, 2013

Improving WebCenter Content User Experience with Information Architecture

Information Architecture (IA) is the "Structural design of shared information environments". It goes hand in hand with User Experience Management (UEM) as it makes your information more usable and your business users more productive. 
Before engaging in the IA exercise you need to make sure that you understand how your information is used and the type of the information assets you'll be managing and the functionality you'll be making available.
The diagram below shows the main principles of Information Architecture. 

Now let's review each of them so you'll be able to use them when designing your sites and your web applications.


Design your site or your app in such a way that it will still function well when your content multiplies by a factor of ten and more.
The cost of storage continues to go down and a well designed site will continue to attract visitors and contributors alike. This means that more content will most likely be coming in then taken out with expiry and other mechanisms. This is an important consideration to keep in mind when looking at all the other principles we will be looking at here.


Your content is a living thing. It has it’s attributes and it’s lifecycle. When you define different types of content that lives on your site or your web app, think of different relationships that it will have with other types of content and it’s lifecycle and also about its relevancy over time. 
News stories age quickly and then they store in archive and eventually expire. Winter road tips become relevant in the winter and need to be take out of view for the summer.


If I feel like getting a spicy sauce at the store and the Condiments section has 3 types of sauce - I'll most likely pick one and walk away happy. Now if I see 15 different types of sauce, with each brand represented by 5 varieties - and I've only got 5 min to pick one - I'll most likely leave the section empty handed. 
Large number of choices makes it hard for us to make a decision so you should keep a list of choices for each particular scenario small. 
This applies to your navigation, list of options available for an operation and number of matches you return from a typical search. Small, focused lists also saves time it takes to make a selection and people will less likely pick up the phone instead of using your intranet site!


Only show enough information for people to understand what comes next and make a decision. Think layers and start with a list of short descriptions or snippets before showing an item. This is what Google is doing on its search. It’s not showing you full web pages and it’s not just showing you the titles either. It shows you just enough information to decide if you'd like to browse to that page or not.


Dan Brown suggest that we "Describe the content of categories by showing examples of  content". This is a much more effective approach then describing the category. Examples may also double as quick access links to familiar and popular content. This is because we as humans internally describe our  categories as networks of familiar examples.
For instance, Forms section  of your intranet may have Expense Report, Contractor Timesheet and your Equipment Request form. Showing 2..3 items in a category with a link to see more items will work a lot better than a category name by itself or a category description.


Do not assume that everybody will always use your home page to enter your site. People will be bookmarking your pages, come in from the search engines and external links and so on, so the large percentage of your visitors will come in through an arbitrary page instead of your home page.
This really means two things - your  site template must have all required navigation elements to add value to the actual piece of content displayed on the page and show what other content is also available on the site and can take you there. 
The second thing is that your home page should not be the do all page. Instead it should focus on welcoming and properly introducing your new users to the site.


Consider using multiple different classifications on your site or app. People use different ways when they're looking for information. This is why Gmail has moved away from folders and allows you to apply many labels to any given email.
An online store may let you find clothing by size, color and type. A corporate intranet may let you browse by topic like benefits or standards, type, like form or a guideline and by department. 
Diverse Classification is a good thing but taken too far it may overwhelm and confuse your users and it will also take more effort to maintain. 


Names that most shops are using to refer to their navigation blocks like the 'Top Nav' and 'Left Nav' make you focus on the block location instead of the actual type of navigation this block offers! This is extremely important. When you start to think of your "Top  Nav" as 'Department Nav' and your left nav will be your 'Topic Nav' - you will be well under way in your  Diverse Classification department. Instead of that, developers get confused by these names and simply duplicate parts of their left nav on the top and so on.


User Experience Management is a bigger topic that includes the IA and your design. It also includes planning of your information flows and usability testing. Many times if your IA is solid and you have a good set of use cases - you'll get great results by just focusing on your information flows and your most important usage scenarios.
We'll wrap up this whitepaper by looking at case study that shows how a study of user behavior and top use cases allowed us to improve user experience with Oracle Content Server itself.


Content Server UI is a frequent target for end user complaints. Let's see how applying just a few of the IA principles has transformed user experience of one of our clients:
First we learnt how users interact with Content Server. Essentially, they were following the pattern of Gradual Disclosure - running a search, looking at Content Information, proceeding with Document Preview on selected items and then they were downloading and updating a few selected documents. 

The main problem we found was latency in the system - and it was not related to performance of the server!

Our most popular user path in the system - from browsing and search to content information to document preview entails a lot of waiting. To get Content Info or preview you have to repeat these steps for each content item:
  • Click on the link and wait
  • Then go back and wait for results to load again
  • Find the row you just clicked on and click on the next one

Another thing is that the use of screen real estate can be greatly improved:

Out of the box Search Results fit 20 rows per screen... and nothing else. Green on the screenshot above shows useful space.
Another thing they complained about was pagination. Pagination may work well on a web site, but in a web app it can slow things down. Going to the next page involves waiting. Mass update operations that span multiple pages have to be performed multiple times (for each page) – which means:
    • More waiting
    • Risk of error – as you have to repeat update instructions for every page

So here's the UI prototype that we built:

Fits almost twice more search results on the screen without feeling crowded. The scrolling is seamless, using the lazy load patterns. The UI gets more records as you continue to scroll, so there's never a need to click the 'Next Page' button and wait.

It also offers seamless Document Preview – without leaving Search Results... 

and it lets you resize panes, hide and show as needed.

You can do multiple selection the Windows way - Shift – Click and Ctrl-Click select multiple rows. 
Right – click delivers context – sensitive menu – just like Word or Excel. And different menus delivered for single and multiple selection...

And here's another screen that get more mileage of looking at content as an object - the search screen :

Refine search criteria – and see results update every time you hit Enter (or hit Run Search button). No need to leave the search form or wait for “Search Within” form to load. This increases productivity because users no longer lose focus while waiting for screens to load.
Overall these improvements allow users to perform tasks faster – without requiring faster hardware and database. It you see a lot more – without having to get a bigger or dual monitor and having to switch browser windows and tabs 


This article introduced you to major components of modern User Experience Management and Information Architecture and with Oracle WebCenter Products. We then reviewied how a simple excersise of analyzing the use cases allowed us to transform WebCenter Content User Experience.

Sunday, November 17, 2013

This handy Trace Log Trick Makes Troubleshooting Content Server EASY!

This article will introduce an indispensible tool that you've probably been using for a while... but chances are - you, (just like me) have been using it the WRONG way!

Don’t you just wish there was more to a Trace Log than 1..15 min worth of Content Server output printed on a web page? Well, I bet this is what you have used up to now, but have had no notion just how much more useful it can be if you used it correctly!

The great news is that you are in command of the situation in more than scrubbing the surface! You can log every single command, service call and nearly any internal Content Server process that happen during the normal use or the initialization stage. Need I say that you will never miss a heartbeat, including any configuration or custom component issue, after using this tool in the above fashion?

How to Maximise on the Tool

Under System Audit Information page (Administration Tray) there is a treasure-trove of sections, each of which can give you just the information you're looking for in that moment. The tool gives you a long list of sections that you can monitor.

Now the nice thing about it is that it also allows you to use wildcards. For instance, if you're going after an indexer issue, you may just simply say "indexer*" in the Active Sections box - instead of manually specifying multiple sections. Sweet! No longer you’re forced to scan through a list of sections.

And this is how I've been using for years, assuming that, oh, well, it’s far from perfect, but it’s still is an awesome tool and you really can't do without.

Now for a taste of novelty in the use of the Trace Log! Up to now, I have been wandering in the admin desert where I have resigned to the fact that I'm stuck with the dumb web page output that is limited to just a small part of the output. How wrong I have been in my delusion!... Until last week when I came across this article by Kyle Hatlestad.

Indeed, here you can learn how third-party tools that write to the Trace Log instead of the real log file are performing on the sidelines. Here is how:

What Kyle (and the guys in the comments section) have pointed out, is that Unix Content Server installations have been writing the trace log output to a file for eons of years (in the Internet calendar of course!) So I you're running a 10g Content Server on Linux or Solaris - check out your etc directory under your Content Server installation (See screenshot below).

Now if running on Windows - you need to first enable the "Output Redirection" as shown below

And hurray, the IdcServerNT.log pops up for you to use and analyze using your favourite tools! All output is now yours too, not just the few minutes worth of most recent output.

And shoud I mention that none of these is an issue for 11g and newevrer systems running on weblogic application server? The trace logs will automatically collected for you under data/trace directory of your Content Server instance directory (See screenshot below)

Well it is time to close shop. Check it out and let me know how working on a true log file (or files) using your favourite analyzer tool makes you feel - compared with that crippled grumpy admin staring at the last 15 min worh of output in a browser window!

Thursday, May 2, 2013

WebCenter Content - Is your system performance on target?

Quick question: how's your WebCenter Content response times in terms of making your business users happy? Now if you do hear complains - do you blame it on the number of records? I mean, wow. Approaching a million content items mark per repository still makes application owners and system administrators feel anxious, especially if business expects some rapid growth rates with more and more content coming in day after day. "How will my system perform when I get over a million items?" Will it slow down to a crawl and will I be losing data?

First off, let me start by acknowledging these fears. Yes, a million records is a lot... when you store them in Access database back in year 1995. Now when fly back to 2013 and put them in an enterprise-level database, like Oracle or one of those other databases supported by WebCenter Content 11g (like SQL Server or DB2) - a million records is not very much at all. WebCenter Content is designed to handle extremely high volumes of content - up to one hundred million content items per day!

Yes, I'm not kidding. A hundred millions items coming in each and every day - when you invest into some high end hardware. But even if you won't - WebCenter Content can still check in over 10 million items each and every day - on a 'middle shelf' commodity hardware!

Just read this section. Especially if you're still thinking that a million records is a lot. It will blow your mind:

WebCenter Content 11g benchmarks on commodity hardware

The following benchmarks were conducted in the Oracle lab on a cheap 'commodity' sever with dual Xeon 2.33 and 16Gb RAM - running a single node Content Server.


Multiple tests were conducted - checking in a variety of file sizes - from 4Kb to 250Kb and various types - text, MS Office and PDF. The table below shows the results of a test run that took full 24 hours to complete and shown some staggering results:

Anywhere from 11 to over 23 millions content items checked in per day on a commodity hardware

Table from the Oracle White Paper: "Oracle Enterprise Content Management Suite Extreme Performance, Extreme Scalability"


Site Studio for External Applications has delivered 124 pages per second and over 446,400 pages an hour - with 89% CPU

The reason most clients don't see that kind of performance lies in all that other 'stuff' that lies outside of the WebCenter Content. This is the kind of stuff that we will be addressing in this whitepaper.

Oracle WebCenter Content is an I/O limited application, which means that overall performance is restricted by the speed of your hard disks, and the bandwidth in your network, not by the software itself.

Adding Exadata

A single node of Oracle UCM 11g can ingest over 91 Million files per day with a Oracle Exadata Database Machine Quarter Rack (which includes Oracle Database 11gRelease 2 and Oracle Exadata Storage Server software), and almost 179 Million files per day with Oracle Exadata Database Machine Half Rack.

So how do you feel now, that you're convinced that WebCenter Content can handle it?

Seriously speaking, I agree, lab environment may behave better then a real life system where you have slow networks, other apps competing for resources and third party vendor customizations to support. I agree, so let me give you some simple guidelines to see if your system is pumping out the numbers it was designed to deliver or the whining and moaning of your business folks really has some ground and should be looking at improving performance.

System performance targets

Here're a couple of numbers for you to use a guideline. By no means your system must match those exactly as every configuration is different, so these are just the ballpark for you to see where you are.

Read-only requests - like the page loads - should pump out about 20 requests per second per GHz of CPU times the number of CPUs. Dual 2Hz box should pump up 80 pages per sec.

For raw requests such as content information or calls to the GET_FILE or Check In (for small files, say, around 200K) you should target 4 requests per GHz of CPU times number of CPUs in the system.

I hope you're coming right in the ballpark and now you have some proven numbers to back up  your system performance. But even if don't - there's no reason to panic.

Next Steps

Have you checked my Quick Hits article from a couple of weeks ago? There's some solid tips to apply. And even if that didn’t do it for you - we're here to help. Drop us a quick line at and we will help. Will routinely run comprehensive scans to diagnose sub-optimal performance - and provide detailed reports of findings.

Stand by for more tips on performance tuning and diagnosis here on this blog.

Thursday, April 18, 2013

Quick Hits for Improving Performance

If you missed my talk at Collaborate in Denver this year, here come some quick tips on diagnosing sub-standard performance.

Many times, people tend to balk at performance problems. They just freeze in hesitation and dwell on the problem, while many times, performance can be rapidly improved by simply going down the list of 'quick hits'. So here it comes. Roll up your sleeves and check: 

  1. Virus scanner configuration. The number one reason for slow performance on many windows deployments. Disable your anti-virus and see if that makes a difference.
  2. Network problems. Unnecessary firewalls that slow down your network connections. Examine the firewall usage and eliminate or downgrade those that aren’t critical to system security. The same is true for other unnecessary network layers, routers, hubs and switches. Your best bet is to place all the components of your Web Center Content installation on the same subnet. Another important consideration here is the be bandwidth between the Content Server and its file system (when not using JDBC storage). Most deployments store files on remote file systems (SAN or NAS) yet system administrators only looking at the pipe going to the database.
  3. Other applications competing for resources - file system, database or web server. See if you have another applications sharing the same physical servers. This point is especially true when running on a virtual server. If too many applications are running on one physical server, response time will be slowed for all applications. Examine your server resources and adjust them accordingly.
  4. Bad timing of batch loading backups and archives. Make sure that all resource - intensive activities are performed at your off-peak times, when few users are on the system.
  5. Overly detailed logging levels - WebLogic, Content Server and External loggers
  6. Insufficient memory - Content Server and database. Low memory causes extensive disk activity when the system is trying to compensate for the lack of RAM by using its swap file.
  7. Database indexes that can be easily tuned to quickly boost your most common queries - be sure to check both - content and indexing database instances if they are separate
  8. Check the search queries that you running - watch for overly complex and sub-optimal ones. Are you running on Oracle but don’t have the Oracle Query Optimizer component installed?
  9. Running outdated software. Many times you can get a tangible boost by simply installing the latest patch or upgrading to the new version of Web Center Content

Stand by for more tips on troubleshooting in imporving your WebCenter Content performance

Friday, April 5, 2013

Come see me speak live at Collaborate 13 in Denver:
  • Session #650 - Integrating Oracle Content Server into larger architectures... in 30 minutes or less! (Wed, Apr 10, 2013, 12:15 PM - 12:45 PM) : Mile High Ballroom 4C
  • Session #611 - Surpassing a million content items Expert tips for maintaining an awesome response time (Wed, Apr 10, 2013, 01:00 PM - 02:00 PM) : Mile High Ballroom 4D

I'll see you there