There are several valuable features of Cognos. In this review, I will be concentrate more on the Automating Cube Building and Deployment without any manual intervention.. even while a user is using a cube for reporting, there will not be a situation as ' Cube file locked; unable to build or deploy'.
IBM Cognos Transformer cubes are one of the widely used OLAP data sources. Normally, a PowerCube contains calculated and aggregated data that is organized as dimensions and measures, which can be viewed and analyzed in Analysis Studio and Report Studio (versions 10). Users find it easy to use and quick to access aggregated summary data which help in better analysis. IBM Cognos PowerCube data is static, and building a PowerCube naturally becomes a repetitive process. At my work place, cube data is refreshed on a daily, weekly, and monthly basis. Since the data volume and the number of dimensions and measures in the cube are big, the build process takes 3-4 hours to complete. In order to ensure the most benefit from the cube, having a scheduled and automated cube build is essential.
Deployment and Activation a new feature which was introduced in version 8.4 that allows the cube file to be automatically copied over to a specified local or network location, activated and/or archived depending on the requirements.
Basically one needs to set the Deploy properties on the .mdl file using Transformer.
– Deployment Strategy: select the “Copy to available locations, then activate” option. This will copy the cube .mdc file to target server location.
– After building the cube: select “Automatically copy and active”.
– Deployment location: add a path for production and/or a place that .mdc file should be deployed too. You can select multiple locations. For instance, the same cube may be deployed to both the Development and Production servers.
– Check “Enable automatic PowerCubes deletion”
Once the changes are made, we need to use the Cognos Transformer command line is capable of performing certain modeling and cube-building tasks on the Windows, UNIX or Linux platforms.
The general syntax for using windows command line is as follows:
cogtr -n -lcognostr10=
Notice that after the cube build completes, Transformer automatically deploys and activates the newly generated cube. It doesn’t require any changes to the data source connection. The live cube swap is effective immediately.
Just a few more steps (on Windows environment), you can now schedule the cube build via a job scheduler application, such as Windows Scheduled Tasks. Here once the ETL process is complete, a Flag file would be sent to Transformer box to trigger the cube builds. You will not need to manually rebuild or deploy the cube any more as everything will be done automatically on schedule.
- Need improvement towards Visualization.
I've used Cognos in some form or the other past 10+ years
Yes, since we were implementing VB Script batch file process on the Transformer server. We had to do some testing to ensure success. Later worked like a charm.
Cognos is extremely stable... Unless someone develops and tries to run a huge data set.. Its all about educating users.
Customer Service:
Customer Service is good and quick answering.
Technical Support:
Yes, Technical Support is very good to some extent.. They would try the initial knowledgebase solutions ( which we would have already tried) .. but they do work along with us to solve issues... sometimes it is quick, some time it takes time reproducing the issue and solving..
Always been using different versions of Cognos - right from the Impromptu.
In-house. I have implemented most of the solution specific to user requirements.. One of them is the automate cube build and deployment.
This is a relatively short and a good review, though a Cognos Cubes security and specifically a row level security is missing.. and using a standard approach offered by Cognos makes a security implementation for cubes a nightmare.. I found a different and a simple solution, will publish later on.. One more comment a VB script batch process.. - my personal opinion that a standard batch file using a system and a Cognos commands is completely sufficient for any kind of a task required, no need to use a VB script for any reason at all..
If you want to trigger a batch file execution after your ETL finishes - just give a command on a last step of your job.. :) As well, if you need to run it on some schedule -> use a windows scheduler for example..
To reduce a cube size in case it is really huge and there some big dimensions - I would recommend
thinking about some big dimensions cleansing(or using only flagged "active" records having data) in order to avoid bringing records having no data into cubes. Thanx.