Showing posts with label statistics. Show all posts
Showing posts with label statistics. Show all posts

Thursday, May 16, 2013

Load testing Microstrategy Web Reports and Documents

Load testing web

In almost every project, the architects will need to publish the repornse time of ther BI systems. To get the response time from the statistics is easy. However we need a system that can simulate a real time scenario, so that the actual capacity or response time of the BI systems can be estimated. A system that performs quiet well for a single user coulf fail, when 100 such users access the system at the same time. In this article we will see how to quickly set up a performance test for your web reports and dashboards. We will use Jmeter an open source graphical testing tool

Jmeter setup:-

I have shows below the minimal settings that you will need to stress test your system. You can follow the documentation for advanced settings.

  • Provide a test plan name.



  • Add Thread Group.


  • Add HTTP Sampler. This is where you define the hostname, port and the url to test.




Once you provide the hostname, proptocol, and port, you need to add the path for report / document execution. Add the login credentials to the URL, eg:- &uid=username&pwd=password

This is needed for automatic login and execute the report. Remember to set your login method to standard during the time of test.

Now to view the results, use any of the listener outputs. like summarised view, tree view. I am using view results in table. You are now set to do the test. Lets look at the output.




The average response time is about 597 ms . Enable statistics from your web admin page and see the results.


Is shows 500 ms. So with this test data, you can say that the reports average response time is about 600 sec, with a deviation of about 140 ms for 10 concurrent access.






Friday, May 3, 2013

Web or Intelligence Server - Identify where to start applying the optimization


Perfronamce Optimization

Invariably this is one area where every architect and developer will spend a lot of time identifying the bottleneck in the system or the whcih areas to apply optimization.  How do they do that ? We do not  know the internals of MSTR, and the only approach that we can take is to follow the suggestions provided by MSTR. And there are many. Before you start following the suggestions you must know which of these sugegstions to apply? Else you will end up in lot of trials than what is necessary and many times, tasks that can be completed quickly, could take longer time.

So how do i quickly identify where to apply the optimization ?

  1. Is it my web server taking longer time to display the result?
  2. Is it my Intelligence Server taking longer time to process the result ?
  3. Is it my network causing the delay ?
these are the basic questions you could start asking and once you identify the source of delay, you can further use the suggestions from MicroStrategy.

Web Stats

MicroStrategy has an option in the Web Admin Page, called statistics. You can enable this and choose either screen / file / both. When you choose screen option, you will see some statistics on evey page displayed. Lets look at how to use these statistics.  For the purpose of the demo, I created a Dashboard with three reports. I have disabled report caching at project level, since I dont want the results to come from a cache while doing my testing.

On Executing the dashboard, you can see the below statistics displayed below the dashbaord content.

Total Web Server + IServer processing time : 3140 millisecondsWeb server processing time : 630 milliseconds (20%)
Transmission Time : 0 milliseconds
IServer processing Time : 2510 milliseconds (80%)
   IServer API time : 220 milliseconds (7%)
   IServer polling time : 2290 milliseconds (73%)
Number of Web API polling calls : 22
Number of Web API calls : 27
Bytes sent/received : 31504 63643
Start Time (millis) : 1367477655056
End Time (millis) : 1367477658196
 

Lets look at each of these lines.

  1. the first line, tells me that the total time it took for the display of the dashbaord is approximately 3 sec
  2. second line, shows the total time taken by web server, this is the renderind time web server takes to process the xml results from the IS  (630/3140) *100 = 20 %
  3. third line, is the transmission time, since IS and web are on the same machine, it is zero
  4. fourth line, shows the time taken by the Intelligence Server for processing of the results as well as the polling time. (2510/3140) * 100 = 80 %
  5. fifth line, shows the time to execute XML API calls and retrieve the data.
  6. sixth line , shows the amount of time the server spends polling the IS for data.
  7. seventh line, show the number of polling calls made, polling is done every 110 milliseconds.
  8. eight line, shows the number of API calls made by web, this included calls to IS and local calls to acccess cache, user rights etc
  9. ninth lines, shows the data send and received in bytes, it cal vary each time because of the number of calls made.
  10. tenth line, shows the start time
  11. elevent line show the end time.

So in this example, you can easily say that 80% of the time is spend in Intelligence Server.  So where should I start applying the optimization ? Web server ? but that hardly took 20 % of the time.   I need to look at how to optimize that 80%  time taken by I server.  

The reports are not using any prompts, and hence I decided to enable caching for them. Lets look at the stats now.

Total Web Server + IServer processing time : 860 millisecondsWeb server processing time : 450 milliseconds (52%)
Transmission Time : 0 milliseconds
IServer processing Time : 410 milliseconds (48%)
   IServer API time : 110 milliseconds (13%)
   IServer polling time : 300 milliseconds (35%)
Number of Web API polling calls : 3
Number of Web API calls : 14
Bytes sent/received : 7646 71394
Start Time (millis) : 1367477719529
End Time (millis) : 1367477720389
See how the dashbaord is now displayed withing a second. From three seconds to one second, is my performance gain. And all of this is done quickly. Depending on your case, the time it takes to find a method for optimization might vary.  The critical factor is to identify where to apply the optimization and which areas needs optimization. Without knowing that you are going to take a longer route for an easy solution.