MicroStrategy ONE
Results Summary Area
Once you have started executing a test, information about the reports or documents being tested appears in the Results Summary area of the main Integrity Manager window. This area lists all the selected reports/documents by name and path.
Selecting a report/document displays the details of that report in the Report Data area. To view a specific comparison, double-click the report/document's entry in the desired column (Data, SQL/MDX, Graph, Excel, or PDF). At the bottom of the Results Summary area, above the Report Data area, is a status bar showing the progress of the test:
- Processing object <ObjectName is displayed when you start the test. <ObjectName is one of the objects selected in the Select Objects from the Base Project page or the Select Reports to Test page. Integrity Manager processes the list of folders, reports, documents, and search objects selected during test setup and composes a list of all the reports to be tested.
- Running reports is displayed once the list of reports/documents to be tested has been generated and appears in the main portion of the Results Summary area. A progress bar at the top of the Results Summary area tracks the percentage of reports/documents that have run.
- Ready is displayed when all reports/documents have finished executing, with or without errors.
The status bar also gives the total number of reports/documents in the test, and the number that are Completed, Error, and Pending. Reports/documents that have timed out or are not supported are included in the Error count.
Column Display
- To change which columns are displayed in the Results Summary area, right-click any of the column headers. A list of columns opens, with the currently visible columns checked. Select a column that you want to show or hide.
- To reorder columns, click and drag the header of the column that you want to move to its new location.
- To sort the list of reports/documents by a column, click the header of that column. To reverse the order of the sort, click the header of that column again. For more information, see Sorting the results of a test.
The columns are listed and described below:
- Sequence: A number indicating the order the reports/documents are executed.
- Object: The type of object that is being tested:
- Grid : Grid report
- Graph : Graph report
- Grid/graph : Grid/graph report
- Document : Report Services document
- SQL : Report that has been saved as SQL
- Name (Base): The name of the base report/document.
- Path (Base): The location of the base report/document in the MicroStrategy project.
- Modification Time (Base): The date and time at which the base report/document was last modified.
- Name (Target): The name of the target report/document in a comparative integrity test. This column is hidden by default. To see this column, right-click anywhere in the column headers and select Name (Target).
- Path (Target): The location of the target report/document in the MicroStrategy project in a comparative integrity test. This column is hidden by default. To see this column, right-click anywhere in the column headers and select Path (Target).
- Modification Time (Target): The date and time at which the target report/document was last modified in a comparative integrity test. This column is hidden by default. To see this column, right-click anywhere in the column headers and select Modification Time (Target).
- Status: The execution status of the report/document. Double-clicking Status opens the Detail tab in the Report Data area. This tab contains details about the report/document execution, such as the cause of an error or the specific prompt that is not supported. The different statuses are as follows:
- Pending reports/documents have not yet begun to execute.
- Running reports/documents are in the process of executing.
In a performance test, this status appears as Running (#/#). The first number is the current execution cycle. The second number is the number of times that the report/document will be executed in the test.
- Paused (#/#) reports/documents, in a performance test, have executed some but not all of their specified number of cycles when the test execution is paused. The first number is the number of cycles that have been executed. The second number is the number of times that the report/document will be executed in the test.
- Completed reports/documents have finished their execution without errors.
- Timed Out reports/documents did not finish executing in the time specified in the Max Timeout field in the Select Execution Settings page. These reports/documents have been canceled by Integrity Manager and will not be executed during this run of the test.
- Error indicates that an error has prevented the report/document from executing correctly. To view the error, double-click the Status.
- Not Supported reports contain one or more prompts for which an answer could not be generated. These reports will not be executed during this run of the test. To view information about the prompt, double-click the report's status.
For more information about how Integrity Manager answers prompts, see Executing prompted reports in Integrity Manager.
- Personal Prompt Answer Name: The name of the personal prompt answer used in this report execution.
- SQL/MDX, Data, Graph, Excel, and PDF: In a comparative integrity test, these columns indicate whether or not the reports/documents from the two projects have identical results for the specified format:
- Matched indicates that the results from the two projects are identical for that report/document. If no query details are returned for a document/dashboard, it is marked as matched.
- Not Matched indicates that a discrepancy exists between the two projects for that report/document. To view a comparison of the reports/documents from each project in the Report Data area, select the report/document.
- Not Compared indicates that Integrity Manager was unable to compare the reports/documents for that type of analysis. This may be because the report/document was not found in the target, because one or more prompts for the report are not supported by Integrity Manager, or because an error prevented one or both reports from executing.
- Not Available indicates that this type of analysis was not selected on the Select Processing Options page. If only the Graph column is marked as Not Available, the report has not been saved as a Graph or Grid/Graph.
- Custom Tags (base) and Custom Tags (target): These columns list any custom tags that have been applied to error messages in the report/document. For information about custom tags and error messages, see Sorting reports and documents by error messages.
- Base (Min), Base (Max), and Base (Avg): The minimum and maximum amount of time that a cycle took to execute the report/document in the base project, and the average execution time for all cycles in the base project.
- Target (Min), Target (Max), and Target (Avg): The minimum and maximum amount of time that a cycle took to execute the report/document in the target project, and the average execution time for all cycles in the target project.
- Difference (Min), Difference (Max), and Difference (Avg): The difference between the base report/document and target report//document in the minimum, maximum, and average execution times.
- Authentication Mode (base) and Authentication Mode (target): The authentication mode for the login used to run this report/document. For more information about running reports/documents with different users, see Executing a test under multiple MicroStrategy user accounts.
- Login (base) and Login (target): The login used to run this report/document.
Enhance Integrity Manager Memory Usage and Performance
Integrity Manager now has the ability to handle larger reports, dashboards, and documents through improved data thresholds. The threshold values are linearly relevant to maximum memory, execution mode, and concurrent job number. For example, a single mode with 1 GB of memory and 1 concurrent job gives a report a threshold of 5,000,000 and documents and dashboards a threshold of 625,000. You can alter these values to pass the memory check. The threshold can be increased by increasing the maximum memory and decreasing the number of concurrent jobs. If you previously ran in PVP or BVP mode, try single project mode as it'll double the threshold.
If an object does not pass the memory check, the execution thread stops. Since Integrity Manager disables incremental fetching for RW objects, the maximum data collection at one time is 10,000 rows for each visualization. Integrity Manager uses the maximum memory, in bytes, and total data volume printed in the Config.log for every object rather than the one set by -Xmx. The maximum memory is returned by Runtime.getRuntime().maxMemory().
Related Topics
The Main Integrity Manager Window