Testing recommendations for SAS BI Dashboard & SAS Web Report Studio

With limited time and budget, you can still thoroughly test SAS reports built in SAS BI Dashboard and SAS Web Report Studio by considering how all the tools were used to build the end report. An understanding of the functionality within each part of the chain from raw data to final report helps build out the test cases to effectively verify accuracy without clicking on every single button on the final report.

  1. Raw data to summarized data table
    Sometimes this is not considered at all, but the ETL scripts written to create the summarized data are worth at least a peer code review. Other options are to use SAS Enterprise Guide and to generate proc summary or proc freq results to compare the raw data to the summarized table. The point here is that instead of verifying that the data in each web report is correct, verify the numbers from the summary table itself are accurate.
  2. Metadata and Information Maps
    For subsequent steps in creating the data source, metadata should be confirmed (especially SAS formats for any improper truncation of values) and custom columns in SAS Information Maps should be tested. Prompts and filters built in SAS Information Map Studio can be verified at this point. To get the most value out of the time spent - the important item for the tester to understand is how the prompt or filter were built. I would recommend more testing when multiple filters are combined (expressed in Boolean logic AND/OR), and error insert testing with the simple prompts/filters (one variable used in the selection). Error inserting means that you are entering in prompted values that you know will NOT return results, to ensure that the end user either is not allowed to enter incorrect values or results are presented in a way that the end user has requested or is expecting.
  3. BI Dashboard and Web Reports
    The design of these reports is critical understanding for the tester. What content uses which information maps, what ranges are leveraged in which indicators, and where interactions exist between indicators are all required to build successful test cases. If any measures are defined in the web report itself, these should be verified as accurate - but all other data elements that are pulled straight from table/information map should already have been validated from one of the above two sections.
  4. Links between BI Dashboard and Web Reports
    With the design of individual reports, the interaction between tools needs to be verified. But note that testing the linkage from one combination of prompted values is typically enough to verify that the functionality is working as expected. If different data sources are used to create the dashboard and the web report studio prompt, it is much more efficient to verify that the data values are identical using proc freq statements than to click your way through tens or even hundreds of possible values using the user interface.

Of course, the most important thing is to do SOME testing of your reports before handing them off to your user community. The last thing anyone wants is an email from an executive questioning the accuracy or complaining that the report simply doesn't work. Loosing their trust will just makes your job harder in the future.

Do you have other recommendations do you have for testing SAS BI Dashboard and SAS Web Report Studio reports? Please share so we can all produce accurate & functional results!

tags: BI Dashboard, Web Report Studio

One Comment

  1. Posted August 8, 2012 at 2:02 pm | Permalink

    Related to point #3: the biggest area of concern in the projects I've worked on surround how the data is subset or grouped in the web report, i.e. the grand total numbers may add up properly in the same report or other more summarized reports, but when looking at subsets or breakdowns of categorical data, the numbers do NOT add up. I've seen errors come from either the misuse of a filter OR an incorrect expression defined in the information map (related back to point #2).

    One way I try to mitigate confusion in production reporting is to encourage report writers to document the purpose of the report and any special considerations taken into account for filtering. This is useful either in the header and/or footer.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <p> <pre lang="" line="" escaped=""> <q cite=""> <strike> <strong>