Alternative Method for Accessing SAS Framework Data Server Content

Back in 2011, I wrote about the all new SAS Framework Data Server in 9.3 and how it needs to be running for certain web applications to function properly. But it is actually used as the data storage device for multiple SAS components including webdav content, alerts, comments, and workflows. During your configuration, you can choose to use another RDBMS database but in the default SAS uses a .FDB data structure stored as SAS-config-dir\\Lev1\\FrameworkServer\\Content\\SHAREDSERVICES.FDB.

So ... what if you have a default installation and need to access this data for other purposes. Think custom SAS Stored Process with all comments saved about a specific BI Dashboard report. On initial research, the only way to access this .FDB appeared to be a JDBC Data Source connection. But then I got really lucky, not enough to win Wednesday night's Powerball but still lucky! SAS has a hidden SASTS library type. Included is the framework (geek joke intended) of the connection string:

libname fdsdb sasts server=<SAS-server-host> port=22031 protocol=bridge 
user='sasadm@saspw' password=<password> 

After updating the string with your connection information and running in SAS, you will then be able to review the data tables. Note that a better practice would be to copy the data tables into your personal area to avoid locking them in production.

proc datasets library=fdsdb nolist;
   copy out=work;

It's probably good that I didn't win the Powerball lottery anyway, I would have a difficult time sharing with the two other winners. At least I can win by finding these wonderful tidbits and sharing them with you.

Sync everything up, at least between EG & metadata

From an Enterprise Guide user's perspective, a SAS library is a library. Whether it was defined in the autoexec or in the metadata or by magic, it is there for them to use with no issues. However, there is a difference as metadata defined libraries do not behave in the same manner. In some circumstances (say when allowing users to create data in the library using the metaout=data or the assignmode=2), you either rely on the user to complete the 2nd step of updating the metadata themselves or risk having everything out of sync. Changes to a table by an EG user will not be viewable and worse yet could confuse a Web Report attempting to use the same table.

Might I suggest an alternative. Create proc metalib code to update all the libraries the user has access to change, and place this in the workspace server autoexec (SAS Config Folder\\Lev1\\SASApp\\WorkspaceServer\\ Included is a sample to loop through all the libraries, since you must point to a single location with each run of the proc metalib code.

/*retrieve all metadata library names*/
data metadata_libraries;
  length liburi $256 name $128 ;
  keep name;
  call missing(liburi,name);
    librc=metadata_getnobj("omsobj:SASLibrary?@Id contains '.'",nlibobj,liburi);
  do while (librc>0);
	 /* Look for another library */
     librc=metadata_getnobj("omsobj:SASLibrary?@Id contains '.'",nlibobj,liburi);
  end; /* do while (librc>0) */
/*create macro loop for processing metadata updates*/
%macro reg(name);
proc metalib;
omr (library="&name");
/*now run the loop via proc execute*/
data _null_;
set metadata_libraries;
call execute('%reg('||name||');');

Make sure the user has WriteMetadata access to the library location in the metadata folder structure. Also verify that they do not have WriteMetadata access to any libraries that should not be modified by the users. In this scenario, I have given all SASUSERS WriteMetadata access to everything in the Data folder.

Also, as with any autoexec file change, restart the object spawner to see the new code.

I had asked my SAS Admin to register the Best Movies data table into metadata. Typical of my enthusiasm for getting the task done, I realized that year was included in the name and I really wanted them separated, without calling the SAS Admin late in the evening I was able to register the table automatically when I opened EG back up the next time.

Make a commitment and swim, bike, run with it

Over the past six months, my free time was devoted to training. This impacted my blog posting frequency but the training helped me successfully complete the inaugural Raleigh Ironman 70.3 in June. When thinking back through how this happened, I noticed many parallels to successfully implementing software for organizations.

  1. If you don't commit, it will not happen

    When you begin to ponder whether you should implement a software solution in your organization, you consider resources, costs, benefits, etc. This can be debated forever, but at some point you must make the commitment, buy the software and begin. The same holds true (for me at least) with exercise. If I didn't sign up for this event, the exercise time for my health would have been limited.
  2. Make progress

    There are debates on whether to implement reporting software using a top-down or bottom-up approach. Top-down is when you start with executive support build top-level reports and work down to operational level detail. Bottom-up is building out quick wins at the lowest level, generating buy in from more individuals in your organization and then moving up to encapsulate everyone in the company. Either way, you have to make progress. Each day move forward. Chose and complete a task, for the Ironman training I chose to either swim, bike or run. For a reporting solution, build a new report, include new data, train a new user.
  3. Sharing progress made can increase success

    Each time you reach a goal, you should celebrate. This does not mean a party every day, but share implementation team successes. When completing a set of reports that reduces manual work, document the improved productivity gains. If a return on the investment is realized, let the team know and copy their managers. This helps generate excitement and buy-in to achieve greater success later.

    It seemed like bragging to share the completion of the Ironman 70.3 with others. But I achieved each goal: I finished, I did not injure myself in the process, and I was not absolutely last. Sharing this success with others has actually created some excitement. I have actually noticed several friends realizing that average people can do this race and considering doing something (a 5k, an open water swim, a half marathon, something) themselves!

  4. Continue forward

    You can never stop as the work never ends it only changes. Your organization will continue to mature. Using reports to show current state will no longer be sufficient and users will request forecasts and other reports that require advanced analytic techniques. You must continue to expand the solution to incorporate the evolving and growing needs.

    I haven't stopped, I have already signed up for next year's Raleigh Ironman 70.3. And for SAS, I have signed up to co-author with Stephen Overton to share our SAS OLAP tips and tricks. So check back as I post more about our OLAP writing progress over the coming six months!

Maintaining prompt selections in multiple Web Report Sections (v4.3)

SAS Technical Support has a wonderful how-to guide for using one set of prompt values across multiple sections of a Web Report Studio report. This works great in 4.2, however there is one additional item that I discovered in a 4.3/9.3 install. With the addition of PROC STP and SAS Information Maps using this new proc when calling the stored process, the prompt values in a stored process are not forwarded automatically to the information map or any other mechanism using PROC STP without the use of the OUTPUTPARM option. Since Information Map Studio creates the PROC STP code automatically using the metadata available for the stored process, you must define the output parameter in the prompt/parameter definition area.

For the SetDate parameter definition (step 7), you also need to select the 'New' button in the Output Parameters area at the bottom of the Parameters screen. Then create two output parameters for Year and Month.

  1. In Step 7 of the original instructions, note that at the bottom of the Parameter page is the button for creating a New Output Parameter.Click New for Output Parameters
  2. Select the New button twice and create two output parameters, one for Month and one for Year. Enter parameter information for both month and year
  3. The end result is that the bottom of the Parameter page looks like this:Results from Output Parameter Creation

For the GetDate stored process the original instructions said you can skip step 7, but in version 4.3 you will need to define the output parameters just as you did above so that these two values are passed.

You can see the result in the "Prdsale with Year Month Prompt" information map if you select Test, pick the Month Year Filter and choose 'View SQL'. You will then see the following code:

PROC STP program='/Reports/SetDate(StoredProcess)';
    outputParam month year;

More detail on how PROC STP works is provided in my 2nd book The 50 Keys to Learning SAS Stored Processes.

Cleaner OLAP cube physical folder structures

Each spring and fall, I start clearing out the kid's small clothes, old toys, etc. It can sometimes go too far and reach into my "digital pack-rat" issues. (Wait, what does that mean? Well, just ask the person who last week requested detail on a project I worked 3 years ago, I keep everything hence my self-coined term "digital pack-rat".) My OLAP folder was getting messy because when OLAP cubes are refreshed in-place and users are still accessing the cube, a second generation of the cube is created.

For illustration purposes, I have included below a sample folder structure of a simple OLAP cube created from SASHELP.SHOES. The cube was created at 11:07am, then updated in-place at 11:15am. But at 11:18am, when I ran the update in-place code again I was still viewing the cube in SAS Enterprise Guide. Therefore, since the gen0001 folder was locked for viewing, a gen0002 folder was created. This is the back-end magic for update in-place to work correctly.

You can not just delete the old generational folder (gen0000) when everyone is done using it. Per the SAS OLAP documentation on archiving and deleting prior generations a cube's new generations typically look through prior generational folder. But cleaning up this folder isn't just for saving space or reducing a pack-rat's digital footprint, each version of the cube remains completely viable and can therefore become less efficient over time.

To combine everything back together, you will need to run a full refresh of the cube during off-hours (which adds its own complexity), or run an COALESCE_AGGREGATIONS option in PROC OLAP as shown below.

   CUBE                   = "/Projects/Cubes/Shoes"

Included is what my new clean (and "digital pack-rat" busting) folder looks like:

More information on how to update cubes in place using SAS code is available in my prior post on OLAP refresh best practices.

So many autoexecs; three tips for BI administrators

BASE SAS users are already familiar with the autoexec file. This is a .sas file that typically resides in the installation folder of the SAS executable. Instructions for setting it up in UNIX, Windows, and other environments is readily available on the SAS website. In SAS BI configurations there are as well as SAS programs scattered throughout but which one(s) should be edited? Three tips:

  1. Avoid future conflicts: SAS recommends that no matter where they are located, you should edit the "" version to avoid conflicts in installing future upgrades/updates.
  2. Where do you need the information: You can preassign SAS libraries, set default options, point to macro and format locations, or create global macro variables with these autoexec files, just choose wisely where you need the information available.
  3. Precedence is important: There is also a main file for the entire SAS installation at: C:\\Program Files\\SASHome\\SASFoundation\\9.3\\ This is the granddaddy file that drives everything else. Of course an option here could then be overwritten by an option down the stream in another file. So understand the order of precedence.




Affects all servers within the application server context \\SASApp
The SAS Schedule Server can execute jobs on the BatchServer. Changes to this file only affect the BatchServer jobs. \\SASApp\\BatchServer
Only the Connect Server \\SASApp\\ConnectServer
Only the OLAP Server \\SASApp\\OLAPServer
Only the Pooled Workspace Server \\SASApp\\PooledWorkspaceServer
Only the Stored Process Server \\SASApp\\StoredProcessServer
Only the Workspace Server \\SASApp\\WorkspaceServer
Affects all services within the SASMeta application server context (recommended for use by SAS Administrators managing SAS applications) \\SASMeta
SASMeta - Batch Server only \\SASMeta\\BatchServer
SASMeta - Workspace Server only \\SASMeta\\WorkspaceServer
Scheduling Server \\SchedulingServer
Share Server \\ShareServer

Verifying a SAS BI installation

After installing SAS Business Intelligence, I would also suggest verifying that all is working appropriately before adding users or importing content from another machine. (My template checklist is included below.) The act of creating the content is an extremely important test (to immediately rectify issues with accessing data, or saving to the SAS Content Server).

For installers who want to be super fast about this, a few items can be built in advance and run from a BASE SAS or Enterprise Guide with relatively little tweaking on any environment. PROC INFOMAPS will create an information map for your use and PROC OLAP creates OLAP cubes without going through the clicks of a SAS OLAP Cube Studio. That could then quickly set up the content to build/test the remaining items.

Sample OLAP Cube Code (v9.3)
Sample Proc INFOMAPS code

Checklist for Verification

  Interface Function
1 RDC onto Server by SAS Administrator Ability to stop and restart all SAS Services
2 SAS Management Console Import tables from an access library (Oracle, SQL, etc)
3 SAS Information Map Studio Create sample information map
4 SAS Web Report Studio Create and save sample report
5 SAS OLAP Cube Studio Create sample OLAP Cube
6 SAS Enterprise Guide Open sample OLAP Cube
7 SAS Enterprise Guide Create sample Stored Process
8 SAS Enterprise Guide Open SAS data table into project
9 SAS Portal View sample Stored Process
10 SAS Portal View sample Web Report Studio Report
11 SAS Portal Ability to change user preferences
12 SAS BI Dashboard The BI Dashboard interface opens successfully
13 SAS BI Dashboard Create indicator data from an Information Map
14 SAS Add-in to MS Office (Excel) Open sample OLAP Cube in Pivot Table
(Word) Create sample graph report off data table
(any) Execute Stored Process Sample
15 SAS DI Studio Ability to view data and execute a sample ETL process
16 SAS DI Studio Ability to schedule a task

Go on a date with SAS BI Dashboard, part 2

Last week I provided the steps for how to go on a date, at least by using dynamic prompts in BI Dashboard. Now that you have seen how to create them, lets discuss making them smarter. For this example, the customer is asking for two prompts to affect a single chart. These prompts will have a start date and an end date. If the user selects an end date that is before the start date, the report will fail to generate results. For example, data starting in June 2012 but ending in October 2011. Or in code terms a nonsensical where clause such as:

"01jun2012"d <= date <= "01oct2011"d.

Here is an example of the BI Dashboard smart dating in action. Notice that when selecting June only October-July appear.

You too can create this to avoid possible user "errors". Included are the steps:

Step 1. Edit the stored process

In order to use the same stored process for both prompts, I will need to make some minor changes to the code.

%global sdate; Globally define the new parameter sdate for start date.
%macro dt;
data test;
format date monyy5.;
%do i=0 %to 11;
date=intnx('month', today(), -&i);
This code is the same.
%if %length(&sdate) > 0 %then %do;
if date > input("&sdate", mmddyy10.) then output;
%else output;;
Test for the sdate existance using %length, if it exist only output end dates greater than what was selected.
IMPORTANT TIP: Date values are passed from BI Dashboard as text and resemble mmddyy10 format.
%end; %dt
Everything else is the same. Refer back to the rest of the code in Part 1 post.

Step 2. Add a prompt to the stored process metadata

This will be used in BI Dashboard to define the relationship between the indicators. The prompt is optional as the code will work with or without a selected value so I did not select the prompt option 'Requires a non-blank value'. As you see below, I named the prompt sdate to match the code &sdate above.

Step 3. Create the second (end date) indicator

Since the data source itself does not change, I do not need to edit the indicator data object. A new indicator for the end date prompt needs to be created.

Extra TIP: Dynamic prompts assume that you will be leveraging a range initially. For these prompts, I select any range I have defined but when I am editing the properties of the prompt I change the prompt type from 'Combo Box with gauge' to just a 'Combo Box'. Then all the range questions disappear.

Step 4. Define interactions between the start date and the end date indicators

In the dashboard edit screen in BI Dashboard, add both the start date and end date indicators.
Select the start date indicator and choose the 'Interactions' task off the menu bar.

Select the interaction type 'Filter data on remote server' and choose source data 'date' with target data 'sdate' - as seen below.

Go on a date with SAS BI Dashboard, part 1

Don't give up on the dating game. Just because there is no prompt type specifically for dates, it can still be accomplished with some stored process matchmaking. BI Dashboard offers a dynamic prompt indicator which provides the ultimate flexibility in developing prompt based reports and dashboards.

The dynamic prompt indicator allows you to generate a drop down box of values dynamically from a specified indicator data element. In this post (part 1), I will explain how to create a drop down box of the last 12 months with no required monthly maintenance (adding months, running ETL scripts to populate a source table, etc). Next week, in Part 2 I will describe how to define interactions between multiple date indicators so your dates are smarter.

Step 1. Develop a stored process to create a data source for the indicator data

Included is the code to create a temporary data table (work.test) which publishes the data to a package 'DynamicDateList' for the BI Dashboard to pick up.
Note that this code dynamically creates a list of the last twelve months (including the current month) in descending order. Some customer require that a default selection is setup, reducing the amount of clicks to get the most often requested report. In BI Dashboard, the dynamic prompt displays by default the first record in the table. For this example, the code is written to create months in descending order so the current month will appear as the selection.

%macro dt;
data work.test;
format date monyy5.;
%do i=0 %to 11;
date=intnx('month', today(), -&i);
%let temploc = %sysfunc(pathname(work));
data null;
length path $32767;
rc = 0;
pid = 0 ;
description = '0';
name = '';
call package_begin( pid, description, name, rc);
call insert_dataset( pid, "WORK", "test", "dynamic date list", '', rc);
call package_publish( pid, "TO_ARCHIVE", rc, "archive_path, 
            archive_name, archive_fullpath","&temploc", "DynamicDateList", path );
call symput( '_ARCHIVE_FULLPATH', path);
call package_end( pid, rc);

Important Note: The stored process must be registered to generate a package result.

Step 2. Setup the indicator data

Choose 'Stored Process' as the data source and choose the stored process created in Step 1. If the published data set name ('test' in this example) doesn't appear there is an error in either the stored process or within the metadata registration. To troubleshoot, I would recommend moving to the SAS Stored Process web application (http://server:port/SASStoredProcess) to run the same stored process and review the log message there.

Step 3. Create a dynamic prompt indicator leveraging the indicator data

After saving the indicator data from Step 2, create a dynamic prompt indicator using this new indicator data. As you can see the default selection is October (Oct12), the first record in the test data table.

Check back next week on how to make the date prompts smarter in part 2!

And for another example of using a stored process to dynamically generate a data result for a dashboard indicator, check out my 2nd book The 50 Keys to Learning SAS Stored Processes.

Always learning something new, two awesome Enterprise Guide tricks

Who says anyone is an expert at something? I definitely do not when it comes to SAS (well, anything for that matter). Each time I sit through a presentation at a conference I learn something new. During the SAS Western Users Conference, fondly know as WUSS, this month I learned two new and awesome tricks for using SAS Enterprise Guide.

Trick #1 - Jump through the log

Ok, ok, I should have known about this already given that my co-author Tricia Aanderud posted about this trick in April. But as most of you know, it is difficult to keep up with everything out there.
Note in the below example I have 13 pages of logs that I would need to jump through to see the first issue. All WARNINGS and ERRORS are accessed via the up/down arrow.

Trick #2 - Create additional process flows to organize your work

When organizing projects, multiple process flows allow you to select areas of the project to run or schedule. I've had some awfully large SAS Enterprise Guide projects in my past. Using this trick would have saved me manual time selecting individual items that I wanted to run in sequence.

Select File New Process Flow to create additional process flows

In the example below, I have 3 process flows ('Initial Review', 'Analysis', and 'Report').

Multiple process flows allow you to run select areas of project

If you don't think you can learn something new, I challenge you to attend a regional SAS User Group conference this fall. Bet you will discover some new awesome idea or trick to implement back at your workplace!

  • About this blog

    Angela Hall manages a team of SAS Fraud Framework implementers within the SAS Solutions On-Demand organization. Angela is co-author of the book Building Business Intelligence using SAS: Content Development Examples and The 50 Keys to Learning SAS Stored Processes
    BI Bible Book 50 Keys to STP Book
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives