What to do when all your boss wants is a spreadsheet

Most SAS programmers have been here. Someone just wants a handful of numbers that they can add to a graph or power point presentation that is due tomorrow. You have the data files, you have a job to summarize it, and you have a dilemma. How do I get my data where the boss wants it, into Excel?

Transferring data between SAS and Microsoft Excel may be easier than you think.

I do not know how many times I have “Googled” something and gotten a cryptic answer that was marginally effective or even useless. You know that something somewhere will tell you how to do this, but where is that. Then you remember that the company that wrote the software has information online that will tell you how to do everything that your software can perform. But if you do not know the name of the procedure to use, how do you find the documentation about it?

My new book, Exchanging Data between SAS and Microsoft Excel: Tips and Techniques to Transfer and Manage Data More Efficiently is designed to help solve that problem by culling information from the SAS manuals and my personal experience into a document that shows you how to transfer data between SAS and Excel. In this first article on the subject, I will show you a simple way to transfer data to Excel with very little effort on your part. It is done with a “Right Click” of your mouse.

When viewing your SAS datasets in the SAS Explorer window all of the datasets have an icon or other display representing the SAS dataset. By using a “Right Click” on the dataset icon a menu appears with an option entitled “View in Excel”. Selecting this option creates an HTML file that Excel can open and use to view the data. In fact, SAS actually invokes Excel to open the HTML output file so you can use the data in Excel. The file will typically have a name similar to “#LNxxxxxx.xls”. The three byte extension (xls) allows Excel to open the file without hesitation prior to Excel version 2007. The newer versions check the contents of the file and if the file name ends in .xls but contains HTML or XML formatted commands for Excel then a message is displayed asking you to verify that you want to proceed.  Select “Yes” and Excel opens and your data appears.

Post a Comment

SAS author’s tip: Macro language timing is everything

This SAS tip is from Robert Virgile and his book “SAS Macro Language Magic: Discovering Advanced Techniques”.

We hope you find this tip useful. You can also read an excerpt from Virgile’s book.

In macro language, as in life, timing is everything.  Macro language students need to learn the timing of the DATA step, the timing of macro language, and the relationship between the two.

Let’s begin with the DATA step.  All DATA steps operate in two separate phases:

  1. The compilation phase. In a nutshell, the software checks the syntax of the DATA step statements, and sets up storage space in memory to hold each variable.
  2. The execution phase. Given that there are no syntax errors, the software executes the DATA step … reading data, performing calculations, outputting results.

Macro language statements may have an impact on step 1, the compilation phase.  The resolution of macro variables affects the statements within the DATA step:

%let dataset=MALES;
data &dataset;
   set everyone;
   if gender='M';

During the compilation phase of the DATA step, &DATASET resolves into MALES.  Therefore, the name of the output data set becomes MALES.  However, macro language statements impact only the compilation phase, not the execution phase of the DATA step.  This concept forms a frequent stumbling block when learning macro language.  To illustrate, consider this DATA step (before the programmer complicated it by adding macro language):

   set everyone;
   if gender='M' then output MALES;
   else if gender='F' then output FEMALES;

Perhaps the programmer was trying to learn macro language, and using this as an experiment.  Perhaps the programmer sought job security.  But the simple DATA step above morphed into this nonworking version:

   set everyone;
   if gender='M' then do;
      %let dataset=MALES;
   else if gender='F' then do;
      %let dataset=FEMALES;
   output &dataset;

Mistakenly, the programmer believed that %LET statements could execute as part of the DATA step.  That is just never true.  %LET statements execute immediately … in this case before the compilation phase of the DATA step completes.  So the order of execution of these statements is:

%let dataset=MALES;
%let dataset=FEMALES;
   set everyone;
   if gender='M' then do;
   else if gender='F' then do;
   output FEMALES;

Clearly, the program revisions alter the outcome, forcing every observation into a single data set.  Remember these basics:

  • %LET statements are never part of a DATA step. Macro language statements execute immediately, and do not wait for the DATA step to begin executing.
  • If you need to control macro variables (either assigning or retrieving a value) while the DATA step executes, tools exist. But they are DATA step tools, not macro language tools.  The primary ones, CALL SYMPUT and SYMGET, will become the subject of a future article.

Let’s consider another example that both illustrates timing and illustrates a basic use of CALL SYMPUT.  Once again, improper use of macro language complicates the program.  Here is the original version, without macro language:

data percentages;
   do until (last.state);
      set cities;
      by state;
      state_pop + city_pop;
   do until (last.state);
      set cities;
      by state;
      percent_pop = city_pop / state_pop;

For each STATE:

  • The top DO loop computes STATE_POP (the total population for the STATE).
  • The bottom DO loop reads the same observations, computes PERCENT_POP for each, and outputs the result.

Now a macro language student might attempt a slightly different, nonworking variation:

data percentages;
   do until (last.state);
      set cities;
      by state;
      state_pop + city_pop;
   call symputx ('denom', state_pop);
   do until (last.state);
      set cities;
      by state;
      percent_pop = city_pop / &denom;

Bad timing is the critical issue:

  • Before the DATA step runs, &DENOM does not exist.
  • The software doesn’t begin to run the DATA step until it encounters the RUN statement.
  • By that time, the reference to &DENOM has already been encountered, generating an error.

There are many ways to introduce timing errors.  The remedy begins with understanding the relationship between macro language statements, DATA step compilation, and DATA step execution.  Most importantly, macro language statements execute immediately, and are never part of DATA step execution.

For more information about the macro language and the magic you can create with it, check out Robert Virgile’s book “SAS Macro Language Magic: Discovering Advanced Techniques”.

Post a Comment

SAS author’s tip: Bayesian analysis of item response theory models

This SAS tip comes from Clement A. Stone and Xiaowen Zhu, authors of Bayesian Analysis of Item Response Theory Models using SAS.

Item response theory (IRT) models are the models of choice for analyzing item responses from assessments in the educational, psychological, health, social, and behavioral sciences. SAS PROC MCMC can be used in all types of assessment applications to investigate how particular characteristics of items and how particular characteristics of persons affect item performance. Use of the SAS system for Bayesian analysis of IRT models has several significant advantages over other available programs: (1) It is commonly used by researchers across disciplines; (2) it provides a robust programming language that extends the capability of the program—in particular, the capability for model checking; and (3) it shows increased performance and efficiency through the use of parallel processing.

Our book Bayesian Analysis of Item Response Theory Models using SAS provides step-by-step instructions for using SAS PROC MCMC to analyze various IRT models. Working through the examples in the book or with some prior knowledge of IRT models and Bayesian methods, you can…

Estimate simple as well as complex IRT models using PROC MCMC. It is a straightforward task in PROC MCMC to implement Bayesian estimation of a variety of simple and more complex IRT models. All you need to do is express the response probability function or likelihood for your particular model, declare the model parameters, and specify prior probability distributions for these parameters. PROC MCMC may be particularly useful for applications investigating multidimensionality or heterogeneity in item responses due to, for example, differential item functioning, content related processes (shared context or word orientation), or response related processes (solution strategies, response styles, response sets).

Evaluate the estimation of the model. Because the Markov Chain Monte Carlo (MCMC) method is a simulation based approach, you should determine whether the simulated draws have converged to the target posterior distributions for model parameters. PROC MCMC includes a number of tools and statistics for evaluating the convergence of the sampling process in the posterior distributions for model parameters. These include history and autocorrelation plots as well as various diagnostic tests and statistics: Gelman-Rubin, Geweke, Heidelberger-Welch (stationary and half-width tests), Raferty Lewis, and effective sample size.

Compare competing models and evaluate model fit. In many applications, different models may be estimated that reflect competing theoretical perspectives or competing formulizations of the item and person characteristics that are modeled. PROC MCMC and the SAS system provide the tools for choosing among competing models. The Posterior Predictive Model Checking (PPMC) method is a commonly used Bayesian model checking tool and has proved useful for evaluating the fit of models. PPMC can be implemented using the robust programming language in the SAS system and a variety of different plots can also be obtained to display results.

In conclusion, PROC MCMC makes estimating and model checking of IRT models in a Bayesian paradigm more accessible to researchers, scale developers, and measurement practitioners.

We hope you find this blog informative and invite you to read a free chapter from the book here.

Post a Comment

Raiders of the lost spreadsheet

Have you ever peered intently into an unfamiliar data delivery directory, realized what was in it, rolled over onto your side, stared blankly into the distance, and dejectedly uttered something akin to:

"Spreadsheets! Why did it have to be spreadsheets?"

If so, then we are definitely on the same page. Why does it always have to be spreadsheets?

The answer to that question is actually pretty obvious when you think about it. The popularity of Microsoft Office has made Excel one of the most popular mediums for storing data. It is used extensively in grade schools, middle schools, high schools, and colleges. People with home businesses use it; office administrators use it; clerical staff use it; scientists use it; lawyers use it; hospital workers use it; Federal, state and local government workers use it; and programmers use it too.

An individual who needs to store data in electronic format and then process it may not have SAS, or C++, or JAVA, or C#, or PYTHON, or PHP, or R, or MATLAB, or ColdFusion, or FOCUS, or FORTRAN, or Groovy, or JavaScript, or MOBY, or MUMPS, or NATURAL, or Perl, or PHP, or PL/SQL, or PowerShell, or Python, or S-PLUS, or Visual Basic installed on his or her PC. But, that person will undoubtedly have Microsoft Office and thus have Excel. That is why it always has to be spreadsheets.

But, processing data stored in spreadsheets is not really a problem for intrepid SAS programmers. When I go on a data exploration expedition where there is a good chance of encountering spreadsheets, I pack the usual: my brown leather jacket, fedora, and bullwhip. But, most importantly, I put SAS/Access Interface to PC Files into my backpack.

SAS/Access Interface to PC Files is a SAS for Windows product that allows you to read, write, and update data in Excel and Access. As such, it is a must-have for your Windows SAS installation.

Here is an example of a program that I use to map out the contents of an unexplored spreadsheet:

ods rtf file="G:\BigProject\Worksheets in NewDataSpreadsheet.rtf";
libname xlslib "G:\NewProject\DeliveryDirectory\NewDataSpreadheet.xlsx" access=readonly;
proc sql;
create table WorkSheets as
select distinct(compress(MEMNAME,"',$")) as WorkSheet_Name,
name as ColumnName
from dictionary.columns
where libname = 'XLSLIB';
proc print noobs data=WorkSheets;
var WorkSheet_Name ColumnName;
title1 "Workseets in NewDataSpreadsheet.xlsx";
ods rtf close;

The ODS statement specifies that my report will be created as an RTF document. Because I have SAS/Access Interface to PC Files, the LIBNAME statement allocates the NewDataSpreadheet.xlsx spreadsheet much the same way as it would for a SAS data set. (Notice that I specified access=readonly so that I do not accidentally update the spreadsheet). Since I have "LIBNAME-d" the spreadsheet, information about its worksheets and column names is now available in the SAS Dictionary Tables.

I use PROC SQL to extract the name of each worksheet (variableWorkSheet_Name) in the Excel file; and the names of the columns (variableColumnName) within each worksheet and then plop them into a SAS data set for further exploration. The code snippet compress(MEMNAME,"',$") gets rid of the annoying quotes and dollar signs that are found in spreadsheet MEMNAMEs. Then, I use the PRINT procedure to create a report. A simple, neat, quick, and easily macro-tized piece of code.

Here are several good references that you can use to find out more about processing spreadsheets with SAS:

Armed with those resources, some pluck, a sense of adventure, and with your own trusty copy of SAS/Access Interface to PC Files, you too can be a raider of the lost spreadsheet!

Best of luck in all your SAS endeavors!

Post a Comment

3 bestselling books at ENAR 2015 Spring Meeting

SASBooks_ENARWe had a lot of books at the ENAR 2015 Spring Meeting in Miami last week, but these were the top three bestsellers.

  1. Analysis of Observational Healthcare Data using SAS by Douglas E. Faries, Robert L. Obenchain, Josep Maria Haro, and Andrew C. Leon
  2. Survival Analysis Using SAS®: A Practical Guide, Second Edition by Paul D. Allison
  3. Bayesian Analysis of Item Response Theory Models Using SAS by Clement A. Stone and Xiaowen Zhu

I also met a young girl who’s ready to become our next author. Or maybe she just likes our buttons. Either way, you’re never too young to think about becoming a SAS Press author. If you have any publishing ideas, visit SAS Books to learn more.


If you were at the conference and picked up the card with the ENAR discount code, don’t forget to use it before April 1st.

Post a Comment

I Know What You Did Last Summer!

I know what you did last summer.

If it was unintentional, then you probably don't know what I am talking about.  If it was intentional, then you probably thought that I would never find out.  Either way, the damage is done.  The actions that you took on that warm summer evening are as clear to me now as they would have been if I had been watching over your shoulder while you did them.  I know what you did last summer: You updated one of my SAS data sets.

We work on the same project  and both have read, write, update, and delete rights to the project's directories.  The production SAS data set that I created for the spring data delivery was inexplicably updated in the summer.  And, you were the one who did it.  Because we have been teammates for a while, I am giving you the benefit of the doubt.  I bet that you made a copy of the production SAS program for a different use, updated it, but forgot to change the LIBREF to point to your test SAS data library.  So when you ran it, you accidentally deleted 400 observations and updated 273 observations in the production data set.

Oh, you want to know how I determined it was you and how I know exactly what changed.

Well, because that production data set is very important, I used PROC DATASETS to create a SAS audit trail file for it.  SAS audit trails record changes to SAS data sets.  They can record the before and after image of observations that were changed, the type of change, the date/time of the change, and the userid of the person who changed the SAS data set.  So, SAS audit trails can be very useful in a shared directory environment where many staff members have access to important SAS data sets.

Here is the code I used to create the audit trail for the production SAS data set:

proc datasets library=prodlib nolist;
        audit SpringDeliveryData;
        log admin_image=yes

When I executed that DATASETS procedure code, SAS created a file named SpringDeliveryData.sas7baud in the same directory as the SAS data set.  When an observation is updated, added, or deleted from SpringDeliveryData, SAS writes an observation to the audit trail data set containing the variables in the original SAS data set and six specific audit trail variables.  Of note are _ATDATETIME_ which specifies the date/time of the change; _ATOPCODE_ which specifies the type of change that took place--e.g. add, delete, modify; and _ATUSERID_ which specifies the userid of the person whose SAS program made the change.

When I noticed that SpringDeliveryData had been modified, I used a PROC PRINT to dump the audit trail file.  That is how I know that the data set was updated at 5:27 PM on August 5th by a program submitted under your userid.

You are interested in using SAS audit trails for your own production SAS data sets?  Great!  You can find a comprehensive write-up in the documentation on support.sas.com at this link.

Don't fret about the updates to the SpringDeliveryData SAS data set.  I am going to request that our systems administrator restore the data set to the day before the summer update.  That way, we will have the original data set available in case our client has questions about it.

Good to know that I was right that you accidentally updated the production data set last summer.  Oh, don't go.  Unfortunately we have one more thing to talk about:

I know what you did last fall...

Best of luck in all your SAS endeavors!

Post a Comment

SAS Press is heading to ENAR 2015 Spring Meeting

Are you heading to the ENAR 2015 Spring Meeting in Miami this week? SAS author and Program Chair Mithat Gönen, of Memorial Sloan-Kettering Cancer Center, and Associate Chair Brisa Sánchez, of the University of Michigan School of Public Health have created an outstanding scientific program this year. The sessions cover a wide range of topics such as, data sciences (big data), genomics, clinical trials, neuroimaging, biomarkers, health policy, electronic health records, ecology, and epidemiology.

After whetting your appetite at some of these great sessions, come and browse the SAS Press booth and find informative, up to date titles to further your knowledge, such as SAS classics: Analyzing Receiver Operating Characteristic Curves with SAS, Gönen; and Analysis of Clinical Trials Using SAS, Dmitrienko et al; and preview new titles: Bayesian Analysis of Item Response Theory Models Using SAS by Clement Stone & Xiaowen Zhu and Time Series Modeling Using the SAS VARMAX Procedure, by Anders Milhoj.

While we do have some great titles, I know we haven’t covered everything. Please stop by and have a quick chat with me. While I am happy to discuss what we do have available - perhaps there is a topic you would like to see covered but we don’t have? Perhaps you have a topic you would love to write about?

Post a Comment

Mobile devices in schools: Beyond “cool”

9781118894309.pdfThe ability to do things on the go – email, tweet, listen to a podcast, find a restaurant or ATM – are things we all do all the time, as adults. This instant access to the Internet and apps allows us to be more productive with our time and make better decisions, just using the little computer in our hand.

Schools are increasingly embracing mobile technology in the form of tablets and notebook computers, and this is changing the way we think about education and offering more opportunities for learning with mobile devices. And the benefits are real.

Giving kids instant access to the web and apps can make the world their classroom, teach them real world skills alongside the curriculum and enable a more collaborative environment for their work. And, access to tablets is a much more affordable and durable option than laptops, the previous model of 1:1 access. Also, personalized education is a big buzzword in the industry, and mobile devices offer an unprecedented level of individualization for learning tools. And perhaps the most important benefit for the students: tablets are inherently cool. The engagement factor is a huge perk, and any way to draw in students who aren't naturally engaged is a good thing.

As more schools adopt these devices, we are learning more and more about how they impact the learning and possibilities. Teachers are also learning as they go, and our new book, Mobile Learning: A Handbook for Developers, Educators and Learners, aims to provide a road map for effective conceptualization, implementation and use of these devices in education. It also provides details for developers in how to create quality educational content and navigate the unique marketplace for educational apps.

Teachers and developers working in collaboration is a key element of a strong mobile learning marketplace, which is no surprise, as we are part of just such an interdisciplinary team at Curriculum Pathways.

In our book, we advocate for the thoughtful incorporation of mobile devices, not simply retrofitting old lessons and adding an iPad, say, to replace a worksheet. Mobile devices can add a level of engagement and learning.

For instance, in a non-technical classroom, students might sit in class and learn about plant life cycles; the teacher would have handouts and maybe a presentation with pictures. However, using mobile devices, this lesson could involve students going outdoors, taking pictures of different plants and identifying the plant and its stage in the life cycle, and maybe even creating and sharing online an ebook to explain. Adding mobile devices, in this example, adds a level of creativity, physical activity, collaboration and engagement that isn't attainable through a classroom lecture and worksheets.

We see examples like this everywhere: “there’s just something about these devices” is one of the most common sentiments we heard from the teachers we interviewed in writing this book. We aimed to break down what makes them special to kids and promising to educators, using educational theory and pedagogy.

There are certainly controversies and drawbacks, all of which are addressed in this book as well – data privacy, screen time limits and digital citizenship, to name a few. However, we see the incorporation of mobile devices into education as an exciting and promising development, and when done responsibly and intelligently, a way to teach all students in a way that more closely mirrors “real life.”  Our book presents a practical look at the latest in learning technologies. From theory to practice, Mobile Learning is a great start for any school or development team looking to get in the mobile learning realm.


To learn more, check out Mobile Learning: A Handbook for Developers, Educators, and Learners by Scott McQuiggan, Lucy Kosturko, Jamie McQuiggan and Jennifer Sabourin.

Post a Comment

Is your personal language sabotaging you?

Bevenour coverAs an executive coach, I've worked with thousands of managers and business leaders whose personal language sabotaged their effectiveness at driving change, not to mention their day-to-day team management.

For your Inner Leader to shine through, you need to master your personal language--your way of communicating your company’s goals and how your team needs to work to meet them.

Meet “Howard,” a vice president in a prestigious global professional services firm.  Howard came to me because he was increasingly frustrated and stressed by what he thought was his team’s inability to follow his instructions. After several conversations, however, Howard realized that his own communication issues were decreasing his effectiveness as a leader. To change, Howard needed to master his personal language by learning to ask open-ended questions and truly listen to the response.

The simple rule for business professionals is to ask open-ended questions that start with the word what. What is factual, and why is emotional. By asking questions that begin with what, your listeners will think and begin to draw their own conclusions. And if you ask these kinds of questions, you will be much more successful in relating your requests to your team.

Howard agreed to complete the following exercise: For every meeting/phone call/interaction, he prepared six to ten what questions. He asked his questions and really listened to the answers, rather than reverting to a “tell” posture and continuing to ask more questions.

Here are some examples of “what questions” that you could use:

  • What does success of this (project, meeting, presentation) look like?
  • What are the top-three steps to achieve success?
  • What about your idea/direction will contribute to the success of this (project, meeting, presentation)?
  • What is the ideal outcome?
  • What would that ideal outcome look like?
  • What is the permanent solution here?
  • What do you want to happen?
  • What is the truth here?

Howard diligently followed this concept of asking what questions, and in six months his frustration and stress level had decreased significantly. His personal language became razor precise, and he is delighted at how his staff and colleagues have responded. Howard’s Inner Leader is smiling.

By recognizing that his personal communication style was sabotaging his effectiveness and incorporating this one change to the way that he communicated with his team, Howard became the kind of leader a competitive, success-driven organization needs.

Post a Comment

Excuse me; but, is that a 32-bit or a 64-bit SAS catalog?

I don’t know about you, but I get pretty determined to prove them wrong when people tell me that I cannot do something. I am not talking about fantastical things such as flying through the heart of the sun and out the other side without getting burned. Nor, am I talking about social things like becoming president of the United States or an author on the New York Times Bestseller list. And, I am not talking about physical things such as swimming 2.4 miles, biking 112 miles, and running 26.2 miles back-to-back on the same day. No, I am talking about being told that I cannot do something with SAS.

For example, I was once told:

  • that you could not summarize impossibly large SAS data sets to load a data warehouse. So, I figured out a way to do it.
  • that you could not measure the performance of SAS/IntrNet application programs. So, I figured out a way to do it.
  • that you could not determine which SAS products individual staff members were using on shared servers. So, I figured out a way to do it.
  • that you could not create a chargeback system for UNIX and Linux systems without purchasing an accounting package. So, I figured out a way to do it.

Consequently, when I was told that there was no SAS facility for programmatically determining whether a Windows SAS catalog was a 32-bit catalog or a 64-bit catalog, I resolved to figure out a way to do it.

The background is that my organization plans to migrate from 32-bit SAS to 64-bit SAS as part of a SAS 9.3 to SAS 9.4 upgrade. SAS data sets are compatible between the two bitages, but SAS catalogs are not. Stating the problem: you cannot open a 64-bit SAS catalog with 32-bit SAS. So, it is advantageous to have a tool for determining which SAS catalog is which bitage as you move forward into a mixed-bit programming environment during the transition.

I did my due diligence and researched every place that I thought I might be able to find a way to differentiate the bitage. An indicator in PROC CATALOG if I ran it with the STAT option enabled? Nope. Something in the directory portion of a PROC CONTENTS listing with the DETAILS option specified? Nope. A lesser-known option of PROC DATASETS? Nope. How about a flag in the Dictionary Tables CATALOGS table or in the SASHELP Views VCATALG view? Nope. A Usage Note on support.sas.com. Nope. A SAS technical paper published at either SAS Global Forum or a Regional SAS Users Group? Nope, not that either.

I figured that if you could not tell the difference within SAS, itself, how about if you looked at the catalogs as simply files. So, I got a 32-bit SAS catalog and a 64-bit SAS catalog and opened them with WordPad to take a look inside. Bingo! There was enough information in the very first record of both catalog files to determine the difference. So, I wrote a program that tested for the string of characters that told the tale.

Here is the SAS program that I wrote:

/*Macro to determine bitage of a SAS catalog */
%MACRO Test_Cat_Bitage(SASCatalog);
filename sascat "&SASCatalog";
data decompcat(keep=CAT_BITS SAS_Catalog);
length CAT_BITS $8
SAS_Catalog $50;
infile sascat obs=1 truncover;
input bigline $charzb32767. ;
if index(bigline, "W32_7PRO") > 0 then CAT_BITS = "W32_7PRO";
else if index(bigline, "X64_7PRO") > 0 then CAT_BITS ="X64_7PRO";
else CAT_BITS = "Unknown ";
SAS_Catalog = strip("&SASCatalog");
label CAT_BITS = "Bitage of SAS Catalog"
SAS_Catalog = "Full Path SAS Catalog Name"
proc append base=AllCatalogs
%MEND Test_Cat_Bitage;
/* Example of executing the macro to read a catalog file */

As you can see, the program determines the bitage of a SAS catalog by treating the catalog as a file, not as a catalog. It opens the catalog file and inspects the first line for a specific character string: W32_7PRO for 32-bit catalogs; X64_7PRO for 64-bit catalogs. Once it determines the bitage, the program writes an observation to data set AllCatalogs in the WORK library. Each observation in AllCatalogs has two variables: CAT_BITS, which specifies whether the catalog is 32 or 64 bits, and SAS_Catalog, which is the full path name of the SAS catalog file.

The object of this particular setup is to run the macro against several, a score, dozens, hundreds, or thousands of SAS catalogs and build a SAS data set which identifies their bitage. After that, one may choose to copy AllCatalogs to a permanent SAS data set, or create a report from it. Or both.

Being a talented SAS programmer yourself, I would bet that you also do not like it when people tell you that you cannot do something with SAS. Right? Yea, it goes with the territory. How about posting a comment telling us about a particularly difficult SAS problem you encountered and the clever way that you resolved it? Bet you can’t do that.

Best of luck in all your SAS endeavors!

Post a Comment