SAS author’s tip: Bayesian analysis of item response theory models

This SAS tip comes from Clement A. Stone and Xiaowen Zhu, authors of Bayesian Analysis of Item Response Theory Models using SAS.

Item response theory (IRT) models are the models of choice for analyzing item responses from assessments in the educational, psychological, health, social, and behavioral sciences. SAS PROC MCMC can be used in all types of assessment applications to investigate how particular characteristics of items and how particular characteristics of persons affect item performance. Use of the SAS system for Bayesian analysis of IRT models has several significant advantages over other available programs: (1) It is commonly used by researchers across disciplines; (2) it provides a robust programming language that extends the capability of the program—in particular, the capability for model checking; and (3) it shows increased performance and efficiency through the use of parallel processing.

Our book Bayesian Analysis of Item Response Theory Models using SAS provides step-by-step instructions for using SAS PROC MCMC to analyze various IRT models. Working through the examples in the book or with some prior knowledge of IRT models and Bayesian methods, you can…

Estimate simple as well as complex IRT models using PROC MCMC. It is a straightforward task in PROC MCMC to implement Bayesian estimation of a variety of simple and more complex IRT models. All you need to do is express the response probability function or likelihood for your particular model, declare the model parameters, and specify prior probability distributions for these parameters. PROC MCMC may be particularly useful for applications investigating multidimensionality or heterogeneity in item responses due to, for example, differential item functioning, content related processes (shared context or word orientation), or response related processes (solution strategies, response styles, response sets).

Evaluate the estimation of the model. Because the Markov Chain Monte Carlo (MCMC) method is a simulation based approach, you should determine whether the simulated draws have converged to the target posterior distributions for model parameters. PROC MCMC includes a number of tools and statistics for evaluating the convergence of the sampling process in the posterior distributions for model parameters. These include history and autocorrelation plots as well as various diagnostic tests and statistics: Gelman-Rubin, Geweke, Heidelberger-Welch (stationary and half-width tests), Raferty Lewis, and effective sample size.

Compare competing models and evaluate model fit. In many applications, different models may be estimated that reflect competing theoretical perspectives or competing formulizations of the item and person characteristics that are modeled. PROC MCMC and the SAS system provide the tools for choosing among competing models. The Posterior Predictive Model Checking (PPMC) method is a commonly used Bayesian model checking tool and has proved useful for evaluating the fit of models. PPMC can be implemented using the robust programming language in the SAS system and a variety of different plots can also be obtained to display results.

In conclusion, PROC MCMC makes estimating and model checking of IRT models in a Bayesian paradigm more accessible to researchers, scale developers, and measurement practitioners.

We hope you find this blog informative and invite you to read a free chapter from the book here.

Post a Comment

Raiders of the lost spreadsheet

Have you ever peered intently into an unfamiliar data delivery directory, realized what was in it, rolled over onto your side, stared blankly into the distance, and dejectedly uttered something akin to:

"Spreadsheets! Why did it have to be spreadsheets?"

If so, then we are definitely on the same page. Why does it always have to be spreadsheets?

The answer to that question is actually pretty obvious when you think about it. The popularity of Microsoft Office has made Excel one of the most popular mediums for storing data. It is used extensively in grade schools, middle schools, high schools, and colleges. People with home businesses use it; office administrators use it; clerical staff use it; scientists use it; lawyers use it; hospital workers use it; Federal, state and local government workers use it; and programmers use it too.

An individual who needs to store data in electronic format and then process it may not have SAS, or C++, or JAVA, or C#, or PYTHON, or PHP, or R, or MATLAB, or ColdFusion, or FOCUS, or FORTRAN, or Groovy, or JavaScript, or MOBY, or MUMPS, or NATURAL, or Perl, or PHP, or PL/SQL, or PowerShell, or Python, or S-PLUS, or Visual Basic installed on his or her PC. But, that person will undoubtedly have Microsoft Office and thus have Excel. That is why it always has to be spreadsheets.

But, processing data stored in spreadsheets is not really a problem for intrepid SAS programmers. When I go on a data exploration expedition where there is a good chance of encountering spreadsheets, I pack the usual: my brown leather jacket, fedora, and bullwhip. But, most importantly, I put SAS/Access Interface to PC Files into my backpack.

SAS/Access Interface to PC Files is a SAS for Windows product that allows you to read, write, and update data in Excel and Access. As such, it is a must-have for your Windows SAS installation.

Here is an example of a program that I use to map out the contents of an unexplored spreadsheet:

ods rtf file="G:\BigProject\Worksheets in NewDataSpreadsheet.rtf";
 
libname xlslib "G:\NewProject\DeliveryDirectory\NewDataSpreadheet.xlsx" access=readonly;
 
proc sql;
create table WorkSheets as
select distinct(compress(MEMNAME,"',$")) as WorkSheet_Name,
name as ColumnName
from dictionary.columns
where libname = 'XLSLIB';
quit;
 
proc print noobs data=WorkSheets;
var WorkSheet_Name ColumnName;
title1 "Workseets in NewDataSpreadsheet.xlsx";
run;
 
ods rtf close;

The ODS statement specifies that my report will be created as an RTF document. Because I have SAS/Access Interface to PC Files, the LIBNAME statement allocates the NewDataSpreadheet.xlsx spreadsheet much the same way as it would for a SAS data set. (Notice that I specified access=readonly so that I do not accidentally update the spreadsheet). Since I have "LIBNAME-d" the spreadsheet, information about its worksheets and column names is now available in the SAS Dictionary Tables.

I use PROC SQL to extract the name of each worksheet (variableWorkSheet_Name) in the Excel file; and the names of the columns (variableColumnName) within each worksheet and then plop them into a SAS data set for further exploration. The code snippet compress(MEMNAME,"',$") gets rid of the annoying quotes and dollar signs that are found in spreadsheet MEMNAMEs. Then, I use the PRINT procedure to create a report. A simple, neat, quick, and easily macro-tized piece of code.

Here are several good references that you can use to find out more about processing spreadsheets with SAS:

Armed with those resources, some pluck, a sense of adventure, and with your own trusty copy of SAS/Access Interface to PC Files, you too can be a raider of the lost spreadsheet!

Best of luck in all your SAS endeavors!

Post a Comment

3 bestselling books at ENAR 2015 Spring Meeting

SASBooks_ENARWe had a lot of books at the ENAR 2015 Spring Meeting in Miami last week, but these were the top three bestsellers.

  1. Analysis of Observational Healthcare Data using SAS by Douglas E. Faries, Robert L. Obenchain, Josep Maria Haro, and Andrew C. Leon
  2. Survival Analysis Using SAS®: A Practical Guide, Second Edition by Paul D. Allison
  3. Bayesian Analysis of Item Response Theory Models Using SAS by Clement A. Stone and Xiaowen Zhu

I also met a young girl who’s ready to become our next author. Or maybe she just likes our buttons. Either way, you’re never too young to think about becoming a SAS Press author. If you have any publishing ideas, visit SAS Books to learn more.

YoungSASauthor

If you were at the conference and picked up the card with the ENAR discount code, don’t forget to use it before April 1st.

Post a Comment

I Know What You Did Last Summer!

I know what you did last summer.

If it was unintentional, then you probably don't know what I am talking about.  If it was intentional, then you probably thought that I would never find out.  Either way, the damage is done.  The actions that you took on that warm summer evening are as clear to me now as they would have been if I had been watching over your shoulder while you did them.  I know what you did last summer: You updated one of my SAS data sets.

We work on the same project  and both have read, write, update, and delete rights to the project's directories.  The production SAS data set that I created for the spring data delivery was inexplicably updated in the summer.  And, you were the one who did it.  Because we have been teammates for a while, I am giving you the benefit of the doubt.  I bet that you made a copy of the production SAS program for a different use, updated it, but forgot to change the LIBREF to point to your test SAS data library.  So when you ran it, you accidentally deleted 400 observations and updated 273 observations in the production data set.

Oh, you want to know how I determined it was you and how I know exactly what changed.

Well, because that production data set is very important, I used PROC DATASETS to create a SAS audit trail file for it.  SAS audit trails record changes to SAS data sets.  They can record the before and after image of observations that were changed, the type of change, the date/time of the change, and the userid of the person who changed the SAS data set.  So, SAS audit trails can be very useful in a shared directory environment where many staff members have access to important SAS data sets.

Here is the code I used to create the audit trail for the production SAS data set:

proc datasets library=prodlib nolist;
        audit SpringDeliveryData;
        initiate;
        log admin_image=yes
              before_image=yes
              data_image=no
              error_image=yes;
        run;
quit;

When I executed that DATASETS procedure code, SAS created a file named SpringDeliveryData.sas7baud in the same directory as the SAS data set.  When an observation is updated, added, or deleted from SpringDeliveryData, SAS writes an observation to the audit trail data set containing the variables in the original SAS data set and six specific audit trail variables.  Of note are _ATDATETIME_ which specifies the date/time of the change; _ATOPCODE_ which specifies the type of change that took place--e.g. add, delete, modify; and _ATUSERID_ which specifies the userid of the person whose SAS program made the change.

When I noticed that SpringDeliveryData had been modified, I used a PROC PRINT to dump the audit trail file.  That is how I know that the data set was updated at 5:27 PM on August 5th by a program submitted under your userid.

You are interested in using SAS audit trails for your own production SAS data sets?  Great!  You can find a comprehensive write-up in the documentation on support.sas.com at this link.

Don't fret about the updates to the SpringDeliveryData SAS data set.  I am going to request that our systems administrator restore the data set to the day before the summer update.  That way, we will have the original data set available in case our client has questions about it.

Good to know that I was right that you accidentally updated the production data set last summer.  Oh, don't go.  Unfortunately we have one more thing to talk about:

I know what you did last fall...

Best of luck in all your SAS endeavors!

Post a Comment

SAS Press is heading to ENAR 2015 Spring Meeting

Are you heading to the ENAR 2015 Spring Meeting in Miami this week? SAS author and Program Chair Mithat Gönen, of Memorial Sloan-Kettering Cancer Center, and Associate Chair Brisa Sánchez, of the University of Michigan School of Public Health have created an outstanding scientific program this year. The sessions cover a wide range of topics such as, data sciences (big data), genomics, clinical trials, neuroimaging, biomarkers, health policy, electronic health records, ecology, and epidemiology.

After whetting your appetite at some of these great sessions, come and browse the SAS Press booth and find informative, up to date titles to further your knowledge, such as SAS classics: Analyzing Receiver Operating Characteristic Curves with SAS, Gönen; and Analysis of Clinical Trials Using SAS, Dmitrienko et al; and preview new titles: Bayesian Analysis of Item Response Theory Models Using SAS by Clement Stone & Xiaowen Zhu and Time Series Modeling Using the SAS VARMAX Procedure, by Anders Milhoj.

While we do have some great titles, I know we haven’t covered everything. Please stop by and have a quick chat with me. While I am happy to discuss what we do have available - perhaps there is a topic you would like to see covered but we don’t have? Perhaps you have a topic you would love to write about?

Post a Comment

Mobile devices in schools: Beyond “cool”

9781118894309.pdfThe ability to do things on the go – email, tweet, listen to a podcast, find a restaurant or ATM – are things we all do all the time, as adults. This instant access to the Internet and apps allows us to be more productive with our time and make better decisions, just using the little computer in our hand.

Schools are increasingly embracing mobile technology in the form of tablets and notebook computers, and this is changing the way we think about education and offering more opportunities for learning with mobile devices. And the benefits are real.

Giving kids instant access to the web and apps can make the world their classroom, teach them real world skills alongside the curriculum and enable a more collaborative environment for their work. And, access to tablets is a much more affordable and durable option than laptops, the previous model of 1:1 access. Also, personalized education is a big buzzword in the industry, and mobile devices offer an unprecedented level of individualization for learning tools. And perhaps the most important benefit for the students: tablets are inherently cool. The engagement factor is a huge perk, and any way to draw in students who aren't naturally engaged is a good thing.

As more schools adopt these devices, we are learning more and more about how they impact the learning and possibilities. Teachers are also learning as they go, and our new book, Mobile Learning: A Handbook for Developers, Educators and Learners, aims to provide a road map for effective conceptualization, implementation and use of these devices in education. It also provides details for developers in how to create quality educational content and navigate the unique marketplace for educational apps.

Teachers and developers working in collaboration is a key element of a strong mobile learning marketplace, which is no surprise, as we are part of just such an interdisciplinary team at Curriculum Pathways.

In our book, we advocate for the thoughtful incorporation of mobile devices, not simply retrofitting old lessons and adding an iPad, say, to replace a worksheet. Mobile devices can add a level of engagement and learning.

For instance, in a non-technical classroom, students might sit in class and learn about plant life cycles; the teacher would have handouts and maybe a presentation with pictures. However, using mobile devices, this lesson could involve students going outdoors, taking pictures of different plants and identifying the plant and its stage in the life cycle, and maybe even creating and sharing online an ebook to explain. Adding mobile devices, in this example, adds a level of creativity, physical activity, collaboration and engagement that isn't attainable through a classroom lecture and worksheets.

We see examples like this everywhere: “there’s just something about these devices” is one of the most common sentiments we heard from the teachers we interviewed in writing this book. We aimed to break down what makes them special to kids and promising to educators, using educational theory and pedagogy.

There are certainly controversies and drawbacks, all of which are addressed in this book as well – data privacy, screen time limits and digital citizenship, to name a few. However, we see the incorporation of mobile devices into education as an exciting and promising development, and when done responsibly and intelligently, a way to teach all students in a way that more closely mirrors “real life.”  Our book presents a practical look at the latest in learning technologies. From theory to practice, Mobile Learning is a great start for any school or development team looking to get in the mobile learning realm.

 

To learn more, check out Mobile Learning: A Handbook for Developers, Educators, and Learners by Scott McQuiggan, Lucy Kosturko, Jamie McQuiggan and Jennifer Sabourin.

Post a Comment

Is your personal language sabotaging you?

Bevenour coverAs an executive coach, I've worked with thousands of managers and business leaders whose personal language sabotaged their effectiveness at driving change, not to mention their day-to-day team management.

For your Inner Leader to shine through, you need to master your personal language--your way of communicating your company’s goals and how your team needs to work to meet them.

Meet “Howard,” a vice president in a prestigious global professional services firm.  Howard came to me because he was increasingly frustrated and stressed by what he thought was his team’s inability to follow his instructions. After several conversations, however, Howard realized that his own communication issues were decreasing his effectiveness as a leader. To change, Howard needed to master his personal language by learning to ask open-ended questions and truly listen to the response.

The simple rule for business professionals is to ask open-ended questions that start with the word what. What is factual, and why is emotional. By asking questions that begin with what, your listeners will think and begin to draw their own conclusions. And if you ask these kinds of questions, you will be much more successful in relating your requests to your team.

Howard agreed to complete the following exercise: For every meeting/phone call/interaction, he prepared six to ten what questions. He asked his questions and really listened to the answers, rather than reverting to a “tell” posture and continuing to ask more questions.

Here are some examples of “what questions” that you could use:

  • What does success of this (project, meeting, presentation) look like?
  • What are the top-three steps to achieve success?
  • What about your idea/direction will contribute to the success of this (project, meeting, presentation)?
  • What is the ideal outcome?
  • What would that ideal outcome look like?
  • What is the permanent solution here?
  • What do you want to happen?
  • What is the truth here?

Howard diligently followed this concept of asking what questions, and in six months his frustration and stress level had decreased significantly. His personal language became razor precise, and he is delighted at how his staff and colleagues have responded. Howard’s Inner Leader is smiling.

By recognizing that his personal communication style was sabotaging his effectiveness and incorporating this one change to the way that he communicated with his team, Howard became the kind of leader a competitive, success-driven organization needs.

Post a Comment

Excuse me; but, is that a 32-bit or a 64-bit SAS catalog?

I don’t know about you, but I get pretty determined to prove them wrong when people tell me that I cannot do something. I am not talking about fantastical things such as flying through the heart of the sun and out the other side without getting burned. Nor, am I talking about social things like becoming president of the United States or an author on the New York Times Bestseller list. And, I am not talking about physical things such as swimming 2.4 miles, biking 112 miles, and running 26.2 miles back-to-back on the same day. No, I am talking about being told that I cannot do something with SAS.

For example, I was once told:

  • that you could not summarize impossibly large SAS data sets to load a data warehouse. So, I figured out a way to do it.
  • that you could not measure the performance of SAS/IntrNet application programs. So, I figured out a way to do it.
  • that you could not determine which SAS products individual staff members were using on shared servers. So, I figured out a way to do it.
  • that you could not create a chargeback system for UNIX and Linux systems without purchasing an accounting package. So, I figured out a way to do it.

Consequently, when I was told that there was no SAS facility for programmatically determining whether a Windows SAS catalog was a 32-bit catalog or a 64-bit catalog, I resolved to figure out a way to do it.

The background is that my organization plans to migrate from 32-bit SAS to 64-bit SAS as part of a SAS 9.3 to SAS 9.4 upgrade. SAS data sets are compatible between the two bitages, but SAS catalogs are not. Stating the problem: you cannot open a 64-bit SAS catalog with 32-bit SAS. So, it is advantageous to have a tool for determining which SAS catalog is which bitage as you move forward into a mixed-bit programming environment during the transition.

I did my due diligence and researched every place that I thought I might be able to find a way to differentiate the bitage. An indicator in PROC CATALOG if I ran it with the STAT option enabled? Nope. Something in the directory portion of a PROC CONTENTS listing with the DETAILS option specified? Nope. A lesser-known option of PROC DATASETS? Nope. How about a flag in the Dictionary Tables CATALOGS table or in the SASHELP Views VCATALG view? Nope. A Usage Note on support.sas.com. Nope. A SAS technical paper published at either SAS Global Forum or a Regional SAS Users Group? Nope, not that either.

I figured that if you could not tell the difference within SAS, itself, how about if you looked at the catalogs as simply files. So, I got a 32-bit SAS catalog and a 64-bit SAS catalog and opened them with WordPad to take a look inside. Bingo! There was enough information in the very first record of both catalog files to determine the difference. So, I wrote a program that tested for the string of characters that told the tale.

Here is the SAS program that I wrote:

/*Macro to determine bitage of a SAS catalog */
%MACRO Test_Cat_Bitage(SASCatalog);
filename sascat "&SASCatalog";
data decompcat(keep=CAT_BITS SAS_Catalog);
length CAT_BITS $8
SAS_Catalog $50;
infile sascat obs=1 truncover;
input bigline $charzb32767. ;
if index(bigline, "W32_7PRO") > 0 then CAT_BITS = "W32_7PRO";
else if index(bigline, "X64_7PRO") > 0 then CAT_BITS ="X64_7PRO";
else CAT_BITS = "Unknown ";
SAS_Catalog = strip("&SASCatalog");
label CAT_BITS = "Bitage of SAS Catalog"
SAS_Catalog = "Full Path SAS Catalog Name"
;
run;
proc append base=AllCatalogs
data=decompcat;
run;
%MEND Test_Cat_Bitage;
/* Example of executing the macro to read a catalog file */
%Test_Cat_Bitage(c:\temp\gender.sas7bcat);

As you can see, the program determines the bitage of a SAS catalog by treating the catalog as a file, not as a catalog. It opens the catalog file and inspects the first line for a specific character string: W32_7PRO for 32-bit catalogs; X64_7PRO for 64-bit catalogs. Once it determines the bitage, the program writes an observation to data set AllCatalogs in the WORK library. Each observation in AllCatalogs has two variables: CAT_BITS, which specifies whether the catalog is 32 or 64 bits, and SAS_Catalog, which is the full path name of the SAS catalog file.

The object of this particular setup is to run the macro against several, a score, dozens, hundreds, or thousands of SAS catalogs and build a SAS data set which identifies their bitage. After that, one may choose to copy AllCatalogs to a permanent SAS data set, or create a report from it. Or both.

Being a talented SAS programmer yourself, I would bet that you also do not like it when people tell you that you cannot do something with SAS. Right? Yea, it goes with the territory. How about posting a comment telling us about a particularly difficult SAS problem you encountered and the clever way that you resolved it? Bet you can’t do that.

Best of luck in all your SAS endeavors!

Post a Comment

Flip your intro stats course using "Practical Data Analysis with JMP"

DataAnalysisHelping students to reason statistically is challenging enough without also having to provide in-class software instruction. “Practical Data Analysis with JMP, Second Edition” walks students through the process of analysis with JMP at their own speed at home, allowing faculty to devote class time to crucial or subtle statistical concepts and to addressing student-generated questions.

Consider, for example, the topic of descriptive bivariate co-variation. As a professor, what does Robert Carver want his students to understand (as opposed to memorize)? He focuses on the following “big ideas”:

  • Many statistical investigations ask if variables X and Y have a relationship
  • Two univariate graphs are not informative about how X and Y vary together
  • The proper investigative technique depends on the data types involved
  • With two variables, there are just three choices: 2 categorical, 2 numerical, or 1 of each
  • Start with graphs. What kinds of patterns are typical when X & Y are/are not associated?
  • We can distill some graphical patterns numerically.

Chapter 4 of Carver’s book is devoted to descriptive analysis of two variables at a time. Using two different data tables and posing realistic problems, the chapter leads students through the foundational concepts of bivariate description. It then provides step-by-step software instruction. By working through the steps, students find solutions to the motivating problems.

Instructors new to JMP 11 may want to lean heavily on Graph Builder to review the concepts during class, and then solidify students’ new-found skills to examine additional pairs of variables from the sample data tables. At the end of the chapter, you’ll find eight more scenarios suitable for homework exercises.

The final topic in the chapter is entitled “More Informative Scatter Plots” which introduces JMP’s Bubble Plot and revisits an earlier chapter example, this time adding additional dimensions to the bivariate plot. (see pages 78-79)

For more information on practical data analysis with JMP, grab Robert Carver’s latest book as an instrument to build your JMP knowledge and empower yourself to reason statistically. You can also read an excerpt from Carver’s book.

Post a Comment

SAS author’s tip: A “most common” macro error

Robert Virgile

Robert Virgile

This week’s author tip is from Robert Virgile and his book SAS Macro Language Magic: Discovering Advanced Techniques.  Virgile chose this tip because even good programmer’s make errors.

We hope you find this tip useful. You can also read an excerpt from Virgile’s book.

Even good programmers make errors.  In fact, when it comes to macro language, good programmers often make the same error:  failing to define macro variables as %local.

Here is an all-too-common example:

%macro doses;
   %do i=1 %to 3;
      dose&i date&i
   %end;
%mend doses;

This straightforward macro generates a list of variable names.  For example, PROC PRINT could call it:

proc print data=patients;
   var patient %doses;
run;

The intention is to complete the VAR statement:

proc print data=patients;
   var patient dose1 date1 dose2 date2 dose3 date3;
run;

The macro works, and the result is correct.  So where is the error?  Why should this macro add just before the %DO loop:

%local i;

Danger abounds!  But to understand why, you need to understand when and where the software creates &i.  To begin, consider this test:

%let i=5;
%macro test;
   %local i;
   %let i=2;
%mend test;
%test
%put i is now &i..;

When %LET appears outside of a macro definition, it creates &i in the global symbol table.  When %TEST executes, the %LOCAL statement creates &i in the local symbol table for %TEST.  So there are two macro variables, both named &i.

When the interior %LET statement executes, which &i receives a value of 2?  The interior %LET statement triggers a search for a macro variable named &i, and uses the first one it finds.  The search process begins in the local symbol table, which does in fact contain &i.  So the local &i receives a value of 2.

When the macro finishes executing, its local symbol table vanishes.  The %PUT statement writes:

i is now 5.

The only remaining macro variable named &i is the one in the global symbol table, which still has a value of 5.

Next, consider a similar example that removes the %LOCAL statement:

%let i=5;
%macro test;
   %let i=2;
%mend test;
%test
%put i is now &i..;

The process doesn’t change, but the result does.  The %PUT statement writes:

i is now 2.

The key step that reverses the outcome is the interior %LET statement.  As before, it triggers a search for an existing macro variable named &i.  It searches unsuccessfully in the local symbol table, so it continues the search in the global symbol table.  Finding &i there, that &i receives a value of 2.

While the scenarios can become more complex (when %A calls %B, and %B calls %C, and %C calls %D), the search process remains the same.  When a program references a macro variable, the software searches for an existing variable with that name.  Search from the most interior symbol table outward, ending up with the global symbol table if necessary.  The software uses the first macro variable found that has the proper name.  If none of the symbol tables contains the macro variable, create it.  Where?  That is another story for another day.  The search process tells us all we need to know to understand why the %LOCAL statement is important.

The missing %LOCAL statement causes trouble when you write a macro that calls %DOSES.  Here is a simple example:

%macro your_macro;
   %local i;
   %do i=1 %to 3;
      proc print data=patient_pool&i;
         var patient %doses;
      end;
   %end;
%mend your_macro;

The intent is to print three data sets.  The result is quite different, even though you were quite careful to add your own %LOCAL statement.

When %YOUR_MACRO executes, the %LOCAL statement creates &i in its local symbol table.  When %DOSES looks for an existing macro variable named &i, it follows the standard procedure:

  • Search the local %DOSES symbol table (unsuccessfully, because %DOSES is missing the %LOCAL statement).
  • Search the local %YOUR_MACRO symbol table (successfully).
  • Use &i in the local %YOUR_MACRO symbol table.

Since %DOSES increments &i in %YOUR_MACRO’s symbol table, %YOUR_MACRO prints only one data set.  After that, &i exceeds the upper %DO loop limit of 3, and %YOUR_MACRO finishes.  No error messages or warning messages appear … but only one data set prints.

With a slight change, the problem gets worse:

%macro your_macro;
   %local i;
   %do i=1 %to 8;
      proc print data=patient_pool&i;
         var patient %doses;
      end;
   %end;
%mend your_macro;

The same search process applies, with %DOSES using &i from the %YOUR_MACRO symbol table.  This means that the program is an infinite loop!  Each time %DOSES executes, it resets &i so that it never reaches a value of 8.

The bottom line:  define your local variables using a %LOCAL statement.  If you don’t, you can suffer wrong results, infinite loops, and a reputation for writing macros that are dangerous.

For more information about the macro language and the magic you can create with it, check out Robert Virgile’s book SAS Macro Language Magic: Discovering Advanced Techniques.

Post a Comment