The DO Loop
Statistical programming in SAS with an emphasis on SAS/IML programs
A recent article describes the main features of simulation by using the Synthetic Minority Over-sampling Technique (SMOTE). SMOTE was created to oversample from a set of rare events prior to running a machine learning classification algorithm. However, at its heart, the SMOTE algorithm (Chawla et al., 2002) provides a way

The Synthetic Minority Over-sampling Technique (SMOTE) was created to address class-imbalance problems in machine learning algorithms. The idea is to oversample from the rare events prior to running a machine learning classification algorithm. However, at its heart, the SMOTE algorithm (Chawla et al., 2002) is essentially a way to simulate

SAS programmers love to brag that the SAS will still run a program they wrote twenty or forty years. This is both a blessing and a curse. It's a blessing because it frees the statistical programmer from needing to revisit and rewrite code that was written long ago. It's a

Isaac Newton had many amazing scientific and mathematical accomplishments. His law of universal gravitation and his creation of calculus are at the top of the list! But in the field of numerical analysis, "Newton's Method" was a groundbreaking advancement for solving for a root of a nonlinear smooth function. The

Newton's method was in the news this week. Not the well-known linear method for finding roots, but a more complicated method for finding minima, sometimes called the method of successive parabolic approximations. Newton's parabolic method was recently improved by modern researchers who extended the method to use higher-dimensional polynomials. The

Nearly every statistician has heard the aphorism, "All models are wrong, but some are useful." The quote is attributed to George Box, an early and influential thinker about statistics. Did George Box actually say this quote? Yes, he did. The first part of the quote ("All models are wrong") appeared