Tag: JMP 10

0
Omne trium perfectum: JMP add-in for WinBUGS

There's a bit of Latin that states "omne trium perfectum" or "everything that comes in threes is perfect." I had not set out to write three posts in a row on Markov Chain Monte Carlo (MCMC), but sometimes the stars align in such a way that the story continues to

0
The well-appointed analytic workbench

What do I mean by “analytic workbench?” Basically, the compute-resource environment with which data analysis takes place. How would you describe some of the analytic workbenches in your organization? Not everyone is a power analyst, so not everyone requires power tools. But all of us deal with data at some

1
Celebrating George Box and Box-Behnken designs

As part of the International Year of Statistics, the JMP Blog is honoring influential statisticians each month. Professor George E.P. Box is the honoree for May. Last week, I wrote about on the first of his two-part paper with J. Stuart Hunter on the family of regular two-level fractional factorial

0
Predictive modelling returns to the UK

We have had such a favourable response to our seminar on Building Better Models that we held our third one on 10 April, with nearly 100 people attending. It's become a global success, having been delivered 20 times throughout the world. The seminar is based on George Box's concept that

0
Using JMP to evaluate MCMC diagnostics

It’s no secret that JMP excels in the visual exploration of data. There’s a healthy dose of statistics, too. But when asked about Bayesian methods, JMP is probably not the first software package that comes to mind. JMP 10 does contain Bayesian D-optimal and I-optimal designs in our design of experiments (DOE) features,

0
New shapes in JMP: Microtitre plates

The last couple of versions of JMP have extended data visualization in so many ways and made it easier to create these graphics, too. One recent addition to Graph Builder is shapes. JMP is installed with a set of shape files for the geographic boundaries of world countries, states and

0
Get discount on JMP training

JMP customers can now receive a 25% discount when pre-purchasing training in bulk. A minimum purchase of $6,000 will buy 8,000 SAS Training Points. When you are ready to purchase training, just remember that 1 point = $1 worth of JMP training (some exclusions apply). This is a great opportunity

1
Partitioning a quadratic in JMP

At a recent Building Better Models seminar, someone asked me, “If you have a factor that has a curved relationship with a response, can a decision tree model be used to model that relationship?” To show that this is indeed possible, I created a simple simulated data set with 200

0
2 great minds in 1 quality webcast

They say great minds think alike. We’ve also heard that two heads are better than one. Well, in the case of husband and wife Brenda and José Ramírez, two great minds have combined to develop and teach highly successful, modern quality practices. Deploying traditional techniques such as those developed by

4
PCA and illustrative variables add-in for JMP

Principal Component Analysis (PCA) is a traditional method in data analysis and, more specifically, in multivariate analysis. PCA was developed by Karl Pearson in 1901. The goal of PCA is to reduce the dimensionality in a set of correlated variables into a smaller set of uncorrelated variables that explain the majority

1 2 3 10