Welcome to the continuation of my series Getting Started with Python Integration to SAS Viya. Given the exciting developments around SAS & Snowflake, I'm eager to demonstrate how to effortlessly connect Snowflake to the massively parallel processing CAS server in SAS Viya with the Python SWAT package. If you're interested
Tech
Comparing Logistic Regression and Decision Tree - Which of our models is better at predicting our outcome? Learn how to compare models using misclassification, area under the curve (ROC) charts, and lift charts with validation data. In part 6 and part 7 of this series we fit a logistic regression
SAS SVP Kimberly May provides an update about the new Support Operating Model.
We now have an option for SAS to operate directly on SQL tables in SingleStore (no need to move data to a SAS dataset). We refer to this as our SAS with SingleStore solution under SAS Viya.
SAS Enterprise Guide 8.4 is released, and it supports connections to SAS Viya to run SAS programs, access data, and more!
Learn how to fit a decision tree and use your decision tree model to score new data. In Part 6 of this series we took our Home Equity data saved in Part 4 and fit a logistic regression to it. In this post we will use the same data and
2023 was a momentous year for Technical Support! I wanted to share some of our achievements with you that demonstrate our commitment to providing you with excellent customer support. Customer Portal By far our biggest accomplishment was the launch of the new customer portal. To achieve this, Technical Support worked
Transitioning from SAS9 to SAS Viya can be uncertain for SAS programmers. Whether your organization is making the move and you’re curious about your current SAS analytical workflows, or you're contemplating moving to SAS Viya and concerned about its impact on your SAS expertise and programs. The hesitation is understandable.
Cast your vote by Friday, Feb. 2 to determine winners of the 2024 SAS Customer Recognition Awards.
Learn how to fit a logistic regression and use your model to score new data. In part 4 of this series, we created and saved our modeling data set with all our updates from imputing missing values and assigning rows to training and validation data. Now we will use this