Applied AI & Machine Learning | Comprehensive



This course provides a methodical, holistic and strategic approach to predictive analytics. Most organizations jump directly into data and tools that tend to produce good models� and their projects fail. They fail because they have not anticipated how to integrate the project in the organization.

Those who make the investment to fully assess their environment, situation, resources and objectives across all team members will produce project results that are measurable, accountable, actionable and impactful. Unlike any other course on the market, the Comprehensive course steps through the full build of a predictive modeling operation within the realistic environment of a large organization.

Leaders who take this course will interact more effectively with their teams at the tactical level, while analytic practitioners will complement their existing algorithmic background with a more strategic goal-driven focus. In the end, the organization will be greatly strengthened with team members who operate within a common platform that makes AI, predictive analytics and machine learning purposeful and impactful.

This course is intended for those willing to invest in developing skills for superior project design and incremental decision modeling platform development. Those who complete this course will be capable of guiding their organization to stand up a thriving internal AI-driven decisioning practice with measurable and residual gains.



  • IT executives and big data directors: CIOs, CAOs, CTOs, Stakeholders, Functional Officers, Technical Directors and Project Managers who desire to transform their deluge of inert data to actionable assets
  • Line-of-business executives�and functional managers: Risk Managers, Customer Relationship Managers, Business Forecasters, Inventory Flow Analysts, Financial Forecasters, Direct Marketing Analysts, Medical Diagnostic Analysts, eCommerce Company Executives
  • Data scientists: Who recognize the importance of complementing their tactical proficiency with a strategic planning and design approach to advanced analytics
  • Technology Planners: Who survey emerging technologies in order to prioritize corporate investment
  • Consultants: Whose competitive environment is intensifying and whose success requires competency with data science, AI, machine learning and related emerging information technologies


Registrants will be required to view this three-hour online video Core Concepts orientation prior to attending this event. Access details to the Core Concepts video modules will be shared with participants prior to the start of the course. Modules are easily consumable in 10 to 15-minute blocks of instruction.

Prior education or experience in data analytics or statistics is helpful, but not required. Instructions on how to download exercise data and any analytic tools will be provided in the preparatory email. The instructor can assist participants with any preparation during breaks, and before or after class.

Learning Objectives

  • Plan and manage your AI projects effectively from the start
  • Identify, qualify and prioritize viable and actionable opportunities
  • Shift from a limited technology mindset to one of organizational transformation
  • Avoid approaches that waste time and expense on doomed AI projects
  • Construct a valid data set and transform data for superior machine learning model performance
  • Select appropriate methods for each of the four core analytic project types
  • Assess the degree to which a decision model meets a predefined performance objective
  • Take a low-risk / high-impact approach to model development with vendor-neutral tool exposure
  • Apply a formal roadmap for data preparation, model development and validation of results
  • Build an analytics sandbox for rapid model development and reduced IT dependency
  • Develop the rare leadership skills to identify assess, design and oversee viable projects
  • Leave with the resources, contacts and plans to reduce preparation time, costs and risks

Assess Phase

  • Assemble Team
  • Leadership, Analysts, Subject Experts, Data Support, Stakeholders, etc
  • Determine Whether External Talent is Needed
    • Examine Culture & Mindset
    • List Candidate Projects
  • Place Projects on a Benefits / Challenges Quadrant Plot
  • Guided Discussion Breakout Session
    • Define Performance Benchmarks
    • Identify Data Sources
    • Itemize Existing Analytic Resources
    • Describe Operational Environments
    • Initial Report of Overall Practice Readiness
  • What Should an Assess Phase Report Contain?
  • Exercise Breakout Session

Plan Phase

  • Pull & Recon Data
  • Explore Data & Verify Quality
  • Do We Have Enough Data?
  • Which Data are Relevant?
  • Make a First Look at Data Quality
  • Exercise Breakout Session
    • Design Analytic Sandbox
    • Qualify Team
    • Qualify Tools
    • Define Operational Environment(s)
    • Establish Performance Benchmarks & Targets
  • What are the current metrics (KPIs)?
  • What is the Role of Technical Metrics vs. KPIs?
  • Benchmark Demonstration
    • Consider Deployment Options
    • Prioritize Viable Projects

Prepare Phase

  • Initiate Culture & Mindset Shift
  • Refine Team Roles & Responsibilities
  • Build Analytic Sandbox
  • The Importance of the �Data Recon�
  • Effective Collaboration Between Analysts and IT
  • Exercise Breakout Session
    • Define Performance Benchmarks
    • Explore Final Data
  • Comparing Data Requirements to Actual Data
  • Looking for Potential Problems
  • Data Exploration Demonstration
    • Prepare Data
  • Data Integration
  • Data Cleaning
  • Data Construction
  • Exercise Breakout Session
    • Select Candidate Modeling Techniques
    • Develop Roll-out Plan for Go-Live

Model Phase

  • Current Trends in Data Science, AI and Machine Learning
    • Algorithms in the News: Deep Learning
    • The Modeling Software Landscape
    • The Rise of R and Python: The Impact on Modeling and Deployment
    • Do I Need to Know About Statistics to Build Predictive Models?
  • Strategic and Tactical Considerations in Choosing a Modeling Algorithm
    • What is an Algorithm?
    • Is a �Black Box� Algorithm an Option for Me?
  • The Tasks of the Model Phase
    • Generate Test Design
      • Train-Test Validation
      • Accept or Reject Modeling Parameters
      • Test / Test / Validate
    • Optimizing Data for Different Algorithms
    • Build Models
      • Classification
        • Issues Unique to Classification Problems
        • Why Classification Projects are So Common
        • An Overview of Classification Algorithms
          • Logistic Regression
          • Neural Networks
          • Na�ve Bayes Classification
          • Support Vector Machines
          • Decision Trees
          • Ensemble Methods
        • Value Estimation and Regression
        • Clustering
        • Association Rules
        • Other Modeling Techniques
          • Times Series
          • Text Mining
          • Factor Analysis
        • Model Assessment
          • Evaluate Model Results
        • Check Plausibility
        • Check Reliability
          • Model Accuracy and Stability
          • Lift and Gains Charts
        • Modeling Demonstration
          • Assess Model Viability
          • Select Final Models
        • Why Accuracy and Stability are Not Enough
        • What to Look for in Model Performance
        • Exercise Breakout Session
          • Create & Document Modeling Plan
          • Determine Readiness for Deployment
        • What are Potential Deployment Challenges for Each Candidate Model?
        • Exercise Breakout Session and Guided Project Discussion

Validate Phase

  • Select the Most Strategic Model Option(s)
  • Validate Finalist Models
  • Prepare Data for Test Deployment
  • Data Preparation Steps for Production
  • Data Preparation Demonstration
    • Measure Lift / ROI / Impact
  • The Potential Challenges of Estimating ROI
  • Designing an Effective �Dress Rehearsal�
  • The Basics of A/B testing
  • Exercise Breakout Session
    • Test Deployment
    • Document Validation Process

Deploy Phase

  • Change Management for New Decision Process
  • Streamline Data Preparation for Deployment
  • Revisiting Data Prep with an Eye toward Deployment
  • Considering Deployment Options
  • Data Preparation Demonstration
    • Review All Project Functions
    • Go Live
    • Prepare Final Report
    • Conduct Knowledge Transfer

Monitor Phase

  • Create Maintenance Schedule
  • Assign Monitoring Responsibilities
  • Build Performance Dashboard
  • Who Will be in Charge of Monitoring?
  • How with the Monitoring Information be Updated?
  • Exercise Breakout Session
    • Define Criteria for Model Refresh or Replace
    • Develop Monitoring & Maintenance Plan
  • Putting a Proper Plan and Schedule into Place
  • Monitoring Demonstration
    • Identify New Data Sources
    • Record Changes to Environment and Organization

Wrap-up and Next Steps

  • Supplementary Materials and Resources
  • Conferences and Communities
  • Get Started on a Project!
  • Options for Strategic Oversight and Collaborative Implementation