AWS Certification Prep Toolkit: Big Data Specialty

COURSE OUTLINE:

Description

Preparing for the AWS Certified Big Data - Specialty exam extends on an Associate level certification. In addition to hands-on experience and supporting resources, you�ll need authorized Big Data on AWS training, exam readiness workshops and an exam voucher to be fully prepared. Exclusive to Global Knowledge is the AWS Certification Prep Toolkit, which includes the three-day Big Data on AWS training, the Big Data - Specialty Exam Readiness Workshop and an Associate-level Exam Voucher, at a valuable rate.

What do you get with your AWS Certification Prep Toolkit?

Choose to take the hands-on training course in a classroom, virtually or in an onsite, private group training session � followed by the exam readiness workshop virtually. Use your voucher when you�re ready to book your AWS exam at a PSI testing center.

Audience

  • Data Architects
  • Developers
  • Solutions Architects
  • IT professionals looking to achieve their AWS Big Data � Specialty certification

Prerequisites

Big Data on AWS

  • Familiarity with big data technologies, including Apache Hadoop and HDFS
  • Knowledge of big data technologies such as Pig, Hive, and MapReduce is helpful but not required
  • Working knowledge of core AWS services and public cloud implementation
  • Students should complete the AWS Essentials course or have equivalent experience
  • Basic understanding of data warehousing, relational database systems, and database design


Exam Readiness Workshop

  • The AWS Certified Cloud Practitioner or an Associate-level AWS Certification
  • Two or more years of hands-on experience performing complex big data analyses on AWS
  • Big Data on AWS training course

Learning Objectives

In the Big Data on AWS course, you�ll learn:

  • Apache Hadoop in the context of Amazon EMR
  • The architecture of an Amazon EMR cluster
  • Launch an Amazon EMR cluster using an appropriate Amazon Machine Image and Amazon EC2 instance types
  • Appropriate AWS data storage options for use with Amazon EMR
  • Ingesting, transferring, and compressing data for use with Amazon EMR
  • Use common programming frameworks available for Amazon EMR including Hive, Pig, and Streaming
  • Work with Amazon Redshift to implement a big data solution
  • Leverage big data visualization software
  • Appropriate security options for Amazon EMR and your data
  • Perform in-memory data analysis with Spark and Shark on Amazon EMR
  • Options to manage your Amazon EMR environment cost-effectively
  • Benefits of using Amazon Kinesis for big data

During the AWS Certification Exam Readiness Workshop: AWS Big Data - Specialty, you�ll learn:

  • Navigate the AWS Certification process
  • Understand the content domains that will be tested in the AWS Certified Big Data - Specialty exam
  • Implement core AWS Big Data services according to architectural best practices
  • Leverage tools to automate data analysis on AWS

Big Data on AWS

  • Overview of Big Data
  • Data Ingestion, Transfer, and Compression
  • AWS Data Storage Options
  • Using DynamoDB with Amazon EMR
  • Using Kinesis for Near Real-Time Big Data Processing
  • Introduction to Apache Hadoop and Amazon EMR
  • Using Amazon Elastic MapReduce
  • The Hadoop Ecosystem
  • Using Hive for Advertising Analytics
  • Using Streaming for Life Sciences Analytics
  • Using Hue with Amazon EMR
  • Running Pig Scripts with Hue on Amazon EMR
  • Spark on Amazon EMR
  • Running Spark and Spark SQL Interactively on Amazon EMR
  • Using Spark and Spark SQL for In-Memory Analytics
  • Managing Amazon EMR Costs
  • Securing your Amazon EMR Deployments
  • Data Warehouses and Columnar Datastores
  • Introduction to Amazon Redshift
  • Optimizing Your Amazon Redshift Environment
  • The Big Data Ecosystem on AWS
  • Visualizing and Orchestrating Big Data
  • Using Tibco Spotfire to Visualize Big Data

This is an emerging technology course. Course outline is subject to change as needed.