Vi ste ovdje
Overview/Description
A well designed and high performance database depends on its storage engine and index structure. In this course, you'll take an in-depth look at storage engines and how they affect the performance of your MySQL databases. The course also covers index structures and types, indexing for performance, and index maintenance.
Target Audience
Personnel at all levels of an enterprise seeking to attain competency in MySQL.
Prerequisites
None
Expected Duration (hours)
2.5
Lesson Objectives
MySQL: Storage Engines, Advanced Indexing, and Maintenance
start the...
Overview/Description
As an organization grows, so will its database requirements. In this course, you'll explore the core administrative tasks of managing and maintaining your MySQL databases as they grow and as performance demands increase. The course also covers the transactions in MySQL, storage engines and server optimization, and scaling and high availability. You'll also learn about partitioning, replication, import and export, and backup and recovery.
Target Audience
Personnel at all levels of an enterprise seeking to attain competency in MySQL
Prerequisites
None
Expected...
Overview/Description
The performance, integrity, and security of your MySQL servers is key to maintaining stable, consistent, and secure databases. In this course, you'll learn about monitoring and maintaining your servers while planning for growth, and about security measures to keep your data safe. The course also covers monitoring systems, measuring performance, securing databases, reviewing database metadata, and ensuring data integrity.
Target Audience
Personnel at all levels of an enterprise seeking to attain competency in MySQL
Prerequisites
None
Expected Duration (hours)...
Overview/Description
A number of tools are available for working with Big Data. Many of the tools are open source and Linux distribution based. This course covers the fundamentals of Big Data, including positioning it in a historical IT context, the tools available for working with Big Data, the Big Data stack, and finally, an in-depth look at Apache Hadoop.
Target Audience
IT engineers, programmers, and DBAs working with or interested in Big Data, as well as business decision makers interested in implementing or managing Big Data systems
Prerequisites
None
Expected Duration (...
Overview/Description
Big Data requires a holistic approach and a change to regular working practices. This course covers the way teams work in Big Data organizations, some projects and use cases for Big Data, and challenges and opportunities that Big Data presents.
Target Audience
IT engineers, programmers, and DBAs working with or interested in Big Data, as well as business decision makers interested in implementing or managing Big Data systems
Prerequisites
None
Expected Duration (hours)
1.5
Lesson Objectives
Big Data Opportunities and Challenges
start the course...
Overview/Description
Apache Spark is a cluster computing framework for fast processing of Hadoop data. Spark applications can be written in Scala, Java, or Python. In this course, you will learn how to develop Spark applications using Scala, Java, or Python. You will also learn how to test and deploy applications to a cluster, monitor clusters and applications, and schedule resources for clusters and individual applications.
Target Audience
Developers familiar with Scala, Python, or Java who want to learn how to program and deploy Spark applications
Prerequisites
None
Expected...
Overview/Description
MapReduce programming is a framework for processing parallelizable problems across huge datasets. This course will define MapReduce programming and explain the basics of programming in MapReduce and Hive.
Target Audience
This path is designed for developers, managers, database developers, and anyone with the basic knowledge of Java interested in learning the basics of programming in MapReduce.
Prerequisites
None
Expected Duration (hours)
2.0
Lesson Objectives
MapReduce Essentials
start the course
describe the job components and the steps of...
Overview/Description
Apache Hadoop is a set of algorithms for distributed storage and distributed processing of very large data sets on computer clusters built from commodity hardware. This course will introduce the basic concepts of cloud computing using Apache Hadoop, cloud computing, Big Data, and the development tools applied.
Target Audience
This path is designed for developers, managers, database developers, and anyone interested in learning the basics of Hadoop, or cloud computing in general.
Prerequisites
None
Expected Duration (hours)
2.0
Lesson Objectives
Apache...
Overview/Description
Hadoop's HDFS is a highly fault-tolerant distributed file system and, like Hadoop in general, designed to be deployed on low-cost hardware. It provides high throughput access to application data and is suitable for applications that have large data sets. This course examines the Hadoop ecosystem by demonstrating all of the commonly used open source software components. You'll explore a Big Data model to understand how these tools combine to create a supercomputing platform. You'll also learn how the principles of supercomputing apply to Hadoop and how this yields an...
Overview/Description
Apache Hadoop is an open source framework for distributed storage and processing of large sets of data on commodity hardware. Hadoop enables businesses to quickly gain insight from massive amounts of structured and unstructured data. In this course, you'll learn step-by-step instructions for installing Hadoop in a pseudo-mode and troubleshoot installation errors. You'll learn where the log files are and more about the architecture.
Target Audience
Technical personnel with a background in Linux, SQL, and programming who intend to join a Hadoop Engineering team in...