Posts

Internal Family System Therapy

  Internal Family Systems Therapy is an evidence-based approach that assumes each individual possesses a variety of sub-personalities, or “parts,” and attempts to get to know each of these parts better to achieve healing. The IFS model emphasizes the network of relationships between parts as parts may not be able to experience a change in isolation. This therapy was developed in the early 1990s by Richard Schwartz The Animation Playbook Video ADVANCE PRACTICE https://www.udemy.com/course/the-animation-playbook-advance-practice/?couponCode=0B4B41F9EA60B59469C2 .

Learn all the Needed DevOps Skills , Technologies and Tools that will land you a Job

  DevOps (development and operations)   is a collection of tools and technologies combined to carry out various business processes. It aims to bridge the gap between two of the most significant departments in any IT organization, the development department and the operations department. This blog will help you get an overview of the numerous concepts that play a significant role in defining DevOps. History of DevOps Before DevOps came into the limelight, our traditional ol’ IT had two separate teams in an organization – the Development team and the Operations team. The development team worked on the software, developing it and making sure that the code worked perfectly. After hours of hardwork and a lot of trial and error, the team releases a code which has to be executed by the Operations team which is responsible for the release and operation of the code. The operations team will be checking the application and its performance and reporting back any bugs, if present. As simp...

Apache Beam | Hands on course for Big Data Pipeline | Python Complete Hands on Apache Beam | Batch & Streaming pipelines | Beam SQL & Google Cloud Dataflow

  Apache Beam is future of Big Data technology  and is used to build big data pipelines. This course is designed for beginners  who want to learn how to use Apache Beam using python language . It also covers  google cloud dataflow which is hottest way to build big data pipelines nowadays using Google cloud. This course consist of various hands on to get you comfortable with various topics in Apache Beam.This course will introduce various topics: Architecture Transformations Side Inputs/Outputs Streaming with Google PubSub Windows in Streaming Handling Late elements Using Triggers Google Cloud Dataflow Beam SQL / Beam SQL on GCP By the end of this course, you will find yourself ready to start using Apache Beam in real work environment. What make this course unique - it's concise that's in only 3 hours you will be able to complete it, covers all relevant topics and slides and presentations are really very exciting and easy to understand. Why Apache beam is fu...

Apache Cassandra in 2 hours A complete guide for Cassandra architecture, Query language ,Cluster management, Java & Spark integration

  This Apache Cassandra training course teaches you to work with Cassandra. This course is intended for complete beginners in Cassandra. This is the most concise, efficient and bestseller course on Apache Cassandra. In this course, we will what is Cassandra how to install Cassandra understand Cassandra data model with some hands on exercise which will teach you how to create a keyspace, create a table,insert and read the data . different data types in Cassandra with exercise. After this you will learn about the partition key and clustering key and understand how data is distributed across the nodes in a cluster.T I will covers the Cassandra Architecture in details in which we will cover replication, consistency, gossip protocol, write path, read path, Cassandra storage and compaction. Understanding anti patterns and data modeling goals. Understand Cassandra configuration files Working with nodetools to manage cluster Integrate with Cassandra java driver to write and run Cassandra f...

Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume

  In this course, you will start by learning what is hadoop distributed file system and most common hadoop commands required to work with   Hadoop File system. Then you will be introduced to  Sqoop Import Understand lifecycle of sqoop command. Use sqoop import command to migrate data from Mysql to HDFS. Use sqoop import command to migrate data from Mysql to Hive. Use various file formats, compressions, file delimeter,where clause and queries while importing the data. Understand split-by and boundary queries. Use incremental mode to migrate the data from Mysql to HDFS. Further, you will learn  Sqoop Export  to migrate data. What is sqoop export Using sqoop export, migrate data from HDFS to Mysql. Using sqoop export, migrate data from Hive to Mysql. Further, you will learn about  Apache Flume Understand Flume Architecture. Using flume, Ingest data from Twitter and save to HDFS. Using flume, Ingest data from netcat and save to HDFS. Using flume, Ingest data fr...

Learn Advance Spark: Beginner to Expert (Scala)

  This course is designed in such a manner to cover basics to advanced concept to learn Apache Spark 3.x  in the most efficient and concise manner. This course will be beneficial for beginners as well as for those who already know Apache Spark. It covers in-depth details about spark internals, datasets, execution plan, Intellij IDE, EMR cluster with lots of hands on. This course is designed for Data Engineers and Architects who are willing to design and develop a Bigdata Engineering Projects using Apache Spark. It does not require any prior knowledge of Apache Spark or Hadoop.  Spark Architecture and fundamental concepts are explained in details to help you grasp the content of this course. This course uses the Scala programming language which is the best language to work with Apache Spark. This course covers: Intro to Big data ecosystem Spark Internals in details Understanding Spark Drivers, executors. Understanding Execution plan in detail Setting up an environmen...

CCA 175 Exam preparation using Scala & Practice Tests [2021]

  In this course, we will do the following Intro & Setup CCA175 Introduction Free Cluster Setup on Google Cloud Revise Hadoop Commands Apache Spark Revisi on Spark Intro Actions & Transformations (Optional) Spark Dataframe  (Transform, Stage & Store) Working with various file formats- Json, ORC, XML, CSV, Avro, Parquet etc Working with various compressions - Gzip, Bzip2, Lz4, Snappy, deflate etc Working with Strings Working with dates Working with columns in dataframe Dataframe APIS Spark SQL (Data Analysis) Working with Spark SQL Working with Hive Manipulating Strings in SparkSQL/Hive Manipulating dates in SparkSQL/Hive Mathematical Functions Aggregating & Analyzing data using SparkSQL/Hive Joining Datasets Ranking & Windowing in SparkSQL/Hive Real Exam Like Questions 8-10 Real like Exam solutions Practice Exams with Solutions Practice Exam1 (8 questions with a timer) Practice Exam2  (8 questions with a timer) Who this course ...