Kalyan
Big Data Project 1
|
|
Project
Name
|
How To Stream CSV Data Into Phoenix Using Apache Kafka |
Project
Code
|
https://github.com/kalyanhadooptraining/kalyan-bigdata-realtime-projects/tree/master/kafka/project1-phoenix-csv |
Learnings
of this Project
|
➢ We
will learn Kafka Configurations and Commands
➢ Kafka
Information
1.
Kalyan Util (CSV data generator)
2.
Kafka Producer (Listen on CSV data)
3.
Kafka Consumer (Recieves the data from Kafka Producer)
4.
Phoenix Consumer (Write the data into Phoenix Table)
➢ Major
project in Real Time `Product Log Analysis`
1.
We are extracting the data from server logs
2.
This data will be useful to do analysis on product views
3.
CSV is the output format
➢ We
can use phoenix to analyze this data
|
Kalyan
Big Data Project 2
|
|
Project
Name
|
How To Stream JSON Data Into Phoenix Using Apache Kafka |
Project
Code
|
https://github.com/kalyanhadooptraining/kalyan-bigdata-realtime-projects/tree/master/kafka/project2-phoenix-json |
Learnings
of this Project
|
➢ We
will learn Kafka Configurations and Commands
➢ Kafka
Information
1.
Kalyan Util (JSON data generator)
2.
Kafka Producer (Listen on JSON data)
3.
Kafka Consumer (Recieves the data from Kafka Producer)
4.
Phoenix Consumer (Write the data into Phoenix Table)
➢ Major
project in Real Time `Product Log Analysis`
1.
We are extracting the data from server logs
2.
This data will be useful to do analysis on product views
3.
JSON is the output format
➢ We
can use phoenix to analyze this data
|
Kalyan
Big Data Project 3
|
|
Project
Name
|
How To Stream REGEX Data Into Phoenix Using Apache Kafka |
Project
Code
|
https://github.com/kalyanhadooptraining/kalyan-bigdata-realtime-projects/tree/master/kafka/project3-phoenix-regex |
Learnings
of this Project
|
➢ We
will learn Kafka Configurations and Commands
➢ Kafka
Information
1.
Kalyan Util (Complex data generator)
2.
Kafka Producer (Listen on Complex data)
3.
Kafka Consumer (Recieves the data from Kafka Producer)
4.
Phoenix Consumer (Write the data into Phoenix Table)
➢ Major
project in Real Time `Product Log Analysis`
1.
We are extracting the data from server logs
2.
This data will be useful to do analysis on product views
3.
Complex Data is the output format then REGEX is best solution
➢ We
can use phoenix to analyze this data
|
Kalyan
Big Data Project 4
|
|
Project
Name
|
How To Stream CSV Data Into Hadoop Using Apache Flume - Kafka Source |
Project
Code
|
https://github.com/kalyanhadooptraining/kalyan-bigdata-realtime-projects/tree/master/kafka/project4-flume-kafka-source |
Learnings
of this Project
|
➢ We
will learn Flume Configurations and Commands
➢ Flume
Agent
1.
Source (Kafka Source)
2.
Channel (Memory Channel)
3.
Sink (Hdfs Sink)
➢ We
will learn Kafka Configurations and Commands
➢ Kafka
Information
1.
Kalyan Util (CSV data generator)
2.
Kafka Producer (Listen on CSV data)
3.
Kafka Consumer (Recieves the data from Kafka Producer)
4.
Flume Kafka Source (Will Send the Kafka Producer data to Flume
Channel)
➢ Major
project in Real Time `Product Log Analysis`
1.
We are extracting the data from server logs
2.
This data will be useful to do analysis on product views
3.
CSV is the output format
➢ We
can use hive / pig / mapreduce to analyze this data
1.
explore hive query to analysis
2.
explore pig scripts to analysis
3.
explore mapreduce to analysis
|
Kalyan
Big Data Project 5
|
|
Project
Name
|
How To Stream CSV Data Into Hadoop Using Apache Flume – Kafka Sink |
Project
Code
|
https://github.com/kalyanhadooptraining/kalyan-bigdata-realtime-projects/tree/master/kafka/project5-flume-kafka-sink |
Learnings
of this Project
|
➢ We
will learn Flume Configurations and Commands
➢ Flume
Agent
1.
Source (Exec Source)
2.
Channel (Memory Channel)
3.
Sink (Kafka Sink)
➢ We
will learn Kafka Configurations and Commands
➢ Kafka
Information
1.
Kalyan Util (CSV data generator)
2.
Kafka Producer (Listen on CSV data)
3.
Kafka Consumer (Recieves the data from Kafka Producer)
4.
Flume Kafka Sink (Will Recieves the Kafka Sink data from Flume
Channel)
➢ Major
project in Real Time `Product Log Analysis`
1.
We are extracting the data from server logs
2.
This data will be useful to do analysis on product views
3.
CSV is the output format
➢ We
can use hive / pig / mapreduce to analyze this data
1.
explore hive query to analysis
2.
explore pig scripts to analysis
3.
explore mapreduce to analysis
|
Kalyan
Big Data Project 6
|
|
Project
Name
|
How To Stream CSV Data Into Hadoop Using Apache Flume - Kafka Channel |
Project
Code
|
https://github.com/kalyanhadooptraining/kalyan-bigdata-realtime-projects/tree/master/kafka/project6-flume-kafka-channel |
Learnings
of this Project
|
➢ We
will learn Flume Configurations and Commands
➢ Flume
Agent
1.
Source (Exec Source)
2.
Channel (Kafka Channel)
3.
Sink (Hdfs Sink)
➢ We
will learn Kafka Configurations and Commands
➢ Kafka
Information
1.
Kalyan Util (CSV data generator)
2.
Kafka Producer (Listen on CSV data)
3.
Kafka Consumer (Recieves the data from Kafka Producer)
4.
Flume Kafka Channel (Will Recieves the Kafka Channel data from
Flume Source)
➢ Major
project in Real Time `Product Log Analysis`
1.
We are extracting the data from server logs
2.
This data will be useful to do analysis on product views
3.
CSV is the output format
➢ We
can use hive / pig / mapreduce to analyze this data
1.
explore hive query to analysis
2.
explore pig scripts to analysis
3.
explore mapreduce to analysis
|
This is a very Nice Blog.
ReplyDeleteBig Data and Hadoop Online Training