- Open eclipse
- Create Project
- Add code from github ;) according to your problem statement
- Add external archives (right click and build path)
-
- hadoop-common.jar from usr/lib/hadoop
-
- hadoop-common-2.6.0-cdh5.4.2.jar from usr/lib/hadoop
-
- hadoop-core-2.6.0-mr1-cdh5.4.2.jar form usr/lib/hadoop-0.20-mapreduce
-
- commons-cli-1.2.jar from usr/lib/hadoop/lib
-
- Create new launch configuration
- For that click on Run as and then Run configuration
- Click on java application and create new configuration
- Select class name that we have created means copy pasted :D
- After creation of launch configuration right click on project and click on export
- Create runnable jar file
- Give path where you want to export it and click finish
- Open terminal and go to the path where you have exported jar file
- Run Command to Put Data in HDFS
hadoop fs -put <file name> <path>
- Run command
hadoop jar <jar file name> <class name> <input file path> <output file path>
- Check output in output file path in HDFS
hadoop fs -cat <output file path>
-
Notifications
You must be signed in to change notification settings - Fork 8
atharvaagrawal/SPPU-DSBDA-Practical
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published