"Attack on Big Data" series of HDFS common operation commands

table of Contents

1. HDFS common operation commands

2. HDFS file recovery mechanism

Third, the local installation of hadoop's window

Four, mapreduce unit testing, integration testing and verification


1. HDFS common operation commands

View files in the root directory

hadoop fs -ls  hdfs://master:9999/  或  hadoop  fs -ls  / 

 hadoop fs -ls -h hdfs://master:9999/ or hadoop fs -ls -h / increase the -h parameter to make the file size readable, -d only display directories -R recursively display directories and files

Create a directory 

hadoop fs -mkdir  hdfs://master:9999/user  或 hadoop fs -mkdir  /user

 Create a multi-level directory

hadoop fs -mkdir -p  hdfs://master:9999/user/hadoop-twq/cmd  或 hadoop fs -mkdir -p  /user/hadoop-twq/cmd

 Upload local files to hdfs

hadoop fs -copyFromLocal -f  word.txt  /user/hadoop-twq/cmd

 hadoop fs -put word.txt /user/hadoop-twq/cmd If the file already exists, an error will be reported, and the -f parameter needs to be added

 hadoop fs -put word.txt word2.txt /user/hadoop-twq/cmd batch upload, if the file already exists, it will report an error, you need to add the -f parameter

 View file content

hadoop fs -cat /user/hadoop-twq/cmd/word.txt

Write the file to hdfs in the form of input stream

hadoop fs -put  -  /user/hadoop-twq/cmd/put.txt

this is

count

jjjs

Press ctrl+D to exit

Download the file to the local

hapood fs -get  /user/hadoop-twq/cmd/put.txt

Create a file

hadoop fs -touchz /user/hadoop-twq/cmd/flag.txt

Modify file permissions

hadoop fs -chmod  744  /user/hadoop-twq/cmd/put.txt

hadoop fs -chmod 777 -R /user/hadoop-twq/cmd

The above commands have a one-to-one correspondence with the commands at the beginning of hdfs dfs.

Other commands:

hadoop fs -rm   /user/hadoop-twq/cmd/put.txt

2. HDFS file recovery mechanism

 

fs.trash.interval = 3 (minutes) configuration file recovery mechanism

For example: When we delete files in the /user/hadoop-twq/cmd directory

The deleted files will be transferred to the /user/hadoop-twq/.Trash/Current/user/hadoop-twq/cmd directory, when more than 3 minutes later, the files here will also be deleted, that is, permanently deleted Unable to recover.

To restore the file, just copy it back to the original directory.

hadoop  fs  -cp   /user/hadoop-twq/.Trash/Current/user/hadoop-twq/cmd   /user/hadoop-twq/cmd

If we want to delete the file or directory immediately when the trash file recovery mechanism is turned on, we can add the -skipTrash parameter

hadoop fs  -rm -r -skipTfrash  /user/hadoop-twq/cmd

Third, the local installation of hadoop's window

Four, mapreduce unit testing, integration testing and verification

data preparation

(1) Meteorological data

 isd-history data (only one data)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Guess you like

Origin blog.csdn.net/qq_31905135/article/details/111579900