python与hadoop/hdfs的交互,读取文件/下载/上传/删除

2022年12月30日08:28:46

1.用 python 运行 command

# import the python subprocess module
import subprocess

def run_cmd(args_list):
        """
        run linux commands
        """
        # import subprocess
        print('Running system command: {0}'.format(' '.join(args_list)))
        proc = subprocess.Popen(args_list, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
        s_output, s_err = proc.communicate()
        s_return =  proc.returncode
        return s_return, s_output, s_err 

2.hdfs的相关命令

ls:返回文件或目录的信息

如果是文件,则按照如下格式返回文件信息:
文件名 <副本数> 文件大小 修改日期 修改时间 权限 用户ID 组ID
如果是目录,则返回它直接子文件的一个列表,就像在Unix中一样。目录返回列表的信息如下:
目录名 \<dir> 修改日期 修改时间 权限 用户ID 组ID
示例:
hadoop fs -ls /user/hadoop/file1 /user/hadoop/file2 hdfs://host:port/user/hadoop/dir1 /nonexistentfile
返回值:
成功返回0,失败返回-1。

# Run Hadoop ls command in Python
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-ls', 'hdfs_file_path'])
lines = out.split('\n')

get:下载文件到本地

# Run Hadoop get command in Python
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-get', 'hdfs_file_path', 'local_path'])

put:从本地文件系统中复制单个或多个源路径到目标文件系统。也支持从标准输入中读取输入写入目标文件系统。

# Run Hadoop put command in Python
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-put', 'local_file', 'hdfs_file_path'])

copyFromLocal:限定从hdfs复制到本地

# Run Hadoop copyFromLocal command in Python
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-copyFromLocal', 'local_file', 'hdfs_file_path'])

copyToLocal:限定从本地复制到hdfs

# Run Hadoop copyToLocal command in Python
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-copyToLocal', 'hdfs_file_path', 'local_file'])

rm:删除hdfs上的文件,只删除非空目录和文件

# hdfs的常规命令
# hdfs dfs -rm -skipTrash /path/to/file/you/want/to/remove/permanently

# Run Hadoop remove file command in Python
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-rm', 'hdfs_file_path'])
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-rm', '-skipTrash', 'hdfs_file_path'])

rm -r:递归删除hdfs上的文件,会删除目录

# hdfs的常规命令
# rm -r
# HDFS Command to remove the entire directory and all of its content from HDFS.
# Usage: hdfs dfs -rm -r <path>
    
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-rm', '-r', 'hdfs_file_path'])
(ret, out, err)= run_cmd(['hdfs', 'dfs', '-rm', '-r', '-skipTrash', 'hdfs_file_path'])

test + 参数:检查文件是否存在

Check if a file exist in HDFS
Usage: hadoop fs -test -[defsz] URI


Options:


-d: f the path is a directory, return 0.
-e: if the path exists, return 0.
-f: if the path is a file, return 0.
-s: if the path is not empty, return 0.
-z: if the file is zero length, return 0.
Example:


hadoop fs -test -e filename


hdfs_file_path = '/tmpo'
cmd = ['hdfs', 'dfs', '-test', '-e', hdfs_file_path]
ret, out, err = run_cmd(cmd)
print(ret, out, err)
if ret:
    print('file does not exist')

本文翻译自 Interacting-with-Hadoop-HDFS-using-Python-codes
其他参考文档:
hadoop shell指南
hdfs工具 snakebite

  • 作者:小饼干超人
  • 原文链接:https://blog.csdn.net/m0_37586991/article/details/120764909
    更新时间:2022年12月30日08:28:46 ,共 2522 字。