Warm tip: This article is reproduced from serverfault.com, please click

Spark iterate HDFS directory

发布于 2014-11-19 18:01:18

I have a directory of directories on HDFS, and I want to iterate over the directories. Is there any easy way to do this with Spark using the SparkContext object?

Questioner
Jon
Viewed
0
27.7k 2019-09-26 10:00:52

You can use org.apache.hadoop.fs.FileSystem. Specifically, FileSystem.listFiles([path], true)

And with Spark...

FileSystem.get(sc.hadoopConfiguration).listFiles(..., true)

Edit

It's worth noting that good practice is to get the FileSystem that is associated with the Path's scheme.

path.getFileSystem(sc.hadoopConfiguration).listFiles(path, true)