Fast recursive delete folder in Hadoop

I am trying to recursively delete a folder in HDFS. Something like: fs.delete (path, true)

However, the folder I'm trying to delete has a significantly huge number of files. Can I quickly delete the delete folder?

My guess was that true recursiveness does not iterate over every file and deletes the folder in bulk, however this does not seem to be the case as I can see the files being deleted one by one.

Please let me know your suggestions. I am using scala via EMR spark and trying to delete files in S3.

+3


source to share


1 answer


Use java library. Scala is fully compatible with it.

val fs = FileSystem.get(getConf())
fs.delete(new Path("path/to/file"), true) // delete file, true for recursive 

      



Copied from here .

+3


source







All Articles