You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I read a bit about java.io.File#delete - seems that file cannot be deleted by two general reasons: permissions and busying by some process. So, since some of them are deleted - looks like permissions is not the reason. It's really annoying to collect megabytes in temp directory.
Spark version 2.2.1, scala version 2.11
Any thoughts here?
The text was updated successfully, but these errors were encountered:
I've noticed that some extra folders like "spark-uuid" in
System.getProperty("java.io.tmpdir")
are created. But, some of them are not deleted. Problem is somewhere here https://github.com/holdenk/spark-testing-base/blob/79eef40cdab48ee7aca8902754e3c456f569eea6/core/src/main/1.3/scala/com/holdenkarau/spark/testing/Utils.scala#L109, it returns true, file is not deleted.I read a bit about java.io.File#delete - seems that file cannot be deleted by two general reasons: permissions and busying by some process. So, since some of them are deleted - looks like permissions is not the reason. It's really annoying to collect megabytes in temp directory.
Spark version 2.2.1, scala version 2.11
Any thoughts here?
The text was updated successfully, but these errors were encountered: