Are failed tasks resubmitted in Apache Spark? -
Are Apache Sparks automatically reproduced in the same or other executable jobs?
I believe that unsuccessful tasks have been resubmitted because I have many times on the web UI Only seen failed work. However, if the same function fails multiple times, the complete operation fails:
org.apache.spark.SparkException: Work aborted due to stage failure: step in step 91.0 120 fail 4 times, most recent failure: lost work in phase 91.0 120.3
Comments
Post a Comment