Are failed tasks resubmitted in Apache Spark? -


Are Apache Sparks automatically reproduced in the same or other executable jobs?

I believe that unsuccessful tasks have been resubmitted because I have many times on the web UI Only seen failed work. However, if the same function fails multiple times, the complete operation fails:

  org.apache.spark.SparkException: Work aborted due to stage failure: step in step 91.0 120 fail 4 times, most recent failure: lost work in phase 91.0 120.3    

Comments

Popular posts from this blog

mysql - where clause in inner join query -

java - Why my included JSP file won't get processed correctly? -

php - MySQL Query for Advanced Search multiple criteria -