Are failed tasks resubmitted in Apache Spark? -


Are Apache Sparks automatically reproduced in the same or other executable jobs?

I believe that unsuccessful tasks have been resubmitted because I have many times on the web UI Only seen failed work. However, if the same function fails multiple times, the complete operation fails:

  org.apache.spark.SparkException: Work aborted due to stage failure: step in step 91.0 120 fail 4 times, most recent failure: lost work in phase 91.0 120.3    

Comments

Popular posts from this blog

php - PDO bindParam() fatal error -

logging - How can I log both the Request.InputStream and Response.OutputStream traffic in my ASP.NET MVC3 Application for specific Actions? -

java - Why my included JSP file won't get processed correctly? -