java - Kill MapReduce job if driver program crashes -


i have driver program launches mapreduce job org.apache.hadoop.mapreduce.job.waitforcompletion(boolean) on hadoop 2.4.0. problem have if driver program crashes in middle of job, job continue running. there way kill launched mapreduce job if driver program crashes? whether or not driver program crashes not under control. i'm guessing require client , job periodically poll each other. there setting or method in api this?

you can find applicationid (jobid) listed in yarn webui or can type yarn application -list in yarn resource manager node of cluster. can kill applicationid using kill command as: yarn application -kill <applicationid>. guess solve problem.


Comments

Popular posts from this blog

c - Bitwise operation with (signed) enum value -

xslt - Unnest parent nodes by child node -

YouTubePlayerFragment cannot be cast to android.support.v4.app.Fragment -