java - Kill MapReduce job if driver program crashes -


i have driver program launches mapreduce job org.apache.hadoop.mapreduce.job.waitforcompletion(boolean) on hadoop 2.4.0. problem have if driver program crashes in middle of job, job continue running. there way kill launched mapreduce job if driver program crashes? whether or not driver program crashes not under control. i'm guessing require client , job periodically poll each other. there setting or method in api this?

you can find applicationid (jobid) listed in yarn webui or can type yarn application -list in yarn resource manager node of cluster. can kill applicationid using kill command as: yarn application -kill <applicationid>. guess solve problem.


Comments

Popular posts from this blog

python - Healpy: From Data to Healpix map -

c - Bitwise operation with (signed) enum value -

xslt - Unnest parent nodes by child node -