ssh - How to view the logs of a spark job after it has completed and the context is closed? -
i running pyspark
, spark 1.3
, standalone mode
, client mode
.
i trying investigate spark job looking @ jobs past , comparing them. want view logs, configuration settings under jobs submitted, etc. i'm running trouble viewing logs of jobs after context closed.
when submit job, of course open spark context. while job running, i'm able open spark web ui using ssh tunneling. and, can access forwarded port localhost:<port no>
. can view jobs running, , ones completed, this:
then, if wish see logs of particular job, can using ssh tunnel port forwarding see logs on particular port particular machine job.
then, job fails, context still open. when happens, still able see logs above method.
but, since don't want have of these contexts open @ once, when job fails, close context. when close context, job appears under "completed applications" in image above. now, when try view logs using ssh tunnel port forwarding, before (localhost:<port no>
), gives me page not found
.
how view logs of job after context closed? and, imply relationship between spark context
, logs kept? thank you.
again, running pyspark
, spark 1.3
, standalone mode
, client mode
.
spark event log / history-server use case.
enable event log
if conf/spark-default.conf
not exist
cp conf/spark-defaults.conf.template conf/spark-defaults.conf
add following configuration conf/spark-default.conf
.
# enabled event log spark.eventlog.enabled true // store event log spark.eventlog.dir file:///users/rockieyang/git/spark/spark-events // tell history server event log spark.history.fs.logdirectory file:///users/rockieyang/git/spark/spark-events
history server
start history server
sbin/start-history-server.sh
check history, default port 18080
Comments
Post a Comment