Dashboard

This page shows how to use the Data Mechanics dashboard in order to monitor your Spark applications. It assumes that you know how to run a Spark application on Data Mechanics..

The dashboard is available in your cluster under http(s)://<your-cluster-url>/dashboard/. You will be asked for a user name and password upon connection.

The dashboard currently contains two pages:

  • the applications list page
  • the application page

Applications list page

Applications list page

This page contains a table with a row for every Spark application recently run on the cluster. The fields are:

  • the application name
  • the job name
  • the start time of the Spark application
  • the end time of the Spark application if the application is finished
  • the duration of the Spark application
  • the number of executors (total if the app is finished, or live if the app is running)
  • the status: running, completed or failed
  • action buttons to kill a running Spark app, or delete a Spark app from history

Clicking on a row takes you to the application page of this application.

Application page

Application page

This page contains more information about a specific Spark application.

The top left card contains the same information as the applications list page for this Spark application. While the application is running, a link to the live Spark UI is available.

The bottom left card contains the configuration of the Spark application. This is exactly the same object as the payload returned by the Data Mechanics API when submitting an application with POST https://<your-cluster-url>/api/apps (see how to run a Spark application here).

The right card shows a live stream of the Spark driver log.