Dashboard

This page shows how to use the Data Mechanics dashboard in order to monitor your Spark applications. It assumes that you know how to run a Spark application on Data Mechanics..

The dashboard is available in your cluster under https://<your-cluster-url>/dashboard/. You will be asked for a user name and password upon connection.

The dashboard currently contains two pages:

  • the applications list page
  • the application page

Applications list page

Applications list page

This page contains a table with a row for every Spark application recently run on the cluster. The fields are:

  • the application name
  • the job name
  • the start time of the Spark application
  • the end time of the Spark application if the application is finished
  • the duration of the Spark application
  • the status: running, completed or failed
  • the current number of executors for a live Spark application
  • action buttons to kill a running Spark app, or delete a Spark app from history

Clicking on a row takes you to the application page of this application.

Application page

Application page

This page contains more information about a specific Spark application.

The first panel contains the same information and actions buttons as the applications list page for this Spark application. While the application is running, a link to the live Spark UI is available.

The next panel contains the configuration of the Spark application. This is exactly the same object as the payload returned by the Data Mechanics API when submitting an application with POST https://<your-cluster-url>/api/apps (see how to run a Spark application here).

The last panel shows a live stream of the Spark driver log.