Dashboard

The dashboard is available in your cluster under http(s)://<your-cluster-url>/dashboard/.

If you use Data Mechanics authentication, you will be asked for a user name and password upon connection. Otherwise, we advise you to set up your own authentication mechanism, especially if your cluster is open to the internet. See more about this topic here.

The dashboard currently contains two pages:

  • the applications list page
  • the application page

Applications list page

Applications list page

This page contains a table with a row for every Spark application recently run on the cluster. The fields are:

  • the application name
  • the job name
  • the start time of the Spark application
  • the end time of the Spark application if the application is finished
  • the duration of the Spark application
  • the number of executors (total if the app is finished, or live if the app is running)
  • the status: running, completed or failed

Clicking on a row takes you to the application page of this application.

Application page

Application page

This page contains more information about a specific Spark application.

The top left card contains the same information as the applications list page for this Spark application.

The bottom left card contains the configuration of the Spark application. This is exactly the same object as the payload returned by the Data Mechanics API when submitting an application with POST http(s)://<your-cluster-url>/api/apps (see how to run a Spark application here).

The right card shows a live stream of the Spark driver log.