Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-12704

Athena - Experiments details page

    Details

    • Type: New Feature
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 5.0.0
    • Component/s: UI
    • Labels:
    • Sprint:
      5.0 02/27
    • Release Notes:
      Hide
      Added support for an automated UI driven flow for machine learning. This feature is available under the Analytics section and allows users to
      1. Create experiments and organize different models under each experiment
      2. Explore prediction of individual models
      3. Operationalize it by adding the desired model to a pipeline
      Show
      Added support for an automated UI driven flow for machine learning. This feature is available under the Analytics section and allows users to 1. Create experiments and organize different models under each experiment 2. Explore prediction of individual models 3. Operationalize it by adding the desired model to a pipeline
    • Rank:
      1|hzy0rc:0irr002

      Description

      As a CDAP user, I want to be able to view and manage all the models related to one experiment.
      User Flow:
      The user lands on this page after either creating an experiment (and a model) or clicking on an experiment from the Experiment Overview page.
      In the Experiment details page, the user has an overview of the experiment and all the models belonging to that experiment.

      The top area contains information related to the experiment:

      • Name of the experiment
      • Description (editable)
      • The username of the person who created the experiment ("Created by")
      • Test Data: display name of the root folder and name of the directory of the file, any other directories are hidden by an ellipsis (e.g. file_system/…/Titanic_data/Titanic. csv). When the user hovers over the TestData path, a tooltip displays the entire path.
      • Outcome: An experiment can only have one outcome, which cannot be edited. If the user wants to change the outcome, he has to create a new experiment.
      • Last Trained: displays the date and time of the last trained model.
      • # of Models: how many models are in this experiment
      • # Deployed: how many models in this experiment have been deployed
      • Visualization of Experiment metrics:
        Algorithm Type: Pie chart displaying the distribution of Algorithm used by the models in the experiment Screen Shot 2017-10-17 at 2.52.17 PM.png
        Status: Pie Chart Displaying the distribution of the status of the model Screen Shot 2017-10-17 at 2.51.59 PM.png
        F1: Bar Chart displaying the distribution of models aggregated by F1 values Screen Shot 2017-10-17 at 4.00.32 PM.png
        Accuracy: Bar Chart displaying the distribution of models aggregated by Accuracy values Screen Shot 2017-10-17 at 4.05.43 PM.png
        RMSE: Bar Chart displaying the distribution of models aggregated by RMSE values Screen Shot 2017-10-17 at 3.53.50 PM.png
        Acceptance Criteria
      • A user can edit the Experiment description

      Delete Experiment
      An experiment can be deleted

      Models table
      Metadata is surfaced for each model and it is organized in the following columns:

      • Model name: add ellipsis if the name is longer than the space allocated. Full name is displayed on hover in a tooltip
      • Status:
        Splitting Data (icon: spinning)
        Training (icon: spinning)
        Failed Training (circle with red outline)
        Trained (circle with green outline)
        Deployed (green circle)
      • Algorithm (Bhooshan needs to fill in the algorithms list) add ellipsis if the name is longer than the space allocated. Full name is displayed on hover in a tooltip
      • Hyperparameters are displayed when the user selects the gears icons a tooltip appears displaying the Hyperparameter. (Ask to Bhooshan if the user need to copy the Hyperparameters).The icon is rendered in grey hex #dbdbdb When the user hovers over the row or when the model is open, the icon changes to blue. Screen Shot 2017-10-18 at 9.36.06 AM.png
      • Precision (this metric is displayed only for classification models)
      • Recall (this metric is displayed only for classification models)
      • F1 (this metric is displayed only for classification models)
      • RMSE (this metric is displayed only for regression models)
      • Delete icon (A user can delete a model by selecting the link “Delete Experiment”. Models that are being trained cannot be deleted and the icon should be disabled.) The delete icon is rendered in grey hex #dbdbdb When the user hovers over the row or when the model is open, the icon changes to blue.

      Model Details
      Model rows: on hover the background of the row changes to grey hex #f5f5f5, the icons change color from grey to blue
      The user can view details about each of the models by clicking on the row.

      • Model Description
      • # Directives: it displays the number of directives applied to the model. The user can view a list of all directives, by clicking the link "View" which opens a modal.
      • Features(#): it displays a list of the features used in the model. If the number of features overflows the space allocated, an ellipsis is displayed.
      • Tags (#): the user is able to tag a model (functionality is similar to Tags in Applications Details). The tags can expand across all the three columns
      • Deployed On: it displays the date the model was deployed. (if the model has not been deployed the data is displayed as - -)
      • Created by: <username> of the person who created the model and date of creation
      • Model ID: this is the unique number assigned to each model. The user can view it and copy it to the clipboard by selecting the "Copy to clipboard" link Screen Shot 2017-10-18 at 12.58.01 PM.png
      • Score Using: These are snippets of code the user can use to score the model in different languages:
        • Spark
        • PySpark
        • Python
        • R
        Each of the languages has a "View Code" link which if selected, opens a tooltip displaying the code and a button to "Copy to clipboard" Screen Shot 2017-10-18 at 1.22.01 PM.png
        After selected button changes to green "Copied to clipboard" Screen Shot 2017-10-18 at 1.22.16 PM.png
      • Model Logs: a new tab opens to display the logs

      Edit Button functionality

      • A user can clone a model and change specific parameters by selecting the [Edit] CTA. A tooltip is displayed showing the following options:
        Screen Shot 2017-10-18 at 1.52.03 PM.png
        • Directives - selecting this option takes the user to the data prep view of the file. The directives related to the model are applied. The Experiment is already selected, the name of the model is highlighted and a copy of the original name plus "_copy" All the other Model configurations (Experiment, Directives, splitting method, ML algorithm) are cloned Screen Shot 2017-10-18 at 1.57.24 PM.png
        • Data Split - the user is taken to the Data split view. The model is cloned, the name is automatically assigned (<original name>_copy), directives from the original model are applied

      Acceptance Criteria

      • The user can edit the Model Description
      • Training Model: When Training the model details display three CTA: [Deploy] [Edit] [Terminate Training]. The first two are disabled. [Terminate Training] opens a confirmation modal:
        "Are you sure you want to Terminate Training <model name>?
        Terminate training is final and it cannot be undone. Once you terminate training a model, you no longer can use that model and have to create a new model.
        [Yes, terminate training] No, keep training
      • Failed Training: When a model has failed training the model details display the following CTA [Edit]
      • Trained: When a model has been trained the following CTA is displayed: [Deploy] [Edit]
      • Deployed: only the CTA [Edit] is displayed.

      InVision prototype: https://invis.io/6ADXK589F#/257711545_Experiment_View-Flow_1

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                ajai Ajai Narayan
                Reporter:
                lea Lea
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: