Harmanpreet Singh
Harmanpreet Singh

Reputation: 1

How to access evaluation metrics in new SageMaker Studio UI after doing model.register?

I am building MLOPs pipelines for a machine learning model. How do I access the evaluation metrics of my model in the SageMake Studio UI after registering the model?

Here's my sample evaluation.json that I am saving in S3

{
    "metric_groups": [
        {
            "name": "regression_metrics",
            "metric_data": [
                {
                    "name": "mse",
                    "value": 6107087691.96
                },
                {
                    "name": "mae",
                    "value": 46717.104
                },
                {
                    "name": "rmse",
                    "value": 78147.85
                },
                {
                    "name": "r2",
                    "value": 0.90
            ]
        }
    ]
}

And here's my register step:

import logging
from sagemaker.workflow.functions import Join
from sagemaker.model_metrics import MetricsSource, ModelMetrics
from sagemaker.workflow.step_collections import RegisterModel


def create_register_step(
        role,
        sagemaker_session,
        model_package_group_name,
        model_approval_status,
        training_step,
        evaluation_step
):
    
    logging.basicConfig(level=logging.INFO)
    logging.info(f'Creating the register step')

    # log evaluation_report
    logging.info(f'Evaluation Report: {evaluation_step}')

    evaluation_s3_uri = evaluation_step.properties.ProcessingOutputConfig.Outputs['evaluation'].S3Output.S3Uri

    
    model_metrics = ModelMetrics(
        model_statistics=MetricsSource(
            s3_uri=Join(
                on="/",
                values=[
                    evaluation_s3_uri,
                    "evaluation.json"
                ]
            ),
            content_type="application/json"
        )
    )


    # Create the RegisterModel step
    register_step = RegisterModel(
        name='ModelRegisterStep',
        estimator=training_step.estimator,
        model_data=training_step.properties.ModelArtifacts.S3ModelArtifacts,
        content_types=["text/csv"],
        response_types=["text/csv"],
        inference_instances=["ml.m5.large", "ml.m5.xlarge"],
        transform_instances=["ml.m5.large"],
        model_package_group_name=model_package_group_name,
        approval_status=model_approval_status,
        model_metrics=model_metrics
    )

    return register_step

My pipeline executes successfully but I cannot see the evaluation metrics Image attached

I have also tried manually adding evaluation report from S3 to the model version but it doesn't work

Upvotes: 0

Views: 83

Answers (1)

Harmanpreet Singh
Harmanpreet Singh

Reputation: 1

I figured out the issue. The evaluation json format was incorrect We simply need to use:

 {
    "metrics": {
        "mse": {
            "value": 6107087691.964753
        },
        "mae": {
            "value": 46717.104932016475
        },
        "rmse": {
            "value": 78147.85788468391
        },
        "r2": {
            "value": 0.9062238811893062
        }
    }
}

Earlier I got confused because when I tried to add an evaluation job manually to a model registry I was getting an error that the JSON requires metric_groups and metric_data

Upvotes: 0

Related Questions