The Amazon SageMaker Python SDK is an open-source library for coaching and deploying machine studying (ML) fashions on Amazon SageMaker. Enterprise clients in tightly managed industries comparable to healthcare and finance arrange safety guardrails to make sure their information is encrypted and visitors doesn’t traverse the web. To make sure the SageMaker coaching and deployment of ML fashions observe these guardrails, it’s a standard apply to set restrictions on the account or AWS Organizations degree by means of service management insurance policies and AWS Identity and Access Management (IAM) insurance policies to implement the utilization of particular IAM roles, Amazon Virtual Private Cloud (Amazon VPC) configurations, and AWS Key Management Service (AWS KMS) keys. In such circumstances, information scientists have to supply these parameters to their ML mannequin coaching and deployment code manually, by noting down subnets, safety teams, and KMS keys. This places the onus on the information scientists to recollect to specify these configurations, to efficiently run their jobs, and keep away from getting Entry Denied errors.
Beginning with SageMaker Python SDK model 2.148.0, now you can configure default values for parameters comparable to IAM roles, VPCs, and KMS keys. Directors and end-users can initialize AWS infrastructure primitives with defaults laid out in a configuration file in YAML format. As soon as configured, the Python SDK mechanically inherits these values and propagates them to the underlying SageMaker API calls comparable to
CreateEndpointConfig(), with no further actions wanted. The SDK additionally helps a number of configuration recordsdata, permitting admins to set a configuration file for all customers, and customers can override it by way of a user-level configuration that may be saved in Amazon Simple Storage Service (Amazon S3), Amazon Elastic File System (Amazon EFS) for Amazon SageMaker Studio, or the consumer’s native file system.
On this publish, we present you easy methods to create and retailer the default configuration file in Studio and use the SDK defaults function to create your SageMaker assets.
We display this new function with an end-to-end AWS CloudFormation template that creates the required infrastructure, and creates a Studio area within the deployed VPC. As well as, we create KMS keys for encrypting the volumes utilized in coaching and processing jobs. The steps are as follows:
- Launch the CloudFormation stack in your account. Alternatively, if you wish to discover this function on an present SageMaker area or pocket book, skip this step.
- Populate the
config.yamlfile and save the file within the default location.
- Run a pattern pocket book with an end-to-end ML use case, together with information processing, mannequin coaching, and inference.
- Override the default configuration values.
Earlier than you get began, be sure you have an AWS account and an IAM consumer or position with administrator privileges. If you’re an information scientist at present passing infrastructure parameters to assets in your pocket book, you’ll be able to skip the subsequent step of establishing your surroundings and begin creating the configuration file.
To make use of this function, be sure to improve your SageMaker SDK model by working
pip set up --upgrade sagemaker.
Arrange the surroundings
To deploy a whole infrastructure together with networking and a Studio area, full the next steps:
- Clone the GitHub repository.
- Log in to your AWS account and open the AWS CloudFormation console.
- To deploy the networking assets, select Create stack.
- Add the template below
- Present a reputation for the stack (for instance,
networking-stack), and full the remaining steps to create the stack.
- To deploy the Studio area, select Create stack once more.
- Add the template below
- Present a reputation for the stack (for instance,
sagemaker-stack), and supply the title of the networking stack when prompted for the
- Proceed with the remaining steps, choose the acknowledgements for IAM assets, and create the stack.
When the standing of each stacks replace to CREATE_COMPLETE, proceed to the subsequent step.
Create the configuration file
To make use of the default configuration for the SageMaker Python SDK, you create a config.yaml file within the format that the SDK expects. For the format for the config.yaml file, confer with Configuration file structure. Relying in your work surroundings, comparable to Studio notebooks, SageMaker pocket book situations, or your native IDE, you’ll be able to both save the configuration file on the default location or override the defaults by passing a config file location. For the default areas for different environments, confer with Configuration file locations. The next steps showcase the setup for a Studio pocket book surroundings.
To simply create the
config.yaml file, run the next cells in your Studio system terminal, changing the placeholders with the CloudFormation stack names from the earlier step:
This script mechanically populates the YAML file, changing the placeholders with the infrastructure defaults, and saves the file within the residence folder. Then it copies the file into the default location for Studio notebooks. The ensuing config file ought to look much like the next format:
When you have an present area and networking configuration arrange, create the
config.yaml file within the required format and reserve it within the default location for Studio notebooks.
Notice that these defaults merely auto-populate the configuration values for the suitable SageMaker SDK calls, and don’t implement the consumer to any particular VPC, subnet, or position. As an administrator, in order for you your customers to make use of a selected configuration or position, use IAM condition keys to implement the default values.
Moreover, every API name can have its personal configurations. For instance, within the previous config file pattern, you’ll be able to specify
subnet-a for coaching jobs, and specify
subnet-d for processing jobs.
Run a pattern pocket book
Now that you’ve got set the configuration file, you can begin working your mannequin constructing and coaching notebooks as ordinary, with out the necessity to explicitly set networking and encryption parameters, for many SDK capabilities. See Supported APIs and parameters for a whole record of supported API calls and parameters.
In Studio, select the File Explorer icon within the navigation pane and open
03_feature_engineering/03_feature_engineering.ipynb, as proven within the following screenshot.
Run the pocket book cells one after the other, and spot that you’re not specifying any further configuration. While you create the processor object, you will notice the cell outputs like the next instance.
As you’ll be able to see within the output, the default configuration is mechanically utilized to the processing job, while not having any further enter from the consumer.
While you run the subsequent cell to run the processor, you may as well confirm the defaults are set by viewing the job on the SageMaker console. Select Processing jobs below Processing within the navigation pane, as proven within the following screenshot.
Select the processing job with the prefix
end-to-end-ml-sm-proc, and it’s best to have the ability to view the networking and encryption already configured.
You possibly can proceed working the remaining notebooks to coach and deploy the mannequin, and you’ll discover that the infrastructure defaults are mechanically utilized for each coaching jobs and fashions.
Override the default configuration file
There may very well be circumstances the place a consumer must override the default configuration, for instance, to experiment with public web entry, or replace the networking configuration if the subnet runs out of IP addresses. In such circumstances, the Python SDK additionally permits you to present a customized location for the configuration file, both on native storage, or you’ll be able to level to a location in Amazon S3. On this part, we discover an instance.
user-configs.yaml file on your house listing and replace the
EnableNetworkIsolation worth to
True, below the
Now, open the identical pocket book, and add the next cell to the start of the pocket book:
With this cell, you level the situation of the config file to the SDK. Now, whenever you create the processor object, you’ll discover that the default config has been overridden to allow community isolation, and the processing job will fail in community isolation mode.
You should utilize the identical override surroundings variable to set the situation of the configuration file if you happen to’re utilizing your native surroundings comparable to VSCode.
Debug and retrieve defaults
For fast troubleshooting if you happen to run into any errors when working API calls out of your pocket book, the cell output shows the utilized default configurations as proven within the earlier part. To view the precise Boto3 name created to view the attribute values handed from default config file, you’ll be able to debug by turning on Boto3 logging. To activate logging, run the next cell on the high of the pocket book:
Any subsequent Boto3 calls shall be logged with the whole request, seen below the physique part within the log.
You may also view the gathering of default configurations utilizing the
session.sagemaker_config worth as proven within the following instance.
Lastly, if you happen to’re utilizing Boto3 to create your SageMaker assets, you’ll be able to retrieve the default configuration values utilizing the
sagemaker_config variable. For instance, to run the processing job in
03_feature_engineering.ipynb utilizing Boto3, you’ll be able to enter the contents of the next cell in the identical pocket book and run the cell:
Automate config file creation
For directors, having to create the config file and save the file to every SageMaker pocket book occasion or Studio consumer profile is usually a daunting activity. Though you’ll be able to advocate that customers use a standard file saved in a default S3 location, it places the extra overhead of specifying the override on the information scientists.
To automate this, directors can use SageMaker Lifecycle Configurations (LCC). For Studio consumer profiles or pocket book situations, you’ll be able to connect the next pattern LCC script as a default LCC for the consumer’s default Jupyter Server app:
See Use Lifecycle Configurations for Amazon SageMaker Studio or Customize a Notebook Instance for directions on creating and setting a default lifecycle script.
While you’re accomplished experimenting with this function, clear up your assets to keep away from paying further prices. When you have provisioned new assets as specified on this publish, full the next steps to wash up your assets:
- Shut down your Studio apps for the consumer profile. See Shut Down and Update SageMaker Studio and Studio Apps for directions. Be certain that all apps are deleted earlier than deleting the stack.
- Delete the EFS quantity created for the Studio area. You possibly can view the EFS quantity hooked up with the area through the use of a DescribeDomain API name.
- Delete the Studio area stack.
- Delete the safety teams created for the Studio area. You could find them on the Amazon Elastic Compute Cloud (Amazon EC2) console, with the names security-group-for-inbound-nfs-d-xxx and security-group-for-outbound-nfs-d-xxx
- Delete the networking stack.
On this publish, we mentioned configuring and utilizing default values for key infrastructure parameters utilizing the SageMaker Python SDK. This permits directors to set default configurations for information scientists, thereby saving time for customers and admins, eliminating the burden of repetitively specifying parameters, and leading to leaner and extra manageable code. For the complete record of supported parameters and APIs, see Configuring and using defaults with the SageMaker Python SDK. For any questions and discussions, be part of the Machine Learning & AI community.
Concerning the Authors
Giuseppe Angelo Porcelli is a Principal Machine Studying Specialist Options Architect for Amazon Net Providers. With a number of years software program engineering an ML background, he works with clients of any measurement to deeply perceive their enterprise and technical wants and design AI and Machine Studying options that make the perfect use of the AWS Cloud and the Amazon Machine Studying stack. He has labored on initiatives in numerous domains, together with MLOps, Laptop Imaginative and prescient, NLP, and involving a broad set of AWS providers. In his free time, Giuseppe enjoys taking part in soccer.
Bruno Pistone is an AI/ML Specialist Options Architect for AWS based mostly in Milan. He works with clients of any measurement on serving to them to deeply perceive their technical wants and design AI and Machine Studying options that make the perfect use of the AWS Cloud and the Amazon Machine Studying stack. His subject of experience are Machine Studying finish to finish, Machine Studying Industrialization and MLOps. He enjoys spending time along with his pals and exploring new locations, in addition to travelling to new locations.
Durga Sury is an ML Options Architect on the Amazon SageMaker Service SA staff. She is enthusiastic about making machine studying accessible to everybody. In her 4 years at AWS, she has helped arrange AI/ML platforms for enterprise clients. When she isn’t working, she loves bike rides, thriller novels, and lengthy walks together with her 5-year-old husky.