R Serving with Plumber
Dockerfile
The Dockerfile defines the environment in which our server will be executed.
Below, you can see that the entrypoint for our container will be deploy.R
[ ]:
%pycat Dockerfile
Code: deploy.R
The deploy.R script handles the following steps: * Loads the R libraries used by the server. * Loads a pretrained xgboost
model that has been trained on the classical Iris dataset. * Dua, D. and Graff, C. (2019). UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science. * Defines an inference function that takes a matrix of iris features and
returns predictions for those iris examples. * Finally, it imports the endpoints.R script and launches the Plumber server app using those endpoint definitions.
[ ]:
%pycat deploy.R
Code: endpoints.R
endpoints.R defines two routes: * /ping
returns a string ‘Alive’ to indicate that the application is healthy * /invocations
applies the previously defined inference function to the input features from the request body
For more information about the requirements for building your own inference container, see: Use Your Own Inference Code with Hosting Services
[ ]:
%pycat endpoints.R
Build the Serving Image
[ ]:
!docker build -t r-plumber .
Launch the Serving Container
[ ]:
!echo "Launching Plumber"
!docker run -d --rm -p 5000:8080 r-plumber
!echo "Waiting for the server to start.." && sleep 10
[ ]:
!docker container list
Define Simple Python Client
[ ]:
import requests
from tqdm import tqdm
import pandas as pd
pd.set_option("display.max_rows", 500)
[ ]:
def get_predictions(examples, instance=requests, port=5000):
payload = {"features": examples}
return instance.post(f"http://127.0.0.1:{port}/invocations", json=payload)
[ ]:
def get_health(instance=requests, port=5000):
instance.get(f"http://127.0.0.1:{port}/ping")
Define Example Inputs
Let’s define example inputs from the Iris dataset.
[ ]:
column_names = ["Sepal.Length", "Sepal.Width", "Petal.Length", "Petal.Width", "Label"]
iris = pd.read_csv(
"s3://sagemaker-sample-files/datasets/tabular/iris/iris.data", names=column_names
)
[ ]:
iris_features = iris[["Sepal.Length", "Sepal.Width", "Petal.Length", "Petal.Width"]]
[ ]:
example_inputs = iris_features.values.tolist()
Plumber
[ ]:
predicted = get_predictions(example_inputs).json()["output"]
[ ]:
iris["predicted"] = predicted
[ ]:
iris
Stop All Serving Containers
Finally, we will shut down the serving container we launched for the test.
[ ]:
!docker kill $(docker ps -q)
[ ]: