R Serving with RestRserve


  • The Dockerfile defines the environment in which our server will be executed.

  • Below, you can see that the entry point for our container will be restrserve.R.

[ ]:
%pycat Dockerfile

Code: restrserve.R

Script restrserve.R handles the following steps * Loads the R libraries used by the server. * Loads a pretrained xgboost model that has been trained on the classical Iris dataset. * Dua, D. and Graff, C. (2019). UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science. * Defines an inference function that takes a matrix of iris features and returns predictions for those iris examples. * Defines two routes: * /ping returns a string ‘Alive’ to indicate that the application is healthy * /invocations applies the previously defined inference function to the input features from the request body * Launches the RestRserve serving application.

[ ]:
%pycat restrserve.R

Build the Serving Image

[ ]:
!docker build -t r-restrserve .

Launch the Serving Container

[ ]:
!echo "Launching RestRServer"
!docker run -d  --rm -p 5000:8080 r-restrserve
!echo "Waiting for the server to start.." && sleep 10
[ ]:
!docker container list

Define Simple Python Client

[ ]:
import requests
from tqdm import tqdm
import pandas as pd

pd.set_option("display.max_rows", 500)
[ ]:
def get_predictions(examples, instance=requests, port=5000):
    payload = {"features": examples}
    return instance.post(f"{port}/invocations", json=payload)
[ ]:
def get_health(instance=requests, port=5000):

Define Example Inputs

Let’s define example inputs from the Iris dataset.

[ ]:
column_names = ["Sepal.Length", "Sepal.Width", "Petal.Length", "Petal.Width", "Label"]
iris = pd.read_csv(
    "s3://sagemaker-sample-files/datasets/tabular/iris/iris.data", names=column_names
[ ]:
iris_features = iris[["Sepal.Length", "Sepal.Width", "Petal.Length", "Petal.Width"]]
[ ]:
example_inputs = iris_features.values.tolist()


[ ]:
predicted = get_predictions(example_inputs).json()["output"]
[ ]:
iris["predicted"] = predicted
[ ]:

Stop All Serving Containers

Finally, we will shut down the serving container we launched for the test.

[ ]:
!docker kill $(docker ps -q)