Deploying a Machine Learning Model with Flask, Docker, and Azure App Service

This article shows how to deploy a machine learning model as a web API using:

  • Flask (to serve the model)
  • Docker (to containerize the application)
  • Azure App Service (to host it on the cloud)

You are going to be training a model to consuming it live via a curl/Postman request – all in under an hour.

Prerequisites:

  • Knowledge of Python – Creating a simple program, run the program from command prompt or bash.
  • Docker – Knowledge about basic Docker commands and Docker installed in local.
  • Azure – Free or paid subscription and basic knowledge of resource groups, resources, Azure Container Registry and App services.
  • Azure CLI installed.
  • Bash prompt (Will be using bash scripts for docker and azure).

Overall Project Structure:

breast_cancer_model_deploy/
├── Dockerfile
├── requirements.txt
├── train_model.py
├── serve_model.py
├── breast_cancer_model.pkl  ← (generated after running train_model.py)
└── README.md                 ← (optional, document instructions)

Step 1: Train and Save the Model (Locally)

We are going to use scikit-learn’s Breast Cancer dataset to train a logic regression classifier for this article.

# train_model.py

from sklearn.datasets import load_breast_cancer
from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import train_test_split
import pickle

# Load dataset
data = load_breast_cancer() 
X_train, X_test, y_train, y_test = train_test_split(data.data, data.target, random_state=42)

# Train model
model = LogisticRegression(max_iter=1000)
model.fit(X_train, y_train)

# Save model to a .pkl file
with open('breast_cancer_model.pkl', 'wb') as f:
    pickle.dump(model, f)

print("Model saved to breast_cancer_model.pkl")

# To run this script, use the command:
# python train_model.py

Once you run the program train_model.py, you should see the breast_cancer_model.pkl file created in the program root folder.

breast_cancer_model.pkl – this is the trained model that is going to be used in API code.

Step 2: Serve the Model Using Flask

We are going to write a lightweight API with Flask to expose the model breast_cancer_model.pkl

# serve_model.py

from flask import Flask, request, jsonify
import pickle
import numpy as np

app = Flask(__name__)

# Load model when app starts
with open('breast_cancer_model.pkl', 'rb') as f:
    model = pickle.load(f)

# Define the prediction endpoint
@app.route('/predict', methods=['POST'])
def predict():
    data = request.get_json()
    features = np.array(data['input']).reshape(1, -1)
    prediction = model.predict(features)
    return jsonify({'prediction': int(prediction[0])})  # 0 = malignant, 1 = benign

if __name__ == '__main__':
    app.run(host='0.0.0.0', port=80)

# To run this script, use the command:
# python serve_model.py

Once we run the serve_model.py, an endpoint is exposed through port 80. Endpoint POST /predict accepts a JSON input and returns the predicted class.

Step 3: Define the dependencies

Create a requirements.txt:

flask==2.2.5
werkzeug==2.2.3
numpy==1.21.2
scikit-learn==0.24.2

This ensures exact versions are installed in Docker for compatibility.

Step 4: Dockerize the Application

Create a Dockerfile:

# Dockerfile

# Base Python image
FROM python:3.8-slim

# Set working directory inside the container
WORKDIR /app

# Install dependencies
COPY requirements.txt requirements.txt
RUN pip install --no-cache-dir -r requirements.txt

# Copy source code
COPY . /app

# Expose port used by Flask
EXPOSE 80

# Start the Flask app
CMD ["python", "serve_model.py"]

This creates a clean, minimal Docker container with your API.

Step 5: Test Locally with Docker

docker build -t breast_cancer_model_image .
docker run -d -p 80:80 breast_cancer_model_image

Send a test prediction:

curl -X POST http://localhost/predict \
  -H "Content-Type: application/json" \
  -d '{"input": [13.54,14.36,87.46,566.3,0.09779,0.08129,0.06664,0.04781,0.1885,0.05766,0.2699,0.7886,2.058,23.56,0.008462,0.0146,0.02387,0.01315,0.0198,0.0023,14.91,19.31,96.53,678.1,0.1312,0.2776,0.3514,0.152,0.2842,0.07569]}'

Expected output:

{"prediction": 1}

Prediction 1 means benign.

curl -X POST http://localhost/predict \
  -H "Content-Type: application/json" \
  -d '{"input": [17.99,10.38,122.8,1001.0,0.1184,0.2776,0.3001,0.1471,0.2419,0.07871,1.095,0.9053,8.589,153.4,0.006399,0.04904,0.05373,0.01587,0.03003,0.006193,25.38,17.33,184.6,2019.0,0.1622,0.6656,0.7119,0.2654,0.4601,0.1189]}'

Expected output:

{"prediction": 0}

Prediction 0 means malignant.

Step 6: Deploy to Azure App Service

Create Resources

az login
az group create --name bc-rg --location eastus
az acr create --resource-group bc-rg --name bcacr12345 --sku Basic --admin-enabled true

Push Image to ACR

az acr login --name bcacr12345
docker tag breast_cancer_model_image bcacr12345.azurecr.io/breast_cancer_model_image:v1
docker push bcacr12345.azurecr.io/breast_cancer_model_image:v1


Deploy with Azure App Service

az appservice plan create --name bc-plan --resource-group bc-rg --is-linux --sku B1

az webapp create --resource-group bc-rg \
  --plan bc-plan --name bc-api-app \
  --deployment-container-image-name bcacr12345.azurecr.io/breast_cancer_model_image:v1

Configure Registry Credentials (optional if admin enabled)

az webapp config container set \
  --name bc-api-app \
  --resource-group bc-rg \
  --docker-custom-image-name bcacr12345.azurecr.io/breast_cancer_model_image:v1 \
  --docker-registry-server-url https://bcacr12345.azurecr.io

Your ML API is Live!

curl -X POST https://<your-app>.azurewebsites.net/predict \
  -H "Content-Type: application/json" \
  -d '{"input": [...]}'

Final Thoughts

This article walks through a complete ML model deployment pipeline:

  • From training to live API
  • Containerized and cloud-hosted
  • Easily replicable for other models and use cases.

What can you further?

You can use any Dataset instead of one that we used in this article – Here’s how:

  • Choose a dataset (e.g. from sklearn, CSV or a database).
  • Train any classifier or regressor.
  • Save the model with pickle or joblib.
  • Update serve_model.py with correct input shape and logic.
  • Rebuild the Docker image.
  • Expose a /predict endpoint for the new use case.
  • Deploy to Azure App Service.

Example use cases you could use

Use CaseDataset Example (in sklearn or public)Target
Predict diabetes
sklearn.datasets.load_diabetes()
Continuous value (regression)
Predict housing pricesklearn.datasets.fetch_california_housing()Median house price
Predict wine quality/classsklearn.datasets.load_wine()
Wine class (0, 1, 2)
Predict customer churn
Custom CSV with features like tenure, charges, etc.

Custom CSV with features like tenure, charges, etc.
Predict handwritten digitsklearn.datasets.load_digits()sklearn.datasets.load_digits()

Leave a Reply

Your email address will not be published. Required fields are marked *