

Picture by Writer | ChatGPT
Machine studying has highly effective functions throughout varied domains, however successfully deploying machine studying fashions in real-world eventualities typically necessitates the usage of an online framework.
Django, a high-level net framework for Python, is especially well-liked for creating scalable and safe net functions. When paired with libraries like scikit-learn, Django allows builders to serve machine studying mannequin inference by way of APIs and likewise permits you to construct intuitive net interfaces for consumer interplay with these fashions.
On this tutorial, you’ll learn to construct a easy Django utility that serves predictions from a machine studying mannequin. This step-by-step information will stroll you thru your complete course of, ranging from preliminary mannequin coaching to inference and testing APIs.
# 1. Undertaking Setup
We are going to begin by creating the bottom mission construction and putting in the required dependencies.
Create a brand new mission listing and transfer into it:
mkdir django-ml-app && cd django-ml-app
Set up the required Python packages:
pip set up Django scikit-learn joblib
Initialize a brand new Django mission known as mlapp
and create a brand new app named predictor
:
django-admin startproject mlapp .
python handle.py startapp predictor
Arrange template directories for our app’s HTML information:
mkdir -p templates/predictor
After working the above instructions, your mission folder ought to appear like this:
django-ml-app/
├─ .venv/
├─ mlapp/
│ ├─ __init__.py
│ ├─ asgi.py
│ ├─ settings.py
│ ├─ urls.py
│ └─ wsgi.py
├─ predictor/
│ ├─ migrations/
│ ├─ __init__.py
│ ├─ apps.py
│ ├─ varieties.py <-- we'll add this later
│ ├─ companies.py <-- we'll add this later (mannequin load/predict)
│ ├─ views.py <-- we'll replace
│ ├─ urls.py <-- we'll add this later
│ └─ assessments.py <-- we'll add this later
├─ templates/
│ └─ predictor/
│ └─ predict_form.html
├─ handle.py
├─ necessities.txt
└─ prepare.py <-- Machine studying coaching script
# 2. Practice the Machine Studying Mannequin
Subsequent, we’ll create a mannequin that our Django app will use for predictions. For this tutorial, we’ll work with the traditional Iris dataset, which is included in scikit-learn.
Within the root listing of the mission, create a script named prepare.py
. This script masses the Iris dataset and splits it into coaching and testing units. Subsequent, it trains a Random Forest classifier on the coaching information. After coaching is full, it saves the skilled mannequin together with its metadata—which incorporates function names and goal labels—into the predictor/mannequin/
listing utilizing joblib.
from pathlib import Path
import joblib
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
from sklearn.model_selection import train_test_split
MODEL_DIR = Path("predictor") / "mannequin"
MODEL_DIR.mkdir(dad and mom=True, exist_ok=True)
MODEL_PATH = MODEL_DIR / "iris_rf.joblib"
def major():
information = load_iris()
X, y = information.information, information.goal
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.2, random_state=42, stratify=y
)
clf = RandomForestClassifier(n_estimators=200, random_state=42)
clf.match(X_train, y_train)
joblib.dump(
{
"estimator": clf,
"target_names": information.target_names,
"feature_names": information.feature_names,
},
MODEL_PATH,
)
print(f"Saved mannequin to {MODEL_PATH.resolve()}")
if __name__ == "__main__":
major()
Run the coaching script:
If all the things runs efficiently, you must see a message confirming that the mannequin has been saved.
# 3. Configure Django Settings
Now that we’ve got our app and coaching script prepared, we have to configure Django so it is aware of about our new utility and the place to seek out templates.
Open mlapp/settings.py
and make the next updates:
- Register the
predictor
app inINSTALLED_APPS
. This tells Django to incorporate our customized app within the mission lifecycle (fashions, views, varieties, and so on.). - Add the
templates/
listing within theTEMPLATES
configuration. This ensures Django can load HTML templates that aren’t tied on to a selected app, like the shape we’ll construct later. - Set
ALLOWED_HOSTS
to just accept all hosts throughout improvement. This makes it simpler to run the mission domestically with out host-related errors.
from pathlib import Path
BASE_DIR = Path(__file__).resolve().mother or father.mother or father
INSTALLED_APPS = [
"django.contrib.admin",
"django.contrib.auth",
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.staticfiles",
"predictor", # <-- add
]
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [BASE_DIR / "templates"], # <-- add
"APP_DIRS": True,
"OPTIONS": {
"context_processors": [
"django.template.context_processors.debug",
"django.template.context_processors.request",
"django.contrib.auth.context_processors.auth",
"django.contrib.messages.context_processors.messages",
],
},
},
]
# For dev
ALLOWED_HOSTS = ["*"]
# 4. Add URLs
With our app registered, the following step is to wire up the URL routing so customers can entry our pages and API endpoints. Django routes incoming HTTP requests by way of urls.py
information.
We’ll configure two units of routes:
- Undertaking-level URLs (
mlapp/urls.py
) – contains world routes just like the admin panel and routes from thepredictor
app. - App-level URLs (
predictor/urls.py
) – defines the particular routes for our net kind and API.
Open mlapp/urls.py and replace it as follows:
# mlapp/urls.py
from django.contrib import admin
from django.urls import path, embrace
urlpatterns = [
path("admin/", admin.site.urls),
path("", include("predictor.urls")), # web & API routes
]
Now create a brand new file predictor/urls.py and outline the app-specific routes:
# predictor/urls.py
from django.urls import path
from .views import residence, predict_view, predict_api
urlpatterns = [
path("", home, name="home"),
path("predict/", predict_view, name="predict"),
path("api/predict/", predict_api, name="predict_api"),
]
# 5. Construct the Kind
To let customers work together with our mannequin by way of an online interface, we’d like an enter kind the place they’ll enter flower measurements (sepal and petal dimensions). Django makes this simple with its built-in varieties module.
We are going to create a easy kind class to seize the 4 numeric inputs required by the Iris classifier.
In your predictor/ app, create a brand new file known as varieties.py and add the next code:
# predictor/varieties.py
from django import varieties
class IrisForm(varieties.Kind):
sepal_length = varieties.FloatField(min_value=0, label="Sepal size (cm)")
sepal_width = varieties.FloatField(min_value=0, label="Sepal width (cm)")
petal_length = varieties.FloatField(min_value=0, label="Petal size (cm)")
petal_width = varieties.FloatField(min_value=0, label="Petal width (cm)")
# 6. Load Mannequin and Predict
Now that we’ve got skilled and saved our Iris classifier, we’d like a manner for the Django app to load the mannequin and use it for predictions. To maintain issues organized, we’ll place all prediction-related logic inside a devoted companies.py file within the predictor
app.
This ensures that our views keep clear and centered on request/response dealing with, whereas the prediction logic lives in a reusable service module.
In predictor/companies.py, add the next code:
# predictor/companies.py
from __future__ import annotations
from pathlib import Path
from typing import Dict, Any
import joblib
import numpy as np
_MODEL_CACHE: Dict[str, Any] = {}
def get_model_bundle():
"""
Masses and caches the skilled mannequin bundle:
{
"estimator": RandomForestClassifier,
"target_names": ndarray[str],
"feature_names": checklist[str],
}
"""
world _MODEL_CACHE
if "bundle" not in _MODEL_CACHE:
model_path = Path(__file__).resolve().mother or father / "mannequin"https://www.kdnuggets.com/"iris_rf.joblib"
_MODEL_CACHE["bundle"] = joblib.load(model_path)
return _MODEL_CACHE["bundle"]
def predict_iris(options):
"""
options: checklist[float] of size 4 (sepal_length, sepal_width, petal_length, petal_width)
Returns dict with class_name and chances.
"""
bundle = get_model_bundle()
clf = bundle["estimator"]
target_names = bundle["target_names"]
X = np.array([features], dtype=float)
proba = clf.predict_proba(X)[0]
idx = int(np.argmax(proba))
return {
"class_index": idx,
"class_name": str(target_names[idx]),
"chances": {str(identify): float(p) for identify, p in zip(target_names, proba)},
}
# 7. Views
The views act because the glue between consumer inputs, the mannequin, and the ultimate response (HTML or JSON). On this step, we’ll construct three views:
- residence – Renders the prediction kind.
- predict_view – Handles kind submissions from the online interface.
- predict_api – Offers a JSON API endpoint for programmatic predictions.
In predictor/views.py, add the next code:
from django.http import JsonResponse
from django.shortcuts import render
from django.views.decorators.http import require_http_methods
from django.views.decorators.csrf import csrf_exempt # <-- add
from .varieties import IrisForm
from .companies import predict_iris
import json
def residence(request):
return render(request, "predictor/predict_form.html", {"kind": IrisForm()})
@require_http_methods(["POST"])
def predict_view(request):
kind = IrisForm(request.POST)
if not kind.is_valid():
return render(request, "predictor/predict_form.html", {"kind": kind})
information = kind.cleaned_data
options = [
data["sepal_length"],
information["sepal_width"],
information["petal_length"],
information["petal_width"],
]
end result = predict_iris(options)
return render(
request,
"predictor/predict_form.html",
{"kind": IrisForm(), "end result": end result, "submitted": True},
)
@csrf_exempt # <-- add this line
@require_http_methods(["POST"])
def predict_api(request):
# Settle for JSON solely (non-compulsory however advisable)
if request.META.get("CONTENT_TYPE", "").startswith("utility/json"):
attempt:
payload = json.masses(request.physique or "{}")
besides json.JSONDecodeError:
return JsonResponse({"error": "Invalid JSON."}, standing=400)
else:
# fall again to form-encoded if you wish to preserve supporting it:
payload = request.POST.dict()
required = ["sepal_length", "sepal_width", "petal_length", "petal_width"]
lacking = [k for k in required if k not in payload]
if lacking:
return JsonResponse({"error": f"Lacking: {', '.be part of(lacking)}"}, standing=400)
attempt:
options = [float(payload[k]) for ok in required]
besides ValueError:
return JsonResponse({"error": "All options should be numeric."}, standing=400)
return JsonResponse(predict_iris(options))
# 8. Template
Lastly, we’ll create the HTML template that serves because the consumer interface for our Iris predictor.
This template will:
- Render the Django kind fields we outlined earlier.
- Present a clear, styled format with responsive kind inputs.
- Show prediction outcomes when accessible.
- Point out the API endpoint for builders preferring programmatic entry.
Iris Predictor
Enter Iris flower measurements to get a prediction.
{% if submitted and end result %}
Predicted class: {{ end result.class_name }}
Chances:
{% for identify, p in end result.chances.gadgets %}
- {{ identify }}: {floatformat:3 }
{% endfor %}
{% endif %}
API accessible at POST /api/predict/
# 9. Run the Utility
With all the things in place, it’s time to run our Django mission and check each the online kind and the API endpoint.
Run the next command to arrange the default Django database (for admin, periods, and so on.):
Launch the Django improvement server:
python handle.py runserver
If all the things is about up appropriately, you will notice output much like this:
Anticipating file modifications with StatReloader
Performing system checks...
System examine recognized no points (0 silenced).
September 09, 2025 - 02:01:27
Django model 5.2.6, utilizing settings 'mlapp.settings'
Beginning improvement server at http://127.0.0.1:8000/
Stop the server with CTRL-BREAK.
Open your browser and go to: http://127.0.0.1:8000/ to make use of the online kind interface.




You may as well ship a POST request to the API utilizing curl:
curl -X POST http://127.0.0.1:8000/api/predict/
-H "Content material-Sort: utility/json"
-d '{"sepal_length":5.1,"sepal_width":3.5,"petal_length":1.4,"petal_width":0.2}'
Anticipated response:
{
"class_index": 0,
"class_name": "setosa",
"chances": {
"setosa": 1.0,
"versicolor": 0.0,
"virginica": 0.0
}
}
# 10. Testing
Earlier than wrapping up, it’s good apply to confirm that our utility works as anticipated. Django offers a built-in testing framework that integrates with Python’s unittest
module.
We are going to create a few easy assessments to ensure:
- The homepage renders appropriately and contains the title.
- The API endpoint returns a sound prediction response.
In predictor/assessments.py
, add the next code:
from django.check import TestCase
from django.urls import reverse
class PredictorTests(TestCase):
def test_home_renders(self):
resp = self.consumer.get(reverse("residence"))
self.assertEqual(resp.status_code, 200)
self.assertContains(resp, "Iris Predictor")
def test_api_predict(self):
url = reverse("predict_api")
payload = {
"sepal_length": 5.0,
"sepal_width": 3.6,
"petal_length": 1.4,
"petal_width": 0.2,
}
resp = self.consumer.publish(url, payload)
self.assertEqual(resp.status_code, 200)
information = resp.json()
self.assertIn("class_name", information)
self.assertIn("chances", information)
Run the next command in your terminal:
It’s best to see output much like this:
Discovered 2 check(s).
Creating check database for alias 'default'...
System examine recognized no points (0 silenced).
..
----------------------------------------------------------------------
Ran 2 assessments in 0.758s
OK
Destroying check database for alias 'default'...
With these assessments passing, you will be assured your Django + machine studying app is functioning appropriately end-to-end.
# Abstract
You will have efficiently created a whole machine studying utility utilizing the Django framework, bringing all elements collectively right into a purposeful system.
Beginning with coaching and saving a mannequin, you built-in it into Django companies for making predictions. You additionally constructed a clear net kind for consumer enter and uncovered a JSON API for programmatic entry. Moreover, you carried out automated assessments to make sure the appliance runs reliably.
Whereas this mission centered on the Iris dataset, the identical construction will be prolonged to accommodate extra complicated fashions, bigger datasets, and even production-ready APIs, making it a stable basis for real-world machine studying functions.
Abid Ali Awan (@1abidaliawan) is a licensed information scientist skilled who loves constructing machine studying fashions. Presently, he’s specializing in content material creation and writing technical blogs on machine studying and information science applied sciences. Abid holds a Grasp’s diploma in know-how administration and a bachelor’s diploma in telecommunication engineering. His imaginative and prescient is to construct an AI product utilizing a graph neural community for college students scuffling with psychological sickness.