Step-by-Step Tutorial: TensorFlow.js and Node.js Integration

Machine learning has come a long way, and today, JavaScript developers can leverage TensorFlow.js to build powerful AI-driven applications using Node.js. Whether you’re looking to classify images, perform sentiment analysis, or even build real-time AI-powered chatbots, TensorFlow.js in a Node.js environment offers an efficient and scalable way to run machine learning models.
Why Use TensorFlow.js with Node.js?
Before diving into the implementation, let’s explore why you should consider TensorFlow.js in a Node.js environment:
→ Run AI Models Server-Side — Unlike client-side TensorFlow.js, the Node.js backend can handle intensive computations efficiently.
→ Leverage GPU Acceleration — If your server has GPU capabilities, TensorFlow.js can utilize them to speed up training and inference.
→ Seamless API Integration — You can integrate AI models into existing REST APIs, WebSockets, or microservices.
→ Pre-trained Models — You don’t have to train models from scratch; TensorFlow.js supports pre-trained models like MobileNet, BERT, and more.
Now, let’s get our hands dirty with some coding!
1. Setting Up the Node.js Environment
First, ensure that Node.js is installed on your machine. If not, download and install it from the official Node.js website.
To verify the installation, run:
node -v
npm -v
This should return the installed versions of Node.js and npm.
Initialize a New Node.js Project
Create a new project directory and set up a basic package.json file:
mkdir tensorflow-nodejs
cd tensorflow-nodejs
npm init -y
This will create a package.json
file for managing dependencies.
2. Installing TensorFlow.js for Node.js
Next, install TensorFlow.js and required dependencies:
npm install @tensorflow/tfjs-node
If you have GPU acceleration, you can install @tensorflow/tfjs-node-gpu
instead for better performance:
npm install @tensorflow/tfjs-node-gpu
3. Loading a Pre-trained Model
TensorFlow.js provides pre-trained models like MobileNet for image classification. Let’s load MobileNet and test it with an image.
3.1 Install Required Libraries
We’ll need node-fetch for fetching external resources and Jimp for image processing:
npm install node-fetch jimp
3.2 Write the Code to Load MobileNet
Create a new file index.js
and write the following:
const tf = require("@tensorflow/tfjs-node");
const mobilenet = require("@tensorflow-models/mobilenet");
const fetch = require("node-fetch");
const Jimp = require("jimp");
async function classifyImage(imagePath) {
// Load the MobileNet model
const model = await mobilenet.load();
// Read and process the image
const image = await Jimp.read(imagePath);
const buffer = await image.getBufferAsync(Jimp.MIME_JPEG);
const tensor = tf.node.decodeImage(buffer).expandDims().toFloat().div(tf.scalar(255));
// Make prediction
const predictions = await model.classify(tensor);
console.log("Predictions:", predictions);
}
// Run classification
classifyImage("example.jpg");
3.3 Explanation of the Code
→ We load the MobileNet model using TensorFlow.js.
→The image is read and converted into a tensor (which TensorFlow can understand).
→The model classifies the image and outputs predictions.
4. Building an Express API for AI Inference
Now, let’s turn this into a simple AI-powered REST API using Express.js.
4.1 Install Express.js
npm install express multer
- Express.js is used to create the REST API.
- Multer handles file uploads.
4.2 Create the Express Server
Modify index.js
to include an Express server:
const express = require("express");
const multer = require("multer");
const fs = require("fs");
const path = require("path");
const app = express();
const upload = multer({ dest: "uploads/" });
app.post("/classify", upload.single("image"), async (req, res) => {
if (!req.file) {
return res.status(400).send("No image uploaded");
}
const imagePath = path.join(__dirname, req.file.path);
const predictions = await classifyImage(imagePath);
// Delete the uploaded file after processing
fs.unlinkSync(imagePath);
res.json({ predictions });
});
app.listen(3000, () => console.log("Server running on port 3000"));
4.3 How It Works
→ The user uploads an image via a POST
request to /classify
.
→ The image is processed and classified using MobileNet.
→The API returns predictions as a JSON response.
5. Testing the API
Now, let’s test the API using cURL or Postman.
Using cURL
curl -X POST -F "image=@example.jpg" http://localhost:3000/classify
Expected Response
{
"predictions": [
{ "className": "golden retriever", "probability": 0.98 },
{ "className": "Labrador", "probability": 0.75 }
]
}
6. Deploying the AI API
Now that we have a working AI-powered API, let’s discuss deployment strategies.
6.1 Using PM2 for Process Management
To keep the API running even after closing the terminal:
npm install -g pm2
pm2 start index.js
6.2 Deploying on a Cloud Server
For a production-ready API, you can deploy it on: ✅ VPS (e.g., DigitalOcean, AWS EC2, Linode)
→ Serverless (e.g., AWS Lambda with Express)
→ Docker for containerized deployment
Example Dockerfile:
FROM node:18
WORKDIR /app
COPY package.json ./
RUN npm install
COPY . .
CMD ["node", "index.js"]
Build and run:
docker build -t tensorflow-api .
docker run -p 3000:3000 tensorflow-api
Conclusion
You have successfully built an AI-powered image classification API using TensorFlow.js and Node.js, covering model integration, API creation, and deployment. Next, explore other models, train custom machine learning models, or extend this API for real-time AI applications.
You may also like:
- 10 Common Mistakes with Synchronous Code in Node.js
- Why 85% of Developers Use Express.js Wrongly
- Implementing Zero-Downtime Deployments in Node.js
- 10 Common Memory Management Mistakes in Node.js
- 5 Key Differences Between ^ and ~ in package.json
- Scaling Node.js for Robust Multi-Tenant Architectures
- 6 Common Mistakes in Domain-Driven Design (DDD) with Express.js
- 10 Performance Enhancements in Node.js Using V8
- Can Node.js Handle Millions of Users?
- Express.js Secrets That Senior Developers Don’t Share
Read more blogs from Here
Share your experiences in the comments, and let’s discuss how to tackle them!
Follow me on Linkedin