The Crop Recommendation System is an intelligent web application designed to help farmers and agricultural experts determine the most suitable crops based on environmental and soil conditions. This system leverages machine learning algorithms to analyze various parameters and provide personalized crop recommendations.
Developed as part of the Bachelor of Computer Technology program at Meru University of Science and Technology, this project demonstrates the practical application of modern web technologies and machine learning in solving real-world agricultural challenges.
The system takes seven key environmental parameters as input and returns a list of recommended crops based on machine learning predictions. The model was trained on a comprehensive dataset containing various crop-environment relationships.
| Parameter | Description | Range | Optimal Range |
|---|---|---|---|
| Nitrogen (N) | Nitrogen content in soil | 0-250 ppm | 20-150 ppm |
| Phosphorus (P) | Phosphorus content in soil | 0-200 ppm | 20-100 ppm |
| Potassium (K) | Potassium content in soil | 0-400 ppm | 20-250 ppm |
| Temperature | Ambient temperature | 5-44°C | 15-35°C |
| Humidity | Relative humidity | 0-100% | 40-80% |
| pH Value | Soil acidity/alkalinity | 3.4-9.0 | 6.0-7.5 |
| Rainfall | Annual precipitation | 150-2500 mm | 150-1500 mm |
The Crop Recommendation System follows a client-server architecture with the following components:
The system uses a Multi-Layer Perceptron (MLP) neural network with the following architecture:
The Crop Recommendation System provides a RESTful API for crop prediction. The API accepts POST requests with environmental parameters and returns recommended crops.
Predict suitable crops based on environmental conditions
{
"N": 60.0, // Nitrogen content in ppm (0-250)
"P": 40.0, // Phosphorus content in ppm (0-200)
"K": 70.0, // Potassium content in ppm (0-400)
"temperature": 26.0, // Temperature in Celsius (5-44)
"humidity": 75.0, // Relative humidity percentage (0-100)
"ph": 6.3, // Soil pH value (3.4-9.0)
"rainfall": 180.0, // Annual rainfall in mm (150-2500)
"top_n": 5 // Number of top crops to return (optional, default: 5)
}
| Parameter | Type | Required | Description |
|---|---|---|---|
| N | float | Yes | Nitrogen content (0-250 ppm) |
| P | float | Yes | Phosphorus content (0-200 ppm) |
| K | float | Yes | Potassium content (0-400 ppm) |
| temperature | float | Yes | Temperature in Celsius (5-44°C) |
| humidity | float | Yes | Relative humidity (0-100%) |
| ph | float | Yes | Soil pH value (3.4-9.0) |
| rainfall | float | Yes | Annual rainfall in mm (150-2500) |
| top_n | int | No | Number of crops to return (default: 5) |
{
"environment": [60.0, 40.0, 70.0, 26.0, 75.0, 6.3, 180.0], // Echo of input parameters
"predicted_crops": [ // Array of recommended crop names
"ocotillo_fouquieria",
"pepper_bell",
"okra_clemson",
"cilantro_santo",
"tomato_cherokee"
]
}
// Make API request to get crop recommendations
const response = await fetch("https://cropie-sys.onrender.com/predict", {
method: "POST", // HTTP method
headers: { "Content-Type": "application/json" }, // Request headers
body: JSON.stringify({ // Request body with environmental data
N: 60,
P: 40,
K: 70,
temperature: 26,
humidity: 75,
ph: 6.3,
rainfall: 180,
top_n: 5
})
});
// Parse the JSON response
const result = await response.json();
// Access the predicted crops array
console.log(result.predicted_crops);
The frontend is built with vanilla JavaScript, HTML5, and CSS3 with Bootstrap 5 for responsive design.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>🌱 Crop Recommendation System</title>
<!-- Bootstrap CSS for responsive design -->
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet" />
<!-- Font Awesome for icons -->
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.0.0/css/all.min.css">
<!-- Custom CSS for styling -->
<style>
/* Custom styles for the application */
</style>
</head>
<body>
<!-- Navigation Bar -->
<nav class="navbar navbar-expand-lg navbar-light">
<!-- Brand and navigation links -->
</nav>
<!-- Main Content Container -->
<div class="container">
<div class="card">
<div class="card-header">
<h2>🌱 Crop Recommendation System</h2>
</div>
<div class="card-body">
<!-- Prediction Form -->
<form id="predictionForm" novalidate>
<!-- Input fields for environmental parameters -->
<div class="row">
<div class="col-md-6">
<label class="form-label">Nitrogen (N)</label>
<input type="number" class="form-control capsule-input" name="n" required>
</div>
<!-- More input fields... -->
</div>
<button type="submit" class="btn btn-primary">🌾 Get Recommendations</button>
</form>
<!-- Loading Indicator -->
<div id="loading" class="loading-dots text-center" style="display:none;">
Analyzing your soil and climate data<span class="dot">.</span><span class="dot">.</span><span class="dot">.</span>
</div>
<!-- Results Section -->
<div id="result-section" class="mt-4" style="display:none;">
<h4>🌾 Recommended Crops</h4>
<div id="result-box"></div>
</div>
</div>
</div>
</div>
<!-- Footer -->
<footer class="footer">
<!-- Footer content -->
</footer>
<!-- Bootstrap JavaScript -->
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/js/bootstrap.bundle.min.js"></script>
<script>
// JavaScript code for form handling and API calls
</script>
</body>
</html>
// Add event listener for form submission
document.getElementById("predictionForm").addEventListener("submit", async (e) => {
e.preventDefault(); // Prevent default form submission
// Collect form data
const formData = new FormData(e.target);
const features = [
parseFloat(formData.get("n")), // Nitrogen
parseFloat(formData.get("p")), // Phosphorus
parseFloat(formData.get("k")), // Potassium
parseFloat(formData.get("temperature")), // Temperature
parseFloat(formData.get("humidity")), // Humidity
parseFloat(formData.get("ph")), // pH value
parseFloat(formData.get("rainfall")) // Rainfall
];
// Get number of crops to display
const top_n = parseInt(formData.get("top_n"));
try {
// Show loading indicator
document.getElementById("loading").style.display = "block";
document.getElementById("result-section").style.display = "none";
// Make API request to backend
const response = await fetch("https://cropie-sys.onrender.com/predict", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ features, top_n })
});
// Parse response
const result = await response.json();
// Handle successful response
if (response.ok) {
displayResults(result.predicted_crops);
} else {
showError(result.error || "Unknown error");
}
} catch (err) {
// Handle network errors
showError("Network error: " + err.message);
} finally {
// Hide loading indicator
document.getElementById("loading").style.display = "none";
}
});
// Function to display results in the UI
function displayResults(crops) {
const resultBox = document.getElementById("result-box");
const resultSection = document.getElementById("result-section");
// Format crop names (convert snake_case to Proper Case)
const formattedCrops = crops.map(crop =>
crop.split('_').map(word =>
word.charAt(0).toUpperCase() + word.slice(1)
).join(' ')
);
// Display results with typing animation
resultBox.innerHTML = "";
resultSection.style.display = "block";
typeText(formattedCrops.join(", "), resultBox);
}
// Function to show error messages
function showError(message) {
alert("Error: " + message);
}
// Function for typing animation effect
function typeText(text, element) {
let i = 0;
const interval = setInterval(() => {
if (i < text.length) {
element.innerHTML += text.charAt(i);
i++;
} else {
clearInterval(interval);
}
}, 40);
}
// Add input event listeners to all number inputs
document.querySelectorAll('input[type="number"]').forEach(input => {
input.addEventListener('input', function() {
validateInput(this); // Validate on every input change
});
});
// Function to validate individual input fields
function validateInput(input) {
const value = parseFloat(input.value); // Get numeric value
const min = parseFloat(input.min); // Minimum allowed value
const max = parseFloat(input.max); // Maximum allowed value
const validationIcon = input.parentNode.querySelector('.validation-icon');
// Check if value is within valid range
if (isNaN(value) || value < min || value > max) {
// Invalid input styling
input.style.borderColor = '#f44336';
input.style.background = '#ffebee';
if (validationIcon) {
validationIcon.textContent = '✗'; // Show X mark
validationIcon.className = 'validation-icon invalid-icon';
}
} else {
// Valid input styling
input.style.borderColor = '#4CAF50';
input.style.background = '#f1f8e9';
if (validationIcon) {
validationIcon.textContent = '✓'; // Show checkmark
validationIcon.className = 'validation-icon valid-icon';
}
}
}
// Function to validate entire form before submission
function validateForm() {
let isValid = true;
const inputs = document.querySelectorAll('input[required]');
inputs.forEach(input => {
if (!input.value) {
isValid = false;
// Add shake animation to empty required fields
input.classList.add("shake-animation");
setTimeout(() => {
input.classList.remove("shake-animation");
}, 1500);
}
});
return isValid;
}
The backend is built with FastAPI and PyTorch for machine learning inference.
# Import required libraries
from fastapi import FastAPI # FastAPI framework for building APIs
from pydantic import BaseModel # Data validation using Python type annotations
import torch # PyTorch for deep learning
import torch.nn as nn # Neural network modules
import joblib # For saving and loading scikit-learn models
import numpy as np # Numerical computing
import asyncio # For asynchronous programming
# Define MLP model architecture
class CropMLP(nn.Module):
def __init__(self, input_dim=7, output_dim=2698):
super(CropMLP, self).__init__() # Initialize parent class
# Define sequential neural network layers
self.model = nn.Sequential(
nn.Linear(input_dim, 128), # Input to hidden layer 1
nn.ReLU(), # Activation function
nn.Linear(128, 256), # Hidden layer 1 to hidden layer 2
nn.ReLU(), # Activation function
nn.Linear(256, output_dim), # Hidden layer 2 to output
nn.Sigmoid() # Sigmoid activation for multi-label classification
)
def forward(self, x):
return self.model(x) # Forward pass through the network
# Load pre-trained model and MultiLabelBinarizer
mlb = joblib.load("mlb.pkl") # Load label binarizer
model = CropMLP(input_dim=7, output_dim=len(mlb.classes_)) # Initialize model
model.load_state_dict(torch.load("crop.pth", map_location=torch.device("cpu"))) # Load weights
model.eval() # Set model to evaluation mode
# Initialize FastAPI application
app = FastAPI(title="Async Crop Prediction API") # Create FastAPI instance with title
# Define request body schema using Pydantic
class Environment(BaseModel):
N: float # Nitrogen content
P: float # Phosphorus content
K: float # Potassium content
temperature: float # Temperature in Celsius
humidity: float # Relative humidity
ph: float # Soil pH value
rainfall: float # Annual rainfall
top_n: int = 5 # Optional parameter with default value
# Asynchronous prediction function
async def async_predict_crops(env_features, top_n=5):
await asyncio.sleep(0) # Yield control to event loop for async operation
# Convert input to tensor and add batch dimension
env_tensor = torch.tensor(env_features, dtype=torch.float32).unsqueeze(0)
with torch.no_grad(): # Disable gradient calculation for inference
probs = model(env_tensor).numpy().flatten() # Get probabilities
# Get indices of top N predictions in descending order
top_indices = probs.argsort()[-top_n:][::-1]
return [mlb.classes_[i] for i in top_indices] # Return crop names
# Define prediction endpoint
@app.post("/predict")
async def predict(env: Environment):
# Extract features from request
features = [
env.N, env.P, env.K,
env.temperature, env.humidity,
env.ph, env.rainfall
]
# Get crop predictions
crops = await async_predict_crops(features, top_n=env.top_n)
return {"environment": features, "predicted_crops": crops} # Return response
# Install required Python packages
pip install fastapi uvicorn torch scikit-learn joblib
# Run the FastAPI server with auto-reload for development
uvicorn app:app --host 0.0.0.0 --port 8000 --reload
# For production deployment (without auto-reload)
uvicorn app:app --host 0.0.0.0 --port 8000 --workers 4
The backend follows a clean architecture with separation of concerns:
The ML model was trained on a comprehensive dataset using PyTorch. Below is the training code and process.
# Import required libraries
import json # For reading JSON data
import numpy as np # Numerical computations
import pandas as pd # Data manipulation
import torch # Deep learning framework
import torch.nn as nn # Neural network modules
import torch.optim as optim # Optimization algorithms
from sklearn.model_selection import train_test_split # Data splitting
from sklearn.preprocessing import MultiLabelBinarizer # Multi-label encoding
from torch.utils.data import DataLoader, TensorDataset # PyTorch data handling
# Load and preprocess dataset
data = [] # Initialize empty list for data
with open("crops.jsonl", "r") as f: # Open dataset file
for line in f: # Read each line
data.append(json.loads(line.strip())) # Parse JSON and add to list
df = pd.DataFrame(data) # Convert to pandas DataFrame
feature_cols = ["N", "P", "K", "temperature", "humidity", "ph", "rainfall"] # Feature columns
X = df[feature_cols].values.astype(np.float32) # Convert features to float32
# Preprocess labels (multi-label encoding)
y_raw = df["label"].apply(lambda x: x.split(",")) # Split comma-separated labels
mlb = MultiLabelBinarizer() # Initialize multi-label binarizer
Y = mlb.fit_transform(y_raw).astype(np.float32) # Transform labels to binary matrix
# Split data into training and validation sets (80-20 split)
X_train, X_val, Y_train, Y_val = train_test_split(X, Y, test_size=0.2, random_state=42)
# Convert numpy arrays to PyTorch tensors
X_train_tensor = torch.from_numpy(X_train) # Training features
Y_train_tensor = torch.from_numpy(Y_train) # Training labels
X_val_tensor = torch.from_numpy(X_val) # Validation features
Y_val_tensor = torch.from_numpy(Y_val) # Validation labels
# Create PyTorch datasets and data loaders
train_dataset = TensorDataset(X_train_tensor, Y_train_tensor) # Training dataset
val_dataset = TensorDataset(X_val_tensor, Y_val_tensor) # Validation dataset
train_loader = DataLoader(train_dataset, batch_size=32, shuffle=True) # Training data loader
val_loader = DataLoader(val_dataset, batch_size=32) # Validation data loader
# Define the neural network architecture
class CropMLP(nn.Module):
def __init__(self, input_dim, output_dim):
super(CropMLP, self).__init__() # Initialize parent class
self.model = nn.Sequential( # Sequential container for layers
nn.Linear(input_dim, 128), # Input to first hidden layer
nn.ReLU(), # ReLU activation for non-linearity
nn.Linear(128, 256), # First to second hidden layer
nn.ReLU(), # ReLU activation
nn.Linear(256, output_dim), # Output layer
nn.Sigmoid() # Sigmoid for multi-label probability output
)
def forward(self, x):
return self.model(x) # Forward pass
# Initialize model with correct dimensions
input_dim = X_train.shape[1] # Number of input features (7)
output_dim = Y_train.shape[1] # Number of output classes (2698 crops)
model = CropMLP(input_dim, output_dim) # Create model instance
# Define loss function and optimizer
criterion = nn.BCELoss() # Binary Cross-Entropy loss for multi-label classification
optimizer = optim.Adam(model.parameters(), lr=0.001) # Adam optimizer with learning rate
# Training loop
num_epochs = 50 # Number of training epochs
for epoch in range(num_epochs):
model.train() # Set model to training mode
train_loss = 0 # Initialize training loss
# Batch training
for xb, yb in train_loader: # Iterate through training batches
optimizer.zero_grad() # Clear previous gradients
outputs = model(xb) # Forward pass
loss = criterion(outputs, yb) # Calculate loss
loss.backward() # Backward pass (compute gradients)
optimizer.step() # Update weights
train_loss += loss.item() * xb.size(0) # Accumulate loss
train_loss /= len(train_loader.dataset) # Average training loss
# Validation phase
model.eval() # Set model to evaluation mode
val_loss = 0 # Initialize validation loss
with torch.no_grad(): # Disable gradient computation
for xb, yb in val_loader: # Iterate through validation batches
outputs = model(xb) # Forward pass
loss = criterion(outputs, yb) # Calculate loss
val_loss += loss.item() * xb.size(0) # Accumulate loss
val_loss /= len(val_loader.dataset) # Average validation loss
# Print training progress
print(f"Epoch {epoch+1}/{num_epochs} - Train Loss: {train_loss:.4f} - Val Loss: {val_loss:.4f}")
# Save trained model and label binarizer
torch.save(model.state_dict(), "crop.pth") # Save model weights
joblib.dump(mlb, "mlb.pkl") # Save label binarizer
The model was trained for 50 epochs with the following performance:
# Function to make predictions with the trained model
def predict_crops_nn(model, mlb, env_features, top_n=10):
model.eval() # Set model to evaluation mode
# Convert input to tensor and add batch dimension
env_features = torch.tensor(env_features, dtype=torch.float32).unsqueeze(0)
with torch.no_grad(): # Disable gradient computation for inference
probs = model(env_features).numpy().flatten() # Get probability scores
# Get indices of top N predictions in descending order
top_indices = probs.argsort()[-top_n:][::-1]
return [mlb.classes_[i] for i in top_indices] # Return crop names
# Test the model with sample environmental conditions
test_envs = [
[60, 40, 70, 26, 75, 6.3, 180], # Environment 1: Moderate conditions
[10, 5, 5, 20, 60, 5.5, 100], # Environment 2: Low nutrient, tropical
[80, 60, 90, 30, 80, 6.8, 200] # Environment 3: High nutrient, warm
]
# Make predictions for each test environment
for i, env in enumerate(test_envs):
top_crops = predict_crops_nn(model, mlb, env, top_n=10) # Get top 10 predictions
print(f"🌱 Environment {i+1}: {env}") # Print environment
print("Top predicted crops:", top_crops) # Print predictions
Adam (Adaptive Moment Estimation) was chosen as the optimization algorithm for training our neural network.
# Adam optimizer configuration in our training code
optimizer = optim.Adam(model.parameters(), lr=0.001)
# Parameters:
# - model.parameters(): All trainable parameters of the neural network
# - lr=0.001: Learning rate (step size for parameter updates)
# - betas: (0.9, 0.999) - default values for momentum and RMSprop
# - eps: 1e-8 - default value for numerical stability
# - weight_decay: 0 - no L2 regularization by default
Adam combines ideas from RMSProp and Momentum:
Why we didn't use SGD:
Why we didn't use RMSProp:
Why we didn't use Adagrad:
We used the following approach for hyperparameter optimization:
We used BCELoss (Binary Cross-Entropy Loss) because:
Additional optimizations implemented in our training:
This project was developed by GROUP B from the Bachelor of Computer Technology program at Meru University of Science and Technology.
Team Lead & Developer
Coordinated project development and contributed to both frontend and backendBackend Developer
Implemented FastAPI server and machine learning integrationFrontend Developer
Designed and implemented the user interface and JavaScript functionalityUI/UX Designer
Created the visual design and user experience flowData Analyst
Preprocessed datasets and analyzed model performanceThe following features are planned for future versions of the system: