Wildlife Monitoring Dashboard — Nationaal Park De Hoge Veluwe
A real-time wildlife camera-trap monitoring dashboard powered by an EfficientNet V2-S classifier and multiple Explainable AI (XAI) methods. Built for the 0HM340 Human-AI Interaction assignment.
The dashboard simulates a live feed of camera-trap detections across five locations in Nationaal Park De Hoge Veluwe, classifying images into seven species (bear, deer, fox, hare, moose, person, wolf) and providing interactive explanations of each prediction.
Prerequisites
- Python 3.13+
- uv package manager
- CUDA GPU (optional, falls back to CPU)
Quick Start
1. Clone the repository
git clone <repo-url>
cd assignment-HAI
2. Obtain the model weights
The trained model weights are not included in the repository due to their size. Place the file in the project root:
| File | Size | Description |
|---|---|---|
efficientnet_v2_wild_forest_animals.pt |
~78 MB | Fine-tuned EfficientNet V2-S weights |
The model was fine-tuned in final.ipynb.
The dataset (wild-forest-animals-and-person-1/) is downloaded automatically from Roboflow on first launch if not already present on disk.
3. Install dependencies
uv sync
4. Run the dashboard
uv run python dashboard.py
The server starts at http://localhost:5000.
Project Structure
assignment-HAI/
├── dashboard.py # Flask application (main entry point)
├── final.ipynb # Training, evaluation, and XAI notebook
├── map.webp # Park map background image
├── pyproject.toml # Project metadata and dependencies
├── uv.lock # Locked dependency versions
├── .python-version # Python version pin (3.13)
├── .gitignore
├── README.md # This file
├── USER_GUIDE.md # End-user documentation
├── DEVELOPER_GUIDE.md # Developer documentation
├── efficientnet_v2_wild_forest_animals.pt # Model weights (not in git)
└── wild-forest-animals-and-person-1/ # Dataset (not in git)
├── train/
├── test/
└── valid/
XAI Methods
| Method | What it shows |
|---|---|
| ScoreCAM | Gradient-free saliency heatmap highlighting regions the model attends to |
| LIME | Superpixel-based explanations for the top-2 predicted classes |
| Contrastive LIME | Highlights which regions distinguish the top-1 prediction from the top-2 |
| Nearest Neighbours | Three training images most similar in feature space (cosine similarity) |
License
Academic project — 0HM340 Human-AI Interaction, TU/e.