# Wildlife Monitoring Dashboard — Yellowstone National Park A real-time wildlife camera-trap monitoring dashboard powered by an EfficientNet V2-S classifier and multiple Explainable AI (XAI) methods. Built for the 0HM340 Human-AI Interaction assignment. The dashboard simulates a live feed of camera-trap detections across five locations in [Yellowstone National Park](https://www.nps.gov/yell/), classifying images into seven species (bear, deer, fox, hare, moose, person, wolf) and providing interactive explanations of each prediction. ## Prerequisites - **Python 3.13+** - **[uv](https://docs.astral.sh/uv/)** package manager - **CUDA GPU** (optional, falls back to CPU) ## Quick Start ### 1. Clone the repository ```bash git clone https://git.barrys.cloud/barry/Wildlife-Detection.git cd Wildlife-Detection ``` ### 2. Obtain the model weights The trained model weights are not included in the repository due to their size. You can regenerate them by running the training script: ```bash uv run python train.py ``` This downloads the dataset from Roboflow (if not present), fine-tunes EfficientNet V2-S for 3 epochs, and saves `efficientnet_v2_wild_forest_animals.pt`. Optional flags: ```bash uv run python train.py --epochs 5 --lr 0.0005 --batch-size 16 ``` The **dataset** (`wild-forest-animals-and-person-1/`) is downloaded automatically from [Roboflow](https://roboflow.com/) on first run if not already present on disk. ### 3. Install dependencies ```bash uv sync ``` ### 4. Run the dashboard ```bash uv run python dashboard.py ``` The server starts at **http://localhost:5000**. ## Project Structure ``` assignment-HAI/ ├── dashboard.py # Flask application (main entry point) ├── train.py # Standalone training script ├── yellowstone-camping-map.jpg # Park map background image ├── pyproject.toml # Project metadata and dependencies ├── uv.lock # Locked dependency versions ├── .python-version # Python version pin (3.13) ├── .gitignore ├── README.md # This file ├── USER_GUIDE.md # End-user documentation ├── DEVELOPER_GUIDE.md # Developer documentation ├── efficientnet_v2_wild_forest_animals.pt # Model weights (not in git) └── wild-forest-animals-and-person-1/ # Dataset (not in git) ├── train/ ├── test/ └── valid/ ``` ## XAI Methods | Method | What it shows | |---|---| | **ScoreCAM** | Gradient-free saliency heatmap highlighting regions the model attends to | | **LIME** | Superpixel-based explanations for the top-2 predicted classes | | **Contrastive LIME** | Highlights which regions distinguish the top-1 prediction from the top-2 | | **Nearest Neighbours** | Three training images most similar in feature space (cosine similarity) | ## License Academic project — 0HM340 Human-AI Interaction, TU/e.