# Verdant demo voiceover

Voice: `en-US-GuyNeural`

Approximate narration duration: 197.0s (target ~180s)

---

Welcome to Verdant, an AI-powered crop disease intelligence platform built for smallholder farmers.
In the next three minutes, you will see how a single photograph becomes a full diagnostic story with weather, soil, market context, and a three-tier treatment plan.

We start on the landing page. Verdant is designed for low bandwidth and mobile-first workflows: capture a photo, stream live agent reasoning, and receive structured guidance—not a black box.

The diagnose flow is the heart of the product. Farmers upload up to five images. Behind the scenes, a ReAct-style agent coordinates vision analysis, agronomic knowledge retrieval, Open-Meteo weather, SoilGrids soil context, and regional market signals. Each tool call is streamed to the UI so extension workers and NGOs can audit the reasoning trail.

Demo mode lets judges and trainers explore five realistic scenarios—tomato late blight in India, maize leaf blight in Nigeria, rice blast in Vietnam, cassava brown streak in Kenya, and wheat rust in Argentina—without external API keys. That means reliable demos on stage, in classrooms, and in the field.

History keeps session-linked cases for follow-up. NGOs can revisit prior diagnoses, compare trends, and export evidence for programs and grants.

The dashboard surfaces aggregate signals for partners who coordinate interventions across regions. It is a lightweight Phase-three surface that grows with your deployment.

Security and reliability are first-class: HTTPS in production, EXIF stripping on uploads, Redis-backed rate limits, request correlation IDs, and structured error codes when tools fail gracefully.

Verdant connects to Z dot A I’s G L M model family through an OpenAI-compatible client, so teams can swap models without rewriting the orchestration layer. PostgreSQL stores cases, ChromaDB backs the disease knowledge base, and containers make local development reproducible.

Continuous integration uploads test artifacts, runs security audits, and gates coverage. That discipline keeps hackathon velocity from turning into production debt.

The API surface is intentionally small and documented: streaming diagnosis on POST diagnose, case history on GET cases, treatment retrieval, weather and market probes, PDF export, and organization registration for higher rate limits. Frontend artifacts include Lighthouse reports on main, Vitest JUnit output, and coverage summaries you can attach to submissions.

Docker Compose brings up Postgres, Redis, ChromaDB, the FastAPI service, and Next.js with hot reload so reviewers can reproduce your environment in minutes. Environment variables separate secrets from defaults, and the README points contributors to Z dot A I keys for GLM four dot six when you want live model calls.

This walkthrough used real screenshots from a running stack: landing, diagnose, history, and dashboard. Download the MP3 voice track if you want to re-edit the video, or share the Markdown script for localization into additional farmer-facing languages.

Thank you for watching. Open the Artifacts page in the app to browse screenshots, download this video, and share the voiceover script with your team. Together, we can move faster from diagnosis to resilient harvests.
