An agentic conversational data analysis platform powered by LangGraph. Analyze CSV data through natural language conversations with real-time preview updates.
Full-stack application that combines an intelligent agent architecture with a modern web interface. The system uses LangGraph to orchestrate a multi-step workflow that understands user intent, processes data, and generates insights—all while streaming updates to the frontend in real-time.
The core of the system is a LangGraph state machine that processes user queries through three sequential nodes:
- Understand Intent - Uses OpenAI GPT-4o to analyze user messages and extract analysis requirements
- Process Data - Executes data operations (loading, filtering, summarizing) using Pandas tools
- Generate Response - Synthesizes results into natural language responses
The agent maintains conversation context and streams status updates via Server-Sent Events (SSE) for real-time UI feedback.
- React 18 - UI framework
- TypeScript - Type safety
- Vite - Build tool and dev server
- Tailwind CSS - Styling
- SSE (Server-Sent Events) - Real-time streaming updates
- FastAPI - REST API framework
- LangGraph - Agent orchestration and state management
- LangChain - LLM integration framework
- OpenAI GPT-3.5-turbo - Natural language understanding and generation
- Pandas - Data analysis and manipulation
- SSE-Starlette - Server-Sent Events streaming
ai-datalab/
├── backend/
│ ├── agent/
│ │ ├── graph.py # LangGraph state machine definition
│ │ └── nodes.py # Agent nodes (understand, process, respond)
│ ├── tools/
│ │ └── data_analysis.py # Pandas-based data operations
│ ├── data/ # CSV files for analysis
│ └── main.py # FastAPI application
├── frontend/
│ ├── src/
│ │ ├── components/ # React components
│ │ ├── hooks/ # Custom hooks (useSSE)
│ │ └── App.tsx # Main application
│ ├── package.json # Frontend dependencies
│ └── vite.config.ts # Vite configuration
└── ...
- Node.js 18+
- Python 3.8+
- OpenAI API key
cd frontend
npm install
npm run devFrontend runs on http://localhost:5175
cd backend
pip install -r requirements.txt
cp .env.example .env
# Add your OPENAI_API_KEY to .env
python main.pyBackend runs on http://localhost:8000
- Start the backend server
- Start the frontend dev server
- Place CSV files in
backend/data/ - Chat with the agent to analyze your data
- View real-time analysis results in the preview panel
POST /api/chat- Process chat messages through the agentGET /api/stream/{session_id}- SSE stream for real-time updatesGET /api/health- Health check
cd frontend
npm run build