Lightweight VS Code extension + chatbot built for Lauzhack 2025/2026 (LLM X Law Challenge).
Helps developers avoid unintentionally violating security policies by analyzing code lines against uploaded security policies (RAG-backed). Also includes a general-purpose chatbot for questions and guidance.
- Biel Altimira
- Laia Mogas
- Pol Resina
- Adrià Vico
- Developers often accidentally introduce security policy violations (secrets, insecure config, disallowed APIs, data exfiltration patterns).
- This tool flags suspicious lines and explains why they may violate your organisation's security policy, using policies you upload into our retrieval system (RAG).
- Includes a conversational assistant for general questions and guidance.
- VS Code extension that scans open files and highlights potential security-policy violations.
- Policy-driven detection: upload your security policies to the RAG backend; scans & explanations are grounded in those documents.
- Line-level explanations with links to the supporting policy snippet and confidence score.
- General chatbot integrated with the same RAG knowledge base for policy-aware Q&A.
- Lightweight local dev mode for testing with a synthetic policy corpus.
- You upload security policies (PDF/TXT/DOCX) to our backend RAG index.
- The extension sends code context (line(s), file path, minimal metadata) to the backend.
- Backend performs retrieval over uploaded policies, composes an LLM prompt, and returns:
- whether the line likely violates a rule,
- the supporting policy excerpt(s),
- a short explanation and remediation suggestion.
- The extension shows inline diagnostics and optional hover details or a side panel with the chatbot.
How to run?
- Install dependencies and build the extension from the
extension/directory:
cd extension
npm install
npm run build-
Open the extension source in VS Code:
- Open
extension/in VS Code. - Open
src/extension.tsand pressF5to launch the Extension Development Host. A new VS Code window will open with the compiled extension loaded.
- Open
-
Run the command:
- Press Ctrl+Shift+P and run the extension command defined in
package.json(seecontributes.commands).
- Press Ctrl+Shift+P and run the extension command defined in
Notes
- The extension calls your backend RAG endpoint; set the endpoint/key in the extension settings or via environment variables in dev mode.
- Add extension logic inside
extension/src/. Keep code small and testable. - Update
package.jsonif you add commands, settings, or activation events. - Add or update unit tests for new behavior.
- Open a PR with a clear description and testing steps.
- The repo includes an example demo backend (FastAPI) that:
- ingests policy documents,
- builds/queries a vector index (FAISS or memory),
- runs an LLM prompt that combines retrieved policy snippets with code context to produce diagnostics and explanations.
- Configure your LLM provider and vector store via environment variables (e.g., OPENAI_API_KEY, VECTOR_STORE).
- Point the extension to the backend URL in settings.
- Backend URL / API key — required for extension → backend requests.
- LLM_PROVIDER, MODEL, VECTOR_STORE — backend config.
- Policy upload: send your policy files to the backend ingestion endpoint to populate the RAG index.
- This tool helps surface possible violations but is NOT authoritative. Always follow your organisation's review and approval processes.
- LLMs can make mistakes or misinterpret policies — require human review of flagged issues.
- Do not send secrets or full source repositories to third-party LLM providers without appropriate agreements (DPA, data residency).
- This is a proof-of-concept/hackathon project — production hardening, security reviews and legal checks are required before real-world use.
- Keep policy documents concise and well-structured for better retrieval results.
- For high-sensitivity environments, run the RAG/LLM stack on-prem or with an approved provider.
- Use the extension in “suggest-only” mode until confidence and policy coverage are validated.
- Open a file and edit a line; diagnostics appear as inline warnings if a potential violation is detected.
- Hover the diagnostic marker to see the explanation and the policy snippet reference.
- Use the chatbot panel to ask policy-aware questions like: "Is storing API keys in this file allowed?" — the reply will cite matching policy lines.
- extension/ — VS Code extension source (TypeScript)
- demo/ — example backend (API, ingestion, RAG helpers)
- docs/ — short guides, sample policies
- tests/ — unit/integration tests
- MIT License
- Repo: https://github.com/laiamp/Legal-Compliance-Assistant
- Issues / feature requests: use GitHub Issues and tag the team (Pol, Biel, Laia, Adrià).
Thanks for checking this out — built for Lauzhack 2025/2026 (LLM X Law Challenge). Feel free to open issues or PRs! 🙌