Find Related products on Amazon

Shop on Amazon

Show HN: Real-time AI Voice Chat at ~500ms Latency

Published on: 2025-07-25 16:17:32

Real-Time AI Voice Chat πŸŽ€πŸ’¬πŸ§ πŸ”Š Have a natural, spoken conversation with an AI! This project lets you chat with a Large Language Model (LLM) using just your voice, receiving spoken responses in near real-time. Think of it as your own digital conversation partner. FastVoiceTalk_compressed_step3_h264.mp4 (early preview - first reasonably stable version) What's Under the Hood? A sophisticated client-server system built for low-latency interaction: πŸŽ™οΈ Capture: Your voice is captured by your browser. ➑️ Stream: Audio chunks are whisked away via WebSockets to a Python backend. ✍️ Transcribe: RealtimeSTT rapidly converts your speech to text. πŸ€” Think: The text is sent to an LLM (like Ollama or OpenAI) for processing. πŸ—£οΈ Synthesize: The AI's text response is turned back into speech using RealtimeTTS . ⬅️ Return: The generated audio is streamed back to your browser for playback. πŸ”„ Interrupt: Jump in anytime! The system handles interruptions gracefully. Key Features ✨ Fluid Conversation: Sp ... Read full article.