Tech News
← Back to articles

JP Morgan’s AI adoption hit 50% of employees. The secret? A connectivity-first architecture

read original related products more articles

When Derek Waldron and his technical team at JPMorgan Chase first launched an LLM suite with personal assistants two-and-a-half years ago, they weren’t sure what to expect. That wasn’t long after the game-changing emergence of ChatGPT, but in enterprise, skepticism was still high. Surprisingly, employees opted into the internal platform organically — and quickly. Within months, usage jumped from zero to 250,000 employees. Now, more than 60% of employees across sales, finance, technology, operations, and other departments use the continually evolving, continually connected suite.“We were surprised by just how viral it was,” Waldron, JPMorgan’s chief analytics officer, explains in a new VB Beyond the Pilot podcast. Employees weren’t just designing prompts, they were building and customizing assistants with specific personas, instructions, and roles and were sharing their learnings on internal platforms. The financial giant has pulled off what most enterprises still struggle to achieve: large-scale, voluntary employee adoption of AI. It wasn’t the result of mandates; rather, early adopters shared tangible use cases, and workers began feeding off each other’s enthusiasm. This bottom-up usage has ultimately resulted in an innovation flywheel. “It’s this deep rooted innovative population,” Waldron says. “If we can continue to equip them with really easy to use, powerful capabilities, they can turbocharge the next evolution of this journey.” Ubiquitous connectivity plugged into highly sophisticated systems of recordJPMorgan has taken a rare, forward-looking approach to its technical architecture. The company treats AI as a core infrastructure rather than a novelty, operating from the early contrarian stance that the models themselves would become a commodity. Instead, they identified the connectivity around the system as the real challenge and defensible moat.The financial giant invested early in multimodal retrieval-augmented generation (RAG), now in its fourth generation and incorporating multi-modality. Its AI suite is hosted at the center of an enterprise-wide platform equipped with connectors and tools that support analysis and preparation. Employees can plug into an expanding ecosystem of critical business data and interact with “very sophisticated” documents, knowledge and structured data stores, as well as CRM, HR, trading, finance and risk systems. Waldron says his team continues to add more connections by the month. “We built the platform around this type of ubiquitous connectivity,” he explains. Ultimately, AI is a great general-purpose technology that will only grow more powerful, but if people don’t have meaningful access and critical use cases, “you're squandering the opportunity.” As Waldron puts it, AI’s capabilities continue to grow impressively — but they simply remain shiny objects for show if they can’t prove real-world use. “Even if super intelligence were to show up tomorrow, there's no value that can be optimally extracted if that superintelligence can't connect into the systems, the data, the tools, the knowledge, the processes that exist within the enterprise,” he contends. Listen to the full episode to hear about: Waldron’s personal strategy of pausing before asking a human colleague and instead assessing how his AI assistant could answer that question and solve the problem. A "one platform, many jobs" approach: No two roles are the same way, so strategy should center on reusable building blocks (RAG, document intelligence, structured data querying) that employees can assemble into role-specific tools.Why RAG maturity matters: JPMorgan evolved through multiple generations of retrieval — from basic vector search to hierarchical, authoritative, multimodal knowledge pipelines.