AI at Work: How JPMorgan Cracked the Code for Mass Employee Adoption
When it comes to enterprise AI adoption, most companies talk about transformation — few actually achieve it. But at JPMorgan Chase, generative AI isn’t just a tool for data scientists or technologists: it’s being used by half the workforce in daily work, and that number continues heading upward. ([Venturebeat][1])
Here’s the story behind that shift — and why a connectivity-first mindset may be the secret sauce.
From Skepticism to Viral Adoption
Two and a half years ago, JPMorgan launched an internal AI platform built around large language models (LLMs) and personal AI assistants. At the time, enterprise interest in AI was tentative — scepticism was common. But the bank’s experience defied expectations: adoption took off organically. Within months, usage exploded from zero to around 250,000 employees, spanning sales, operations, finance, and tech teams. Today, more than 60% of the global workforce interacts with the AI suite regularly. ([Venturebeat][1])
What’s remarkable isn’t just the scale — it’s how employees embraced the tools. People didn’t just query prompts; they built custom AI assistants tailored to their roles, shared practical use cases internally, and spread enthusiasm from team to team. That bottom-up adoption created a powerful feedback loop that accelerated usage and innovation across the company. ([Venturebeat][1])
Connectivity Over Models: The Real Competitive Edge
Most enterprises treat AI like a flashy add-on. JPMorgan’s technical leadership saw it differently. Instead of focusing only on the models themselves — which they believed would soon be commoditized — they invested early in integrating AI deeply into the company’s systems and data ecosystem. ([Venturebeat][1])
At the heart of this strategy is a connectivity-first architecture:
- The AI suite is embedded within the bank’s core systems of record, linking seamlessly to knowledge bases, CRM, HR, trading, finance, risk, and other business data. ([Venturebeat][1])
- Advanced multimodal Retrieval-Augmented Generation (RAG) workflows enrich AI outputs with structured data and diverse document types. ([Venturebeat][1])
- Continuous expansion of connectors means employees gain increasing access to the information they need — directly through the AI. ([Venturebeat][1])
This infrastructure-first approach means AI isn’t isolated — it can act on real data, not just chat about it.
As JPMorgan’s Chief Analytics Officer Derek Waldron puts it: even the most powerful AI is just a “shiny object” unless it can plug into systems, data, tools, and processes that matter in day-to-day work. ([Venturebeat][1])
Building for Many Roles, Not One Tool
Another design principle underpinning success is flexibility. JPMorgan didn’t build a single “AI app” for everyone. Instead, it created a platform with reusable building blocks — like RAG, structured querying, and document intelligence — that employees combine to build role-specific assistants that actually help them do their jobs. ([Venturebeat][1])
This means a risk analyst’s AI assistant looks very different from a salesperson’s — but both get value. That approach has helped the bank avoid one-size-fits-none solutions and drive meaningful productivity gains.
What It All Means
JPMorgan’s experience shows that scaling AI in large organizations isn’t just a technical challenge — it’s a platform and culture challenge:
- Human behavior matters: voluntary, practical usage beats top-down mandates.
- Data access beats model hype: connectivity into real business systems unlocks real value.
- Flexible platforms beat fixed tools: empowering employees to tailor AI to their workflows drives adoption.
In other words, the future of enterprise AI isn’t about building bigger models — it’s about connecting them well.
Glossary
Large Language Model (LLM) – A type of AI trained on massive text datasets that can generate and understand human-like language. Retrieval-Augmented Generation (RAG) – An AI technique that combines generative models with external data sources to produce more accurate, context-rich outputs. Systems of Record – Core enterprise databases or applications (e.g., CRM, finance, HR) that are authoritative sources of business information.
| [1]: https://venturebeat.com/orchestration/jp-morgans-ai-adoption-hit-50-of-employees-the-secret-a-connectivity-first “JP Morgan’s AI adoption hit 50% of employees. The secret? A connectivity-first architecture | VentureBeat” |