
OpenAI has expanded its data-residency options for enterprise customers, specifically its ChatGPT Enterprise, ChatGPT Edu, and API users. The move, as per analysts, could clear one of the biggest hurdles holding enterprises back from adopting the company’s LLM stack at scale.
“Enterprises can move from small pilots to full deployments without violating their jurisdiction’s rules on where data should live. The reality is that, earlier, most security and compliance teams weren’t rejecting GenAI because of model design; they were rejecting it because storing data in the US or EU pushed them into conflict with GDPR, India’s incoming DPDPA norms, UAE’s federal rules, or sector-specific mandates like PCI-DSS,” said Akshat Tyagi, associate practice leader at HFS Research.
The data residency expansion, according to Tyagi, changes that because enterprises will now be able to run workflows involving regulated or sensitive information as they can store that data in the specified region as per dedicated policies, directly benefitting banks, insurance companies, hospitals, and public sector bodies, which are heavily regulated.