The Gap: Our Memories are Currently Confined to Specific AI Platforms
Today, our AI memories live in silos (e.g. ChatGPT might remember your preferences - but only inside ChatGPT). Our identities, preferences, and past interactions are confined to specific AI platforms, and our ability to control or carry them elsewhere is limited.
This creates inherent friction between users and the products they engage with. Without portable memory:
- Our product experiences lack context and personalization, leading to lower conversion and NPS
- Time-to-value for new apps increases
- Users waste energy constantly reintroducing themselves to every AI they meet
The result is a disjointed experience and holds back both user satisfaction and product performance.
The Opportunity: Portable Memory
We need new tools that manage user memory across different AI applications (e.g. agents, chat interfaces, etc) - letting users control, manage, and carry their AI memories with them across platforms. This means giving users the ability to:
- Port their interaction history and preferences between different AI apps (as well as potentially non AI-apps)
- Add, edit, or delete their memory data, just like updating a profile or settings page
- Permission access to certain parts of their memory while keeping others private
With portable memory, users don’t have to start from scratch with every new AI. And developers can build more personalized experiences without relying on cold starts or guesswork.
Past Attempts at Portable Memory Have Faced Challenges
We’ve seen many attempts at user-controlled, portable data before, but most have fallen short. For example:
- Consumer healthcare data portability has been historically challenging. While consumers care more about their health than ever before - and increasingly want to take it into their own hands - they still face barriers: personal health data remains fragmented across providers, trapped in outdated EHR systems, and is difficult to standardize.
- User data monetization via micro-payments has also been difficult to scale. With third-party cookies being phased out, several platforms have tried to help advertisers directly incentivize users with micro-payments for their data. Adoption has been minimal, constrained by dollar values that are too small to incentivize most users and the need for massive network effects to scale.
Further, while Google and the social platforms have started to offer some visibility into the data they collect on users (e.g. Facebook’s “why am I seeing this ad?” tool) in response to regulatory pressure, they stop well short of full transparency - and provide no meaningful way for users to take that knowledge elsewhere. Thus, data portability has remained challenging until now.
Why AI Could be the Catalyst That Portable Memory Needs
The tailwinds behind user data portability are fundamentally different today:
- LLMs have a lot of context on us - and they let us access it: Models contain rich data about users (how we speak, what we care about, how we interact, etc), and unlike the social platforms, they’re actually willing to share that data with us. For example, ChatGPT’s memory feature shows users what ChatGPT remembers about them and allows users to add or delete specific pieces of their personal data (e.g. the style in which you like to write, your favorite type of restaurant, how many kids you have etc). That’s a fundamentally different foundation for data portability compared to the black boxes of Google and the social platforms. Some users are already manually extracting their personal profiles (or simply asking AI “what do you know about me”) and porting that data into other systems - abstracting that process is the obvious next step.
- The rails for portable memory are being built: Standards like the Model Context Protocol (MCP) connect LLMs to external tools and datasets; however, the same infrastructure could be used to enable portable AI memory. Whether MCP becomes the default standard or not, the groundwork for model-to-model memory portability is already being built. Additionally, companies like mem0, Heurist, Memoripy, Basic, WorkshopLabs, Letta, and Sentience are building infrastructure that allows AI to remember user history across different AI platforms.
- Regulatory pressure is building: Privacy regulations continue to move in favor of the consumer. Walled gardens are becoming harder to justify, and data portability is increasingly framed as a user right. Frameworks like GDPR and CPRA are laying the groundwork for more open, user-controlled ecosystems - and future policies may support cross-platform portability by design.
High-Impact Use Cases for Portable Memory
We think portable memory can unlock superior product experiences across several categories. These are a few where we see opportunity:
- AI Therapists: The average person needs to go to three therapists before they find the right person for them. Yet no one wants to re-explain their life story every time they try a new therapist or mental health app. With portable memory, users could bring emotional context, past sessions, and communication preferences with them - making new tools feel less emotionally draining and more effective from day one.
- AI Customer Service: Customer support is one of the most obvious places where memory can make a difference. Today, most customer support agents start from scratch - asking the same questions, missing context about the user, and offering generic responses. With portable memory, these agents could instantly understand a user’s past issues, preferences, and purchase history. For users, this means faster resolution and less frustration. For businesses, it drives higher conversion and stronger retention.
- AI Workflow Tools: AI-powered productivity tools (e.g. Notion AI, Grammarly, or an AI personal assistant) get dramatically better when they have memory. With portable memory, these tools would immediately understand how you like to write, organize your work, format, etc. Portable memory would enable these tools to produce work that sounds more like you and requires fewer corrections.
- E-commerce: Every shopping experience gets better with persistent memory. Preferences, sizes, styles, return history - carried from AI to e-comm websites - means faster discovery and more relevant recommendations.
- Personal Identity & Privacy: Portable memory introduces a new layer of user data flow - which means new potential privacy threats. If users are going to port sensitive data between systems, we need tools that ensure that data is protected, permissioned, and not misused. Think of this as memory + identity + privacy management. Just like password managers or identity protection services help users control access to their data today, future systems will need to help users manage what’s known about them, who can access their data, and how it’s used.
- Advertising: Most ads today still rely on outdated signals - cookies, inferred demographics, past behavior from a single platform, etc. With portable memory, ads would become far more contextual, reflecting a user’s real preferences and intent. This doesn’t have to mean less privacy, and it could actually mean more user control. If memory becomes a user-owned asset, people could permission which parts of their context are shareable for personalization and which are off-limits. This would mean fewer irrelevant ads, better ROI for brands, and a more aligned user experience.
- Collective Memory: Portable memory also opens the door to better collective experiences via shared intelligence. With permissioned data sharing, users could allow parts of their AI memory to be accessed by others: teammates, family members, etc. At work, this could look like more effective onboarding of a new hire who has access to the past AI interactions of a manager (e.g. their writing styles, insights surfaced by AI, etc). Within families, it could mean shared assistants that understand everyone’s preferences. Over time, this kind of permissioned memory-sharing could reshape how we learn and collaborate.
Open Questions
Portable memory feels inevitable, but there is still a lot we don’t know in terms of how it takes shape and who captures value. These are a few open questions we’re actively exploring:
- What’s the right business model? B2C has always been challenging as most consumers haven’t historically wanted to pay a subscription for data portability. B2B feels more viable as businesses will pay for anything that drives measurable conversion and revenue. If portable memory can make their product experiences more personalized, that’s a direct ROI story.
- What will the product experience look like? Is it a data vault where users deposit and extract their memories? Is it a network layer stitching together the connective tissue of your memory from every platform you’ve interacted with? How will permissioning work and will users be able to tag and segment the type of data they’re willing to share (e.g. sharing retail purchase history but keeping therapy sessions private)? To get a product off the ground in this space, it needs to be incredibly simple to use for both companies and end-users - requiring exceptional UX and a killer initial use case.
- What role will model routing play? Portable memory could enable AI to route to the right model for the right context. For example, an AI therapist could know you prefer a more light-hearted style and pull from Grok, or a more evidence-based approach and pull from OpenEvidence. GPT5’s in-app model routing is an early glimpse at this - what does this look like at scale?