Memory & personalization might be the real moat for AI we’ve been looking for. But where that moat forms is still up for grabs: •App level •Model level •OS level •Enterprise level Each has very different dynamics. 🧵 ⸻ 1. App-level personalization Apps build their own memory & context for users. Examples: •Harvey remembering firm-specific legal knowledge for law firms •Abridge capturing patient conversations & generating notes for doctors •Perplexity building long-term search profiles for individual users ➡️ Most likely in vertical applications with focused use cases and domain-specific data. This is where Eniac Ventures is currently doing most of our investing ⸻ 2. Model-level personalization The model itself becomes personalized and portable across apps. Examples: •ChatGPT memory & custom instructions •Meta’s LLaMa fine-tuned on personal embeddings ➡️ Most likely in general-purpose assistants and broad horizontal use cases where user context needs to travel across apps. ⸻ 3. OS-level personalization Personalization happens at the OS level, shared across apps & devices. Examples: •Google Gemini native to Android •Apple (maybe) embedding Claude via Anthropic ➡️ Most likely in consumer devices and mobile ecosystems where platforms control distribution. ⸻ 4. Enterprise-level personalization Each enterprise owns and controls its own personalization layer for employees & customers. Examples: •Microsoft Copilot trained on company data •OSS models (LLaMa, Mistral) deployed on private infra with platforms like TrueFoundry •OpenAI GPTs fine-tuned & hosted in secure enterprise environments ➡️ Most likely in highly regulated industries (healthcare, financial services) where data privacy and compliance are critical. ⸻ Why it matters: Where memory & personalization “land” may define who captures AI value. Different layers may win in different sectors. Where AI memory lives may reshape who captures the next decade of value.
AI-Enhanced Personalization In Digital Products
Explore top LinkedIn content from expert professionals.
Summary
AI-enhanced personalization in digital products refers to using artificial intelligence to create more tailored and relevant user experiences based on individual preferences, behaviors, and needs. By leveraging AI, companies can offer solutions like predictive recommendations, customized content, and user-specific functionality while balancing privacy concerns.
- Focus on user context: Design products that learn from user behavior, preferences, and patterns to deliver tailored experiences that align with individual needs across apps and devices.
- Adopt privacy-first approaches: Implement techniques like on-device processing, federated learning, and user-controlled data sharing to maintain privacy without compromising personalization.
- Utilize predictive analytics: Leverage machine learning to anticipate user needs and deliver relevant suggestions or solutions in real-time for a more engaging customer experience.
-
-
If you think gen AI in ads is just about creative automation—think again. One area I've been really bullish on at Disruptive Digital is using AI to create personalized creative at scale. Imagine being able to use AI to generate the right ad for the right user at the right time... Meta's new retail-specific AI tools should help reach that vision of enhancing both user experience and ad effectiveness, including: → Virtual try-ons using AI models to reduce friction → Background generation for Catalog ads → AI-powered product copy that actually converts I'm thoroughly excited about the virtual-try on being able to showcase AI models of different ages, genders and body sizes wearing your products. Why? When people see themselves represented in ads, they are more likely to buy. In fact, Meta is already seeing this is the case... By combining these with dynamic product sets, some brands saw up to a 25% drop in cost per purchase and 23% lift in ROAS! The takeaway? AI isn't just an efficiency lever—it's becoming a core creative partner.
-
How do we balance AI personalization with the privacy fundamental of data minimization? Data minimization is a hallmark of privacy, we should collect only what is absolutely necessary and discard it as soon as possible. However, the goal of creating the most powerful, personalized AI experience seems fundamentally at odds with this principle. Why? Because personalization thrives on data. The more an AI knows about your preferences, habits, and even your unique writing style, the more it can tailor its responses and solutions to your specific needs. Imagine an AI assistant that knows not just what tasks you do at work, but how you like your coffee, what music you listen to on the commute, and what content you consume to stay informed. This level of personalization would really please the user. But achieving this means AI systems would need to collect and analyze vast amounts of personal data, potentially compromising user privacy and contradicting the fundamental of data minimization. I have to admit even as a privacy evangelist, I like personalization. I love that my car tries to guess where I am going when I click on navigation and it's 3 choices are usually right. For those playing at home, I live a boring life, it's 3 choices are usually, My son's school, Our Church, or the soccer field where my son plays. So how do we solve this conflict? AI personalization isn't going anywhere, so how do we maintain privacy? Here are some thoughts: 1) Federated Learning: Instead of storing data in centralized servers, federated learning trains AI algorithms locally on your device. This approach allows AI to learn from user data without the data ever leaving your device, thus aligning more closely with data minimization principles. 2) Differential Privacy: By adding statistical noise to user data, differential privacy ensures that individual data points cannot be identified, even while still contributing to the accuracy of AI models. While this might limit some level of personalization, it offers a compromise that enhances user trust. 3) On-Device Processing: AI could be built to process and store personalized data directly on user devices rather than cloud servers. This ensures that data is retained by the user and not a third party. 4) User-Controlled Data Sharing: Implementing systems where users have more granular control over what data they share and when can give people a stronger sense of security without diluting the AI's effectiveness. Imagine toggling data preferences as easily as you would app permissions. But, most importantly, don't forget about Transparency! Clearly communicate with your users and obtain consent when needed. So how do y'all think we can strike this proper balance?
-
For years, companies have been leveraging artificial intelligence (AI) and machine learning to provide personalized customer experiences. One widespread use case is showing product recommendations based on previous data. But there's so much more potential in AI that we're just scratching the surface. One of the most important things for any company is anticipating each customer's needs and delivering predictive personalization. Understanding customer intent is critical to shaping predictive personalization strategies. This involves interpreting signals from customers’ current and past behaviors to infer what they are likely to need or do next, and then dynamically surfacing that through a platform of their choice. Here’s how: 1. Customer Journey Mapping: Understanding the various stages a customer goes through, from awareness to purchase and beyond. This helps in identifying key moments where personalization can have the most impact. This doesn't have to be an exercise on a whiteboard; in fact, I would counsel against that. Journey analytics software can get you there quickly and keep journeys "alive" in real time, changing dynamically as customer needs evolve. 2. Behavioral Analysis: Examining how customers interact with your brand, including what they click on, how long they spend on certain pages, and what they search for. You will need analytical resources here, and hopefully you have them on your team. If not, find them in your organization; my experience has been that they find this type of exercise interesting and will want to help. 3. Sentiment Analysis: Using natural language processing to understand customer sentiment expressed in feedback, reviews, social media, or even case notes. This provides insights into how customers feel about your brand or products. As in journey analytics, technology and analytical resources will be important here. 4. Predictive Analytics: Employing advanced analytics to forecast future customer behavior based on current data. This can involve machine learning models that evolve and improve over time. 5. Feedback Loops: Continuously incorporate customer signals (not just survey feedback) to refine and enhance personalization strategies. Set these up through your analytics team. Predictive personalization is not just about selling more; it’s about enhancing the customer experience by making interactions more relevant, timely, and personalized. This customer-led approach leads to increased revenue and reduced cost-to-serve. How is your organization thinking about personalization in 2024? DM me if you want to talk it through. #customerexperience #artificialintelligence #ai #personalization #technology #ceo