Have you ever wondered what the future of entrepreneurship looks like in a world where artificial intelligence can take on the roles traditionally reserved for human employees? In this episode of The Angel Next Door Podcast, host Marcia Dawood sits down with AI expert Sharon Zhang to explore the transformative impact of AI on business and society. Sharon talks about the evolving landscape of AI and its potential to not only automate mundane tasks but also to foster new business models and opportunities for creative entrepreneurship. Sharon Zhang, who boasts over 16 years of experience in artificial intelligence, emerges as a compelling guest. She began her journey at the MIT CCL lab and has since ventured through various roles, from clinical decision-making at Nuance Communications to algorithm development for hedge funds. Since 2020, she has co-founded Personal AI, a platform that builds digital twins to augment individual lives. Sharon’s extensive background provides a rich foundation for discussing AI’s role in modern entrepreneurship. This episode is a must-listen as Sharon provides a comprehensive view of the AI ecosystem, breaking it down into essential components like AI chips, infrastructure, foundation models, and applications. She shines a light on data privacy and the significance of user sovereignty over personal data. Furthermore, Sharon shares insights on the financial challenges faced by AI startups, the strategic moves by industry giants like OpenAI and Microsoft, and the burgeoning field of AI agents capable of performing complex tasks. Whether you’re an entrepreneur, an investor, or simply fascinated by AI, this episode offers a treasure trove of knowledge and foresight into the future of artificial intelligence and its profound implications.
Have you ever wondered what the future of entrepreneurship looks like in a world where artificial intelligence can take on the roles traditionally reserved for human employees? In this episode of The Angel Next Door Podcast, host Marcia Dawood sits down with AI expert Sharon Zhang to explore the transformative impact of AI on business and society. Sharon talks about the evolving landscape of AI and its potential to not only automate mundane tasks but also to foster new business models and opportunities for creative entrepreneurship.
Sharon Zhang, who boasts over 16 years of experience in artificial intelligence, emerges as a compelling guest. She began her journey at the MIT CCL lab and has since ventured through various roles, from clinical decision-making at Nuance Communications to algorithm development for hedge funds. Since 2020, she has co-founded Personal AI, a platform that builds digital twins to augment individual lives. Sharon’s extensive background provides a rich foundation for discussing AI’s role in modern entrepreneurship.
This episode is a must-listen as Sharon provides a comprehensive view of the AI ecosystem, breaking it down into essential components like AI chips, infrastructure, foundation models, and applications. She shines a light on data privacy and the significance of user sovereignty over personal data. Furthermore, Sharon shares insights on the financial challenges faced by AI startups, the strategic moves by industry giants like OpenAI and Microsoft, and the burgeoning field of AI agents capable of performing complex tasks. Whether you’re an entrepreneur, an investor, or simply fascinated by AI, this episode offers a treasure trove of knowledge and foresight into the future of artificial intelligence and its profound implications.
To get the latest from Sharon Zhang, you can follow her below!
LinkedIn - https://www.linkedin.com/in/xiaoranz1986/
Use the code PODCAST50 for 50% of any personal plan for 30 days!
Sign up for Marcia's newsletter to receive tips and the latest on Angel Investing!
Website: www.marciadawood.com
Learn more about the documentary Show Her the Money: www.showherthemoneymovie.com
And don't forget to follow us wherever you are!
Apple Podcasts: https://pod.link/1586445642.apple
Spotify: https://pod.link/1586445642.spotify
LinkedIn: https://www.linkedin.com/company/angel-next-door-podcast/
Instagram: https://www.instagram.com/theangelnextdoorpodcast/
TikTok: https://www.tiktok.com/@marciadawood
Marcia Dawood
Well, hi, Sharon. Welcome to the show.
Sharon Zhang
Hi, Baja. Thanks for inviting me.
Marcia Dawood
Yeah, well, I am fascinated to have you on the podcast and have you tell us a little bit about all of the things AI that you talked about at our Angel Capital Association Women Investor Forum earlier this month, and you just did such an amazing job explaining it. So many people came up to me afterward and said, oh, my gosh, this is like, I've never really understood all of the different facets of AI. And Sharon explained it so beautifully. So maybe let's just start out, though, with a little bit about your background and how you got into all of this.
Sharon Zhang
Yeah, definitely. So I've been in AI for about 16 years now. I started at MIT Ccl lab. My undergrad thesis was actually on clinical decision making. So I think I stumbled on it back then and really just continued the journey. Worked at nuanced communications for about five years, working on different clinical applications, mostly focused on speech recognition, NLP, and questions back then. And then I call it. I went over to the dark side for a bit for two years working for a hedge fund, really writing algorithms for automatic training, and then came back to learn more about the provider problem, payer problem, and also patient problem, because I think my heart was always in the healthcare space.
Sharon Zhang
So I worked very closely with different cancer centers, like constant cancer centers. Hope also Kaiser Permanente, on different clinical operational applications of AI in healthcare. Fast forward. My last company was Glint, where we did employee engagement and analytics, really using NLP, and that was sold to LinkedIn a few years after. And now since 2020, I've been building personal AI. I'm the technical co founder. So me and my co founders, we really build up something that's kind of magical, even for us, I think, till today, really. We're trying to build these digital twins of people, employees, individuals, to really augment them in their lives.
Marcia Dawood
Yeah, I know. I can't wait till we get to that part where you can actually have employees that are AI people. Oh, so interesting. So I know that I learned a ton when you actually broke down the building blocks of the AI stack. So can you walk us through hardware infrastructure, foundation models, and the applications?
Sharon Zhang
Yeah, definitely. I think there's many pieces in the AI stack, like you said. I think, foundationally, the first thing that we think about is chips, AI chips. So that encompasses everything from GPU's, AMD's, GPU's, to some of the newcomers, such as Guac, who has accelerated quite quickly for the AI specific compute, I would say. And then the second layer built on top of the chips is what I think about as infrastructure layer. So that's what we all know as Google Cloud, which is public cloud and Azure, as well as aws, of course. And then there's also some newcomers in that area where they're more focused on accelerating the AI stack. So together, AI is a really good example.
Sharon Zhang
Mosaic is a really good example, and both of them specializes in how do we train faster and abstracting that part of it away from the developers that are developing on the infrastructure. So they're also very focused on that specific AI niche. And then the layer on top of that one is what we now know as foundation models. So that's the anthropic of the world, the open eye of the world. And on the open source side, of course, you have Lama from meta, and then you also have things like a mistral, which a lot of you probably also heard about. So those guys are building models on top of these chipsets and also on top of these infrastructure, and they build these foundational models that essentially is like a generalized layer that anyone can use to build their own applications, which I think of as an app store moment. So if you think back on when Apple first released app store, there wasn't that many apps, maybe a few hundred, and there's hundreds and thousands of apps that enabled people to build. And I think Openeye is very similar to that.
Sharon Zhang
I think when it first launched its developer ecosystem, there was around like 300 apps, and now there's around 2 million developers building on it. And for open source, there's even more. Right? And then we come to the application layer, which I think is where most of us would touch and feel every day on daily basis. So if you have used things like Cha VT, if you have used things like stability or mid journey, all of those things are application layer, right? So they are really the user experience layer of the AI stack. And there's, I think there's explosion of them right now in both intertained enterprise setting, consumer setting, different type of verticals. So it's really interesting to see how that feels evolving.
Marcia Dawood
Yeah. So, okay, back to the first layer, the chips. How did Nvidia get such a huge market share? Because aren't they like 70% to 85% of the market share?
Sharon Zhang
I think they're around like 90% of market share.
Marcia Dawood
Oh, my gosh. Wow.
Sharon Zhang
Yes. It's a good question, I think. I don't know the internal reasons, of course, but Nvidia was very focused on GPU computing, and it just happens that GPU compute didn't come about because of AI, but AI definitely could have utilized GPU's much more efficiently that AI can use cpu's. I think generally the adoption happened a little bit organically because of that, because GPU's are just very good at running the AI compute. Therefore the AI market adopted to it. The other thing I think as a main driver is everyone was building on the Nvidia in terms of software. What do I do today if I want to set up anything? I pulled the Nvidia Docker from Docker hub and then all of the different software stack are built on top of that. I think in a way that they won because the different developer ecosystems are very early on adapting to the Nvidia SDKs Nvidia stack to build on.
Sharon Zhang
Right. So that helped them penetrate the market.
Marcia Dawood
Interesting. So then just today I saw an article that OpenAI is raising money again and they're oversubscribed in a millisecond by billions of dollars. Right. And they've already raised several billions of dollars. So earlier you equated OpenAI to the Apple app store when it first started. Besides, we know OpenAI and chat GPT to be together. What are the other types of things that are being created off of OpenAI?
Sharon Zhang
Yeah, a few really good early on examples, perplexity. For example, I don't know if you have used them before, but right now I believe they have around 50 million monthly active users. They're also one of the most successful consumer AI companies at the moment. When they first started, they built off of GPT four. It allowed them to push the product to market very quickly. Right. Very meaningfully ahead of everyone else who's trying to build their own models right now. I believe they have fine tuned their own models on top of the open source Lama stack, but that's after they have did their proof of concept and had commercial success.
Marcia Dawood
Interesting. So then what about chat GPT? And you used an example in Boston about inflection.
Sharon Zhang
Yeah, I think I was talking about the rise and falls of different unicorns. Yeah, really a comparison of seemingly on their surface they look very similar in terms of mount race or in term of monthly active users. But there's something fundamental underneath that made one of them successful and the other one isn't. Right. So I think chat GPT and inflection AI was one of that pair. So chat GPT, for me, the one thing AI is very different from traditional, I guess, consumer, any kind of SaaS company is that AI is expensive, right? So on average, if you buy like a h 100 today, I think it runs somewhere from twenty k to thirty k. Right. That's quite expensive for the chip, right?
Marcia Dawood
Yeah.
Sharon Zhang
And you need many thousands or if not hundreds of thousands of them. Right. To be able to do something meaningful like a foundational model. But if you look at inflection and open eyes approach to this. Right. Open eye raised, I think about one point something billion back in 2019. Inflection rates like 1.3, I think 2022. So they seem to have raised the same amount of money.
Sharon Zhang
The difference is OpenAI was backed by Microsoft Azure. So they had a very early partnership with Microsoft, which I can equate that to almost infinite compute. So they never had to worry about their cloud bill, if you will, where inflection, I believe they purchased a 22,000 h 100 cluster, which really spend almost half of their raise on just that. So that capex was very high at that point. So in function a year later, they're still around, but the major team has moved on to become the Microsoft AI consumer group. So I think that is one of the essential differences. It's actually funny because it feels like we went backwards in a way, because in the cloud days you're able to spawn anything up on AWS and you can be a startup, you can be a one person shop and then be able to spin up tons of services without incurring a lot of costs because you're not buying the physical machines. And now it feels like there's a raise for Nvidia machines, so everyone's buying the physical machines.
Sharon Zhang
Right. And then you need to spend a lot of money upfront to do that, which means you have a very high raise even before you make any revenue.
Marcia Dawood
Wow, you gave another example about, I think it was mid journey and stability AI.
Sharon Zhang
Yep.
Marcia Dawood
One raised a lot of money and the other raised none.
Sharon Zhang
Yeah. So that's also an interesting one. So mid journey is one of my favorite, like, image generation tools. So is stability actually, because they put out some amazing open source image generation tools. Stability raised about, I think, $100 million. So with like over a billion valuation, which made them a unicorn the time. But quickly they were running into a deficit because image generation is like expensive tasks. Right? So if you're supplying that open source to everyone in the world, it becomes quite expensive very quickly.
Sharon Zhang
So because they were running these, I think, around $10 million deficit every month, they quickly burn through that money. Right. Whereas mid journey, actually, they decided not to build any app besides the model layer at first. So I still remember the first time I joined their Discord server. I believe that Discord server now is 20 million people. But when I first joined, there was like 60 people. Right? Oh, my gosh, they didn't have to build an application. All they did was, hey, in discord, send a message.
Sharon Zhang
Right? And they use the discord bot to be able to push back the images that's generated. So in a way, they build up community in discord without having extra expense on building their own application, building their own mobile app. And they were quickly monetizing based on the image generation. So per unit of image generation, they were already making a lot of revenue. So I think another lesson learned now, I believe they're making around like $200 million Nuarr with about like 40 people, which makes them one of the most successful companies. So I think the lesson learned here is really think about how you can monetize things early, because it is a quite costly operation for a company.
Marcia Dawood
And so you said the image generation is very costly. Is that because of all the computing power that goes behind it and you have to pay for that computing power to get it to work?
Sharon Zhang
Yep, exactly. Yeah. Everything goes back to that equations.
Marcia Dawood
So, like, with chat GPT, you can use it as a user for free, and then even the premium is only $20 a month, which, I mean, still $20 a month is not super cheap, but it's not super expensive. So, I mean, are they, is chat GPT making money, or are they, like, how does that all work?
Sharon Zhang
Yeah, so I believe the last revenue that heard is around 3 million ar for chat GPT, or open eye, really, in general. But they are still losing around five to $6 billion a year. So they're definitely not profitable. And five to six, of course, is people's estimation online. Since they're not a public company, there's no way to really know, I suppose. But they are still. Yes, like losing quite a bit of money there. But I think their model is also slightly different because for the free plan, if you're on that, or I believe even the paid $20 plan, they are using the data to train the models.
Sharon Zhang
Right. So if they, in a way, they're paying for the data by having the user use the compute.
Marcia Dawood
Interesting. So then talk a little bit about the evolution of the AI technology. You went from, like, 2018 to where we are today and where we're going.
Sharon Zhang
Yeah, that was fun for me to talk about because I lived through all of those. I guess the first ever paper that came out that really changed everything is something called attention is all you need. So in that paper, they introduced this new architecture for how do you do compute efficiently? So before this just backtrack a few decades, even in 1960s and seventies, we already had neural networks. It's just we never had enough compute and it's never fast enough for us to see any reasonable results. Right? So the transformers change all of that. It created this parallel computing scheme that you can essentially efficiently compute and reach a very good model within meaningful amount of time, right? So you don't have to wait thousands of years for something to happen anymore. So that changed everything. And everything that we're talking about today in terms of Genai, is mostly built on top of that foundation, which is the transformers.
Sharon Zhang
And then in 2019, you started to see the first generation foundation models. So it came first with GP two and then quickly followed by GPU three. Both of these are obviously released by open. At the time, GP two was actually an open source model. I believe the stats is at that time, the biggest GPD two Excel was 1.2 billion parameters, which is still smaller than today's. What we know is small language models released by Microsoft and Google. Since then, GPT-3 created this interesting dynamic, which is what we are talking about before as the App store. GPT-3 not only did it come with the models, it came with a set of APIs for anyone to be able to develop.
Sharon Zhang
On top of that changed a lot of things. Then you see things like Jarvis came out or Jasper came out of, which helps writers to write better or helps bloggers to write faster, et cetera. So that created a whole another job description called prompt engineering. It depended on people who can prompt the AI or talk to the AI at the bus. That's really how I think about it. Then what came after is this era of fine tuning that generally means, hey, not only can you call these APIs on top of the models that was already trained, you can also tune it to your own data. So that give a lot of power to people who have very specific data or domain specific data. So at that time, which is around 2022, that happened, people were creating these more like legal specific models or medical specific models to be able to help with their specific domain.
Sharon Zhang
So I think that was really the first time enterprises can use something to do meaningful things for their. And after that, of course, came the first open source model. So 2023 is when Lama got released. So Lama is of course made by meta. And in terms of open source model is a big step further forward because now anyone can go and download these models and be able to do anything to it that they want. So the foundation model is no longer exclusive for open air or no longer exclusive for anthropic, right. The general audience can now create their own things built on top of llama. And the last time I checked, Mistral.
Sharon Zhang
Mistral is a open source foundation model framework as well in the company, and they had 2 million downloads every month. That's like incredible number of downloads compared to the 2 million developers that open house right now. And then from there is what we know as Rag. I call like 2024 really the year of Rag. So this is the first time that everyone heard about retrieval augmented generation, right? So what that means is you can actually push some amount of your data into the foundation model without having to retrain it and get meaningful results, right? So a lot of you probably experienced it. Whether you use cloud or use chat DPT, you can simply upload a file and ask about the file instantly. Right before what would happen is if you ask about, hey, what is the super borrow score for 2024 chat? GPT will just simply say, oh, I don't know about that because I'm only trained up to November 2021. So that solves a lot of the recent problems that happen.
Sharon Zhang
And I think from here we see a glimmer of something called AI agents, which I'm super excited about. An AI agent really addresses. How does the AI take actions outside of your conversation or outside of your chats? Right? So if I did ask AI, hey, you know what, this sounds good. Can you go book a restaurant for me for my New York trip next week? Can the AI actually take the action as if there were a human assistant to go and negotiate with external systems? So I think that is going to be what's happening next year. So next year we'll see a lot of different type of asian companies that are really interesting.
Marcia Dawood
Wow, that is so interesting. So you also said that humans learn by playing against each other, and AI learns a event by playing against itself, right? Explain that a little bit.
Sharon Zhang
Yeah, so I think last week I was on a podcast and someone asked me what was the moment where I felt magic in terms of Gen AI revolution? And I told them it was the alpha go moment. So Alpha Go was something that Google DeepMind came up with to play the game go. So for a very long time, it was, I guess, hypothesized that AI can never play go because there's just too many permutations, too many combinations, and it's not possible for the compute to happen within that time frame. An alpha goal, what it achieved is not only the fact it beat the champion of Go in the world like five times consecutively, it also achieved this by lonely learning from playing itself. So there's something called alphago Zero, which means it started with no pre knowledge of any of the games that ever happened before. Right? So it played millions of games against itself to learn what is the best strategy to play go. All it had is basic rules that you need to follow, which is the goal rules. And it was able to really evolve its own strategy, evolve its own playing principles, and to the extent that is able to be human champions.
Sharon Zhang
Right. I thought that is fascinating because essentially we shrink the thousands years of goals evolution into a few months by using machines playing against themselves.
Marcia Dawood
Wow, creepy and fascinating all at the same time, right?
Sharon Zhang
Yeah.
Marcia Dawood
So I know you talked a little bit when we were in Boston about having AI employees, and we were all just fascinated. Talk a little bit about that.
Sharon Zhang
Yeah, actually we've been seeing some of that recently. So there are companies that's building AI SDR. So who will book meetings for you, do research over you. And I think the generalized version of that is really thinking about every employee can also be an AI employee. So really a company can have a mix of both. For example, in our company, we have around ten full timers and then five AI employee. So the AI employees would then be trained on their specific data. They would do their own specific tasks.
Sharon Zhang
For example, we have one that knows everything about every customer call ever. So whenever we need to know something about customer, we simply just go to Andy and we're like, hey, Andy, can you tell me about customers? So and so, right? Andy gives us this very extensive outline of, okay, on this day, you guys talked about this. On this day, we have these action items for them. So the Andy would know all of that, right? Andy will also send out emails after doing research on the company when we have inbounds to tell them, hey, why should you purchase personal AI? Right, so that's really interesting for us. And another AI that we use quite often inside our team is Gates. So Gates is our fundraising AI. So Gates knows everything about company metrics, business market, icps. So this is actually what we give to our investors or potential investors to do diligence, because diligence used to take so much time for us.
Sharon Zhang
Like, hey, go talk to Gates AI. Gates should be able to answer most of your questions. So I believe in the future, really the future work is really a combination of the two kind of humans and AI is working together. And the cool thing about AI employees is also you can keep the data as fresh as possible and always up to date. Right. And that helps automate a lot of the different maybe manual tasks that you don't want to do or just things that needs to happen while you're asleep.
Marcia Dawood
Right, right. And what's the feedback been like with, let's say, a potential investor? When you say, okay, I'm not going to do the diligence with you personally, you're going to go talk to this.
Sharon Zhang
Aih, yeah, actually feedback's been really good because all of them are like, oh, wow, you're really using your own product to do all the investment things. Right. And also they don't need to wait for our answer. That's the really cool thing. Because they can talk to kids 24/7 doesn't mean, doesn't matter when they're doing diligence. There's no longer, hey, I sent you ten questions. Can you please draft the questions answers back to us? And then there's this like three, four day cycle. Now they can just go and talk to Gates about anything anytime.
Sharon Zhang
And the other thing is, even though they're talking to Gates, we can also see what's happening. So if at any time gates doesn't know an answer, we can always jump in as humans. We can say, hey, this is updated number for our p and L for this month versus last month. And then we can also add that to Gates for him to learn for the next question.
Marcia Dawood
Amazing. And I know there was a question in Boston about, well, should we be as investors as angel investors, should we be looking at companies and saying, hey, this is a good thing that they have these AI employees or how do we even evaluate something like that?
Sharon Zhang
Yeah, that's actually one of my favorite questions in the session. As a startup founder, I think we're always lacking resources. So whether that's time, people or money, right. And I think for any company that can be creative on how to scale their efficiency without spending extra capital, I think kudos to them. Right. And that's really what I believe in. And like internally, we have seen a huge boost in efficiency for ourselves. So I do believe that is the future.
Marcia Dawood
Yeah, that's amazing. All right, so tell everyone about personal aih, how they can sign up, what they can do with it. Tell us all the things.
Sharon Zhang
Yeah. So personally, like we've been talking about, is really a platform for you to create your own AI. So we don't dictate what you can create. Right. So it's really Marsha can create Marsha AI or Marsha, AI author AI so you can really, it's for your to imagine what you can create, right? And then we have both the enterprise plan, which is more geared towards organizations as well as individual plan, which we call personal plan. So anyone can go and sign up for the personal plan. Right now it's $40 a month at the moment. It comes with both a training platform as well as a messaging platform.
Sharon Zhang
So if you imagine how a slack workspace looks like, it's like that with AI's and a training studio. We actually just released our training studio this week, so it'll be a really cool time to go and train out any data you have, any email that you have, anything in your Google Drive that you want your AI to understand and know, you can go ahead and do that. And this is a complete no code platform, so anyone can really go and get their hands dirty. And I love seeing what our customer create. So please do tell us in the community channels as well when you create these things. We also have a social aspect to this, less of a social network, but really think about it like slack channels so you can share your AI with you in it, with other people also, which is a really great use case as well.
Marcia Dawood
Wow. So if I wanted to create an AI for everything that has to do with my book angel investing, all the things I know, like all the podcasts and all that, I can upload all that information and then somebody would be able to.
Sharon Zhang
Yeah, and you can even embed it on your website, podcast, website and be like, hey, this Marsha podcast AI has everything about my past podcast. So ask anything about any podcast, I suppose, same thing with the author AI. You can if you want to expose it to public, you can also share that along with your book. Right? So if they do want to pull specific quotes from the book that they really loved, they can do that with AI too.
Marcia Dawood
That is amazing. So many possibilities. Oh my God. And does it matter? The format, like you can upload it in a PDF, you could give it a URL, any of those things.
Sharon Zhang
Yeah. So you can do PDF, PowerPoint, URL, different text formats and then let me think, URL text. You can actually also take the notes on our platform. You can also ingest any audio. So if you have mp3 s, mp4 s, you can upload this directly. It will transcribe everything for you. You can also connect directly, like I said, to your Google Drive, OneDrive, whichever one you use. But just rest assure we're not taking everything from your drive.
Sharon Zhang
It's the specific things that you're uploading, there is a sync operation. So if you do have a personal AI Google Drive folder, anything that you put in there will be automatically synced over period of time.
Marcia Dawood
And then what should people be thinking about related to data privacy?
Sharon Zhang
Yeah, that's a really good question. I think that that's something every person and every company should be really mindful about. The one thing that we started the company on really is the foundation of how do we keep data private for everyone and not use, say, Marsha's data to train Sharon's model, vice versa. Right. And I think I get a heartburn from this for sure. But I've been seeing things like, oh, for slack, they're using your data to create their own AI. LinkedIn has an opt out system, so they are also, by default, using your data to create their AI. So that leaves kind of a question of, like, where is the boundary of this, right.
Sharon Zhang
So I think when you're making your choice, really be mindful that is my data being used by other people. Right. Is the data being used to train the company AI? Do I still get to own my own data? What if I need to leave the platform? Right. Because models are not that easy. It's not. I'm leaving the platform under GDPR. You have to remove my data. It's not very simple to remove your data once it's trained into a model.
Sharon Zhang
Right. So I think, like, all of those things are things that consumers should educate themselves on. Like, where is the data really going? That is definitely a really important part.
Marcia Dawood
Yeah, that's a really good way to think about it. Yeah. Because if it's being used to train other company AI's or whatever, how do you extract it then, after it's already out there in the ether verse, it's so far, you couldn't just be like, okay, now I want this piece back and that piece back and that piece back. Right?
Sharon Zhang
Yeah. And definitely one thing we always want to see for, like, me and my co founder is almost like a decentralized system for the data, right? So if you really think about it now, Facebook owns everyone's data. Like Google owns everyone's data, right? But what if the equation was reversed? What if Marshall owned her data? I own my data. And whenever the third party platform needs to use it, they need to negotiate with your personal AI. So maybe Amazon comes to your AI and says, hey, to provide you with better recommendations, we would like access to your Amazon history. But that's purely Marshall choice. Or maybe Marsha's AI's choice to give or not give that data. As the owner of data, you want that sovereignty and the ability to choose.
Sharon Zhang
So that's really the future that we want to see, really, like each person be able to control and have complete sovereignty over their data.
Marcia Dawood
Yeah, totally. Well, Sharon, thank you so much for coming on the show today. We'll, of course, put in the show notes where people can find out more about personal AI. They can build their own AI. And you are going to give us a coupon code. So there will be a coupon code in the show notes as well, which is fantastic. So thanks so much for coming on the show today.
Sharon Zhang
Yeah, of course. Thanks, Marshall, for having me.