Bethany Corbin
As a healthcare innovation and femtech attorney, Bethany Corbin is on a mission to help thought-leading companies – both national and international – revolutionize the global women’s health sector. Bethany is a strong advocate for women’s health and wellness and is passionate about transforming the way society thinks about human health. With almost a decade’s worth of hands-on legal experience, she empowers pioneering femtech companies to achieve their goals with well-crafted legal counsel and strategic guidance.
Women are 50% of the world’s population and account for 80% of consumer purchasing decisions in the healthcare industry. So why is it that women’s health has been considered a niche market, a subset of healthcare, or Femtech? This business opportunity is estimated at over $1 billion for a range of tech-enabled, consumer-centric products such as at-home diagnostics, trackers, and wearables. But Femtech also brings with it various risks if you are not savvy in navigating the emerging space. Femtech lawyer and expert Bethany Corbin joins this episode to discuss all the opportunities and risks associated with Femtech and how her own work with startups and venture capital companies is bringing new innovations to the marketplace.
[00:00:00] KRISTINA PODNAR, host: Recently, the FTC filed a lawsuit against data broker Kochava Incorporated for selling data that could be used to trace the movement of individuals from a sensitive location. What does this lawsuit mean for data tracking and selling?
[00:00:13] INTRO: Welcome to The Power of Digital Policy, a show that helps digital marketers, online communications directors, and others throughout the organization balance out risks and opportunities created by using digital channels. Here's your host, Kristina Podnar.
[00:00:30] KRISTINA: Hello to all of you listening today, and welcome back to the Power of Digital Policy. I'm happy that you found some time to listen today as I speak with healthcare innovation and Femtech attorney Bethany Corbin, who's on a mission to help thought-leading companies, both national and international, revolutionize the global women's health sector.
Bethany, I think I mentioned this to you as we hopped on the call this morning, but I'm super excited to speak with you today, and I have a slew of questions. But before we dive in, I want to take a step back and ask you to help us understand fundamentally what is Femtech and what does it mean to be a Femtech attorney?
[00:01:06] BETHANY CORBIN, guest: Well, thank you so much for having me on the show today. I'm really excited to be here and to talk about one of my favorite topics, which is, of course, data privacy, data selling, and what that means for companies. So I think we're going to have a great discussion. Femtech is actually short for female health technology, and so it's really the intersection of women's health and digital health combining to form solutions that are going to help revolutionize women's healthcare going forward. And as we all know, women have really been excluded from modern medicine and modern science for much of their history. They weren't even allowed to participate in clinical trials until 1993. And so Femtech really arose from the fact that women have historically been excluded from modern medicine. The fact that they are not the same as men, there's been a lot of application today of modern medicine to women on kind of a one size fits all basis, where we take male physiology and apply the same symptoms, diseases, and ailments to women without taking into account how their physiology may make those symptoms and diseases different. So that's really where Femtech arose, which is designed to use these digital health solutions to help propel women's healthcare forward and make sure that we're taking into account differences in gender and sex for healthcare going forward. I am a Femtech and a healthcare innovation attorney. And so, really what that means for me is that I am at the forefront of these cutting-edge digital health solutions that are impacting individuals globally. And my interest is in being a woman's health advocate. And for that reason, I focus a large portion of my practice on helping women's healthcare companies start up and enter this space and then walking them through a kind of first fundraising, their subsequent fundraises, and how they can operate compliantly. And today is the healthcare market, which, as many people know, is governed by a ton of regulations, and it's very difficult to navigate and really help them through to the exit. And my personal passion is just watching these companies grow and seeing the new solutions that they're bringing to life and how this could transform healthcare nationwide.
[00:03:19] KRISTINA: So, you're making my heart go put our patter, Bethany, because one of the things we'd like to talk about around here is the balance between risk and opportunity, right? Anytime we have any kind of innovation, we have this risk aspect, but we also have our opportunity aspect, and it's critical for most organizations to understand how they want to balance that out. And this is such an interesting and evolving area; as you said, it's something that's fairly new. I mentioned a moment ago the FTC lawsuit against Kochava; how did that lawsuit come about, and why should consumers and businesses care about what's going on?
[00:03:52] BETHANY: Yeah, so the first thing that kind of notes about this lawsuit is it isn't happening in a vacuum. You know, it's not like one day the FTC woke up and said, Oh, the way that healthcare data is being used by a lot of companies is improper, and we need to kind of double down on that. It's been several years in the making. And I think the first kind of just taking a step back and really understanding the role of the FTC and kind of monitoring the uses of healthcare data may be beneficial. The FTC is really charged here with making sure that there are no unfair or deceptive acts or practices against consumers. And as we've entered this digital world, especially a digital healthcare world that has blossomed and grown since the Covid-19 pandemic, there's been a lot of practices that are harmful to consumers occurring in this kind of data, failed data exchange, data collection processes with companies. And as that area has grown, the FTC has been monitoring it. They've been looking out for, you know, these types of unfair and deceptive practices that might be harmful to consumers. And so that's kind of the background from which this Kochava lawsuit is stemming. We've seen the FTC go after, you know, kind of fem tech health tech companies before in terms of their data practices. It's previously been in the context most prevalently how they're using data in comparison to their privacy policies. And so that's what we saw with the FTC flow lawsuit in 2021, which was a lawsuit against the Femtech company because they were selling or exchanging or disclosing data downstream in a way that their privacy policy, which is that externally facing policy for consumers, said they were not going to do. And from that basis, we've seen the FTCs interests in these health tech companies, healthcare data, continue to increase. We've seen the concurrence in the FTC flow lawsuit really kind of say we need to be using this new health breach notification rule that's been on the books forever, but has never been used. And so we've really seen the FTC start to pick up interest in this over the past couple of years. And so now we've got this FTC lawsuit against Kochava and that end is a little bit different because it's looking at how Kochava as a data broker is selling geolocation data from hundreds of millions of mobile devices and how that data can actually be used to trace where individuals have gone, and particularly what sensitive location they might have visited, like a reproductive health clinic or an abortion clinic. And that has become even more important in today's world, given that we're in kind of post Dobbs world where we no longer have the Roe versus Wade protections. And so I think that environment as well as kind of fed into this emphasis and analysis now on the ways in which consumer data is being used and collected and exchanged by companies nationwide.
[00:06:57] KRISTINA: Are you seeing this lawsuit almost as a signal to other companies? Do you think that the FTC is going to come out with maybe clearer regulations and guidance around what organizations should be thinking about when they're looking at collecting sensitive data or potentially sensitive data? It could be me standing on a corner of New York City where it's harmless, or it could be me at a health clinic, which I might not want floating out there. So is this a phase you think where we're going to get more clarity on sensitive information that should or shouldn't be collected and disclosed, or where do you think we're going? Just look at your crystal ball for us, if you will.
[00:07:34] BETHANY: Yes. I, I can tell you the writing on the wall that I have seen is that we are going to see increased regulation of consumer data, especially consumer healthcare data going forward. We've seen this lawsuit against Kochava right now. We've also seen the FTC starting to seek input and looking at advanced public rule-making with respect to commercial surveillance and data security. And so what that means is the FTC recognizes that the uses and disclosures of data may not be in consumers' best interest. They may not be protective of consumers. We may not be giving consumers a meaningful choice in how they use their data or how they choose to disclose their data. And, the protections around that data may not be strong enough. And so that's what the FTC right now is really diving into, is looking at what additional protections we might need for consumers and their data going forward. And now it's kind of the time which any interested individuals can participate in that advanced notice of proposed rulemaking and add some comments there, trying to influence the way that the FTC is going to consider this going forward. They're in those very early stages of their rule-making process, but it's something that the FTC is highly focused on right now, and it's something that I think it takes a long time to get that clarity. We're at the beginning stages right now. But over the next year or two, I think we're going to start to see some changing regulations, change in requirements. We're going to see also these types of investigations and complaints by federal and potentially state agencies as well, looking into deceptive practices related to consumer data, and I think that's being used as well as you mentioned, as a signal for where the agency wants to go in the near future with these regulations. It's also a signal to other companies that are out there. That they're taking this seriously, right? If you're a data broker that does something similar to what Kochava is doing, this is something you wanna be watching because how the FTC rules on this, right, or the settlement that they enter into could be reflective of how they might treat your organization going forward. It helps to set those standards that may not yet be memorialized in a rule-making process, but really gives insight into the agency's thinking and kind of where they're hoping to take this in the future.
[00:09:50] KRISTINA: Bethany, you mentioned your work with startups and early stage funding of companies. One of the things that I get frequently asked is why should any company that's in startup mode put together policies. I think fear is always, it's going to slow things down. It's going to limit creativity or innovation. But I think the opposite side of that is possibly additional venture funding, maybe; what do you see in the startup space? Is this an opportunity for organizations in terms of how they position their products and services?
[00:10:22] BETHANY: Absolutely. And I will tell you, as a privacy and security attorney who very much wants to make sure that consumer privacy is safeguarded, I get a lot of pushbacks from startup companies, you've got a limited amount of funding and get your product design, you've gotta get marketing. You've gotta make it so that you can take that product and commercialize it to the extent you need regulatory approvals to do that. You've gotta hire your tech team. All of these competing interests for the money that you've got as a startup company make it so a lot of startup companies don't wanna invest in privacy and security at the outset. A lot of times, they also get pushback because they think, while I'm a startup company, so you know, if the government's going to come after someone or if a cyber criminal's going to come after someone, it's not going to be me, right? I'm a pretty small piece of fruit on this huge platter where there are other opportunities for the government and for cybercriminals to go after companies that may have a lot more data or be using that data improperly at this time. The thing that I always tell startup companies is we've gotta shift from this risk mitigation mindset into a proactive mindset of how privacy and security and how you use data in those protections is actually going to benefit you as a company. So too often, we get in this mindset where we're being reactive. We think those policies and the data security infrastructure that we build are just to protect consumer data from these cyber hacks, right? Protect us from regulatory investigations as the company, but we don't think about how that might actually help us close deals faster. And that's really what I like to stress the startup companies. If you build these privacy and security and data protection practices and policies into your company at the outset, especially if you incorporate some of the aspects like privacy and security by design, what that means is you're actually building a really solid foundation that's going to allow you to obtain funding and close deals faster. And the reason is this, especially in the wake and if you're a healthcare company, for instance. This is really important. But in the wake of the dismantling of Roe versus Wade, a lot of investors have really started to emphasize privacy. It might have been something, and I heard an investor tell me, tell it to me this way before, they said privacy was something we looked at before, but it was a lower-case p. Now we're looking at privacy and all capital letters. So, as you're preparing for that fundraising, one of the things that you're going to get asked about, most likely by your investors, Is, what are you doing with this data? Are you selling it? How are you protecting it? How are you making sure that you're not disclosing sensitive data downstream to third parties who don't have a use for it or don't have a legitimate use or need for it? So having those policies and procedures in place before you go and try to raise that round is going to help you satisfy your investors, right? Show them that you know what you're doing in this market. Show them that you're also taking seriously your users' and your consumers' privacy and security and their data protection. It's also going to help you as you start to make those industry partnerships; it'll help you make them faster. Because oftentimes, even if you're not required to comply with certain privacy security data practices on your own, you're oftentimes going to be seeking partnerships with larger companies that have very robust privacy, security, and data protection practices and requirements. Because what they're have happened, right? They want a very secure and high-standard environment for data and privacy. So they're not going to contract or otherwise engage with companies that aren't going to meet the privacy, security, and data protection standards that they've got in place. If you've got that infrastructure built already, then that's not going to be a problem for you; you'll be able to say, Yes, I meet X, Y, Z requirements. I'm good to go. If you haven't built that infrastructure yet, what's going to happen is you're going to have to pause. You're going to have to take six to nine months to build that infrastructure, and then you'll have to go back and continue negotiating that deal, and it can really delay your ability to enter into that partnership and start gaining more publicity, more funding, all of that faster.
[00:14:35] KRISTINA: Well said, Bethany, one of the things that I think oftentimes even I forget about is how fast you have to slam on the brakes when you're non-compliant. You want to enter into a deal with that larger company or somebody who's upstream or downstream of you. So, it's not just about you, it's about the ecosystem that you're operating in. So, excellent, excellent point. I'm wondering, as you're talking about that, and thinking about the broader ecosystem, what are the key data privacy concerns for Femtech, especially when we start expanding beyond the United States, there's the EU, we have a lot of emerging practices in Asia. What are you seeing globally, and is it any different than what you're seeing in the US?
[00:15:15] BETHANY: Yeah, it's, it's a great question. So we do help companies globally, both, building within the US and scaling outside the US and also building outside the US and scaling to come into the US and there's some unique challenges that each of those sides face. So, I would say very common challenges that I see are unfortunately also basic challenges. First off, I would say kind of privacy policies are not loved by a lot of companies, especially health tech and thumb tech companies, because as you mentioned before, they're seen as an inhibitor to innovation. They're seen as putting limits on what the company can or can't use for their data, and I oftentimes sprained that companies, especially in the startup stages we'll, like to copy and paste privacy policies from their competitors. And one of the things that I always have to tell them is you can get in a lot of trouble with the regulatory agencies for doing that, especially because the way that you use and collect and disclose data is always going to be different than how your competitors doing it, no matter how similar your products are. And so you've gotta make sure that you're viewing these privacy policies and procedures as living, breathing documents. That you're always updating them. That's kind of the other challenge I see from a lot of companies is that they want to draft this policy and kind of put it on the shelf, right? They don't want to think about it again. And what you've gotta recognize is as your product evolves, that necessarily means your privacy policy gotta evolve because how you're handling that data is going to continuously be changing. So I always recommend for companies at least look at and revise their privacy policies on an annual basis, if not more frequently, if regulations continue to change. So that's one overarching concern that I do see a lot. In terms of other compliance challenges, if you've got a company that's either coming into the US or growing of scaling and exiting into, let's say, the European or the Asian markets, knowing what regulations to comply with and understanding that regulatory framework of where you're going is absolutely crucial. A lot of times, I see companies trying to build infrastructure and different types of products and company designs in locations using their home location as kind of that base, right? And the problem with that is that you've got completely different structures depending on what location you're in. And so I always recommend getting some regionally specific legal advice for wherever you're hoping to build to make sure that your model is going to be acceptable there. And if not, understand how you need to change it, right? If you're going, for instance, from the US to the EU, you've got things like GDPR, whereas in the US, we don't have that overarching national privacy framework. We've got more piecemeal legislation that's very sector-specific, so how you handle data is necessarily going to be different once you go into the EU and you're dealing with GDPR. And so that's going to be a framework that you want to have implemented before you start to even open operations over there to make sure you're fully compliant because otherwise, the risks that you face over there from non-compliance and from regulatory agencies is going to be pretty significant. I always like to tell clients to take a step back and think about what their six-to-nine-month plan is, right? Where do you want to be six to nine months from now? If that includes being overseas, in Europe or Asia, then we need to start thinking now about what those regulations are; how does your company or your product need to change to satisfy those regulations? And then we can do the legwork upfront so that in that six-to-nine-month period, you're ready to enter that market. And so, for many clients, it's also about counseling them to think ahead, what that six to nine-month goal is, and starting to build the infrastructure now.
[00:19:12] KRISTINA: With that in mind, Bethany, do you think that consumers are aware at all of what it means to have their data collected? My perspective is yes, everybody should be paying attention to privacy and security, but that's also from a very digital policy Sherpa perspective. But do you think there's actually been an uptick in consumers caring more? What are you seeing out there in the marketplace?
[00:19:34] BETHANY: Yes, there has definitely been an uptake. I will say It's very interesting because the ways in which consumer data is continuing to be used haven't really changed; companies are still collecting more data than they oftentimes need, selling that data downstream to companies like Facebook or Google. And so there's always been this, to put it nicely, kind of exploitation of consumer data to a certain extent that consumers before have always been feeling like they don't have a choice. They typically want to use an app, and they sign up for that app, for instance. They have to click on that. Yes, I accept the privacy policy. They don't even read it, and they've relinquished their data, and they've gotten used to that model because that's how it's been for so long. They haven't had, especially when we're thinking about the US, a real meaningful choice in saying: you know what? I want to use your app, but I don't want you to disclose my data downstream to Facebook, so I'm going to click no for that disclosure. But yeah, you can disclose it downstream to a healthcare research organization. That one's fine. We haven't really had when thinking about the majority of companies out there that type of consumer can set before; since the fall of Roe versus Wade in the US, I have seen a significant uptake in consumers wanting to know about their privacy and their data rights, especially in the healthcare context for the obvious reasons that now that reproductive health data could potentially get women in trouble or other providers in trouble. Women and other consumers have been very interested in knowing their rights now, knowing how their data are being collected and used downstream, how it's being sold, and what they can or can't do about that. And I do expect that to start to bleed over into other areas as consumers continue to get savvier about their rights, their obligations, and how companies are using that data. And because of that, I've started to see kind of really high-value privacy, security, and data protection practices becoming a market advantage and distinguisher for companies that do have those policies and procedures in place that protect consumer data. I think before, it wasn't necessarily as much of a distinguisher because consumers didn't really have a way in which they could compare the different apps or products out there on the market. There wasn't a study that had been done that showed you the best health apps for X, Y, and Z., and now there are studies that are being done that say, Okay, here are the top 20 apps that protect your privacy in this sector. Here are the apps that are the most data-hungry in this sector. And so, maybe you should instead use the apps that aren't as data-hungry because they're going to collect less data, right? They're going to be less inclined to sell that data downstream. And so I've really started to see these types of studies, these types of publications coming out and giving consumers more of the tools that they need to make the informed choices that before, I think they just assumed they didn't have the power to do, and so I do anticipate that that's going to flow down to other sectors. Especially when we think about healthcare, I think right now we're definitely seeing it in the Femtech sector for the obvious reason that we no longer have Roe versus Wade. I'm seeing it start to bleed now into things like mental health apps. That's been another one that's gotten a lot of publicity lately, and some studies done on how that data's being used and disclosed downstream, and I think it's going to continue to flow from there.
[00:22:54] KRISTINA: You mentioned that FTC is paying attention to what privacy policy states, and how are organizations using data, and they are being transparent, trustworthy, et cetera. I'm wondering who's paying attention to data privacy and security in Femtech? Is there actually anybody from a regulatory-specific perspective that's looking at it through this Femtech lens? And is there anybody, whether it's in the US or globally, that's really focused on educating consumers? Because it seems like there's a little bit of a void maybe between regulators and businesses. What comes between now, those two?
[00:23:33] BETHANY: Yeah. Yeah. It's a wonderful question. Femech really falls under the healthcare umbrella. So that, and there's a couple of actors that focus on Femtech, but I wouldn't say they focus exclusively on Femtech, they're looking at Femtech from this broader lens of healthcare, data privacy and security. One thing to note is that there's; also I saw a lot of times this misconception among consumers that the Health Insurance Portability and Accountability Act, known as HIPAA, applies to a lot of the Femtech data that's out there. That's actually a misconception and not necessarily true because of the way in which we have a federal piecemeal privacy landscape, HIPAA applies right to healthcare data, but it doesn't actually apply to all healthcare data; it only applies to certain healthcare data known as protected health information that's being held by covered entities or their business associates. And oftentimes, a Femtech company is not going to be considered a covered entity. So that means that really, from the federal level, what's really applying to these companies is the FTC is a prohibition against unfair and deceptive acts and practices and that requirement to have that privacy policy for consumers to understand how their data's being used and collected. And then on the state level, there are some states, I think it's about five right now that have passed privacy legislation. So, to the extent that that legislation is broad enough to encompass health data, it might apply to Femtech apps. But we have a huge void in Femtech. And it's not just Femtech, it's also other types of Healthtech, but there's this regulatory void for how data is supposed to be protected because we're falling outside the bounds of HIPAA most often, and we're kind of being left to be regulated by the FTC and any state legislation that's out there. So that's an unfortunate scenario that we've got right now. I know that there has been some talk and some proposals for a new HIPAA privacy rule that would potentially expand to cover other types of health data and actors that are using this health data. But right now, there's not as much regulation for Femtech as you would think. The second answer is that consumers, it's very interesting to me, there's not really an agency out there that has this mission of educating consumers and making sure that they understand their privacy rights and that, those privacy rights are something that they not only understand but know how to kind of enforce and can have a meaningful choice. Really, we've seen the FTC kind of play that role again in monitoring those unfair and deceptive acts or practices. And we've seen there's a bunch of lawyers out there in the privacy and security fields that try to bring that information to consumers. I think there are some TikTok accounts or some Instagram and Twitter account out there that kind of walk consumers through what the different portions of privacy policies mean. What does it mean to opt-in or opt-out? Where can they find those options and the settings for each of the apps, but really there's nothing; I always think of it as there's nothing like a school, right? That kind of teaches consumers how to do this. It would always be interesting to me, for instance, if, if this was something that could be taught at a high school or a college level to allow consumers to really understand their own data, privacy rights, and practices. Because once your data's out there, it’s out there, you can't really put that cat back in the bag, which is very unfortunate because if you put data out there that, down the line, gets hacked or gets disclosed, you don't have any control over that. And so I think education of consumers at earlier stages is going to be extremely necessary and beneficial. Especially in this global digital world today, they're putting stuff out there as teenagers, onto social media channels, et cetera, that as adults, whenever they kind of understand, better about how this data could get out, right? How this data could be used by third parties in a negative manner. That's something that they may not have wanted to put out there. And so I really do think that education of consumers on those rights and practices at much earlier stages is something that we need. I don't think we have anything that's really filling that gap right now.
[00:27:45] KRISTINA: It sounds like in addition to teaching keyboard, which we still do in junior high, and I can't believe that we do that. In personal finance, we definitely need a course on data privacy and security. So maybe if somebody's listening who's in education, they can take up the mantle and help us get it done.
[00:28:00] BETHANY: Yes. I think that would be incredible. And it doesn't even have to be a full course study, but adding it to some other type of life skills course that we're teaching in high schools or colleges could be extremely beneficial because having to wade through that legalese and a privacy policy, if you don't know what it's saying, you don't know where to look. That can be a real challenge. Even, even as a lawyer, I don't like to read those for the apps that I use. So I can only imagine if you're a high schooler or in your early twenties, that's not something you're going to usually take a lot of time to read. And so I think the more guidance we have there for consumers, the more informed we're going to have consumers be. And we may even get a structure down the road that a regulatory structure that is consumer protected because consumers are actually caring and informed about how their privacy and data rights apply and what they mean to them.
[00:28:48] KRISTINA: Oh, I hope so. I'm going to invite you out to coffee in four years and ask you if anything's changed. I hope so.
[00:28:54] BETHANY: I really hope so too.
[00:28:56] KRISTINA: I was reading a Mackenzie report, and they said depending on the scope, they estimate that Femtech's current market size can range anywhere from 500 million to 1 billion, which is huge. I'm curious; we've been talking so much about risks and privacy and security by design, but what excites you the most about the future of Femtech?
[00:29:17] BETHANY: I think there are a couple of things. There's definitely a lot of excitement. There's also some trepidation and some risk. From the negative side, first, I think we've got to build Femtech companies that have privacy and security at the forefront because of the fact that we've got the dobs decision now and women are losing a lot of their confidence in Femtech because we don't have this strong data privacy and data security practices right now. And so I think, on the Femtech side, a lot of companies, especially startups, are scrambling a bit to try to go back and revise their privacy policies and procedures, try to reassure consumers of how they're using their data. But I've started to see consumers abandon the market because of that because we don't have that strong privacy and security infrastructure right now. And so that, that's something that worries me about the future for Femtech. I don't think we're going to see a total abandonment of the Femtech market, but if we don't shore up those privacy and security practices to a way that makes consumers comfortable sharing their data, we're not going to be promoting the access to care that we need. We're not going to be promoting women's healthcare. We're also going to see some disparities, I think, in kind of the diversity of data that we attract into these apps going forward. From a concerned perspective, what we need to focus on is making sure that we're getting privacy and security absolutely right in Femtech so that we don't lose the significant gains that we've made in this market to date. Cause Femtech is relatively new. It was coined in 2016, and it's got substantial growth, as you mentioned up to a billion dollars so far. And its projected kind of going forward through like 2027 to maybe even reach 50 billion. So, the possibility is there; we've got a lot of work to do right now to ensure that we can obtain that potential. And once we set that aside, I think what excites me the most about Femtech is its possibilities in terms of how many diseases we can tackle and how much we can revolutionize women's healthcare. Right now, I will say that the majority of the Femtech market is focused on things like reproductive health and maternal healthcare, those types of things that are really tied to the reproductive health system, which has led to a bit of conflation between Femtech and reproductive health. But as we all know, women's health is so much more than that, and we're starting to see those other companies come in and target things like chronic women's health conditions. There's a really cool startup company right now called AOA, and it's actually focused on creating a blood test for ovarian cancer. So those types of companies that we're now starting to see enter this market and tackle those long-term or chronic women's health conditions, those excite me the most because I think that that's really going to revolutionize how women's healthcare expands and potentially even change the course of millions of lives of women, we're able to focus on these longer-term problems that women experience. And the passion is there. The number of startups, companies coming into the field, the number of founders are there, and I think we've just gotta keep pushing. It's been great to see the gains that we've made in Femtech over the past several years and really, for me, what's exciting is, is that future, once we shore up the privacy and the security and redeem consumer trust, I don't think there's going to be a lot of limits to what we can do in the future. I think we're going to be a very successful industry and I think that we're going to impact millions lives of women.
[00:32:48] KRISTINA: That's such an upbeat message and a good way to conclude today's conversation. This has been an amazing conversation. Bethany. Thank you so much for popping by today and sharing your insights. I think more than anything, like I said before we started recording, I appreciate you being able to speak in plain language about business, digital technologies, emerging technologies, Femtech, and also the law, your rare breed. And we certainly appreciate you joining us today.
[00:33:16] OUTRO: Thank you for joining the Power of Digital Policy; to sign up for our newsletter, get access to policy checklists, detailed information on policies, and other helpful resources, head over to the powerofdigitalpolicy.com. If you get a moment, please leave a review on iTunes to help your digital colleagues find out about the podcast.
You can reply to this podcast here: