
HIPAA compliant marketing is possible and it is faster when you design for it from the start. This episode turns legal and technical constraints into a plan. First, know where marketing intersects with HIPAA and state laws. Define what counts as PHI in your programs and where risk can emerge. Choose vendors that are ready for healthcare and sign BAAs where required. Document what data flows where and for how long you keep it. Build consent and preferences in plain language and collect only what you need.
On the technical side, send server side events with filtered fields and avoid platform auto tracking that can leak sensitive data. Keep payloads minimal but useful. On the creative side, write claims you can prove, include needed disclosures, and stay away from implied diagnoses or guaranteed outcomes. Keep a clean audit trail. Map data, authenticate domains, and run regular tag scans and event log reviews. Work with legal on a clear incident plan so you can respond quickly if something breaks. Done well, privacy protects people and unlocks growth. Brands that can prove they handle data with care will earn trust and get approvals faster.
Navigate the complexities of HIPAA, PHI, HHS, and other regulations to market effectively while staying compliant.
This panel brings legal, technical, and platform views on how to market in healthcare without crossing lines.
Things are never black and white in the healthcare space. They’re gray, it’s a spectrum.
Adam Putterman
Co-Founder of Ours Privacy, which is a customer data platform (CDP) that helps healthcare organizations run compliant, privacy-first marketing campaigns. With deep expertise in HIPAA and state regulations, CDPs, and healthcare tech, Adam brings sharp insight into how brands can navigate digital marketing while protecting patient data and building trust.
A lot of the health clients are moving into using CDPs, which are Customer Data Platforms more because in there you have more control over what you actually sent through to obey the legal team's restrictions.
Marina Alves
Technical Lead at Matchnode, where she helps digital health companies design compliant, high-performing acquisition pipelines. Her work touches everything from platform tracking to HIPAA-compliant data flows, making her the go-to expert on the technical foundation behind effective health marketing.
We were at a point in our history (during COVID) where we all realized that the old way of delivering care may not be the best way of delivering care.
Rebecca Gwilt
A healthcare regulatory attorney and co-founder of Elevare Law, where she helps digital health companies design legally sound, market-ready business strategies. With deep expertise in HIPAA, data governance, and the fine line between innovation and compliance, Rebecca is a trusted advisor to founders, marketers, and legal teams navigating an increasingly high-risk environment.
Chris Madden:
Patient privacy isn’t just important. It’s the foundation of trust, but here’s the challenge. How do you navigate hipaa? P-H-I-H-H-S? And the alphabet soup of regulations and still market effectively. This is Marketing Digital Health, and I’m your host, Chris Madden. We’ve got some great voices back with us, Adam Putterman and Marina Alves, plus a new one.
We’re introducing Rebecca Gwilt. Rebecca Gwilt is a healthcare regulatory attorney and the co-founder of Elevator Law, where she helps digital health companies design legally sound, market ready business strategies. With deep expertise in HIPAA data governance and the fine line between innovation and compliance, Rebecca is a trusted advisor to founders, marketers, and legal teams navigating an increasingly high risk environment.
Together, they’ll help us unpack how to stay compliant, avoid legal landmines, and still connect with patients in meaningful ways. Rebecca starts with how the legal landscape around digital marketing has changed in the last few years, listing some key changes, what worked, and some unknowns that threw companies for a loop.
Rebecca Gwilt:
So the last five years, I would say, have been a really interesting time in the digital health space. We had a wave of people sort of come out of the woodwork during COVID, who all of a sudden started paying attention to what worked and what didn’t in the healthcare sector, and many of them wanted to fix it.
And we were at a point in our history where we all realized that the old way of delivering care may not be the best way of delivering care. And so lots of people jumped in and many of those people did not come from the healthcare sector. And the reality is that entrepreneurs that cross between industries often are the most disruptive and the most clever.
Mark Cuban is a great example. He all of a sudden had somebody that told him about PBMs and how prescription pricing worked, and he said, well, that doesn’t make any sense. Let’s do something different. And it really took somebody from outside the industry to come in and create a big disruption.
That said, during that time a lot of things were tested and tried, not only from an operational perspective, a technological perspective, but from a legal perspective. Models of companies and models of reimbursement that didn’t exist before 2019 became huge. What comes to mind is the remote monitoring sector. It became a multi billion dollar industry overnight given these changes in CMS laws, which in many ways were ushered forth by the pandemic.
So now that we’re mostly back to business as usual, what I’m seeing in the last couple of years is a lot of differentiation out of necessity and consolidation. In 2021, there were a bajillion telehealth companies that were doing basic urgent care, internal medicine for the industry. And today what I see is clients coming to me saying, I’m really passionate about pediatric occupational therapy for kids with multiple chronic conditions. I want to build a company around this.
And those companies are being built slower and more deliberately and more creatively. And so five years ago, there was a cookie cutter roadmap for the few of us out there, shout out to my telehealth colleagues that built these kinds of businesses, and it became in a way commoditized. I don’t want to name names, but there’s companies out there that are building the infrastructure for telehealth in a way that didn’t exist before.
Now what we’re seeing is founders that are saying, yeah, yeah, yeah, we get it, we know how you sort of do that, but we’re thinking about a more creative business angle, or more creative financial angle, or more creative clinical angle. And for that reason, we’ve got to do things a little bit differently. So frankly, it’s made my job a lot more interesting.
And I also think in terms of why I get up and do this every day, it’s more fulfilling. I know exactly the kind of people that my clients are setting out to help. I know exactly how it’s going to help, and that helps me as their lawyer sort of align to that and be creative in the way that I can be.
At the HIPAA level, you are looking at a subsection of healthcare data that the federal government is able to govern, and that is protected health information, PHI, that is used and disclosed by covered entities and business associates who are processing standard transactions. Important to note a couple of things.
Most people believe that if your doctor’s office has your healthcare information, you are protected by hipaa, not the case. A lot of concierge practices that don’t process insurance and cash pay practices, they are not subject to hipaa, but they are subject to the health breach notification rule.
Digital health companies that are cash pay companies that do telemedicine visits and collect lots of healthcare information about you, that you’re using your credit card to pay for, your information is not protected by hipaa by those companies unless they have payer contracts, unless they’re processing what we call standard transactions.
So there’s a whole swath of healthcare information out there that’s being collected, used, disclosed, sold by healthcare companies that HIPAA doesn’t cover. And so OCR, which is the federal agency under HHS that enforces HIPAA, and the Federal Trade Commission that enforces Section five and enforces the health breach notification rule, got together and said, we’re nervous about the public’s information being bought and sold by companies because I don’t think the public is aware that it’s not being protected.
Now, I know this is the case because some of the companies in this space are not aware that this is the case. So the FTC stepped in, in 2023, to police health data misuse where HIPAA doesn’t apply, and GoodRx, which is a prescription discount and telehealth app, was not a covered entity. On their website they said, we’re HIPAA compliant, in public facing documents said, don’t worry, we respect your privacy and we will never share your health information with others. And in their privacy policy that was published on their website, they said they protected the privacy information, didn’t share it with others for any marketing purposes, et cetera.
Turns out all that was baloney.
Chris Madden:
Both companies ended up facing fines. GoodRx was fined 1.5 million dollars for sharing prescription drug discount and telehealth user data with ad platforms like Google and Facebook, despite promises of privacy. BetterHelp agreed to pay 7.8 million dollars under an FTC order, which also bans it from sharing sensitive mental health intake data, like email addresses and therapy history, with Facebook, Snapchat, et cetera, after promising privacy.
For BetterHelp, the FTC made it a point to say there was an untrained, inexperienced, young marketing person in charge of Facebook advertising and data uploads. BetterHelp promised not to share personal health data.
Rebecca Gwilt:
but secretly did share it, or inadvertently, or what have you. They sent it to Facebook, they sent it to Google, various sort of advertising and marketing analytics platforms got real information about real patients. And the FTC said that was a deceptive trade practice. You told people that you protected their data in accordance with industry standards. You told people you were HIPAA compliant. You told people in your privacy policy that this wasn’t going to happen, and it did.
And so they got hit with the FTC’s first ever health breach notification rule enforcement action. They had to pay in the order of millions, and they had an order that’s going to last for the next 20 years that makes them subject to scrutiny by the federal government for their practices. That was a warning sign to the industry that this rule that hadn’t really been enforced ever was going to start being enforced.
The FTC and OCR together said, we’re getting serious about the privacy and data protection issues related to consumers’ healthcare information, and we’re now going to focus on it. In 2022, HHS published a bulletin, and the gist of the bulletin was even if a person is simply browsing around on your covered entity website, on your hospital website, you’re collecting their IP address, they’re on your website.
We believe it’s reasonable to conclude that because you have an IP address, which is arguably identifying, and because they’re on a healthcare website, which is indicative that they’re interested at least in healthcare, that if you are sharing that information, if you’re sharing the IP address of those folks using pixel tracking technologies and then uploading that information to a third party advertising company, that this is a HIPAA breach.
And the hospital association, the American Hospital Association, pushed back enormously. And HHS ended up revising that bulletin, but they still said, listen, just because they’re on the website, maybe that doesn’t mean that it’s PHI, but if they’re on the website looking for healthcare, then yes, it’s PHI.
And again, the American Hospital Association said that is a gross overreach. How are we supposed to know the intentions of people who are surfing our website looking up diabetes information, and the American Hospital Association filed suit against HHS. They pushed back pretty hard. HHS ended up losing that case, and in the summer of 2024, the court out of Texas basically ruled that this narrow piece of this guidance, that said just the fact that someone is browsing around on your covered entity website and may be interested in healthcare, that doesn’t mean it’s PHI. That’s an overreach that expects that the hospital knows the intent in the mind of the person as they’re browsing, which obviously they could never know, and because they are not clairvoyant, that information, that IP mixed with just being on a healthcare website, isn’t enough to call that information PHI.
What I saw is a lot of my clients and others in the industry saying, whew, that guidance is null and void. We don’t have to worry about pixel tracking technologies anymore. The American Hospital Association won. Let’s keep doing what we’ve been doing. And the reality is that narrow portion of that guidance has been rescinded, and you’ll see it on the HHS website or on the OCR website, but the majority of that guidance stands, and this is where things get tricky.
The court said, regulations say in order for something to be PHI, it has to be individually identifiable health information, and that means that it has to have two key features. One, the information by itself has to be identifiable, and many courts have ruled that an IP address by itself with nothing more is not identifiable, but an IP address mixed with someone’s name is identifiable.
So it has to be individually identifiable, and it has to have to do with the past, current, or future health or healthcare of that person or payment for healthcare related to that person. And so companies, when they’re deciding what kind of information that they can pass back to third party platforms, need to make sure that information is either not having to do with healthcare or not identifiable.
If it is both and they are a covered entity, they have violated Hitler*. If it is both and HIPAA does not apply to them, there’s a lot more flexibility. We can go through some best practices, but you’ve got to have some best practices together to make sure that you are not running afoul of the Section five law, you know, Section five against unfair and deceptive trade practices.
That means your privacy policy should be very clear about what you do and do not do with that information. Your website should be disclosing and not hiding how you’re using patient information. And depending on the state you’re in, there might be opt out or even opt in requirements for certain uses of information and types of information.
Chris Madden:
Marina Alves is our technical lead at Matchnode. We introduced Marina in episode seven during the overview of the paid acquisition funnel in digital health. After Rebecca has set the stage, we turn back to Marina Alves. Marina shares how marketers are responding to those privacy challenges in real time, specifically how data gets managed, what platforms will or won’t allow, and why more teams are turning to customer data platforms or CDPs to keep control of what’s shared.
Marina Alves:
Technically, we would have the freedom to send through all the data that we can and we have access to send. Most of the time, we are not allowed to for legal reasons, either that be a restriction from an ad platform or from the client’s own legal team, who doesn’t want us to share something with a pixel or have that kind of information on the website.
So the trend, I’d say, that I’m seeing is a lot of the health clients are maybe stepping away a little more from Google Tech Manager* and moving into using CDPs, which are Customer data platforms more, because in there you have more control over what you actually send through to obey the legal team’s restrictions that they’re putting on there.
Chris Madden:
Adam Putterman is the co-founder of ours Privacy. We first introduced Adam in episode ten when we discussed technical setup around events, signal resilience, and attribution. Adam dives deeper into CDPs. He explains why they’ve become essential for healthcare marketers who want to run performance campaigns without exposing sensitive data to platforms like Meta and Google.
Adam Putterman:
What happened a few years ago that really brought the CDP conversation to the forefront of many healthcare marketers’ minds is that there was a large increase in FTC action, OCR action, and then civil class action lawsuits and media outreach or publication highlighting healthcare companies, particularly health systems and large mental health platforms, that were not only pixeling their marketing site and sending all of that data back to a Meta or Google, but were also in some cases putting a pixel on their backend platforms and sharing actual patient experience or user journey data.
So there’s a ton that’s wrong with that. And I think where the regulations and the guidance and really consumer sentiment and the market has landed is that if you sell a sensitive product or experience, a healthcare product in particular, and you’re sharing back with a Meta or Google purchase data and identifying data of who made that purchase, or Adam Putterman went to this website and did these things and then bought a therapy session, and his email is this, and his IP address is this, and the type of therapy session he bought is this, that’s health information.
What a lot of healthcare companies need to do very quickly is find a way to run performative advertising and effective analytics without sharing that data to platforms that wouldn’t sign a BAA, or that were non compliant, and that’s where CDPs came in, because instead of having to reinvent the wheel and build all of these integrations natively and essentially become an advertising company more than a healthcare company, what a CDP enabled these companies to do is to modify or remove user properties before they went back to platforms like Meta and Google.
Chris Madden:
Large hospital systems and some mental healthcare platforms used to encounter data privacy breaches, some data that clearly should not have been shared. CDPs like ours Privacy eliminate that problem when patients are being onboarded. The most advanced and current approach to data management and signal resilience, ours Privacy is HIPAA compliant and purpose built for digital health.
And to Adam, things are never black and white in the healthcare space. They’re gray, it’s a spectrum, it’s dependent on the type of services you’re providing and many other factors.
Adam Putterman:
In general, the consensus that we’ve seen develop, and you can see this by reading the HHS guidance, you can see this by reading some of the class action cases that have been made public in the last year, is that you need to be careful sharing the combination of identifying user data and a purchase or high intent or inferable action.
The metaphor we always like to use here is if you were to go to a store or a library and check out a book on mental health, and then as you walk out, the library called up their contacts at Meta and said, hey, Adam just checked out this book. By the way, he lives at this address, his email is this, his IP address, can you find me more people that want to check out these types of books?
That feels obviously wrong. That was kind of the state of things for a long time. And so we’ve seen a correction to that. And instead, what the library might do is say, hey, someone that you sent me did something I want more of. They might have checked out a book on mental health, but they might have just walked around the library for more than 30 seconds. And so what I’m going to tell you is just that of the hundred people that you sent to my library, 50 of them did something I want more of.
And that seems to be a much more balanced take on how to run effective yet privacy protecting ads and analytics.
Chris Madden:
Adam weighs in on the benefits of companies using a CDP versus alternative approaches.
Adam Putterman:
The compliance risk there is probably smaller than people expect. The real risk there is on the performance side and the resourcing side. So again, taking a step back on when a CDP makes sense or why you go with the CDP versus doing something internally, taking some other approach.
Ideally, what a CDP does is enables you to be compliant, of course. It also, of course, increases your performance. It reduces external media or legal risk because you’re sending a signal to the market that you’re taking extra privacy practices or investing in privacy practices, and then fourth, it decreases resourcing needs. You’re not having to maintain it.
So enabling compliance is something I think companies can do internally, for sure. Increasing performance starts to drive a lot of the resourcing needs, because all of these APIs, like Meta‘s Conversion API and Google and Reddit, and the platforms start to add up. They’re constantly changing and they require maintenance and upkeep and the best practices change.
And then you start to get into, well, how are you doing identity stitching across all these different touch points. Eventually, in some cases, if you’re building something internally, you become a media company, not a healthcare company, and that’s really what you’re trying to not do.
So yeah, I think there are cases where it makes sense to do this internally. The main reason not to would be that it quickly becomes, or consumes, not just your marketing team but your engineering team, and that means they’re not spending time on product, and your marketing team isn’t spending time on strategy and creative and optimizations.
Chris Madden:
We broaden the lens again to include artificial intelligence. Rebecca covers the compliance risks that come with AI and why hospitals and health systems are suddenly adding strict new rules to their contracts.
Rebecca Gwilt:
So now they have entire summits all over the world, three to five days long, on healthcare AI. Top of mind for the moment in terms of what my digital health clients are asking me, a lot of it is in the context of what they need to do to get through procurement with very large healthcare buyers.
So as hospitals, health systems, employers, other very large healthcare entities, pharma, as they become more educated about the potential risks and the nature of the AI that could be used within their organization, their legal departments are beefing up their compliance measures, their contracts, their governance around this. AI has the ability to very quickly get very bad. It is one of these things where it goes bad and the scale of the bad could be enormous.
And these are entities that are in charge of very sensitive data about millions and millions of people, and they are going to err on the side of very stringent measures once they get their head around it. I would say a year ago, two out of five hospitals didn’t even ask about it. This year, probably three to four out of five hospitals ask about it, and one out of five has really mature language in their contracts that either say, can’t use it at all, which forces the conversation about what it is, what it’s doing, what information it’s collecting, how it’s training its algorithm, whether it’s using de identified or identified data.
What are the outputs, who owns the outputs, who owns the analyses, what are the IP implications. All those things are bubbling up in the procurement process now in the digital health space. And what I tell my clients about is they better have answers to these questions, or what was already a hard process, a six to twelve month procurement process, is either going to be scrapped because you can’t get past their AI governance board, or it’s going to be extremely long and that’s going to put your company in jeopardy, especially for early stage companies.
Chris Madden:
I was curious about how working in digital health and dealing with privacy concerns has affected Adam and his own health journey. Even as a patient, it’s easy to sign forms without realizing what you’re agreeing to. His experience shows just how much fine print can slip past us.
Adam Putterman:
Yeah. I need glasses and, and the other day I went to get my eye exam so that I could get glasses, and I was checking in, and after I signed in, they’re like, okay, now we need you to sign a few forms. And then they just turned the signature block around and asked me to sign.
I asked, what am I signing, and they’re like, oh, it’s a HIPAA authorization. Then of course my ears perked up, because I’m like, oh, I wonder what it is. But they wouldn’t show me what it was, they were just asking me to sign it, which, now I’m probably the worst customer or patient from that perspective. I want to know exactly what I’m authorizing and what I’m giving consent to.
So I asked them to show me, and they couldn’t show me on the screen, so they had to print it out. And then I reviewed it before signing, and it ended up being interesting, because among many things I was potentially unknowingly consenting to, one was publishing the scans of my eye with, I think, identifying information in any journal, which felt like something I definitely didn’t want to do.
I still signed it because I needed the eye exam and I didn’t want to go somewhere else. But I think the number one way that it’s affected my health is that I’ve become a pretty terrible patient from that perspective, which probably doesn’t help me get great healthcare, but I’m just so much more, I just want to know. There’s no problem I’m needing or wanting to use my eye scan to publish in publications, but I would prefer to be asked about it.
Chris Madden:
Here’s what I hope you take away. Protect patient trust like it’s your most valuable asset, because it is. Stay curious, stay compliant, and don’t treat privacy as a barrier. Done right, it’s actually a competitive advantage.
Compliance gets even more complicated when every state in the US can have its own rules. Episode 16, our next episode, examines how to navigate state level fragmentation without losing momentum.