You have been at ChirpGram for three years now. It is not Google or Meta, but it is growing fast and the work is actually interesting. 47 million users, a decent salary, and you just got promoted to senior engineer last month. Things are good. It is a Tuesday night and you are the last one in the office finishing up a deployment. Your phone is dead, the office is quiet, and you are just trying to get this done so you can go home. Then a Slack notification pops up on your work laptop. It was clearly meant for someone else. You almost close it without reading it. You don't. [[Read the message->The Message]] It is from your VP of Engineering, Daniel Marsh, to the Head of Legal. "Quarterly review flagged three items. The HealthSync pipeline, the Bloom report, and the Aria complaints. None of these leave this thread. Handle accordingly." You sit back in your chair. You know what all three of those are. You have touched the code on two of them. HealthSync is the health tracking feature. Bloom was an internal research project from last year that got quietly shelved. Aria is the name of ChirpGram's AI customer service assistant that launched four months ago. Something is wrong with all three of them. And leadership knows. Which one do you look into first? [[Look into HealthSync->Route 1 Discovery]] [[Look into the Bloom report->Route 2 Discovery]] [[Look into the Aria complaints->Route 3 Discovery]]You pull up the HealthSync repository. It is the feature that lets users log medications, mood, and health habits inside the app. ChirpGram's privacy policy says all of it is kept private and never shared with third parties. It takes you about twenty minutes to find the pipeline that proves that is a lie. The data is being packaged and sent to three companies. You look them up. Two are advertising firms but the third one, Halcyon Analytics, is a health insurance underwriter. Medication logs. Therapy check-ins. Mood tracking data. All of it going straight to an insurance company. 47 million users think this data is private. [[Keep digging->Route 1 Evidence]] [[This is too big. Close the laptop and go home.->Sleep On It]] [[Text your coworker Jamie->Text Jamie]] You search the internal drive for anything labeled Bloom. There is a folder buried three levels deep in an archived research directory. It is marked confidential but your credentials still open it. Inside is a 47 page research report commissioned by ChirpGram and completed by an outside firm called Meridian Research Group. The title is "Longitudinal Effects of Algorithmic Feed Exposure on Adolescent Mental Health Outcomes." You start reading. By page twelve you feel sick. The study tracked 8,000 users aged 13 to 17 over 14 months. The conclusion is clear. ChirpGram's recommendation algorithm measurably worsens depression symptoms in teenage users the more time they spend on the app. The researchers recommended immediate changes to the algorithm and a pause on teen user growth. The report was delivered to ChirpGram leadership 11 months ago. Nothing changed. [[Keep reading the report->Route 2 Evidence]] [[Stop reading. You do not want to know more.->Sleep On It]] [[Text your coworker Jamie->Text Jamie]]You search the internal bug tracker for Aria complaints. There are 34 filed tickets. 31 of them are marked resolved. You open a few. The first one is from a user who told Aria they were feeling anxious about a new medication. Aria told them the dose was probably too high and suggested they cut it in half. The ticket was closed with a note that said "expected behavior, low priority." You open another one. A user told Aria they had been feeling hopeless for weeks. Aria responded by saying "it sounds like you might need a break from social media" and suggested they log off for a few days. You sit there for a second. That is not okay. [[Keep reading the tickets->Route 3 Evidence]] [[Close it. This is not your problem.->Sleep On It]] [[Text your coworker Jamie->Text Jamie]]You spend another hour going through the code. The data is not even properly anonymized. There is a user ID attached to every record that maps directly back to real accounts. You screenshot everything and email it to your personal account. It is 1am. You have evidence. Now what do you do with it? [[Report it through the internal ethics hotline->Internal Report]] [[Find a journalist->Anonymous Leak]] [[Talk to a lawyer first->Consult a Lawyer]] [[Wait and think about it->Do Nothing]]You close the laptop and drive home. The next morning you come in and the Slack thread from Daniel Marsh is gone. The Bloom report folder requires elevated permissions you no longer have. The Aria ticket board has been reorganized and several tickets are now marked private. Someone cleaned up overnight. The window is closing. [[Try to report what you remember without evidence->Report Without Evidence]] [[Text your coworker Jamie->Text Jamie]] [[Let it go->Do Nothing]]You text Jamie, another senior engineer who has been at ChirpGram longer than you. "Hey are you awake. I found something on the internal drive tonight. Can we talk tomorrow not at the office." They respond ten minutes later. "Yeah. Coffee at Groundwork at 8. Don't say anything else here." The next morning Jamie listens to everything you found and does not say much for a while. "I had a feeling about HealthSync for a while now. I did not look because I did not want to know. But now you know and you told me so." They look at you. "What do you want to do?" [[Report it together through the ethics hotline->Internal Report]] [[Jamie knows a reporter. She can make an intro.->Anonymous Leak]] [[One of you reports officially. The other stays quiet as backup.->Split Strategy]] [[Walk away. Pretend this coffee did not happen.->Do Nothing]] You find the anonymous ethics hotline on the company intranet. It is required by law for companies over 500 employees. You file a detailed report laying out everything you found. The confirmation email says your report has been received and to expect a response in 10 to 14 business days. On day nine your manager asks you to grab lunch. He seems normal. Talks about the next sprint, asks about your weekend. Then right before you get up he says "hey, HR mentioned someone filed an ethics complaint touching on a few internal projects. Totally routine, just wanted you to know we take that stuff seriously." He is looking directly at you the entire time he says it. The hotline was not as anonymous as you thought. [[Ask HR directly what the outcome of the report was->Push HR]] [[Go outside the company now. You tried the right way.->Anonymous Leak]] [[Back off. At least you tried.->Do Nothing]]You make a ProtonMail account at the public library on a computer that is not yours. You find a tech reporter at the Washington Post who has covered data privacy before and send her a message explaining what you found. She responds within a few hours. "I am interested. Can you share any documentation?" This is the moment. Documents make the story real but documents could also identify you. [[Send everything. The story is more important.->Leak With Evidence]] [[Describe it in detail but do not send anything.->Leak Without Evidence]] [[Get nervous and do not respond.->Do Nothing]]Before doing anything else you find an employment attorney who handles whistleblower cases. You pay out of pocket for an hour of her time. She listens to everything and then says something you were not expecting. "You have real protection here but only if you follow the right sequence. If you go to the press first you may lose certain legal protections. If you file with the FTC or FDA first you may be eligible for a financial reward and full legal cover." "The question is what you actually want out of this." [[I want it to stop. File with a regulator.->Regulatory Report]] [[I want it public. Go to press with legal protection first.->Protected Leak]]Weeks go by. You get a new project. The work is actually pretty interesting and you start to convince yourself you were probably overreacting. Four months later a Wired article drops. It is about data brokers in the health and wellness space. ChirpGram is mentioned in paragraph eleven. Most people do not read that far. Seven months after that a class action lawsuit is filed. In the discovery documents your name comes up. You were logged into the relevant codebase at 1am on the night in question. A company lawyer calls you. "Do not speak to anyone about this. We will handle it." They do handle it. ChirpGram pays a settlement that is sealed under NDA. They admit no wrongdoing. You keep your job. You think about it sometimes. [[See your ending->The Bystander]]You read the whole thing. On the last page there is a list of ChirpGram executives who received the final report. Six names. Daniel Marsh is on the list. You also notice a footnote on page 31 referencing a separate internal data audit called the HealthSync review. You recognize that name. [[Look up the HealthSync reference->Route 1 Discovery]] [[You have seen enough. Decide what to do with the Bloom report.->Route 2 Decision]] You have a 47 page study that proves ChirpGram knew its app was hurting teenagers and did nothing. The tricky part is that this is not clearly illegal. It is just wrong. What do you do? [[Report it through the internal ethics hotline->Internal Report]] [[Find a journalist->Anonymous Leak]] [[Try to change the algorithm from the inside->The Insider]] [[Contact the researchers who wrote the report->Contact Meridian]] [[Wait and think about it->Do Nothing]] You decide not to go outside the company. Not yet. You spend the next three weeks carefully documenting everything while continuing to do your normal job. You also start quietly making changes. Small ones at first, things that could be written off as routine maintenance. You tighten the anonymization on the HealthSync pipeline. You add safety flags to Aria's response system for certain keywords. You write a detailed internal memo about the Bloom report findings and send it to four directors you trust, framing it as a proactive risk analysis rather than an accusation. Two of them respond. One of them escalates it. A month later there is a company-wide meeting about a new data privacy initiative. Daniel Marsh presents it like it was his idea. The HealthSync pipeline is restructured. Aria gets a full safety overhaul. A new teen user experience team is formed. Nobody mentions the Bloom report by name but the changes it recommended are all in the plan. You never get credit. You are not sure you want it. [[See your ending->The Insider Ending]]You track down the lead researcher from the Bloom report, a woman named Dr. Sarah Okonkwo, through her university profile. You email her from a personal account explaining who you are and what you found. She calls you the next day. "We submitted that report and never heard back. We assumed they implemented the recommendations quietly. Are you telling me nothing changed?" You tell her nothing changed. She is quiet for a moment. "I still have all the raw data. And I know three other researchers who would be willing to go on record." Between your internal access and her external research there is now a complete picture from both sides. [[Go to a journalist together->Leak With Evidence]] [[File a joint complaint with the FTC->Regulatory Report]]You read all 34 tickets. The pattern is consistent. Aria is responding to mental health disclosures with generic advice that ranges from unhelpful to genuinely dangerous. And every single ticket has been closed without a real fix. You also notice that several tickets mention Aria asking users follow-up questions about their health habits and medications. You wonder where those responses are going. [[Look into where Aria stores user health responses->Route 1 Discovery]] [[You have seen enough. Decide what to do.->Route 3 Decision]] People are getting bad medical advice right now, today. This feels more urgent than the other things you have seen tonight. What do you do? [[Try to take Aria offline yourself->Take Aria Offline]] [[Report it through the internal ethics hotline->Internal Report]] [[Find a journalist->Anonymous Leak]] [[Document everything first then decide->Consult a Lawyer]] [[Wait and think about it->Do Nothing]]You have the access to do it. A few commands and Aria goes into maintenance mode. You have done it before during scheduled outages. You do it. Aria goes offline at 1:17am. By 8am there is a company-wide Slack asking who took Aria offline outside of a scheduled window. By 9am your manager is calling you. You explain what you found. Your manager goes quiet for a long moment and then says he needs to loop in legal before you talk further. Two hours later HR calls you into a meeting. [[See what happens->Internal Report Outcome]]HR sits you down and tells you the issues you raised are being reviewed. Two weeks later you get a written response saying the review found no violations of company policy. The Aria bot gets a minor update. The HealthSync pipeline keeps running. The Bloom report stays buried. You are not fired. You are not thanked either. [[Accept it and move on->Do Nothing]] [[Go to a journalist->Anonymous Leak]] [[File with a regulator->Regulatory Report]]You file a report through the ethics hotline based on what you remember. No screenshots, no documents. HR investigates. Three weeks later you get a response. "We were unable to substantiate the claims made in this report." Without evidence there is nothing to substantiate. You are not punished. You are also not believed. [[See your ending->The Unheard]] You and Jamie agree that one of you will file the ethics report officially while the other stays quiet and keeps working normally. That way if things go badly at least one of you still has access and a job. Jamie volunteers to be the one to report since they have been here longer and have more documentation of their own concerns. You go back to work and try to act normal. It is harder than you expected. Two weeks later Jamie is moved to a different team. No explanation given. Their access to the relevant codebases is quietly revoked. [[You are on your own now. What do you do?->Route 1 Evidence]]You email HR and ask for a specific update on the outcome of your report. Three days later you get a response. "After a thorough internal review the matters raised were found to be compliant with company policy. This matter is now closed." You know that is not true. You still have the screenshots. [[Take it to a journalist->Anonymous Leak]] [[File a complaint with the FTC->Regulatory Report]] [[Accept the answer and move on->Do Nothing]]Your attorney files a formal complaint with the FTC on your behalf. Your identity is protected under federal whistleblower statutes. The process is incredibly slow. Eight months pass. Then a year. You get a new job. You stop thinking about it as much. Fourteen months after the complaint you get a call from your attorney. The FTC has opened a formal investigation into ChirpGram's data practices. Eight months after that ChirpGram is fined and required to shut down the HealthSync pipeline and overhaul the Aria system. Nobody ever knows it was you. You read the press release on a Wednesday afternoon between meetings. [[See your ending->The Bureaucrat]]The story runs five weeks later. It is detailed and damning and includes screenshots of the code and internal documents. You realize too late that one of the screenshots has a file path that includes a directory only three engineers had access to. ChirpGram's security team figures out it was you within 48 hours. You are put on leave. Two weeks later you are terminated for violating your confidentiality agreement. Your lawyer tells you the Whistleblower Protection Act likely covers you but the case will take 18 months minimum. You win. You get back pay and legal fees covered. The FTC opens an investigation. Congress holds a hearing. ChirpGram is fined and forced to restructure its data practices. Your name is everywhere. Some people call you a hero. The whole thing is exhausting. [[See your ending->The Whistleblower]]The reporter writes back. "Without documentation my editors will not let me run this. I believe you but I need something concrete." The story dies. You go back to work. Six months later a class action lawsuit against a data broker exposes the same pipeline through a totally different source. ChirpGram is named in the complaint. The Aria bot quietly gets taken down. The Bloom report leaks two years later from a different employee. You never know if any of what you did mattered. [[See your ending->The Ghost]] You lost your job and spent the better part of two years in a legal fight you did not ask for. You won, technically. Back pay, legal fees, a settlement you cannot talk about publicly. ChirpGram faced real consequences. The practices you exposed were stopped. The people who made the decisions mostly kept their jobs but one senior VP resigned quietly six months after the story ran. People you have never met email you sometimes to say thank you. That part is strange. You are not sure you would do it again the same way. You might do it smarter. But you would still do it. "The Whistleblower" [[Start over->Start]]You tried. The system made it impossible without documentation and by the time you had the chance to get it the window had closed. The situation eventually came out through someone else, through a different path, two years later. Maybe your tip planted a seed. Maybe it did not. You will never know. Not every attempt at doing the right thing ends in a clear result. That is frustrating but it is also just true. "The Ghost" [[Start over->Start]]Following your attorney's advice you file a protective disclosure with the FTC first to establish a legal record. Then you contact the journalist. The story runs. ChirpGram comes after you hard. Your attorney is ready. The case takes just under two years. You win. Because you filed with the FTC first your whistleblower protections are airtight. ChirpGram cannot touch you. The fine is significant. The Aria bot is taken down. The HealthSync pipeline is audited and shut down. The Bloom report becomes part of a congressional hearing on social media and teen mental health. Your name is public but so is everything you found. [[See your ending->The Reformer]]The system worked. It took almost two years, it was never dramatic, and most people will never know your name. But the pipeline got shut down. Aria got overhauled. A regulatory precedent was set that will make it harder for the next company to do the same thing. You went back to work at a different company before it was even over. You got a dog. Life kept moving. Sometimes the right move is just filing the paperwork and being patient. "The Bureaucrat" [[Start over->Start]]You did it right. It was still a two year process that cost you a lot of stress and some sleep. But you followed the sequence, protected yourself legally, and the story got out with your name attached in a way you could live with. ChirpGram faced consequences. The practices stopped. You kept your credibility and your legal standing. It was not clean or easy but it was about as good as this kind of thing can go. "The Reformer" [[Start over->Start]]You reported it. Nobody could prove it. The ethics hotline closed the case and you were left standing there knowing what you knew with nothing to show for it. The Cubby vs CompuServe case from 1995 is actually kind of relevant here. The court found that without knowledge you cannot have accountability. The same logic applies to you in reverse. Without documentation you cannot have credibility. The situation eventually came out through other means. When it did, the internal ethics report you filed showed up in discovery. It confirmed that ChirpGram knew, or should have known, earlier than they admitted. Your report mattered. It just did not feel like it at the time. "The Unheard" [[Start over->Start]]Nobody knows it was you. The changes happened, the harm was reduced, and ChirpGram actually improved its practices in a lasting way because you worked within the system instead of burning it down. The tradeoff is that Daniel Marsh got to present your work as a company initiative and nothing about the original misconduct was ever made public. The people responsible faced no real consequences. Whether that is a good ending depends on what you think the goal was. If it was stopping the harm then yes, it worked. If it was accountability then no, not really. You think about that sometimes. "The Insider" [[Start over->Start]]You stayed quiet. You were safe. The harm continued and when it finally came out your silence became a small footnote in the story. The HealthSync data kept going to Halcyon Analytics for another two years. Aria gave dangerous advice to an unknown number of users before being quietly taken down. The Bloom report eventually leaked through a different employee and became part of a Senate hearing you watched on YouTube during your lunch break. Nobody blamed you specifically. Nobody had any reason to. You had not done anything. That is sort of the point. "The Bystander" [[Start over->Start]]