Flock Safety, Dunwoody, and New Jersey: The Hidden Surveillance Infrastructure No One Voted For
Across New Jersey, small black boxes are appearing on poles at neighborhood entrances, intersections, and commercial corridors. They’re marketed as Flock Safety cameras – a “smart” tool to deter crime, recover stolen cars, and help police respond faster.
Local officials repeat the vendor’s talking points: automatic license plate readers, privacy by design, 30‑day data retention, “we own the data, not Flock.” Residents are told not to worry.
But when you step away from the marketing and look at internal logs from real deployments – especially the verified Flock event logs from Dunwoody, Georgia – a very different picture emerges:
“License plate readers” quietly upgraded to full live‑view cameras.
Data shared with over 1,200 external agencies, contrary to public assurances.
Private camera networks labeled “Do Not Share” shared anyway.
Flock employees in other states logging in to view cameras aimed at pools, gyms, preschool hallways, and gymnastics rooms.
Phantom accounts and system users performing privileged actions with incomplete audit trails.
For New Jersey residents, lawyers, journalists, and policymakers, this is not an abstract “other state’s problem.” Flock is actively selling and deploying the same architecture here, under the same narratives.
Flock Safety is best understood as a data platform, not just a hardware vendor. The cameras are the sensors; the real power lives in Flock’s cloud software, FlockOS.
Flock’s core devices fall into two broad categories:
Automatic License Plate Readers (ALPRs)
Capture high‑resolution images of passing vehicles.
Extract plate number, date, time, and GPS coordinates.
Tag “vehicle fingerprint” attributes: make, model, color, body style, visible damage, roof racks, bumper stickers.
Enable searches like “blue Honda sedan with front‑left damage and a roof rack” without knowing the plate.
Live‑view video cameras (e.g., Condor)
Provide continuous or on‑demand video streams.
Deployed at parks, dog parks, trails, intersections, city facilities, HOAs, schools, religious campuses, and private entities.
Often support pan‑tilt‑zoom and low‑light capabilities.
In practice, many deployments that began as “LPR only” have been quietly upgraded to live‑view video without a fresh public debate or contract rewrite. Residents who think they approved a plate scanner are now living under a city‑wide video grid.
In FlockOS, authorized users can:
Run exact or partial plate searches across all cameras they can access.
Search by vehicle fingerprint: color, make, model, body style, roof racks, dents, decals.
Use association / convoy analysis to find vehicles that frequently appear together, effectively mapping travel companions and potential “associates.”
View live or recorded video from any shared live‑view camera (parks, schools, campuses, HOAs, businesses).
Critically, Flock encourages agencies to share their networks with each other. A small town’s camera grid can quickly become part of a regional or national search space, depending on configuration and vendor‑enabled features.
This is a qualitatively different system from a single, stand‑alone camera.
EEAT & AI SEO by Jornio.com
Produced by LegalPodcasting.com
in association with the NichePodcastPodcast.com
You’re driving. It’s late. Maybe you’re on Route 70 cutting across South Jersey, or rolling up or down the Turnpike to your daily exit. Headlights in front of you, headlights behind you. Same commute you’ve done a thousand times. You don’t see it. You don’t hear it. But a little black box on a pole the size of a shoebox just watched you go by and kept a record: your license plate, the exact time, your lane, your speed, what kind of car you drive, even the dents, the roof rack, the bumper sticker you stuck on your back window. By the time you get home, a private company you’ve never heard of has quietly added another line to the permanent story of where you’ve been and when.
That company is called Flock Safety. And tonight on the New Jersey Criminal Podcast, we’re not talking about one bad cop, one crazy stalker, a serial killer, or a rogue detective. We’re talking about an entire surveillance platform that’s wiring up New Jersey’s roads, HOAs, schools, parks, and shopping centers while most of us are not paying any attention whatsoever. In this first segment, we’re going to stay out of the courtroom and inside the machine. What are these cameras? What exactly do they collect? Where does the data go? Who can search it? And why does a town in Georgia, almost 800 miles away, matter for people driving through Cherry Hill and Newark and Toms River?
For now, just picture that black box on the pole as you drive, because by the time we’re done with this, you’re going to see it very differently. And you might even start to actually see it.
So, the sales pitch versus the reality is where we’re going to start. If you go to a council meeting in a small New Jersey town, the pitch for Flock sounds pretty simple: automatic license plate readers help us find stolen cars, help us find missing kids faster, deter crime. The mayor in Monroe Township, for example, put out a column calling Flock license plate readers a “valuable tool in deterring crime,” and talking about how these cameras will help keep neighborhoods safe. Well, that sounds reasonable. Who’s against catching car thieves and hit-and-run drivers?
Flock leans hard into that story. Their marketing is full of phrases like “solve and deter crime,” “AI-powered precision policing,” and “helping neighborhoods work together.” They talk about “privacy by design,” data retention limits, and they say that local departments, not Flock, own the data. You hear that and you think, okay, it’s like a Ring doorbell for the town. Some extra eyes on the road. No big deal. That’s the sales pitch.
The reality is a lot bigger and a lot stranger, because Flock is not just a camera. It’s a network. And once you understand the network, you start to see why cities around the country—Mountain View, California, Hillsborough County down in Florida, and others—have already hit pause or canceled contracts entirely over what this system actually does.
Let’s start with the basics: the hardware. The devices themselves are not just cameras. Flock sells different pieces of hardware, but they all plug into the same nervous system.
The license plate machines, the first piece, are what most people think of: the automatic license plate reader branded with names like Falcon. You’ve seen them without realizing it: a small dark box strapped to a metal pole near the entrance of your development, a strange-looking unit mounted on a traffic light over a country road. They’re not pointed at your face; they’re just pointed low enough to catch every plate that crosses their field of view. Every time a car passes the camera, they capture a high-res image of your vehicle, read the plate, log the date, the time, the GPS coordinates, and often tag additional metadata like the make, model, color, and body style of your automobile.
Flock has something it markets as a “vehicle fingerprint.” That’s their term. The idea is your car has a unique signature, just like your finger. So instead of “blue Honda Civic,” the system learns to recognize “Honda Civic ‘22–24, has a roof rack, a little bit of front-end damage on the left, with that sticker you put in your back window.” So now you don’t have to know the license plate to find the car. Now you can search for—or someone can search for—your automobile with descriptors like “blue sedan, Honda, roof rack, left front-end damage,” and the system will come back with hits that match.
That’s not equivalent in any way to a cop eyeballing a few dozen still images. That’s software slicing through millions of scans from dozens of towns in seconds.
The live view cameras are the other side of the hardware line—live video. Flock brands these under names like Condor and sells them as an answer to trespassing and unauthorized access for businesses, homeowners associations, parks, schools, and city facilities. These too are not just license plate readers. These are full video cameras watching a park playground, a dog park, a library entrance, a church parking lot, or a preschool hallway. They can be fixed or pan–tilt–zoom. They work in low light. They can be mounted on trailers and rolled around town for special events or “crime surges.”
And just like the plate readers, they’re not sitting on a DVR in the back office of the police station. They are streaming into Flock’s cloud. Those live view cameras are going to matter a lot later when we get into how they’re used and who gets to look through them, because that’s where this story stops being abstract and starts feeling a little bit personal—and frankly, super messed up.
For now, think of it this way: on one side, you have machines that log where your car goes. On the other side, you have machines that can watch where you are. Both feed into the same software brain.
So, let’s go inside the operating system. Inside Flock OS—the brain—everything you just heard about, the plate snaps, the vehicle fingerprint, the live view video, all flows into a central platform Flock calls Flock OS. This platform is where the real power is: the search engine, the AI summaries on you and your life.
Imagine you’re a detective. You sit down at a terminal and open Flock OS. The interface looks like a clean, modern search dashboard. On the left are filters like plate number, state, time, date range, make, model, color, unique identifiers like decals, a roof rack, aftermarket wheels. On the right, a map with hits popping up like pins.
You can type a full plate and get every sighting of that vehicle for the last 30 days, 60 days, or however long your agency keeps data. You can type a partial plate and let Flock fill in the gaps. Skip plates altogether if you want and run it by description: “white pickup truck, ladder rack, black rims, no tailgate, between these dates and in this radius.” That’s one vehicle.
Now zoom out. The system can look at patterns. Which cars always show up near each other? Who is traveling in convoy with whom? Flock actually markets a feature for this. It looks for vehicles that repeatedly appear together in space and time and represents them as “associates.” It doesn’t care whether it’s a drug crew or two neighbors who commute together and grab coffee every Tuesday. From a policing perspective, that’s pretty powerful. From a civil liberties perspective, it’s preposterous and terrifying.
Now add live video to that. If your agency has live view cameras, the same platform can show you which cameras are online—a grid of thumbnails: park, school, hotel, homeowners association, city hall. With one click, you’re watching the feed. And if you’re thinking that sounds like a private Ring network merged with a police intel platform, you’re not far off.
Here’s the twist that really matters for New Jersey: your town is not an island. Your local little Flock project is not an island. Their entire business model is built around connecting agencies to each other. When a police department or sheriff’s office signs up, Flock nudges them to share their camera network with neighboring jurisdictions, task forces, regional partners, and so on.
Some of that makes sense on paper. If a stolen car blows through the border between Town A and Town B, both agencies want the plate hit. But in practice, it means your license plate scans taken in your quiet town can end up searchable by state police, neighboring counties, out-of-state agencies in regional task forces, and federal agencies depending on agreements. And we’ve already seen examples of where this flies out of control.
In California, the city of Mountain View discovered that a Flock “National Search” feature had been turned on—which meant agencies outside of Mountain View could search scans inside Mountain View, potentially in violation of California’s strict ALPR laws. The city responded by pulling the plug on all 30 of its Flock cameras while it figured out how far that sharing went, who’d been looking, and what the circumstances were.
In Florida, Hillsborough County ended up in the news over concerns about how Flock was handling data and who had access to it. Surprise, surprise: we’re finding bad actors who are doing things they shouldn’t be doing with this information. Or it gets so much worse—this is such a long episode—and those are just the places that caught it. What about smaller towns that don’t have an internal privacy office or a tech team combing through Flock settings? Looking at you, Monroe, New Jersey. What about your town?
So, let’s talk about what Flock claims about privacy. At this point, it’s worth taking a breath and talking about what they say in their own defense. If you read Flock’s website or FAQ, you’ll see a few core promises repeated over and over:
– Local agencies own the data; Flock is just a vendor, a neutral host.
– Data is retained for a short period, often marketed as 30 days, unless the local agency changes that.
– The system does not use facial recognition.
– There are access controls, audit logs, and they say their employees can’t just sit around watching your footage for fun.
All those promises are meant to reassure you. It’s meant to sound like: we built powerful tools, but your police department is in the driver’s seat; we can’t see anything without their permission; you’re safe.
There are three problems with that story, minimum.
Problem one: architecture beats policy. When you build a system that can track the movements of every car that passes a camera, store that data in a centralized cloud, correlate it with make, model, color, and unique features, and map associations over time—then overlay that with live video feeds from schools, parks, and neighborhoods—you have built a surveillance machine. Once it exists, you are relying on policy, mostly quiet policy that no one reads, to prevent abuse.
When “abuse” is on the table, we mean real abuse, and we’ll get to that later. You’re relying on a policy that says “delete after 30 days rather than 90 or 180.” A policy that says “no access without a case number.” A policy that says “no sharing with federal agencies unless there’s a warrant.” And you’re trusting hundreds of small towns under pressure to “do something about crime” to configure that perfectly and never drift. If you pay attention to how local government usually works, you know how unrealistic that is.
Problem two: documented misuse. This isn’t hypothetical. Investigations by civil liberties groups have already turned up examples of plate reader systems, including Flock, being used to track protesters and activists, monitor marginalized communities, investigate visits to reproductive health clinics. In one Kansas case, a police chief used Flock cameras over 200 times to stalk his ex-girlfriend and her new partner—not solve crimes, not find stolen cars, not find a missing child faster, just watch someone’s movements out of obsession and spite.
So Flock can point to its policies all day. When a system is this powerful, you only need one bad actor with a login and a grudge.
Problem three: leaks and outsiders—what happens when you have data and that data spills outside the circle it was allegedly meant for. We’ve already seen a breach and misconfiguration that exposed live police camera feeds—live video—to outsiders who could not only view, but delete footage. We’re talking about citizens, you and me, logging into a police Flock system and watching any camera they want and deleting any footage they want. Federal agencies reportedly querying Flock data in ways that went beyond what local contracts even allowed, prompting investigations into unauthorized access.
The more eyes you invite in—local police, county, state, federal, neighboring towns, task forces, private homeowners associations—the more potential there is for that data to be misused, copied, leaked. That’s the system in place. And remember, the pitch to local government is: “we make it easy to share.”
All of that is the meta level. Now we come back to the ground in New Jersey, because it’s one thing to talk about Mountain View and Kansas. It’s another to ask what this looks like on the roads you actually drive.
Let’s talk about living inside the web. New Jersey is built for this kind of surveillance: a dense road network, toll roads, bridges and tunnels, bedroom communities feeding into Philadelphia and New York, malls, warehouses, massive shopping centers, gated developments. You already know you’re on camera at the mall and the gas station and at E-ZPass. What Flock does is connect those moments into a continuous narrative.
Picture a normal New Jersey commuter. Camera at the entrance to their development. Camera along the county road near the high school. Camera at the on-ramp to 295. Cameras sprinkled along 295. Cameras at the off-ramp near their job. Cameras at the grocery store lot on their way home. Cameras at their kids’ soccer field. Cameras outside their synagogue or church.
If even a fraction of these are hooked into Flock, you’ve created a detailed, timestamped map of that person’s life: when they leave work, when they come home, where they shop, where they worship, where they routinely drive, who they routinely drive alongside, whether they ever visit a clinic they’d never talk about, whether they attended a protest. That map is searchable. It can be exported. It can be shared.
And we’ve already told a story about regular folks like you and me logging into these systems and looking at whatever they want. In town after town, the only public conversation has been a brief presentation about helping to solve crime and a quick vote to approve the contract. In some places like Monroe, New Jersey, the official messaging is basically: “We’re installing Flock surveillance. This will help us deter crime. Don’t worry, it’s for safety.”
What you almost never get is: a detailed discussion of how long the data is kept; an explicit list of every outside agency that can query your town’s data; a commitment not to share with other agencies outside of scope, or not to use data to track reproductive health visits or protests; a public, independent audit of who’s actually using the system and why.
Can you picture your little town in New Jersey—Millville, Westfield, Hoboken—keeping up with oversight of that? That’s why they’re not telling you. That’s why they don’t want a debate. Because technology doesn’t care whether you had a robust town-hall argument before you installed it. It just keeps scanning plates and logging entries.
You can feel the shape of where this is leading. We’ve talked about what Flock is, how it works, and why it’s fundamentally different from “just some cameras.” From here, the story moves from architecture to evidence: specific logs, specific sessions, specific people clicking through live cameras and plate histories in ways that raise very real questions about power, consent, and what it means to be watched every time you step out the door.
Now we’ve looked at the skeleton of the system. We’ve got cameras on poles, vehicle fingerprints, live view video, a cloud platform that can stitch your movements—i.e., your life—together across towns and states and space and time. That’s the architecture.
Now we’re going to talk about what happens when that architecture collides with reality, with a real city, with real oversight of a real small city, and a resident stubborn enough to get the receipts. The city is Dunwoody, Georgia. The company is still Flock Safety. And what you’re about to hear does not come from a press release or a think tank report or some anonymous tip. It comes from internal logs exported from the Flock system, produced by the city under public records law. Verified Dunwoody, Georgia data via Flock.
Let’s go inside. Dunwoody is a suburban city just outside of Atlanta—tree-lined streets, parks, community centers, hubs for families with pools, gyms, preschool, summer camp, all of it. It’s also a city, like a lot of towns in New Jersey, that bought into the Flock vision: license plate readers to fight crime, live view cameras to protect public spaces, sharing with neighboring agencies to, you know, “keep everybody safer.”
If you show up to city council meetings, you hear the same language you hear in Monroe or Evesham or half a dozen other New Jersey towns that have rolled out these cameras. The difference is that in Dunwoody, somebody started pulling the logs, not the marketing slides. And this is going to start happening in New Jersey, and it’s going to be fun and vindicating for those of us who’ve been screaming about this for several years.
The event logs are timestamped entries of who logged into the system, which camera they clicked, which permissions they changed, which agencies were granted access, which networks were shared. It’s the kind of data that, if you’re building a system like this, you only keep if you’re serious about accountability—or if you don’t expect anyone to look.
Part one of the Dunwoody saga: “Only two agencies”…and 1,271 others.
We start with a sentence from a council meeting in March. A Dunwoody police lieutenant stands up in front of the city council, Lieutenant Feck, and tries to calm a nervous public. Residents are concerned about these live view cameras and who can see them, and about whether they’re being watched by strangers. The lieutenant tells them this: only two outside agencies—Brookhaven and Chamblee—can view Dunwoody’s live view cameras. Anything live view, he says, is “definitely strictly reviewed and on a case-by-case basis.” Two agencies, strictly reviewed, case-by-case. That’s the official line. That’s the standard Flock story.
Then you look at real life. In Dunwoody’s event logs, from the start of 2025, you see 1,271 external agencies have been granted permission to view live streams. Three hundred fifty-eight external agencies have been granted the ability to record those streams. This is not Brookhaven and Chamblee. This is a small city’s camera network being quietly wired into over a thousand other entities.
It gets worse. In those same logs, you see exactly zero entries showing those external agencies initiating live view sessions. Either those agencies never used the access they were granted, or the audit logs simply don’t capture that usage. If it’s the first option, then why did Dunwoody grant live view access to more than a thousand outside agencies “just in case”? And if it’s the second option, then the basic promise that the system is auditable—that someone can go back and see who watched what—is broken.
From an accountability standpoint, that’s a nightmare. From a civil liberties standpoint, it’s worse, because you can’t even tell the public, “No, nobody’s watching your kids at the pool,” if the logs don’t reliably show who is or isn’t watching.
Part two: when plate readers become full-blown spy cameras.
One subtle thing that comes out of the Dunwoody documents is how the language we’re using doesn’t match what’s installed. If you listen to officials talk, they still call them “LPRs” or “license plate readers.” And so you know the mental image: a little box by the road reading plates like a toll camera. That is not what Dunwoody has, and that’s not what’s going in New Jersey.
According to Flock’s own product announcements, their so-called ALPRs in places like Dunwoody have been upgraded to live view cameras. These aren’t just snapping plates; now they’re streaming full video that can be pulled up on demand as well. In Dunwoody, those live view cameras are pointed at parks, trails, city hall, traffic intersections, and private campuses.
There was no clean public moment where the city came back to residents and said, “Hey, remember that vote for some plate readers? We’ve quietly turned them into always-on video cameras. They can be viewed live, they can be recorded, and they’re being shared with thousands of outsiders.” Instead, there’s no big announcement—just a quiet software update, a quick product upgrade, a blog post from Flock touting new features. On the ground in Dunwoody, it means what the public thought they were approving is not in any way, shape, or form what’s actually running.
If that’s the pattern there, what’s going on in Monroe, New Jersey?
Part three: “Do not share” doesn’t mean what it says.
Let’s get into the private cameras. The Marcus Jewish Community Center is a private campus. It’s got pools, gyms, a preschool, camp, religious and cultural programming. Parents drop their kids off there. Families spend hours at the pools and fitness studios.
Inside Flock, MJCC shows up as its own camera network with a clear name: “Dunwoody, Georgia PD – Atlanta JCC – [something like] ‘Do Not Share.’” That’s how it’s labeled. Now, if you’re a normal person, you’d assume that meant “do not share.” This network is restricted. Don’t send it to outside agencies.
But the event logs say otherwise. On June 4th, 2025, Dunwoody PD shares access to this MJCC network—this “do not share” network—with multiple external agencies. “Can view live stream.” “Can view recorded live stream.” “Can download recorded live stream.” That means those outside agencies could pull up live video from JCC’s private cameras, watch recorded video, and download clips and walk away with them.
This isn’t a one-off typo. It shows up in multiple exports: the event logs and the shared network logs. A month later, that sharing is still in place, even after it’s flagged to the chief in January. Later logs still show Dunwoody’s Flock network sharing data with other entities inside Flock’s ecosystem that are literally labeled “Do Not Use” and “Delete.”
Let that sink in. You have entities in the Flock network that are named “Do Not Use” and “Delete,” and instead of being removed, they’re in the system—and Dunwoody is sharing its citizens’ data with them. If that’s what’s visible in one city’s logs, imagine the level of hygiene across this whole network with thousands of agencies, private homeowners associations, business improvement districts. That’s a recipe for disaster. That’s what these municipalities are opting into with a quick “hey, let’s do safety” sales pitch.
Part four: phantom users and owner-level access.
So far, we’ve been talking about which agencies get which permissions. Now we zoom in further: the people. In any system like this, you expect local officers with user accounts, admins in the department with higher privileges, and a limited number of vendor staff with support access. Dunwoody’s records show something more tangled.
In the user exports and event logs, you see Flock employees with owner-level access to Dunwoody’s network. This isn’t support. This is: create and delete users, change roles and permissions, create API connections to third-party companies that Dunwoody does not have contracts with, turn off multi-factor authentication requirements for local users, disable whether a user’s searches show up in audits.
You also see at least one account in that system with a name that should never exist in a production law-enforcement environment: “Invalid User UUID.” Not Officer Smith, not Support, not System. Just a cryptic label indicating a user ID that’s been invalidated somewhere upstream—and yet this invalidated user ID appears in the logs performing 20-plus tasks, updating roles, touching permissions as recently as March 9th, 2026.
If you’re a defense attorney, an auditor, or just a citizen who gives the bare minimum of a damn, what do you do with that? How do you cross-examine “Invalid User UUID”? How do you subpoena them? How do you even know who that was?
Then there are ghost accounts. At one point, the resident who pulled these logs discovered a user named “John Watson” had been involved in removing, changing, or sharing on the MJCC network—remember, that’s the community center with the little kids in the pool. But when they pulled the master user export— the file titled something like “users_arch_26_2020.csv,” which is supposed to include every active and inactive historical user—John Watson is nowhere to be found. He exists in the event logs, but he doesn’t exist in the user export.
The same thing happens with certain Flock employees who appear in some parts of the data but not others. It’s like looking at footprints in the mud without any record of who owns the shoes. For a system that’s sold as “auditable,” that is a disgrace and a fundamental problem.
Part five in the Dunwoody–Flock “safety” saga: named Flock employees and what they watched.
This is the part that’s super creepy. Up until now, we’ve talked about permissions and sharing in the abstract. The Dunwoody logs let you see individual user sessions: timestamped, camera by camera, who clicked what in what order, how long they stayed, when they moved on. Over and over, you see Flock employees—not Dunwoody officers—corporate staff using the system to access live view cameras in Dunwoody, Georgia, to watch little kids play in a pool. Not just traffic cams. Private spaces. Children’s spaces.
Let’s talk about these folks. To be clear, what’s being described here is what the logs show these accounts doing. The logs don’t give us motive; they don’t tell us what was in these folks’ heads. But the patterns are deeply concerning.
First, Randy Gluck, Business Development – 911 Emergency Department for Flock Safety. Randy is a Flock business development manager based in Raleigh, North Carolina. The logs show him with live view access to Dunwoody’s cameras. Why does a sales manager in another state need live view access to Georgia’s suburban cameras?
On July 21st, 2025, Randy Gluck logs in. He clicks through a traffic camera at 608 Ashford Dunwoody Road. He clicks another at Ravinia Parkway and Ashford Dunwoody. A few more roadway cameras. At 1:41 p.m., Randy clicks “Dunwoody Library – Main 03.” Nothing else for two hours. No case number, no annotation. Just traffic, traffic, traffic…and then the library. What are you looking at in the library, Randy?
Two days later, July 23rd, 2025, Randy is watching the baseball field at the community center in Dunwoody, then the clock tower, then the gym inside the community center, then he moves to the pool. After that click, no further activity for 210 minutes.
In plain English: a sales manager in North Carolina spent part of his Wednesday clicking into a baseball field at a Jewish community center, an indoor gym, and then turned on the camera in the pool and left that on for what appears to be hours. The logs don’t tell us why or what Randy’s thinking or what Randy likes. They only tell us what happened on the screen. But if you’re a parent at that pool and you hear that a sales manager in another state watched your kids’ pool camera for an unknown length of time with no case documented, that lands differently.
Now, Bob Carter, VP Strategic Relations and Business Development at Flock Safety. Bob is not an investigator. He’s not a sworn officer. He’s a senior sales and partnerships guy. From the Dunwoody data, from the start of the prior year, Bob Carter accessed recorded footage in Dunwoody at least 185 times.
One entry jumps out. September 30th, 2025. Bob accesses one camera labeled “Gymnastics.” This is inside the community center gymnastics room. He doesn’t touch any other cameras that day. Bob Carter just sits there watching—presumably—children’s gymnastics in Georgia from his office in North Carolina, where he does entirely unrelated work.
Again, we don’t have motive. We don’t know what Bob’s into. He works for a company that has “Safety” in the name, so presumably Bob would never have ill intent. We just have patterns: senior business executives logging in, viewing pools and kids’ gymnastics rooms.
If you’re on the city council in Dunwoody and you see that, you have to ask not only: What possible legitimate business purpose did that serve? You also have to ask: Are my constituents going to let me get from here to the car when they find out? If you’re in New Jersey, you have to ask: Who at Flock has that kind of access to your town’s cameras? How would you even find out?
Tom the producer
4.9.2026







