All through this whole ghost fleet thing I've had this question as to how a large ship in the sea can possibly keep its movements secret. Large media organisations seem to be unable to say where large tankers have been if they turn their transponders off.
Don't we have constellations of satellites constantly imaging the entire earth, both with visual and synthetic aperture radar, with many offering their data freely to the public? Wouldn't a large ship on the ocean stick out somewhat? And yet journalists seem lost without vesselfinder. Is this harder than I'm imagining, or are they just not paying the right orgs for the info?
I've read reports about Havana Syndrome before and remain thoroughly unconvinced. The locations vary wildly, the symptoms vary wildly and can be explained by normal medical phenomena in a way Occam would find more agreeable.
Look at their 'smoking gun' evidence here:
>He tracked down an email, what he considers a receipt, for services provided to the Russian government by a member of Unit 29155 for "potential capabilities of non-lethal acoustic weapons."
Acoustic crowd control weapons are not mysterious, there are people on YouTube building and testing them! American companies will sell them to any oppressive government around the world (I believe the ones used against Serbian protestors were American). Yet this description contradicts speculation about microwaves just a bit further down in the article.
Yup. There's no hard evidence and so it still comes off as mass psychosis / psychosomatic / placebo effect with wildly-varying "symptoms". Surely, there would be some sizable "weapon" consuming massive amounts of energy nearby that would be visible and captured on video if it were true.
I have been a skeptic of this until now. The explanation given by the researcher interviewed seems more than plausible to me.
It’s not the typical misunderstanding of non-ionizing radiation. The variable symptoms make a lot of sense, given that the weapon is basically just causing random electrical “failures” in the body. This was not a precision op. They saturated a location with this engineered interference signal, with the goal of maiming the target. No regard to whether their families and children would be collateral damage. It’s a war crime on multiple levels, on our soil. Then we presumably went and did the same thing during the Maduro raid at scale.
Just what we needed in 2026, more man-made horrors beyond our comprehension.
according to James C. Lin: "A high-power microwave pulse-generated acoustic pressure wave initiated in the brain and reverberating inside the head could bolster the initial pressure, causing injury of brain matter. Thus, it is conceivable that the microwave auditory effect or the microwave pulse-induced pressure shock wave inside the head could become a potentially lethal or nonlethal weapon against animals and humans." https://ieeexplore.ieee.org/document/9366412
My entire dev ways of working were Windows-centric. Visual Studio was a core tool. C# was the only platform I was deeply experienced in using. Xcode was/is alien technology to me.
Claude Code erases all of those constraints and the M4/5 chips are blazing fast.
I had the same issue when I first put up my gitea instance. The bots found the domain through cert registration in minutes, before there were any backlinks. GPTbot, ClaudeBot, PerplexityBot, and others.
I added a robots.txt with explicit UAs for known scrapers (they seem to ignore wildcards), and after a few days the traffic died down completely and I've had no problem since.
Git frontends are basically a tarpit so are uniquely vulnerable to this, but I wonder if these folks actually tried a good robots.txt? I know it's wrong that they ignore wildcards, but it does seem to solve the issue
I will second a good robots.txt. Just checked my metrics and < 100 requests total to my git instance in the last 48 hours. Completely public, most repos are behind a login but there are a couple that are public and linked.
Cloudflare actually has this as a free tier feature so even if you don't want to use it for your site you can just setup a throwaway domain on Cloudflare and periodically copy the robots.txt they generate from your scraper allow/block preferences, since they'll be keeping up to date with all the latest.
> I wonder if these folks actually tried a good robots.txt?
I suspect that some of these folks are not interested in a proper solution. Being able to vaguely claim that the AI boogeyman is oppressing us has turned into quite the pastime.
Don't we have constellations of satellites constantly imaging the entire earth, both with visual and synthetic aperture radar, with many offering their data freely to the public? Wouldn't a large ship on the ocean stick out somewhat? And yet journalists seem lost without vesselfinder. Is this harder than I'm imagining, or are they just not paying the right orgs for the info?
reply