That can't be true, right? I mean, Google broke Adblockers in Chrome to prevent this very issue. And it had absolutely nothing to do with Google's Ad business.
So it's completely impossible that such malicious extensions still exist.
I used to create a number of simple web pages in XHTML back in the days when we believed XHTML was the future. Recently, while going through and restructuring some of my old "online stuff", I learned that XHTML really isn't in a state that I'd want to use it any more:
* XHTML 1.0 and 1.1 are officially deprecated by the W3C.
* XHTML5 exists as a variant of HTML5. However, it's very clear that it's absolutely not a priority for the HTML5 working groups, and there's a statement that future features will not necessarily be supported by the XHTML5 variant.
* XHTML5 does not have a DTD, so one of the main advantages of XHTML - that you can validate its correctness with pure XML functionality - isn't there.
* If you do a 'view source' in Firefox on a completely valid XHTML 1.0/1.1 page, it'll redline the XML declaration like it's something wrong. Not sure if this is intended or possibly even a bug, but it certainly gives me a 'browser tells me this is not supposed to be there' feeling.
It pretty much seems to me XHTML has been abandoned by the web community. My personal conclusion has been that whenever I touch any of my old online things still written in XHTML, I'll convert them to HTML5.
> If you do a 'view source' in Firefox on a completely valid XHTML 1.0/1.1 page, it'll redline the XML declaration like it's something wrong
Is the page actually being served as "application/xhtml+xml"? Most xhtml sites aren't, in which case the browser is indeed interpreting those as invalid declarations in a regular old html document
Those red squiggles on view-source: pages in Gecko all have title text with diagnostics. The message (errProcessingInstruction) in recent-ish releases is given as:
> Saw “<?”. Probable cause: Attempt to use an XML processing instruction in HTML. (XML processing instructions are not supported in HTML.)
Okay, story time: back in 2018, the German government's foreign ministry was hacked.
At the time, a colleague of mine (we were both working for the German IT news magazine Golem) found a web page by a government-associated university that was offline with a message that it's been taken down due to a security issue.
Putting a few hints together, we figured out that Ilias was hosted therer, and that this was how the attack on the government initially started.
We weren't able to figure out which vulnerability was used, but had some ideas what it might've been. (Older versions had a default password for the admin account.)
One wonders: there's an Open Source software that's widely used by universities, even by government-associated universities. It's been the cause of a high-profile attack on a government before. One wonders why that doesn't trigger sufficient funding for regular, high-quality security audits of that software.
I'm certainly a lay person here, so take this with a grain of salt. But my understanding is that this is part of the problem, or more the issue that people criticize.
I think it's largely uncontroversial that the math in string theory could be useful in other areas. But if that's your argument for the legitimacy of string theory then the question arises what string theory is and if it is still part of physics. Because physics has, of course, the goal of describing the real world, and, my understanding is, string theory failed to do that, despite what many people have hoped.
If string theory is "just a way of developing math that can be useful in totally unrelated areas", it's, well, part of mathematics. But I don't think that's how the field sees itself.
And why would that be a reason to attack people who don't care at all about the physics, but acknowledge that the mathematical ideas they use originated in string theory? Should they omit that just because the physics side of string theory has been more or less fruitless?
They don't, and they can't cheat physical realities either.
Plants only filter out very small amounts of CO2 from the air over relatively long timeframes. That's why crop-based biofuels require such enormous amounts of space.
"The amount of CO2 removed from the atmosphere via photosynthesis from land plants is known as Terrestrial Gross Primary Production, or GPP. It represents the largest carbon exchange between land and atmosphere on the planet. GPP is typically cited in petagrams of carbon per year. One petagram equals 1 billion metric tons, which is roughly the amount of CO2 emitted each year from 238 million gas-powered passenger vehicles."
Man-made carbon emissions amount to over 40 billion metric tons annually, according to a quick Google search. Worldwide terrestrial plant carbon exchange amounts to less than 2.5% of the CO2 humans release, if plants take in 1 billion tons per year.
From the perspective of averting climate change it is indeed very small.
A team of scientists led by Cornell University, with support from the Department of Energy’s Oak Ridge National Laboratory, used new models and measurements to assess GPP from the land at 157 petagrams of carbon per year, up from an estimate of 120 petagrams established 40 years ago and currently used in most estimates of Earth’s carbon cycle.
Whether 157 billion tons or 120 billion tons, these numbers are large compared to anthropogenic releases. Of course most of this carbon is quickly cycled back out from land plants due to animals/bacteria/fungi consuming the biomass produced by land plants.
You still need to turn incredible amounts of biomass into charcoal or other stable forms of carbon to make a dent in atmospheric co2. It would take decades of hard work on gigantic scales to unburn and bury the fossil fuels we used.
That's the pay-off of our 150-year rush to monetize as much of the Earth's natural resources as possible -- while making stringent efforts to keep quiet knowledge - or suppress any efforts - to utilize the benefits of free solar energy.
Having polluted and despoiled much of the biosphere, of course we'll be donating our supposed wisdom and that hard work to the future generations that will enjoy the fruits of our labors and entreasurement.
They're pretty amazing for the amount of capital cost. $50 in seed and an acre of land can sequester several to over a dozen tons of carbon per year. It might not be space efficient but it requires basically zero infrastructure.
Which is something that when I try to explain to some 'environmentalists' do not get the point.
The other benefits of a biodiverse green belt are great, but if tomorrow I have a concrete system that captures CO2 at 10x the level of trees over lifetime in a similar density, guess what I would like my futuristic city to look like.
It looks like the shingles vaccine has positive effects that prevent dementia. (Well, that's in the title.)
This study was possible due to a "natural experiment" where one country gave people from a very specific birth date the vaccine (so people born right before and right after that date were very similar, except for the vaccine).
It's not clear why this is the case. It might be that the virus the vaccine supresses plays a role in dementia development, or it might be that the vaccine causes an immune response that has other indirect positive impacts.
So it's completely impossible that such malicious extensions still exist.
(may contain sarcasm)
reply