Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is actually not a bad idea. Why should the browser contain a specific template engine, like XSLT, and not Jinja for example? Also it can be reimplemented using JS or WASM.

The browsers today are too bloated and it is difficult to create a new browser engine. I wish there were simpler standards for "minimal browser", for example, supporting only basic HTML tags, basic layout rules, WASM and Java bytecode.

Many things, like WebAudio or Canvas, could be immplemented using WASM modules, which as a side effect, would prevent their use for fingerprinting.



> This is actually not a bad idea. Why should the browser contain a specific template engine, like XSLT

XSLT is a specification for a "template engine" and not a specific engine. There are dozens of XSLT implementations.

Mozilla notably doesn't use libxslt but transformiix: https://web.mit.edu/ghudson/dev/nokrb/third/firefox/extensio...

> and not Jinja for example?

Jinja operates on text, so it's basically document.write(). XSLT works on the nodes itself. That's better.

> Also it can be reimplemented using JS or WASM.

Sort of. JS is much slower than the native XSLT transform, and the XSLT result is cacheable. That's huge.

I think if you view XSLT as nothing more than ancient technology that nobody uses, then I can see how you could think this is ok, but I've been looking at it as a secret weapon: I've been using it for the last twenty years because it's faster than everything else.

I bet Google will try and solve this problem they're creating by pushing AMP again...

> The browsers today are too bloated

No, Google's browser today is too bloated: That's nobody's fault but Google.

> and it is difficult to create a new browser engine

I don't recommend confusing difficult to create with difficult to sell unless you're looking for a reason to not do something: There's usually very little overlap between the two in the solution.


I'm asking this genuinely, not as a leading question or a gotcha trap: why use this client side, instead of running it on the server and sending the rendered output?


For one, in many cases the XML + XSLT is more compact than the rendered output, so there are hosting and bandwidth benefits, especially if you're transforming a lot of XML files with the same XSLT.


That’s fascinating because I wouldn’t have expected it. What’s an example of when they rendered output would be bigger?


Imagine 1000 numbers in XML and a XSLT with xsl:for-each which renders a div with a label, textbox with the number and maybe a button. That's a simple example. Output would be a lot longer than XML+XSLT.


Ah, gotcha. Thanks for that. Ok, I could see why that’d be smaller, although I wonder now much compression could equalize it.


I think the obvious answer is that client side mapping would let the browser give different view of the data to the client. The obvious problem is that downloading all the data and then transforming is inherently inefficient (and sure, despite this, download-then-process is a common solution used for many problems - but it's problematic to specify the worst solution before you know the problem).

Perhaps there's an alternative universe where javascript lost and an elegant, declarative XSLT could declaratively present data and incrementally download only what's needed, allowing compact and elegant websites.

But in our universe today, this mapping language wound-up a half-thought-idea that just kicked around for a long time in the specs without ever making sense.


My gut instinct is to agree with every bit of that. I admit that I might be missing something, but I've never wanted to send the data once and then have the client view it in multiple transformed ways (minus simple presentation stuff like sorting a table by column and things like that).

And using it to generate RSS as mentioned elsewhere in the comments? That makes perfect sense to me on the server. I don't know that I've ever even seen client-side generated RSS.

But again, this may all be my own lack of imagination.


> I've been looking at it as a secret weapon: I've been using it for the last twenty years because it's faster than everything else.

Serving a server-generated HTML page could be even faster.


Maybe but PR author, who created the Issue there as well, gave example: 'JSON+React'. 'React' one of the slowest framework out there. Performance is rarely considered in contemporary front-end.


> Serving a server-generated HTML page could be even faster.

Except it isn't.

Lots of things could be faster than they are.


Loading one page is probably faster that loading a template and only after that loading the data with the second request, given that the network latency can be pretty high. That's why Google serves (served?) its main page as a single file and not as multiple HTML/CSS/JS files.


> Loading one page is probably faster that loading a template and only after that loading the data with the second request, given that the network latency can be pretty high

XSLT is XML: It can be served with the XML as a single request.

You don't have any idea what you're talking about.

> That's why Google serves (served?) its main page as a single file and not as multiple HTML/CSS/JS files.

Google.com used to be about a kilobyte. Now it's 100kb. I think it's absolutely clear Google either doesn't have the first idea how to make things fast, or doesn't care.


That assumes the server has a lot of additional CPU power to serve the content as HTML (and thus do the templating server side), whereas with XSLT I can serve XML and the XSLT and the client side can render the page according to the XSLT.

The XSLT can also be served once, and then cached for a very long time period, and the XML can be very small.


With server-side rendering you control the amount of compute you are providing, with client-side rendering you cannot control anything and if the app would be dog slow on some devices you can't do anything.


> Sort of. JS is much slower than the native XSLT transform, and the XSLT result is cacheable. That's huge.

Nobody is going to process million of DOM nodes with XSLT because the browser won't be able to display them anyway. And one can write a WASM implementation.


I think you're confusing throughput with latency.

You're right nobody processes a million DOM nodes with XSLT in a browser, but you're wrong about everything else: WASM has a huge startup cost.

Consider applying stylesheet properties: XSLT knows exactly how to lay things out so it can put all of the stylesheet properties directly on the element. Pre-rendered HTML would be huge. CSS is slow. XSLT gets you direct-attach, small-payload, and low-latency display.


That's even a rarer case, embedding CSS rules into XSLT template (if I understood you correctly), I never heard of it. I know that CSS is sometimes embedded into HTML though.


> Why should the browser contain a specific template engine, like XSLT,

XSLT is a templating language (like HTML is a content language), not a template engine like Blink or WebKit is a browser engine.

> Also it can be reimplemented using JS or WASM.

Changing the implementation wouldn't involve taking the language out of the web platform. There wouldn't need to be any standardization talk about changing the implementation used in one or more browsers.


The old, bug-ridden native XSLT code could also be shipped as WASM along with the browser rather than being deprecated. The sandbox would nullify the exploits, and avoid breaking old sites.

They actually thought about it, and decided not to do it :-/


> Many things, like WebAudio or Canvas, could be immplemented using WASM modules, which as a side effect, would prevent their use for fingerprinting.

Audio and canvas are fundamental I/O things. You can’t shift them to WASM.

You could theoretically shift a fair bit of Audio into a WASM blob, just expose something more like Mozilla’s original Audio Data API which the Web Audio API defeated for some reason, and implement the rest atop that single primitive.

2D canvas context includes some rendering stuff that needs to match DOM rendering. So you can’t even just expose pixel data and implement the rest of the 2D context in a WASM blob atop that.

And shifting as much of 2D context to WASM as you could would destroy its performance. As for WebGL and WebGPU contexts, their whole thing is GPU integration, you can’t do that via WASM.

So overall, these things you’re saying could be done in WASM are the primitives, so they definitely can’t.


Why should the browser contain a specific scripting language, like JavaScript, and not ActiveScript for example?


I suspect you might know this, but Internet Explorer 3 supported JavaScript (JScript) and VBScript in 1996.


The browser could use Java or .NET bytecode interpreter - in this case it doesn't need to have a compiler and you can use any language - but in this case you won't be able to see a script's source code.


You already effectively can't see a scripts source code because we compile, minify, and obfuscate JS these days. Because the performance characteristics are so poor.

Actually, most of the time C# decompiles nicer from CLR bytecode than esoterically built JS.


It's a consequence of javascript being "good enough." Originally, the goal was for the web to support multiple languages (I think one prototype of the <script> tag had a "type=text/tcl") and IE supported VBScript for a while.

But at the end of the day, you only really need one, and the type attribute was phased out of the script tag entirely, and Javascript won.



Fair enough. Its use to denote other scripting languages was phased out.


You can still use it that way you just would either have a browser extension or a JavaScript file read the contents and use it. Here is a 2017 Stack Overflow thread for example: https://stackoverflow.com/questions/14015899/embed-typescrip...


BTW, over a third of court case management software in the US is run on VBScript hosted in IE7 compatibility mode.


> Why should the browser contain a specific template engine, like XSLT, and not Jinja for example?

Historic reasons, and it sounds like they want it to contain zero template engines. You could transpile a subset of Jinja or Mustache to XSLT, but no one seems to do it or care.


> and it sounds like they want it to contain zero template engines.

The funny thing? No, they want to create a new one: https://github.com/WICG/webcomponents/issues/1069


Adding XSLT support is as absurd as adding React into a browser (especially given that it's change detection is inefficient and requires lot of computation). Instead, browsers should provide better change tracking methods for JS objects.


Knockout.js may be off the radar these days, but has robust handling for this.

Still the best framework I've ever worked with.


The downside of knockout was that it used proxies for change tracking, and you had to create those proxies manually, so you cannot have an object with a Number property, you had to have an object with a proxy function as a property.


So instead of a complete browser engine we get a basic engine and we need to write the complete on top of it?


Sounds like Wayland


>Why should the browser contain a specific template engine, like XSLT

Because XSLT is part of the web standards.


I kind of agree that little used,[0] non-web-like features is fair to be considered for removal. However I wish they didn't hide behind security vulnerabilities as the reason as that clearly wasn't it. The author didn't even bother to look if a memory safe package existed. "We're removing this for your own good" is the worst way to go about it but he still doubles down on this idea later in the thread.

[0] ~0.001% usage according to one post there


> [0] ~0.001% usage according to one post there

This is still a massive number of people who are going to be affected by this.

https://news.ycombinator.com/item?id=44938747


I get what you're saying, but following this line of reasoning would mean that successful, wide-spread specifications, standards, and technologies must never drop any features. They would only ever accumulate new features, bloating to the point of uselessness, and die under the weight of their own success.


> must never drop any features

On the web? That's about right. See Google's own document on this: https://docs.google.com/document/d/1RC-pBBvsazYfCNNUSkPqAVpS...


Nonsense. Following this line of reasoning is that putting percentages on billions is intellectually dishonest: You don't have to go any further than that. It is perhaps out of ignorance (now you know), but if you try to make it about anything else, that's just arguing in bad-faith.

Of course you can drop features, but if you work at Google I think you can pick something else, and you'll have a hard time convincing anyone that XSLT which was in Chrome back when it was fast, is why Chrome isn't fast anymore. And if you don't work at Google, why do you care? You've learned something new today. Enjoy.


It's not being dishonest. Software needs to be maintained. And google isn't the only web browser, nor should it be. It makes sense to re-evaluate which features make sense for the web. Flash and Java applets were both removed from web browsers and broke sites for millions of users, probably much more than XSLT would. But it was still the right call. This case is a bit more nuanced than those but I still think it's at least fair to discuss removing it.


> You've learned something new today. Enjoy.

Indeed: I learned that you're a condescending ass who doesn't engage with the actual argument I brought up.


It’s classic Google behaviour: “oh not used by a billion people? Didn’t get popular enough, axe it”.

They arguably became a victim of their own scale.


Compare webkit to UDK (The unreal development kit for game dev) to consider why there is so much bloat in the browser. People have wanted to render more and more advanced things, and the webkit engine should cater to all of them as best it can.

For better or worse, http is no longer just for serving textual documents.


Maya is the go to example of bloat for me for many of the same reasons.


While this sounds crazy at first, I could warm for several incremental layers of features, where browsers could choose to implement support for only a set of layers. The lowest layer would be something like HTTP with plain text, the next one HTML, then CSS with basic selectors, then CSS with the full selector set, then ECMA and WASM, then device APIs, and so forth.

Would make it possible to create spec-compliant browsers with a subset of the web platform, fulfilling different use cases without ripping out essentials or hacking them in.


There is no point in several layers because to maximize compatibility developers would need to target the simplest layer. And if they don't, simple browsers won't be able to compete with full-fledged ones.


You can set the doctype in the document to the spec you want to use, which is basically what you're asking for. Try setting <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">


> Why should the browser contain a specific template engine, like XSLT, and not Jinja for example? Also it can be reimplemented using JS or WASM.

I think a dedicated unsupported media type -> supported media type WASM transformation interface would be good. You could use it for new image formats and the like as well. There are things like JXL.js that do this:

https://github.com/niutech/jxl.js


I get the point a minimal browser and WASM, but Java bytecode ?! Why not Python bytecode ? Seems unreasonable to me to add any specific bytecode support. By layout rules you mean get rid of CSS ? Sounds also unreasonable IMHO.

And no WebAudio and Canvas couldn't be implemented in client WASM without big security implication. If by module you mean inside the browser, them, what is the point of WASM here ?


What WebAudio needs to provide is only means to get or push buffers from/to audio devices and run code in high priority thread. There is no need for browser to provide implementation of low-pass filters, audio proccessing graphs and similar primitives.


Honestly, even WASM makes it not very minimal in my book. A minimal browser should be HTML and perhaps a subset of CSS, that's it.


Wasm is ANYTHING but basic.

Fuck javascript, fuck wasm, fuck html, fuck css.

Rebase it all on XML/XPath/XQuery that way you only need ONE parser, one simple engine.

This whole kitchen sink/full blown OS nonsense needs to end.

Edit: You’re clearly a wasm shill, wasm is an abomination that needs to die.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: