Naturally. And the editors, typesetters, designers, proof readers, etc. should get paid for their work.
It is just a question of ownership and power imbalance.
The problem with the current mainstream system (both in books and music) is that the publisher often effectively owns the work - not the author.
Are they though?
* Rust was designed before even vscode was released
* Zig, Odin, JAI are all designed by people that don’t use LSP and don’t like it
* Don’t know about Carbon, but it currently doesn’t look like it’s gonna kill anything
It seems to me that rust analyzer is something of an exception
Zig has on the roadmap the creation of a compiler REPL to power all kinds of code intelligence tools, including LSPs. This item is blocking only on finishing incremental compilation.
Sure and the incremental compiler work is genuinely amazing. I was objecting purely to the notion that this new crop of systems languages is “designed from day one” around LSP, which doesn’t seem to be quite true
ngl, a lot of the times, an in-memory “database” that gets backed up to a file is perfectly reasonable. Even consumer devices have dozens of gigabytes of RAM. What percentile of applications needs more?
Just because a technology works well for a few cases shouldn’t mean it’s the default. What’s the 80% solution is much more interesting IMO.
> an in-memory “database” that gets backed up to a file is perfectly reasonable.
We have org-mode, application configs, and music playlists as three widely used examples for this.
You switch to a database when you need to query and update specific subsets of the data, and there's the whole concurrency things when you have multiple applications.
There's absolutely no problem with this. But it probably shouldn't be a best practice or default for the industry? That's what op was saying. I'd argue you're still better off using SQLite than doing it manually, but to each its own.
Yes, the 80% claim comes from comparing 9 tests converted by both the conversion tool and humans and comparing the quality - “80% of the content within these files was accurately converted, while the remaining 20% required manual intervention.” Not sure what to make of it since they claim only 16% of files get fully converted.
The conversion is between two testing libraries for React. Not to be too cynical (this sort of works seems to me like a pretty good niche for llms), but I don’t think I’d be that far off of 80% with just vim macros…
I think you're significantly underestimating the complexity of automatic transforms. It's not like they didn't try writing codemods first, and vim macros aren't more powerful than codemods.
You really think you could achieve 80% success rate with just syntaxic transformations, while the article says they only reached 45% success rate with fine grained ast transformations?
I am no vim hater, but allow me to cast a large, fat doubt on your comment!
Fair enough :) It was very much an exaggeration. But, I do wonder how far would “dumb” text editing go in this scenario. And, more importantly, whether it wouldn’t be faster overall than writing a tool that still requires humans to go through its output and clean/fix it up.
Key point is that vim macros are interactive. You don’t just write a script that runs autonomously, you say “ok, for the next transformation do this macro. Oh wait, except for that, in the next 500 lines do this other thing.” You write the macro, then the next macro, adjust on the fly.
> Our initiative began with a monumental task of converting more than 15,000 Enzyme test cases, which translated to more than 10,000 potential engineering hours
Out of curiosity, can you drop into edit session during the macro? It is some time since I last used vim, so I do not recall, but in emacs you can record a macro along the lines of "do A, do B, drop to edit session letting user do whatever, do C, do D". Is that possible with vim macros?
This Vimcast (http://vimcasts.org/episodes/converting-markdown-to-structur...) recording is an example of a quite complex macro for converting (a specific file's) markdown to HTML. At the beginning of the video you see that they save the macro to the "a" register. You can record macros of similar complexity to each of the other letters of the alphabet, to get some idea of the maximum complexity (though I tend to stick to about 3 or less in a single session).
Not to mention the possible savings if you just don't switch to whatever the latest testing framework is your resume driven developers want. 100% time savings!
Gee, if "many teams" want to spend their time migrating their unit-test framework and unit tests because their frontend framework hit version 18 I suppose that's their prerogative.
Not me to applaud Teams but it seems Slacks lunch is being eaten by people who are busy building things on the corpse of Skype, not churning through churn incarnate.
You can file a suit against anybody for anything. If it’s obviously without merit, it’s likely to get dismissed pretty quickly. But there’s nothing stopping you from filing it.
I don’t think the specifics of the placement matter all that much. I’m a happy nvim user with a dvorak layout. Only ‘h’ is on the home row, but it’s not a problem. The issue with normal arrow keys is that you have to lift your whole hand to reach them, which is way less precise and much slower than just moving a finger.