Hacker Newsnew | past | comments | ask | show | jobs | submit | twoquestions's commentslogin

Glad to see Anthropic continuing to invest in the longevity and quality of their open-source dependencies!

If you missed it, they bought Bun a while back, which is what Claude Code is built in: https://bun.sh/blog/bun-joins-anthropic


Wow. Just came to know from your comment. Not sure if it was covered here on HN. I totally missed it.

It happened quite a while back, most of us knew what direction this was going (Claude Code uses Bun, OpenCode uses Bun, they need Bun to work the best for Claude Code)

Same, it feels like Bootstrap but much, much more customizable. When I'm not using ShadCN that's what I reach for by default, it's been much nicer to work with, personally.

I've been having a very good time with Zed. Great vim motion support, and fast to the point where using VSCode feels like driving a semi truck by comparison.

https://zed.dev


Yet the UI is terrible. I trashed it because of that -really wanted to give a chance.


Before I clicked the article, I said to myself "The victim's gotta be Black", and lo and behold. AI has inherited police's (shitty, racist, and dangerous) idea that any Black person is a dangerous monster for whom anything is a weapon.


It's extremely refreshing to see the editor's memory and processor usage be smaller than the webapp tab I'm working on.

I'm really liking it thus far!


Its binary is half a gig in size, so just like a browser, nothing fresh about that.


It has a huge amount of treesitter modules, etc., statically compiled into the executable. They're not all loaded the instant you fire it up.


Size on disk is about 64x less relevant than size in RAM for me. To give Zed some credit in this area, it's statically linked and the Linux binary is half the size as the Windows one.


What could they be statically linking to have a 400MB executable?


A tonne of treesitter grammars


They wrote their own graphics renderering library for starters, that's bundled into the editor when compiled.

https://www.gpui.rs/


Every Unity game ships with three UI frameworks (IMGUI, UGUI, UI Elements) built-in, in addition to everything else the game engine supports, and the engine is only about 50 MB.


Is that really necessary for an ide? Seems like a ton of added complexity for little to no trade off or even a downside considering...


Yes. Zed is snappy in a way that makes my copy of Sublime Text 3 feel slow. Compared to VSC it feels like going from a video game at 40 FPS with 2-3 frames of input lag to 120 FPS.


You can read about it here:

https://zed.dev/blog/videogame


A renderer/rendering library for something as simple as a text editor is not (or is not supposed to be) a lot of code and should not take up a large amount of space in a binary. Odds are good it's the embedded Treesitter grammars and other things that Zed uses that takes up space.

It is Rust software, so there is probably a good 50-90% waste when it comes to just raw needed vs. actual lines of code in there, but there is no way anyone makes a renderer so overcomplicated that it takes up a meaningful part of the 500 MB or so Zed takes up on Windows.


The OS loads the entire binary into RAM.


It loads on demand, and pages you don't use don't need to be on RAM


considering how cheap storage is nowadays nitpicking about binary size is a very weird take. Are you editing code on an esp32?


Why don't you actually do some considering how mistaken this superficial dismissal is: storage is not cheap where it matters - for example, your laptop's main drive. It doesn't help you that an external hard drive can be purchased cheaply since you won't be able conveniently upgrade your main mobile system. Also, the size of various used content (games/photos /videos) has increased a lot, leaving constrains on storage in place despite the price drop.


I just refuse to use any software that baloons it's filesize. Not because I can't afford storage, but because there are always alternatives that have similar features and packed into fraction (usually less than 1%) of filesize. If one of them can do it and other can't, it's a bad product, that I have no intention to support.

We should strive to write better software that is faster, smaller and more resilient.

"Storage is cheap" is a bad mentality. This way of thinking is why software only gets worse with time: let's have a 400mb binary, let's use javascript for everything, who needs optimization - just buy top of the shelf super computer. And it's why terabytes of storage might not be enough soon.


I can empathize with how lazy some developers have gotten with program sizes. I stopped playing CoD because I refused to download their crap 150+ GB games with less content than alot of other titles that are much smaller.

That said, storage is cheap, it's not a mentality but a simple statement of fact. You think zed balloons their file sizes because the developers are lazy. It's not true. It's because the users have become lazy. No one wants to spend time downloading the correct libraries to use software anymore. We've seen a rise in binary sizes in most software because of a rise in static linking, which does increase binary size, but makes using and testing the actual software much less of a pain. Not to mention the benefits in reduced memory overhead.

VSCode and other editors aren't smaller because the developers are somehow better or more clever. They're using dynamic linking to call into libraries on the OS. This linking itself is a small overhead, but overhead none-the-less, and all so they can use electron + javascript, the real culprits which made people switch to neovim + zed in the first place. 400mb is such a cheap price to pay for a piece of software I use on a daily basis.

I'm not here to convince you to use Zed or any editor for that matter. Use what you want. But you're not going to somehow change this trend by dying on this hill, because unless you're working with actual hardware constraints, dynamic linking makes no sense nowadays. There's no such thing as silver bullet in software. Everything is a tradeoff, and the resounding answer has been people are more than happy to trade disk space for lower memory & cpu usage.


Does static linking really reduce memory and cpu usage significantly compared to dynamic linking?


I keep hearing this since I've got my first Pentium 1 PC, that storage is cheap.


RAM is not cheap. Executables live in RAM when running.


Executables lives in pages of RAM and not all pages are in physical memory at once.


No, they're mmapped to RAM. Only the pages that get used are loaded to RAM.


Absent evidence to the contrary, the reasonable assumption here is that all of those pages are being used. The rest compiler is not in the habit of generating volumes of code that aren’t being called.


That would be asking to prove a negative, which isn't reasonable.

Also, the question isn't if the code isn't used, it is that it isn't used simultaneously. Almost all software has many features which aren't used simultaneously. For example, it is unlikely you have all of the different language parsers loaded at the same time, because most projects use only a few.


Only the parts that are being used (the working set).


Just be sure about that because zed fires up node as well for lsp.


People keep comparing LLMs to automated looms, but I find them more comparable to cruise control than autopilot.

I've been working on a character sheet application for a while, and decided to vibe-code it with Spec-kit to help me write up a specification, and for things I know it's been great. I tried using Claude to make it into a PWA (something I don't know very well) as an experiment, and I've found the nanosecond the model strays out of my experience and knowledge everything goes straight to Hell. It wraps my codebase around a tree as if I'm not paying attention while driving.

It's a tool you'll have to learn to use, but I can say with absolute confidence it's no replacement for actual skills, if anything it highlights the gulf between people who know what they're doing and people who don't, for better and worse. It sacrifices some of the 'code under your fingers' feeling for management tasks, which I personally really like, as I've always wanted to document/test/code review/spec things out better, and I now understand the pain of people who'd rather not do that sort of thing.

https://github.com/github/spec-kit


The difference is that you can trust cruise control to do whatever limited job it knows how to do; you can't trust an LLM to do anything. That makes it, I think, hard to compare to anything we're used to (happily) working with.


Cruise control is a useful technology, that once you learn to use, it's automatic (somethingsomething pun something). LLMs on the other hand - well, yeah - if you like playing chess with pieces and board made out of smoke (to paraphrase Jerry Seinfeld), sure you'll probably figure it out...some day...


I do not know... I keep seeing everywhere, people promising that agent-based tools can solve all these problems and handle full, project-level tasks.


Those same people have large equity stakes or are in the surrounding network companies dependent on AI being successful.


I 90% agree with you, though Apple did stand up to the FBI some years ago. The US gov't at least is much more restricted on what data it can collect and act on due to the 4th Amendment among other laws, and as another commenter said Apple can't blackbag me to El Salvador.

Apple/FBI story in question: https://apnews.com/general-news-c8469b05ac1b4092b7690d36f340...


Apple is an exception, and even that is debatable because of the unencrypted backups they store.

On the other hand, what Apple did is a tangible thing and is a result.

This gives them better optics for now, but there is no law says that they can't change.

Their business model is being an "accessible luxury brand with the privacy guarantee of Switzerland as the laws allow". So, as another argument, they have to do this.


I've fallen in love with using Vimium when browsing, and the real thing elements are much easier to use than JS substitutes.

https://chromewebstore.google.com/detail/vimium/dbepggeogbai...


Probably the same thing I'm trying to do as a side gig now, building software to help solo/small company artisans keep track of customers/payments/taxes.

Essentially, make tools for others in my position that are going to try selling pottery or soap until they can hopefully turn it into a full time thing.


Hardest possible concur with everything OP said.

Knitting and other fiber arts are the grandmother of computer programming, and I'd go so far as to say your CS education is incomplete without at least passing knowledge of fabric weaving and especially weaving machine history.

Ignorance is not your fault, unfortunately they can't teach you everything in college, and people tend to downplay the importance and history of "women's work", much to all our detriment.

https://www.scienceandindustrymuseum.org.uk/objects-and-stor...


>I'd go so far as to say your CS education is incomplete without at least passing knowledge of fabric weaving

Why?


The first programmable computer, using punchcards, was a loom: https://en.wikipedia.org/wiki/Jacquard_machine


I'm going to go ahead and say that you can have a complete CS education without studying fabric looms.


You won't have a full and comprehensive education as an author without spending a few months copying texts by hand in a scriptorium like monks did back in the 500s.


Hard agree.

I'm not even that much of a fiber artist - I can crochet, and I can weave shepherd's slings out of plant fiber/paracord/other strings. But I believe the thinking patterns help me, especially in large-but-not-complex systems thinking.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: