Hacker Newsnew | past | comments | ask | show | jobs | submit | gardnr's commentslogin

Looks like it may not have a package manager like apt or dnf:

> Can you please add wget, nano, $my_fav_app_omg_i_love_it to the root filesystem?

> No, not likely.

I am guessing the way to use software not already in the image is to use `docker run`.


It can be damn near impossible to break them out of some loops once they've committed. Gotta trim the context back to before the behaviour started.

865 GB: I am going to need a bigger GPU.

Or several bigger GPUs! :)

If you buy a new MacBook Pro be prepared to ditch all of your duckhead adapters.

If you enjoy using a "Power Adapter Extension Cable" to extend your distance from the wall, then be prepared that you cannot buy one that fits the new prongs as Apple does not sell them.

I just got off the phone with their sales support and they recommended that I go to a hardware store and buy a regular extension cable.


I wouldn't label that arrogance. In my experience outside the USA, my GP has been unaware of new research / advice / guidelines published by the MoH. They generally respond to new info from reputable sources when you print it out and bring it to them.

I would expect specialists to be subscribed to journals and reading the latest articles in their field. When I saw a specialist at UCSF this was definitely the case; while my GP still has gaps where their current knowledge on a specific subject is from their time at med school.

An equivalence would be a front-end engineer being naive to the happenings on the Linux kernel mailing list. They could likely understand what's going on if they took the time to read it, but that is not their focus.


The fact that there are advertisements for cancer medications make me think a hell of a lot of specialists don’t keep up. It’s one thing to advertise to consumers about a new medication for their chronic condition and they might not have seen their doctor in 3 years. It’s another entirely to have cancer patients need to ask about the new hotness.

The GLM-4.7 model isn't that great. I was on their $200/month plan for a while. It was really hard to keep up with how fast it works. Going back to Claude seems like everything takes forever. GLM got much better in 5.1 but Cerebras still doesn't offer that yet (it's a bit heavier). I have a year of Z.ai that I got as a bargain and I use GLM-5.1 for some open source stuff but I am a bit nervous about sending data into their API.

The new one is quite a bit heavier!

GLM 4.7 is 358B parameters: https://huggingface.co/zai-org/GLM-4.7

GLM 5.1 is 754B parameters: https://huggingface.co/zai-org/GLM-5.1

That said, 5.1 is indeed a bunch better and I could definitely see myself using it for some tasks! Sadly all of the stuff I can actually run locally is still trash (I appreciate the effort behind Qwen 3.6, Gemma 4 and Mistral Small 4 though, alongside others).


It's like the author handed the copy to the editor who then added a new broken sentence after each original sentence that somehow jams "agents" in there.

The llama weights were leaked. It open sourced itself.

You are right though. Meta could have been in lockstep releasing ChatGPT features into some chat bot on Facebook.com but instead it seemed like their FAIR arm was hell bent on commoditising this stuff by publishing their research models before the Chinese companies took the lead in that.

It’s hard for me to be mad at FAIR even though I general disagree with the outcomes that Meta produce for their users.


It’s personal…


I was really excited until I realised that “personal” meant “owned by meta“.

I’m trying to decide is I find the doublespeak a bit offensive or not.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: