A few distros already do that. Of the top of my head, both NixOS and Arch enable the QR code kernel panic screen, which is written in Rust. Granted, those are rather bleeding edge, but I know a few more traditional distros have that enabled (I _think_ fedora has it? But not sure).
For now there aren't many popular drivers that use Rust, but there are currently 3 in-development GPU drivers that use it, and I suspect that when those get merged that'll be the real point of no return:
I suspect the first one of those to be actually used in production will be the Tyr driver, especially since Google's part of it and they'll probably want to deploy it on Android, but for the desktop (and server!) Linux use-case, the Nova driver is likely to be the major one.
I believe Google has developed a Rust implementation of the binder driver which was recently merged, and they are apparently planning to remove the original C implementation. Considering binder is essential for Android that would be a major step too.
This is so cool! I love things like that, it feels like fresh air after years and years of anachronistic retro-vibes that seem to be a part of C-programming culture.
Arch never failed me. The only time I remember it panicked was when I naively stopped an upgrade in the middle which failed to generate initramfs but quickly fixed it by chroot'ing and running mkinitcpio. Back up in no time
I have an X1 extreme. I’ve never gotten it to last over 2h on Windows. On Linux it can last an hour or so more if I turn off the NVIDIA GPU, but otherwise it’s still abysmal.
Then there’s the stupid BIOS warning that requires you to press ESC for the computer to boot if it’s not plugged in to the official charger, which means that if it ever reboots at night it’ll just keep you awake (because the power management hasn’t been initialized yet so it’s stuck at 100% CPU) until you go press ESC.
Oh and it thermal throttles all the time so the CPU performance is good for a few minutes and then it’s just awful.
Unfortunately the TPM story for the raspberry pi… isn’t, really. It doesn’t come with one, and while it does support secure boot, it’s incredibly limited and more akin to what you’d find in a microcontroller (you can burn vendor keys to EEPROM). So all that to say, it would be kind of pointless, unfortunately.
I’d you’re interested in this, I know systemd has been working pretty hard on getting TPM-provisioned credentials usable on Linux though!
I've encountered systems that only have bash in /bin/bash, or in /usr/bin/bash, and it's a hell of a pain to have to fix every script when using different distros (I think it must've been an old Fedora and Ubuntu?).
Nowadays, most distros are moving towards having /bin be a symlink to /usr/bin, so it's mattering less and less, but I see no reason not to just do /usr/bin/env which is supposed to be on the same place on every distro.
Namely the fact that emergency calls can be routed through other networks that aren’t your own (in fact, you can place an emergency call without a SIM).
Seconding the other user saying that I also use AirPods Pro (2nd gen) in Teams at least twice a day, on macOS, iOS, and Windows (and had used them on android before, as well). Absolutely no issue whatsoever, and everyone mentions I usually have by far the best audio out of anyone else in the call.
What would Apple even gain out of this? They don’t have a competitor to MS Teams, FaceTime is hardly targeting the same segment.
I do want to point out, as someone who uses the WhatsApp app (to me, it’s slightly more convenient than the web version) that the old native windows app was /awful/. It looked native enough, but it just didn’t work. For as long as I remember it would randomly stop accepting input into the text field and I’d have to restart the app, and this was insanely frequent. Typing dead keys was also randomly broken with accents not coming through, which is really annoying if you’re trying to sound professional on a language that requires them.
The new electron app does take more resources, but at the very least it works.
I see the same bugs. It looks like after ICQ, writing a chat app has become an impossible computer science problem (skype, teams, whatsapp,…). How did that ancient civilisation from the 90s managed to build a functional chat app? The know how is lost to times.
> How did that ancient civilisation from the 90s managed to build a functional chat app?
By only accepting ANSI input, not encrypting any messages, and not bothering to protect users' from remote attacks.
Facebooks's GUI stack for WhatsApp may be rather buggy but on a technical level there's a lot more going on than back in the days of unencrypted TCP connections over plaintext protocols.
Meanwhile, Telegram has an excellent desktop app (despite their terrible protocol), so it's not like the knowledge was lost either.
Secure, end-to-end, multi-device encryption isn't easy. Plenty of people try and fail to build secure messengers based on top of PGP and Signal's protocol.
I don't use the Telegram web app, but their native apps work excellently. The insertion of ads has been a major disappointment but the chat UX itself is still great, even on native Linux.
Indeed. I'm not actively using Telegram, but I tried the desktop application (made with Qt if I remember well), and it's way ahead of what Whatsapp offers. Not to mention it's fast and relatively light.
Facebook could just take the app, change the colour's to make it green, and replace the messaging protocol with their WhatsApp library, and they'd get an actually usable chat client practically for free.
The more time passes, the more impressed I am with mIRC. It was an incredibly fully featured chat client - with hundreds of features, and its own scripting language for more advanced use. All that in a 4mb download. It probably still works great, to this day.
As a teenager, I thought we'd get better at making software over time. Not worse.
Modern chat apps work better and have more functionality. The knowhow is specifically for Windows chat apps. And the reason the knowledge was lost is that Microsoft sucked at platform design so people stopped learning their platforms and the people who still have the knowledge don't want to go down the career dead end of writing apps for them.
This is partly because MS became insanely complacent. The Windows team is very junior. Just ask anyone who has worked with them. They don't have the skills or resourcing that they did in the 90s.
If is anything like the WhatsApp Web, "works" might be optimistic choice of word.
When I switched from Windows, the thing that I missed from Windows on Linux was the native WhatsApp App. Now they killed there, so feeling better on my switch now!
The WhatsApp web app is not perfect (no software is) but I’ve never had any major issues with it. It’s snappy and very rarely does it glitch out. I find your comment surprising.
It's a ram hog if you stay the tab open for a while, with ton of messages. So from time to time I need to close the tab and open again.
Huge perf issues because of this.
Also had some serious bugs for a few week. Had to let WhatsApp Web wait for completely sync for 15 minutes~ or else it just stop responding and crashes everything.
Sounds like the all powerful mighty Meta didn't have capable engineering working on that app. No idea how that could happen, given that they only hire top talent ...
While I don’t have as many problems as you seem to, I did notice that it was much slower (like a minute or two) to “connect” after opening the window compared to before.
Have you tried telegram desktop to compare? I don't use whatsapp, but I have never had a performance problem with telegram desktop and I've been using it for years on underpowered machines. I usually run some variant of Debian on them.
Yeah, I'm sure that a lot of people notice the same thing then uninstall it. It's one of the solutions to the Fermi paradox, that both people have telegram installed at the same time.
The location access is often needed for the app to be able to find the WiFi networks in the neighborhood. This is because that can be used to triangulate your location, so they bundle it in with the same permission (unfortunately, there isn't a very good way to separate this, since it theoretically can be used to locate you and therefore you should let the user know that).
Why does a fan app need to find the WiFi networks? The OS does this, and then serves an internet connection to the app. It doesn't need to know what the available networks are.
I found the whole site a very interesting (and fairly quick) read. I don't really have anything else to add, but I'm glad the owner manages to be honest and take good lessons from the whole thing.
It's interesting to me how from his account, everyone is fairly sympathetic to him regarding his charges (he mentions his employer showing up to his interview in a sports jersey in reference to his charges!), and how he mentions he knows several actual sports players used his site. It really goes to show the state of modern streaming.
While you could do that, the hub needed to implement the logic to actually convert the different "APIs" that the products spoke. E.g. imagine an IKEA remote sends "button_on" to turn on the light, but the Philips remotes send "light_on" or something. Philips lights will work with their remotes but not with IKEA remotes, since they wouldn't know what to do with "button_on". Zigbee2mqtt and ZHA are great projects that implement a compatibility layer to all of this, but they do have to explicitly support every device (and they support basically _every_ device there is, thanks to a ton of community work, they're genuinely great projects and something that wouldn't really be possible without open source). You mention that you can pair between different vendor's products, but that's not quite the case - you can pair different vendor's products to the hub, and the hub can translate between them. But while you can pair an IKEA remote to an IKEA bulb without a hub, you can't really do that between different brands.
Matter simplifies this. It defines the API layer. You can use Thread without Matter, at which point you basically have Zigbee + IPv6, but the power comes with Matter since now every device is speaking the same language and can actually understand each other.
> Matter simplifies this. It defines the API layer.
Technically Zigbee _also_ defines an API layer -- the Zigbee Cluster Library, or ZCL -- but that's more like an opt-in standard you _could_ implement, rather than any hard requirement. And no surprise, the Matter Cluster Library Specification, being authored by the same CSA that made ZCL, is eerily similar to ZCL...
But as I understand it, you're right that Matter is essentially "hey everyone, let's _actually_ standardize around a common application layer". It isn't technologically revolutionary (the building blocks have been around for more than a decade), but it's a better packaging of it all.
Source: My employer has been involved with Zigbee and other low-power network technologies for a long time.
> you can pair different vendor's products to the hub, and the hub can translate between them. But while you can pair an IKEA remote to an IKEA bulb without a hub, you can't really do that between different brands.
Yes you can, I did that with Ikea, Philips and Innr brands. No hub, not even Z2M involved. Yes, as you say they do need to agree on a "protocol" and AFAIK they are all following Philips lead on that, but they can totally work in a P2P fashion without any hub. They negotiate their own key, you just need to pair them with a very close distance (less than 5cm approx).
Thanks, I didn't know that this worked! I guess this is more of an informal standard that they ended up following Philips' lead on, but still, better to have this kind of thing be officially defined.