Tab complete is still useful and code review/suggesting changes can be better in a GUI than in a terminal. I think there is still a radically better code review experience that is yet to be found, and it's more likely to come from a new player like Cursor/Graphite than one of the giants.
Also Cursor's dataset of actual user actions in coding and review is pure gold.
God cursors tab complete is woeful in basically all of my usage at work. It’s so actively wrong that I turned it off. It’s agent flows are far far more useful to me
Valve's hardware products will be successful but remain niche. And that's ok. They are unwilling to pursue business models that require locking down hardware in order to subsidize it with software purchases, and I love that about them. As a result their hardware will always be more expensive. They will not outcompete Meta in VR or Sony/MS/Nintendo in consoles because price is king for the mass market.
Valve's hardware products, aside from being awesome and setting a standard that others have to match, are really an insurance policy. They ensure Valve cannot be locked out of their own market by platform owners like Microsoft or Meta using their leverage to either take a cut of their revenue or outright ban Steam in favor of their own stores (as it looked like MS might try to do in the Win8 days). By owning a platform of their own Valve always has a fallback option.
In 2025 none of Sony, Microsoft or Nintendo sell their consoles at a loss. They're sold for very slim margins, which is what I assume Valve will do with the Steam machine. I expect the Steam Machine to be price competitive.
ish. Sony/Microsoft/Nintendo likely have contracts locking down RAM prices whereas Valve will have to negotiate theirs based on current prices.
If Valve sold the Steam Machine for any significant cost less than the Dell / HP / Lenovo equivalent, HNers would snap them up by the truckload to repurpose as home or work machines with guaranteed Linux compatibility.
I'm sure Valve would be ecstatic if they were snapped up by the truckload for home use, because home use and gaming use overlap significantly even if not perfectly.
For business use, the Beelink equivalent is about $350, because the GPU in the Steam Machine is useless for business or AI use. The Steam Machine is going to be more than $350.
The comment you're replying to is in a discussion about the possibility that the Steam Machine will be sold at a loss, as they would be able to re-coup the funds in sales on Steam.
In this example, no, Valve would not be ecstatic if they were snapped up for things other than Steam use. Sony tried this with the playstation, and the military bought them out as cheap linux compute, costing Sony thousands
This is why Sony killed the PS2Linux effort, and the PS3 Linux no longer offered graphics acceleration.
They had hoped for a second wave of Yaroze like indie developers, instead the large majority were repurposing their PS2 as MAME like emulators or Linux computers.
> They will not outcompete Sony/MS/Nintendo in consoles because price is king for the mass market
I don't follow the console market at all, but don't its players subsidize their hardware by keeping software (game) costs high? I didn't think they had anything like Steam's level of regular discounted sales. "Price is king" can cut both ways.
"Steam" doesn't decide to have discounted sales -- games are heavily discounted because developers compete against one another for attention. Nintendo and Sony generally have less need to do this.
> Nintendo and Sony generally have less need to do this.
The prime Nintendo games (i.e. Animal Crossing, Pokemon and anything Mario related) are rarely discounted, yes - and Nintendo can do this because these games have borderline drooling fanbases and the games aren't available anywhere else.
But everything else? There's constantly something on sale on the Switch store.
Likewise, the PS5 has absolutely dominated the Xbox's current generation in terms of sales in large part due to exclusives. Xbox Series S is far cheaper than a PS5, mind you.
Steam doesn't need to lock down the Steam Machine to subsidize it with store purchases. The casual user could theoretically install another OS, but that doesn't matter because they won't (because they're casual users), and the dedicated user buys most of their games on Steam anyways because it's the dominant distribution platform.
The risk isn't that casual users all spontaneously decide to stop using Steam on their own. The risk is that businesses exploit the subsidy in various ways. For example, businesses could buy the Steam Machine in bulk as a workstation. Or a store competitor could produce an OS of their own that replaces SteamOS and promote it to users.
>and the dedicated user buys most of their games on Steam anyways because it's the dominant distribution platform.
It's also the most convenient by far, and the new Steam Family stuff lets you share all of your games with all of your siblings without any need for password sharing like you'd have to on e.g. GoG or Epic. I have 4 siblings and most of us are married. Our combined Steam library is well over 1000 games
They do need to lock it down if they want to subside it with store purchases, otherwise it's too tempting for non-gaming uses where they don't get any money after the initial sale.
The Steam you download from steampowered.com can be an open platform at the same time that the Steam that comes preinstalled on the Steam Machine is a closed platform.
Seems unlikely because we believe Valve has integrity. But it's possible they have less integrity than we think, and they pursue this strategy to make some of those games with kernel-level anti-cheat available on the Steam Machine.
Kernel-level anticheat doesn't necessarily need to be on a fully closed platform, it could be implemented like SafetyNet on the Pixel series to check for system integrity but still allow for bootloader unlock and arbitrary user software
Pixels and SafetyNet are different than a console appliance (e.g. Xbox, Playstation) in that Google allows both unlocking and relocking the bootloader, without affecting the integrity of a Pixel's onboard cryptographic hardware and secure enclave. This means you can, for example:
1. Unlock the bootloader and install an alternative OS (e.g. Graphene).
2. Relock the bootloader and still benefit from the Pixel's hardware security.
The above is not possible on modern video game consoles, or other phones, for the most part. Hardware cryptography has historically been used to lock customers out of their own machines for the purposes of profit, but that doesn't mean it has to be.
In the threat environment as it exists today --- a world in which almost everyone has an always on, always networked computer which must continually reveal its location in order to interface with the global network --- something like the Pixel's design ought to be the minimum standard for a computer in your pocket. Sadly, the only other device on the market with similar hardware security features is the iPhone, and it's as locked down as a games console. Samsung's Knox is another secure hardware platform/architecture, but they burn out a fuse on their phones to disable it when you unlock the bootloader.
> Steam you download from steampowered.com can be an open platform at the same time that the Steam that comes preinstalled on the Steam Machine is a closed platform.
i dont think that's possible unless steam choose to go the route of what apple does with iOS and macOS - both essentially are "different" OS's.
But if that's the case, then games would have to be written "twice" (or have engine support directly from engine vendors). I highly doubt this can or will occur, as game developers are short on time as is.
Not my space, but I think this would be a cryptography kind of thing. Burn a key into read-only hardware, lock the bootloader, require the kernel and drivers to be signed with a key the burnt-in key can validate. Potentially extend it to all executables on the device.
It’s closed in the sense that you can’t install whatever you want, not in the sense that Valve is going to make their own framework devs have to use.
Valve's products are 100% designed to punch a hole through Windows Store monopolization. It encourages developers to write for Linux.
Microsoft has been trying to corner Valve. Valve is finding clever ways out by getting developers to finally make their games Linux compatible.
If Valve's consoles become broadly successful, that's an added bonus. The real win is to outflank Microsoft.
One of Microsoft's biggest mistakes was to give up on Windows Phone. One of Meta's biggest mistakes was to give up on their phone (they gave in early due to technical choices, not just lack of user demand).
Owning a "pane of glass" lets you tax and control everything. Apple and Google have unprecedented leverage in two of the biggest markets in the world. Microsoft wants that for gaming, and since most gaming is on Windows, they have a shot at it.
Valve is doing everything they can to make sure developers start targeting other platforms so PC games remain multi-platform. It's healthy for the entire ecosystem.
If we had strong antitrust enforcement (which we haven't had in over 25 years), Apple and Google wouldn't have a stranglehold on mobile, and Microsoft would get real scrutiny for all of their stunts they've pulled with gaming, studio acquisitions, etc.
Antitrust enforcement is good for capitalism. It ensures that stupid at-scale hacks don't let the largest players become gluttons and take over the entire ecosystem. It keeps capitalism fiercely competitive and makes all players nimble.
The government's antitrust actions against Microsoft in the 1990s-2000s was what paved the way for Apple to become what it is today. If we had more of it, one wonders what other magnificent companies and products we might have.
Valve actually encourages devs to only provide Windows builds compatible with Proton, or at least it used to, to the disappointment of some professional porters. Mainly because several devs kept leaving their Linux builds abandoned while still maintaining their Windows ones.
Hopefully developers are being encouraged to target Proton, as it's the subset. Presumably anything that works on Proton will also work on Windows, so it makes sense to target Proton.
If the windows build already performs better than on native windows, why faff around with another build target and all its associated complexities (testing, etc).
Targeting Linux means probably targeting all distros, and that's asking for trouble I reckon.
> Targeting Linux means probably targeting all distros
Valve actually distributes a runtime (or at least used to), that's based on Ubuntu, and provides a stable target for developers who want to release a Linux port.
But I agree in general with your point; if the Windows build already performs great on Linux through Proton, why go through the effort to release a native Linux build?
They still distribute runtime(s). Proton runs inside one of those runtimes. You're talking about 1.0 version, 2.0 was based on debian 10, and 3.0 is based on debian 11.
It still has some assumptions about host system, though, but that's a problem for those who package steam. For example, my non-FHS NixOS provides everything required, and it works out-of-the-box.
Was antitrust enforcement necessary in this case, if Valve can break the "monopoly" with a superior value proposition for customers? Perhaps Valve would not feel the need to enter such a capital-intensive industry if it weren't for pressure from the behemoths. I happen to like that antitrust doctrine in the US is focused on good for consumers instead of some abstract ideal of a healthy market.
Microsoft and Nvidia (amongst many others) are happy to leave their gaming customers hanging for years in order to inflate the AI bubble further. They don’t care about gaming in any significant capacity. Valve is still a great gaming focused company and they will be successful.
Great point. Exactly. They have undermined the marketplace by hovering up and consolidating studios, scuttling a lot of IP that gamers care about and squeezing money out of properties for quarterly gains. They use their market power to push worse games with abusive, dark patterns. Microsoft has really become even more of a heel than they were in the 2000s.
> One of Microsoft's biggest mistakes was to give up on Windows Phone.
They had no other choice.
The technical foundation of the prior WP versions (aka, Windows CE) was just too dated and they didn't have a Windows kernel / userland capable of performantly dealing on ARM, x86 performance was and still is utter dogshit on battery powered devices, they didn't have a Windows userland actually usable on anything touch based, and most importantly they did not have developer tooling even close to usable.
At the same time, Apple had a stranglehold over the upper price class devices, Android ate up the low and mid range class - and unlike the old Ballmer "DEVELOPERS DEVELOPERS DEVELOPERS" days, Microsoft didn't have tooling that enticed developers, while Apple had Xcode with emulators that people had been used to for years, and Android had a fully functioning Eclipse based toolchain.
As I recall, that is not correct. There was a gargantuan internal effort to refactor Windows 10 to run on everything from mobile devices to servers. Windows Phone 10 was running Windows 10. And the tile UI was well received by those who had WP devices.
As others have said, lack of critical apps and shenanigans from Google is what killed sales which led to the death of Windows Phone.
Apple darn well knew what people want - even the first iPhone, the one that didn't even have an App Store (which got invented as a concept by jailbreakers proving it was possible!), came with YouTube and Maps from the start.
What I don't know however why Microsoft insisted on the ability to not show ads and download videos when copying that concept. They had to know that they were directly cutting into Google's bottom line.
> What I don't know however why Microsoft insisted on the ability to not show ads and download videos when copying that concept. They had to know that they were directly cutting into Google's bottom line.
There's a long backstory here.
Microsoft tried everything to get YouTube on Windows Phone. At one point, they negotiated with Google and Google said they were going to work on an app. That didn't happen.
Microsoft tried to use the proper APIs, but Google kept shutting them off:
"Downloading" the videos was Microsoft trying to work around API limitations and shut offs.
Imagine Microsoft's customers getting angrier and angrier that YouTube kept breaking. For years. This was a deal breaker for lots of people, especially young early adopters.
Microsoft tried really hard here.
What Google did was abuse their market position to cripple Windows Phone. Customers abandoned Windows Phone because it didn't have YouTube.
Google had to play nice with Apple in the early days because Apple had all the patents Google needed to continue with Android. It wasn't until they purchased Motorolla that they had a MAD patent strategy.
> At one point, they negotiated with Google and Google said they were going to work on an app.
MS made that offer to probably every developer on top 100 on ios/android stores. That usually meant some small shop in Eastern Europe will be contracted.
Geoff Hinton's 2012 Coursera course "Neural Networks for Machine Learning" was incredible. Anyone who took that course got in on the ground floor of deep learning just when it was about to take off. It certainly changed the course of my career.
To give an idea of how cutting-edge it was at the time, the well-known RMSProp optimizer was unpublished work that Hinton presented in the course, and people had to cite the presentation slides when they used it in papers published later.
> Meshlet has no clear 1:1 lane to vertex mapping, there’s no straightforward way to run a partial mesh shader wave for selected triangles. This is the main reason mobile GPU vendors haven’t been keen to adapt the desktop centric mesh shader API designed by Nvidia and AMD. Vertex shaders are still important for mobile.
I get that there's no mapping from vertex/triangle to tile until after the mesh shader runs. But even with vertex shaders there's also no mapping from vertex/triangle to tile until after the vertex shader runs. The binning of triangles to tiles has to happen after the vertex/mesh shader stage. So I don't understand why mesh shaders would be worse for mobile TBDR.
I guess this is suggesting that TBDR implementations split the vertex shader into two parts, one that runs before binning and only calculates positions, and one that runs after and computes everything else. I guess this could be done but it sounds crazy to me, probably duplicating most of the work. And if that's the case why isn't there an extension allowing applications to explicitly separate position and attribute calculations for better efficiency? (Maybe there is?)
Yes, you have to execute the vertex shader twice, which is extra work. But if your main constraint is memory bandwidth, not FLOPS, then I guess it can be better to throw away the entire output of the vertex shader except the position, rather than save all the output in memory and read it back later during rasterization. At rasterization time when the vertex shader is executed again, you only shade the triangles that actually went into your tile, and the vertex shader outputs stay in local cache and never hit main memory. And this doesn't work with mesh shaders because you can't pick a subset of the mesh's triangles to shade.
It does seem like there ought to be an extension to add separate position-only and attribute-only vertex shaders. But it wouldn't help the mesh shader situation.
I thought that the implication was that the shader compiler produces a second shader from the same source that went through a dead code elimination pass which maintains only the code necessary to calculate the position, ignoring other attributes.
Sure, but that only goes so far, especially when users aren't writing their shaders with knowledge that this transform is going to be applied or any tools to verify that it's able to eliminate anything.
Well, it is what is done on several tiler architectures, and it generally works just fine. Normally your computations of the position aren't really intertwined with the computation of the other outputs, so dead code elimination does a good job.
The title of this story should be "Announcing the Beta release of ty". A lot of people have been waiting for the beta specifically.
I've been using Pyrefly and loving it compared to Pyright, but they recently shipped some updates with crash bugs that forced me to pin to a previous version, which is annoying. Unfortunately my first impression of ty isn't great either. Trying to install the ty extension on the current version of Cursor says "Can't install 'astral-sh.ty' extension because it is not compatible with the current version of Cursor (version 2.2.20, VSCode version 1.105.1)."
(pyrefly maintainer here) If you haven't already, please file an issue for that crash on the [Pyrefly repo](https://github.com/facebook/pyrefly) as well :)
If there's anything else accompanying the error, do you mind filing an issue? I've been using the ty extension with Cursor for weeks and am having trouble reproducing right now.
That's the full error. It shows up in a dialog box when I press the install button. I'm on macOS, connected with the Anysphere Remote SSH extension to a Linux machine.
If I choose "install previous version" I am able to install the pre-release version from 12 hours ago without issue. Then on the extension page I get a button labeled "Switch to Release Version" and when I press it I get an error that says "Can't install release version of 'ty' extension because it has no release version." Filed a GitHub issue with these details.
In the meantime, the previous version appears to be working well! I like that it worked without any configuration. The Pyrefly extension needed a config tweak to work.
https://forum.cursor.com/t/newly-published-extensions-appear... suggests that there's some kind of delayed daily update for new VSCode extension versions to become available to Cursor? It seems likely that's what is happening here, since ty-vscode 0.0.2 was only published an hour or two ago.
Apart from installation problems/crash issues, do you have some feedback about type checking with ty vs. pyrefly? Which is stricter, soundness issues, etc?
Both are rust/open-source/new/fast so it's difficult to understand why I should choose one over the other.
It would likely reduce or eliminate the "compiling shaders" step many games now have on first run after an update, and the stutters many games have as new objects or effects come on screen for the first time.
The temperature that you raise to the fourth power is not Celsius, it's Kelvin. Otherwise things at -200 C would radiate more heat than things at 100 C. Also the temperature of space is ~3 K (cosmic microwave background), not 10 C.
There is a large region of the upper atmosphere called the thermosphere where there is still a little bit of air. The pressure is extremely low but the few molecules that are there are bombarded by intense radiation and thus reach pretty high temperatures, even 2000 C!
But since there are so few such molecules in any cubic meter, there isn't much energy in them. So if you put an object in such a rarefied atmosphere. It wouldn't get heated up by it despite such a gas formally having such a temperature.
The gas would be cooled down upon contact with the body and the body would be heated up by a negligible amount
These satellites will certainly be above the themosphere. The temperature of the sparse molecules in space is not relevant for cooling because there are too few of them. We're talking about radiative cooling here.
The Sun is also not 10 C. Luckily you have solar arrays which shade your radiators from it, so you can ignore the direct light from it when calculating radiator efficiency. The actual concern in LEO is radiation from the Earth itself.
I love the sliders, but note that the numbers on this site literally came from ChatGPT, so there is plenty of room for disagreement.
Seems like according to this analysis it all hinges on launch cost and satellite cost. This site's default for Starship launch cost is $500/kg, but SpaceX is targeting much lower than that, more like $100/kg and eventually optimistically $10/kg (the slider doesn't even go that low). At $100/kg (and assuming all the other assumptions made on the site hold) then you break even on cost vs. terrestrial if you can make the satellites for $7/watt (excluding GPUs, as the whole analysis does).
OTOH SpaceX has a pretty good history of undercutting the industry on cost. If Starship full reusability works I would be very surprised if it only lowered launch costs by a factor of three. Of course it's not guaranteed to work, but clearly SpaceX's orbital datacenter plans are predicated on Starship working.
SpaceX created reusable rockets that can fly back to the launch platforms and land gracefully. Hard to blame people for becoming fans. Before them stuff like this only existed in kerbal and sci-fi.
Accepting everything they then do, forever, even when it's obviously nonsense, is what gets you called a "huge batshit crazy fanbase of boot lickers".
This "idea" is great party conversation. It's probably doing a great job of shoving around the Overton window, too (perhaps the real goal here?). It's, uh, not realistic, and anyone who is seriously "all in" on it (you're allowed to consider it and to dream, that's not the same as being all in) is not worth taking seriously no matter how much of the oxygen in the room they're using up.
If you're in the US you can get a real cell phone number with VoIP and SMS that works without a phone for $20/mo with Google Fi. You'd need a phone to set it up but after that you could just turn it off and still use VoIP and SMS from any web browser.
There are BYOD prepaid providers that are even cheaper than that. The lowest you can get is ultra mobile's $3.50/month plan, but you need to jump through some hoops to get it working, like getting a physical sim in person. Tello is $5/month and you can activate online.
I like this metric for service security. Which service is the most expensive to buy verification on? So far the best one I've found is Telegram at 166/$100, and the worst is Discord at 5044/$100.
Adding on to this one since it was the only link to the map data. There's some other supplemental data available. The supplemental PDF [1] has a bunch of the vendor names and there's a Google Docs sheet that has the list of vendors and availability per area. [2]
Also Cursor's dataset of actual user actions in coding and review is pure gold.