> You might fear that people will start arriving at 1:07pm, but I have seen the opposite. They respect the new time. They arrive by 1:05pm, ready to work.
We do the :05 thing and this is exactly what happens every meeting: all of them end up starting between :07 and :10 since people leave their desk to find the room at :05.
Maybe this is a lever that they now have to finally break free of their total dependence on Google. Get someone like Meta to pay them to be the default AI model / interface.
Wow, to be honest I hadn't thought of it that way but that might be exactly what's on the horizon. Hard to survive on a search licensing deal in a world where LLMs threaten to eclipse search, but it may mean that LLMs want to compete to be in the browser space.
Serious question: does this thing actually make games run really great? Or are they so optimized for AI/ML workloads that they either don’t work or run normal video games poorly?
Also:
> I arrived at a farmhouse in a small forest…
Were you not worried you were going to get murdered?
It was fun when the seller told me to come and look in the back of his dirty white van, because "the servers are in here". This was before I had seen the workshop etc.
I believe these gpus dont have direct hdmi/DisplayPort outputs, so at the very least its tricky to even run a game on them, I guess you need to run the game in a VM or so?
Copying between GPUs is a thing, that's how integrated/discrete GPU switching works. So if the drivers provide full vulkan support then rendering on the nvidia and copying to another GPU with outputs could work.
And it's an ARM CPU, so to run most games you need emulation (Wine+FEX), but Valve has been polishing that for their steamframe... so maybe?
People have gotten games to run on a DGX Spark, which is somewhat similar (GB10 instead of GH200)
i did a test with just spamming date in a terminal and having a high fps video captured from my phone, it was usually under a frame (granted 60 fps so 1/60 sec)
Ah, no, that's not what I mean. It's the input devices. Mainly the mouse pointer.
I now remember there was a way to go around it (a bit cumbersome and ugly) which was to render the mouse pointer only locally. That means no mouse cursor changes for tooltips/resizing/different pointers in games, etc. But at least it gets rid of the lag.
I think the point of negative returns for gaming is going above the RTX PRO 6000 Blackwell + AMD 9800X3D CPU + latency optimized RAM + any decent NVMe drive. Seems to net ~1.1x more performance than a normal 5090 in the same setup (and both can be overclocked about equally). Aside from what the GPU is optimized for, the CPU in these servers being ARM based ends up adding more overhead for games (and breaks DRM) which still assume x86 on Windows/Linux.
> does this thing actually make games run really great
It's an interesting question, and since OP indicates he previously had a 4090, he's qualified to reply and hopefully will. However, I suspect the GH200 won't turn out to run games much faster than a 5090 because A) Games aren't designed to exploit the increased capabilities of this hardware, and B) The GH200 drivers wouldn't be tuned for game performance. One of the biggest differences of datacenter AI GPUs is the sheer memory size, and there's little reason for a game to assume there's more than 16GB of video memory available.
More broadly, this is a question that, for the past couple decades, I'd have been very interested in. For a lot of years, looking at today's most esoteric, expensive state-of-the-art was the best way to predict what tomorrow's consumer desktop might be capable of. However, these days I'm surprised to find myself no longer fascinated by this. Having been riveted by the constant march of real-time computer graphics from the 90s to 2020 (including attending many Siggraph conferences in the 90s and 00s), I think we're now nearing the end of truly significant progress in consumer gaming graphics.
I do realize that's a controversial statement, and sure there will always be a way to throw more polys, bigger textures and heavier algorithms at any game, but... each increasing increment just doesn't matter as much as it once did. For typical desktop and couch consumer gaming, the upgrade from 20fps to 60fps was a lot more meaningful to most people than 120fps to 360fps. With synthetic frame and pixel generation, increasing resolution beyond native 4K matters less. (Note: head-mounted AR/VR might one of the few places 'moar pixels' really matters in the future). Sure, it can look a bit sharper, a bit more varied and the shadows can have more perfect ray-traced fall-off, but at this point piling on even more of those technically impressive feats of CGI doesn't make the game more fun to play, whether on a 75" TV at 8 feet or a 34-inch monitor at two feet. As an old-school computer graphics guy, it's incredible to be see real-time path tracing adding subtle colors to shadows from light reflections bouncing off colored walls. It's living in the sci-fi future we dreamed of at Siggraph '92. But as a gamer looking for some fun tonight, honestly... the improved visuals don't contribute much to the overall gameplay between a 3070, 4070 and 5070.
They do still have texture units since sampling 2D and 3D grids is a useful primitive for all sorts of compute, but some other stuff is stripped back. They don't have raytracing or video encoding units for example.
Valve is the only company I'd let inject anti-cheat software directly into my veins if it meant I could play CS and be sure others were not cheating haha.
Are there four year bootcamps? Bootcamps also have much less stringent entry requirements. A student who is capable of meritoriously enrolling in a decent school e.g. Duke is, I think, capable of being an above average industry programmer in four years of instruction. It's something wrong with the instruction.
McKinsey and other consulting companies aren't really paid to consult so much as they are paid scapegoats. Management just needs someone to blame if something goes wrong. LLMs won't really ever replace them.
Not just to blame. They also sell credibility to a lot of managers and bosses.
I've experienced it often enough that upper management doesn't listen to their own employees.
Ultimately, a consultant comes in, talks to employees, suggests the exact same thing to the same people, and they love it.
Having that branding on the ppt slides sells ideas. If you're a project manager or department lead and need to push through an idea but your boss won't let you? Try hiring a consultant who will sell it to your boss.
Serious questions: won't banks and ratings agencies simply treat this as Meta's debt since it it effectively Meta's debt? What changes if this was on their "official balance sheet"? How does playing with the wording actually help Meta overall?
The crux of this article is that they won't treat Meta's debt as debt, because Meta intentionally structured this debt to circumvent the agencies' definition of "debt." Should they change their definition of "debt?" Maybe, but what incentive do they have to do that, is any formal definition bulletproof to circumvention, etc.
What's very interesting to me is what happens when Meta doesn't exercise those lease options. If there isn't some kind of penalty for declining the option, well...
As in the 2008 crash, the ratings agencies were disincentivized to accurately rate these vehicles because they were superficially masked and paid by the companies asking them for ratings.
We do the :05 thing and this is exactly what happens every meeting: all of them end up starting between :07 and :10 since people leave their desk to find the room at :05.
reply