"Designing AI for Disruptive Science" is a bit market-ey, but "AI Risks 'Hypernormal' Science" is just a trimmed section heading "Current AI Training Risks Hypernormal Science".
Ooh, that's a worthy challenge. Of course, I can imagine getting enough data on all of those cities and deciding to launch everywhere else but not Boston "because your roads are garbage and you all drive like you're impaired 24/7" :-)
That's not how you should measure "worth". In that world, you'd have a P/E ratio of 1. Comparing to a bond, it would be like expecting to get paid the face amount in a single year. Many people are quite happy with 5-10% interest as a risky benchmark, so 10-20 P/E isn't wild. That puts the market cap for tech itself at 10-20T as a reasonable baseline.
I feel vindicated :). We put in a lot of effort with great customers to get nested virtualization running well on GCE years ago, and I'm glad to hear AWS is coming around.
You can tell people to just do something else, there's probably a separate natural solution, etc. but sometimes you're willing to sacrifice some peak performance just have that uniformity of operations and control.
This isn't strictly correct: you probably mean wrt compressed size. Compression is a tradeoff between size reduction and compression and decompression speed. So while things like Bellard's tz_zip (https://bellard.org/ts_zip/) or nncp compress really well they are extremely slow compared to say zstd or the much faster compression scheme in the article. It's a totally different class of codec.
an LLM can be used to losslessly compress a string to a size equal to the number of bits of entropy of next token prediction loss over the string, by encoding the extra bits of entropy with arithmetic encoding. its sota compression for the distribution of string found on the internet
When you say "fixed this" which "this" do you think they fixed? Are you imagining this is a hash table? It's not
It's an adaptor which will use two other containers (typically std::vector) to manage the sorted keys and their associated values. The keys are sorted and their values are stored in the corresponding position in their own separate std::vector. If you already have sorted data or close enough then this type can be created almost for free yet it has similar affordances to std::map - if you don't it's likely you will find the performance unacceptable.
Every lift at Snowbird has the map printed on the bar. So you can plan your route on the way up. I agree that when you get lost, that map won't save you, but I think an offline PDF is also fine.
Absolutely not true lol. I dont think any of their lifts have maps on them right now. The maps also arent super helpful at snowbird because the cliffs often come out of nowhere
Wait, really? I haven't been up this season, but it's always been there! I understand removing the printed ones when the bars have them (and the big boards at the top). Is it all just ads now?
None of the bars have anything printed on them now if I remember correctly. I have a pass and have been around 10 times this season. At least they all have footrests unlike alta where they love foot pain
"Designing AI for Disruptive Science" is a bit market-ey, but "AI Risks 'Hypernormal' Science" is just a trimmed section heading "Current AI Training Risks Hypernormal Science".
reply