Hacker Newsnew | past | comments | ask | show | jobs | submit | aruametello's commentslogin

(VR enthusiast here, mostly under windows)

intel support has been mild to non existent in the VR space unfortunately. Given the very finicky latency + engine support i wouldn’t bet on a great experience, but hope for the best for more competition in this market. (even amd has a lot of caveats comparing to nvidia)

Footnotes:

* critical "as low as it can be" low latency support on intel XE is still not as mature as nvidia, amd was lagging behind until recently.

* Not sure about "multiprojection" rendering support on intel, lack of support can kill vr performance or make it incompatible. (the optimized vr games often rely on it)


It looked like when Intel jumped into this space, they tried to do everything at once. It didnt work well, they were playing catch up to some very mature systems. They are now being much more selective and restrained. The down side is that things like VR support are put on the back burner for years.

Good for most people but if you need that fuctiobality and they dont have it, go somewhere else.


Post traumatic "nvidia TurboCache" disorder triggered.

https://en.wikipedia.org/wiki/TurboCache

(Not the same thing 1:1, but worth the joke anyway)


(not a teardown dev)

i had brainstormed a bit a similar problem (non world aligned voxels "dynamic debris" in a destructible environment. One of the ideas that came through was to have a physics solver like the physX Flex sdk.

https://developer.nvidia.com/flex * 12 years old, but still runs in modern gpus and is quite interesting on itself as a demo * If you run it, consider turning on the "debug view", it will show the colision primitives intead of the shapes.

General purpose physics engine solvers arent that much gpu friendly, but if the only physical primitive shape being simulated are spheres (cubes are made of a few small spheres, everything is a bunch of spheres) the efficiency of the simulation improves quite a bit. (no need for conditional treatment of collisions like sphere+cube, cube+cylinder, cylinder+sphere and so on)

wondered if it could be solved by having a single sphere per voxel, considering only the voxels at the surface of the physically simulated object.


from what i seen in "low end" ssds like the "120gb sata sandisk ones" under windows in heavy near constant pagging loads is that they exceed by quite a lot their manufacturer lifetime TBW before actually actually started producing actual filesystem errors.

I can see this could be a weaker spot in the durability of this device, but certainly it still could take a few years of abuse before anything breaks.

an outdated study (2015) but inline with the "low end ssds" i mentioned.

https://techreport.com/review/the-ssd-endurance-experiment-t...


it seems to be bad at spatial and some temporal tasks given it currently f*** s**'s at pokemon.

source: https://www.twitch.tv/claudeplayspokemon


You're allowed to say "fucking sucks" on Hacker News. It's not against the rules, and there's no "algorithm" that will penalize you.


glad to know, i am rather new here and somewhat used to the "don't do the usual forbidden stuff".


"fuck sex's"?


that's silly. obviously there's a missing apostrophe:

"it's currently Flan Sam's at pokemon"


you may just have casted a curse on our future motherboards, damn you


... did they mine their own minerals?

this could go into a sagan's "If you wish to make an apple pie from scratch, you must first invent the universe."


Like any self-respecting baker, I have a cabinet full of universes which produce various pies.


And ideally use your own philosophy, concepts and language when engineering everything. No English language, Latin alphabet or Arabic numerals!


> ... gaping demon butthole

for someone bad at naming things that gives me an idea! a software named gdb ?


boooo! :)


To dial up the weirdness, sometimes the solar flare activity has spikes (https://www.spaceweatherlive.com/en/solar-activity/solar-fla...) and these have a mild relationship with the odds of having "bitflips" in that timeframe.

we had a "historic bad solarweather" a bunch of years ago and i talked with a cyber cafe operator that "you could have more computers bluescreen on this week than usual".

to me it got really weird when he said later he really did, but honestly its 50/50 that could had been just incidental.

in another note there are some "rather intense" discussions when someone speedrunning a game gets a "unreproducible glitch" in their favor, some claim its a flaw from ageing dram hardware, but some always point that it could be a cosmic ray bitfliping the right bit. (https://tildes.net/~games/1eqq/the_biggest_myth_in_speedrunn...)


cutting edge perhaps?


"Bleeding edge" is an established English idiom, especially in technology: https://www.merriam-webster.com/dictionary/bleeding%20edge


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: