Hacker Newsnew | past | comments | ask | show | jobs | submit | warvstar's commentslogin

Thanks for the detailed report! We have recently received similar reports on M1 and M2 chips and we are going to investigate further.

Also, you're right about the loading screen for unsupported browsers, that is something we will add ASAP. Thanks again!


Any chance for multi agents and/or a vs-code extension(for diffing / applying changes)? nmv on the vscode extension, I just found your other comment.


Could you explain what you mean by "multi agents"?


We have the ability to stream in content from our CDN, so even devices with little to no storage capacity can run games of near infinite size. The main issue in today's browsers is the 4gb ram limit, although many games can run within that limit. We have MEMORY64 support as well, removing that ram limit, but it will probably be a few months before some browsers enable MEMORY64 without having to use a flag.


Streaming solves some of the problems, but the browser purging caches behind your back means you'd presumably have to serve the same data to the same user many, many times in the course of a playthrough. Even if you get a good deal on bandwidth, is that economical? And how wide of a connection does the user need to keep up with streaming high quality assets?

Games are already pushing 100GB when you download them up-front, with redundant streaming it's not hard to imagine that piling up to over a TB of bandwidth for one user.


It can be, we have our data cached as close to the user as possible, in over 300 locations. It does add some latency, because instead of a 3ms latency to fetch and decompress assets, it now might be 17ms. However, because we also have a memory cache, this can reduce that latency significantly, and we use that for as many small and recently accessed assets as possible. Our virtual fs is multithreaded and works in tandem with Unreals async loading threads, so we are able to fetch multiple assets at a time, reduce wait tune. We also have the ability to know what assets are commonly fetched in a certain period and to fetch those ahead of time.

Of course, if you combine bad networks, lack of storage capacity and large projects, you can be sitting around a while, or may not have the best experience. Keep in mind though, that the browsers don't usually evict data from the cache unless you've used up the storage quota, the system is under storage pressure, or the origin has not been accessed in a while. According to them


With severe streaming pop-in and LOD issues, though. UE5 games are already being designed around the assumption of having an SSD to pull assets from and have major visible issues when just using a hard drive instead, which is still a good order of magnitude faster than most internet connections.


That's actually the main thing we've worked on, WebGPU support is small potatoes compared to it, but I'm happy it's getting some attention here. We have an asset streaming system that enables you to stream in large, or near infinite size worlds at runtime, of course you still have to be smart about memory usage. We also have a server-side streaming solution that can be used as fallback, for constrained devices.


Not from the official source, but we (Wonder Interactive Inc.) have a custom version of the engine that compiles to wasm. We are looking to upstream to Epic, or at least have a plugin that doesn't require custom source.


I love the idea you folks are trying to get this upstreamed, would benefit everybody.

IIUC the reason HTML5 was dropped in the first place was for the big render re-work that went on during 4.27 + 5.0. So now seems like a good time to add modern WebGPU support back.

Do have to ask, isn't your business model for "theimmersiveweb" built around this WebGPU capability? Providing that tech to others or making your site a marketplace for web games?


> We are looking to upstream to Epic

What's the benefit to your company in this relationship?


Same as all open source, other people can help maintain it, building/extend the functionality, ensure compatibility with other parts of the engine, etc


Unreal isn't open source.


Source available though. The point is the same.


No downloads and the ability to easily and quickly jump into the action, especially useful for multiplayer or social games.


You're maybe not saving the game to disk, but you still have to download it. At least the assets you're going to interact with. And if you want to play again, then you keep the game/assets in a browser cache somewhere and there's not much difference with downloading and installing the game, in terms of bandwidth/diskspace


Of course, but from the user’s perspective this is all transparent. Parent comment is talking from the UX perspective.


100% agree and when WebGPU comes out, that's going to reduce performance differences by a ton.


We have a gaming platform where users can download and play full games in the browser. We don't use indexedDB though, we use the Cache API.


I assume that also includes the Cache API's Storage?


This makes me feel old, thanks for that.


Haha, sorry friend, but still, memories of cranking up that 56k and logging on to Active Worlds night after night. Well, everything is illuminated I guess.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: