Hacker Newsnew | past | comments | ask | show | jobs | submit | tanduv's commentslogin

Rich from a site that loads 4 trackers

... and is built with Next.js including no less than 12 enormous x-font-woff2 chunks of data at the top of the source code and another big __NEXT_DATA__ JSON chunk at the bottom. Hardly lean, vanilla HTML and CSS.

> Congratulations! You're now using Hinge to distribute unassuming abstract expressionist pixel art.

should've spun up a few more AI agents


is that eagle head following the cursor? that's hilarious!


ummm First amendment? Its not the first time misinformation has been broadcasted on air, why does the FCC need to get involved in this one. Would they have gotten involved if the implication was that he was a liberal?


They asked what was controversial about what he said, not whether the FCC's actions were constitutional.


>They asked what was controversial about what he said, not whether the FCC's actions were constitutional.

The former is (for some at least) interesting. The latter is actually consequential. I'm concerned about the latter.

The former, whether I agree or not, is about legal, protected political speech.


I don't see the FCC cancelling news shows on which Trump lies. Double standards driven by politics and why the govt orgs need career staff and not political players. Rule of Law anyone?


but even buildings need maintained


Automation costs a lot. The projects I work on are almost always in the millions of dollars,band they're far from being considered "big" projects. The hardware manufacturers will sell you equipment that runs for thirty years. Companies are reluctant to replace working systems.

I replaced a PLC a couple years ago. The software to program it wouldn't run on my laptop because it used the win16 API. It used LL-984 ladder logic, and most people who were experts in that have retired. It's got new shiny IEC-compliant code now, and next they're looking at replacing the Windows 2000 machines they control it with. Once that's done, it'll run with little to no change until probably 2050.


In a lot of "technical" situations, people tend to opt for the well established English counter parts for nouns or concepts. eg even a native Hindi speaker will use कंप्यूटर / computer over संगणक / Sanganak


I never really liked the syntax of fetch and the need to await for the response.json, implementing additional error handling -

  async function fetchDataWithAxios() {
    try {
      const response = await axios.get('https://jsonplaceholder.typicode.com/posts/1');
      console.log('Axios Data:', response.data);
    } catch (error) {
      console.error('Axios Error:', error);
    }
  }



  async function fetchDataWithFetch() {
    try {
      const response = await fetch('https://jsonplaceholder.typicode.com/posts/1');

      if (!response.ok) { // Check if the HTTP status is in the 200-299 range
        throw new Error(`HTTP error! status: ${response.status}`);
      }

      const data = await response.json(); // Parse the JSON response
      console.log('Fetch Data:', data);
    } catch (error) {
      console.error('Fetch Error:', error);
    }
  }


While true, in practice you'd only write this code once as a utility function; compare two extra bits of code in your own utility function vs loading 36 kB worth of JS.


Yeah, that's the classic bundle size vs DX trade-off. Fetch definitely requires more boilerplate. The manual response.ok check and double await is annoying. For Lambda where I'm optimizing for cold starts, I'll deal with it, but for regular app dev where bundle size matters less, axios's cleaner API probably wins for me.


Agreed, but I think that in every project I've done I've put at least a minimal wrapper function around axios or fetch - so adding a teeny bit more to make fetch nicer feels like tomayto-tomahto to me.


You’re shooting yourself in the foot if you put naked fetch calls all over the place in your own client SDK though. Or at least going to extra trouble for no benefit


why don't they just set those as options

{ throwNotOk, parseJson }

they know that's 99% of fetch calls, i do t see why it can't be baked in.


I somehow don't get your point.

The following seems cleaner than either of your examples. But I'm sure I've missed the point.

  fetch(url).then(r=>r.ok ? r.json() : Promise.reject(r.status))
  .then(
    j=>console.log('Fetch Data:', j),
    e=>console.log('Fetch Error:', e)
  );
I share this at the risk of embarrassing myself in the hope of being educated.


Depends on your definition of clean, I consider this to be "clever" code, which is harder to read at a glance.

You'd probably put the code that runs the request in a utility function, so the call site would be `await myFetchFunction(params)`, as simple as it gets. Since it's hidden, there's no need for the implementation of myFetchFunction to be super clever or compact; prefer readability and don't be afraid of code length.


Except you might want different error handling for different error codes. For example, our validation errors return a JSON object as well but with 422.

So treating "get a response" and "get data from a response" separately works out well for us.


I usually write it like:

    const data = (await fetch(url)).then(r => r.json())

But it's very easy obviously to wrap the syntax into whatever ergonomics you like.


You don't need all those parens:

  await fetch(url).then(r => r.json())


why not?

    const data = await (await fetch(url)).json()


That's very concise. Still, the double await remains weird. Why is that necessary?


The first `await` is waiting for the response-headers to arrive, so you know the status code and can decide what to do next. The second `await` is waiting for the full body to arrive (and get parsed as JSON).

It's designed that way to support doing things other than buffering the whole body; you might choose to stream it, close the connection early etc. But it comes at the cost of awkward double-awaiting for the common case (always load the whole body and then decide what happens next).


So you can say:

    let r = await fetch(...);
    if(!r.ok) ...
    let len = response.headers.get("Content-Length");
    if(!len || new Number(len) > 1000 * 1000)
        throw new Error("Eek!");


It isn't, the following works fine...

    var data = await fetch(url).then(r => r.json());
Understanding Promises/A (thenables) and async/await can sometimes be difficult or confusing, especially when mixing the two like above.


Same thing. Maybe this doesn't make the double promise quite as visible, but it's still a double promise. You could probably replace the other await with a .then() too.


IMU because you don't necessarily want the response body. The first promise resolves after the headers are received, the .json() promise resolves only after the full body is received (and JSON.parse'd, but that's sync anyway).


Honestly it feels like yak shaving at this point; few people would write low-level code like this very often. If you connect with one API, chances are all responses are JSON so you'd have a utility function for all requests to that API.

Code doesn't need to be concise, it needs to be clear. Especially back-end code where code size isn't as important as on the web. It's still somewhat important if you run things on a serverless platform, but it's more important then to manage your dependencies than your own LOC count.


perhaps we can create a sort of a bracket that scales based on the income?


TBH even in the SF Bay Area "tech capital of the world" you'll find areas with spotty reception.

https://www.reddit.com/r/bayarea/comments/1cqhr4i/what_is_up...


Crazy hills will do wireless in anywhere. In a rural area I am maybe a mile from the tower as the neutrino flies but a photon can’t make it. I have a bar of 5G in the pasture in front of the house but in the house it promises a bar of LTE but whether I can get out a text (w/o WiFi) depends on atmospheric conditions.

Out in Cali they have ring of fire conditions.

Too much crowding can do it too. A decade ago I was going to Cali a lot and thought it was funny to see so many ads for geospatial apps on TV that always showed a map of San Francisco when for me SF was where GIS went to die. Taking my Garmin handheld to the roof of the Moscone center because it was the only place it could get a clear view of the sky to sync up with GPS, so many twisty little streets that routing algorithms would struggle…. Being guided around to closed restaurant after closed restaurant and walking past 10+ open ones because a co-worker was using Yelp, etc.


Back around 2001 I visited South Carolina, and it was like being transported to the future of mobile internet. They had some kind of high-bandwidth cellular setup in the area that was far ahead of the rest of the country at the time, I think I recall it being around 20Mbit wireless. I was told the area was a testing ground for new tech. I was kind of shocked that somewhere that seemed so stuck in the past had such cutting edge tech deployed. I thought why is this not in SF??


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: