Hacker Newsnew | past | comments | ask | show | jobs | submit | mjpuser's commentslogin

I dunno I thought it was well done. This is an entertaining trend on social media and they applied it to their product well.


I think that programmers are quick to jump to concurrency when they want to make something performant as opposed to using other structures, like a cache, or researching other possible solutions, like the HN favorite bloom filter, or other algorithm.


Nice intro to timeouts and context. Next step would be dealing with state changes that happen in a cancelled request.


We also have a mantra against optimization until you know that you need to. It seems too cost and time prohibitive to put these things on the programmer to maintain, and that we need to develop tools to help optimize our code. Maybe the next generation of optimization techniques will be runtime instead of compile time. We already have dbs with optimizers, so maybe there will be programming languages with optimizers?


I would say reddit is a forum, and you can have good discourse on there, but can also have straight comfort internet trash. Basically I think its more nuanced than just having a forum...


Reddit is probably the closest thing to a forum. I guess having a separate domain in a website owned by people rather than one run by a big company allows for a more intimate environment.


I don't really find reddit comparable to forums, mainly because reddit is also built around recent comments. It has very rudimentary search functionality, and the way submissions and comments are scored discourages long conversations. Unless you're on a really small subreddit, stuff falls off the first page quickly, and your comment will be far less visible if posted half a day later. This is completely different than forums, which optimize for conversations that last for several days to a couple of weeks, sometimes much longer.


Exactly. Forums optimize for quality content and this makes it hard to monetize them because when you found the good stuff you are not going to click on ads.

Also, mobile which is not ideal for long conversations helped their demise.

Still, I hope they come back big time. I miss them.


The forums I hung out on simply sorted by the last comment, or by votes. They certainly didn't "optimize for quality content" in any particular way, other than banning trolls (sometimes.)

Also plenty of good forums had ads. I don't think it's a common thing to want to refuse to click on ads because of high quality content, I would guess the opposite tends to be true.


I probably have a too much romantic memory of them :)

Anyway, it was easy to come back to a specific topic, so it was easier to find good ones. You could bookmark them, see the number of views/replies or even find them in search engines.

Good content surfaced and survived better, so optimization is probably the wrong word.


* College will become less common, and focus on vocational jobs will increase.

* Autonomous cars will be available by the end of the decade

* First person will land on Mars

* 20-40% of wild animal populations will be gone

* Lab grown meats will be introduced and become the norm as meat prices skyrocket

* Taxes in the US will increase to 40% to "target climate change" even though those funds will be poorly handled

* Starlink becomes operational, disrupting mobile carriers

* New phone number system comes created as all systems will go though the internet

* precision medicine for cancer care improves but is too expensive


This is a compelling article, but then when you try to see what the product is, you have to submit your email / request a demo. In my experience, hiding your product behind a demo is a bad sign that the product has flaws. It also puts you in contact with a salesman or account manager which is typically geared towards a high pressure sale event of the product. I'd be interested to hear other people's experiences with this model, though.


Also for certain products it's almost impossible to sell at scale without an enterprise sales team. Developers won't be "willing" to pay or admit the need for what's basically a glorified web scraper concept that has existed for as long as the web itself. For this sort of product to be successful you need enterprise sales, tons of funding and marketing, among other stuff. Even if your actual product is terrible you will still have a larger segment of market compared to a competitor product that may be more advanced functionality-wise. This is not something that's popular on HN because of the constant myth that "if you build a well-engineered product, customers would magically show up".


Sometimes with products like this one the value that it provides to the customer varies significantly depending on a use case and it's hard to put up pricing without either cutting out a long tail of smaller users or losing money on the big deals.

I have an API that market research and medical customers are willing to pay 100x more for than consumer social, at the same time the consumer companies have 100x the volume.


Find a way to segment that into two markets based on features or volume. Don't try to do it by making your customers take a phone call before telling them anything about the product. That's a move that even Oracle sales would think twice about...


I think diffbot looks complex enough that you might buy a license and a 100 hours of consulting time for your first MVP.

For such products, having a meeting where the product and it’s benefits can be shown, followed by a business case planning session is a reasonable way of selling the service.


Here's some more of diffbot.. They've built basically a knowledge graph search engine. I am guessing PoCs are the way to sell this. https://www.forbes.com/sites/jilliandonfro/2019/05/13/the-we...


I think there can very much be two segments. Both a high-volume self-service segment (I know this is what I prefer when evaluating developer tools) as well as a high-touch enterprise segment for training and implementation (think Bloomberg Terminal model). Diffbot has a free 14 day self-service evaluation for its individual extraction API, but it is not quite implemented yet for the Knowledge Graph.


DiffBot - sounds like it alerts me when a page changes.


Not sure why HN is saying it is obvious that assembly is relevant because compilers... The article's intent is around a programmer writing assembly. I'm sure there are niches but I can see web developers getting away without writing assembly in their professional career.


Ok, we've put programming in the title above to make this distinction clearer. The author is not writing about computer-generated assembly language, such as compiler back ends.


Might see that flip in the coming years, if you consider web assembly (wasm) to be assembly. I do.


WebAssembly is not assembly in the ways that the article talks about. Like, writing it directly doesn’t give you any special control or guarantees over timing.


It's assembly against a virtual machine, not a physical one. You're right it's not appropriate for an embedded system or some other RTS, but assembly doesn't stop being assembly when you target a virtual machine.


It kinda does in this case. Don’t kid yourself. In real assembly, the really interesting part is how to use a finite register file. WebAssembly has an infinite slab of variables available, in the sense that you get to say how big it is. That fundamentally changes the game.


There are real CPUs that are just like that.

In fact, most mainframes have always made use of microcoded CPUs, with Assembly being referred as bytecode on the programming manuals.

You just need to dive into IBM and Xerox PARC manuals, for starters.


Yep. I feel like most of HN's readership's asm education begins and ends with their 6502 class at uni.


Sounds like you’re saying those machines executed bytecode.

Otherwise there isn’t a great limiting principle to your logic. Just because someone once built hardware that executes such a high level assembly that the manual referred to it as bytecode doesn’t mean that all bytecode formats are assembly.


Indeed I am, the interpreter is the microcoded CPU.

Even modern 80x86 Assembly is a low level form of bytecode, given that the micro-ops that are processed by the microcoded CPU are completly unrelated to 80x86 Assembly opcodes.


Wasm is different than assembly, so I don't think so.


It targets a virtual machine, not a physical one, but other than that it's "assembly-like" enough that learning some core ASM coding practices will help you.


Not sure why you are sticking to your guns here. First, it's highly unlikely anyone would ever write wasm by hand. Second, the article rambles a bit but the most compelling argument for assembly is writing fast code for constrained hardware. At a high level, that's not what wasm is solving.


WebAssembly is a bytecode format.


Bytecode is just the instructions, which is actually a level lower than asm, but for a virtual machine instead of a physical one. It's a direct enough abstraction that you can predict the bytecode you'd build from the asm you write.


I think one contributing factor slowing down progress is the cost of electricity in Italy. It's €0.24/kwh vs $0.06/kwh in the US (at least in PA). You dont see any electric cars in Italy, and it seems super expensive to them. I’m also not sure if there is a tax on top of that €0.24 from what I remember my cousin telling me. I’m hitting a language barrier confirming this.


Isn’t gas also more expensive? Most of Europe is like the equivalent of $6-8 a gallon IIRC.


Much closer to $0.25 in San Francisco.. From my bill last month:

Generation charges: $22.33

Peak Usage: 62kWh @ $0.29672

Off-Peak Usage: 258kWh @ $0.28243

Baseline Credit: 246kWh @ -0.0832

Total usage: 320kWh @ $0.2160/kWh


My dad pays $0.08 / kWh for electricity in Kentucky. San Francisco has very expensive electricity compared to much of the country. The average US electric rate is $0.12 / kWh for reference.


Kentucky is 75% coal powered, mostly using old existing coal plants.

As those plants age out of their usable lifetime, Kentucky utilities will have to choose between some combination of renewables and storage or natural gas or nuclear, all of which have higher costs than operating the defunct coal plants, but lower capital costs than building new coal plants. This will result in a rise in their prices over time.

Of course, comparing the future costs of new non-coal plants doesn't account for the huge existing public health and environmental costs of burning coal for electricity, which would be avoided by any of the other technologies.


Kentucky has been transitioning to natgas for some time now. The TVA shut down three big coal plants and another big utility also shut down a big coal plant last year. They’re not building new coal plants but natgas plants. That’s not going to change the economics much.

I’m hoping Kentucky starts to do more solar as it is pretty decent for sun, but the local politics likely think solar + stationary battery storage is some lib-ruhl conspiracy.


I think the real problem is that they didn't properly maintain their code. Rewriting it in Go won't prevent them from dealing with this in a few years for when this Go version reaches end of life. I would have liked to see an article on "introducing process" side of programming.


Thanks for the comment! A couple of things about this...

1. the Go team is working very hard to ensure that there are no such compatibility issues. Code written for Go 1.0 should still compile with Go 1.14 beta today.

2. It's possible there's more we could have done along the way, and I tend to think that statically typed languages make it easier to safely refactor more ruthlessly. But I do think we've actually done quite a bit of change incrementally along the way. Our move to React on the frontend and GraphQL on the backend have been good examples of that. Plus, we did a huge refactoring a couple of years ago to draw better boundaries in our monolith, and that has made a move to services possible.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: