Hacker Newsnew | past | comments | ask | show | jobs | submit | fortytwo79's commentslogin

And could you imagine the bokeh on a portrait?!


Isn't this already the case with marijuana? Legal in one state, if you travel with it to another state, but make a stop in an illegal state, you can go to jail?

I don't know much about laws on that topic, but it seems to be a similar case to me.


I think its better stated as:

* If you buy something legal in one state

* and you then travel to another state (where its illegal) with that item

* then you have broken the law and can go to jail.

Its not the doing something in state-a and then traveling to state-b. Its the traveling to state-b with something illegal in state-b.

Does that make more sense?


Crossing state lines with marijuana elevates your offense to the Federal level, regardless of the legality situation in the source and destination states.

Federally, marijuana transport across state lines is still trafficking in illegal narcotics. That the States don't help enforce it doesn't change a thing.

Not a lawyer, just read books, mind.


Okay, now someone needs to do the same study with innocuous end game scenarios. (I know this study links to one other paper that makes this case, but I can't access it, and the abstract has a biased tone)

If you're going to explore worst-case, so you can think through preparedness, then you should also consider the trivial case to make sure we aren't over-responsive either.


> As noted by the Intergovernmental Panel on Climate Change (IPCC), there have been few quantitative estimates of global aggregate impacts from warming of 3°Cor above (1). Text mining of IPCC reports similarly found that coverage of temperature rises of 3°C or higher is underrepresented relative to their likelihood (2). Text-mining analysis also suggests that over time the coverage of IPCC reports has shifted towards temperature rise of 2°C and below https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/202.... Research has focused on the impacts of1.5°Cand2°C, and studies of how climate impacts could cascade or trigger larger crises are sparse.


The thing to study besides worst base scenarios isn't "best case scenarios" but average case scenarios. And average case scenarios look so bad that it's obvious we are massively under-responsive to this. The "best" people have done in thirty years is reduce the increase in the rate at which CO2 in being released.


Realistically it would take a supervolcano or major asteroid impact to achieve such low-end radiative forcing scenarios, by injecting large volumes of material into the stratosphere and reflecting incoming sunlight, but any such event would be truly immediately catastrophic to human civilization (for example some major Yellowstone eruptions blanketed something like a quarter of continental North America with a meter of ash, IIRC), and would cause major crop failures and global starvation on an unimaginable scale. That would be worse for human civilization than any projected warming over the next century. Also, once the dust eventually cleared, the atmospheric CO2 levels wouldn't have changed much, so warming would continue (for about 100 years until temperatures equilibrated, assuming human civilization was basically destroyed and no more fossil emissions were taking place).

Case example: the Pinatubo explosion (1991) resulted in a bit less than a decade of steady temperatures, as predicted by climate models at the time:

https://pubmed.ncbi.nlm.nih.gov/11976452/

I suppose if we had a Pinatubo every ~5 years for the next 100 years it would result in the lowest plausible warming scenario, without destroying global agriculture.


I’m in. How can we make this happen?


Read Termination Shock by Neal Stephenson


We have a ton of evidence that the catastrophic outcomes are more likely than the innocuous outcomes, so “equally weighting” their likelihood and commissioning research on that basis seems misguided.


It will be tough to get buy-in on that proposition, because many people view this as a Pascal's-wager scenario, with belief in catastrophe being the rational wager.


And then there are people who don’t think the mass extinction and mass immigration that have already happened are a catastrophe.


So what you're saying is that if we're all discussing the potential for catastrophe and it's already happened, then why publish this paper?

I think it's still worth discussing the likelihoods and severities of future catastrophes.


The paper defines what it considers a catastrophe quite well, you should read it! As a climatologist I always appreciate more investigation into the possibilities of future climate, even those I consider unlikely.

What I did say is that a lot of bad effects are already happening. Talking about climate science used to be like being the mythical Cassandra. But now it’s worse: the mass deaths and migration has already begun, and people still deny the evidence all around them that the predictions have already come true.

If there was an industrial accident where a chemical got released and a million people around the factory all died, that would be a catastrophe, right? Okay, a million people scattered around the world die from air pollution every year. Does the point source make it somehow a bigger tragedy in the first example? Yes, I know that without energy even more people would die, but a million deaths that we have all the technology to avoid is still a catastrophe. Regional famines leading to civil war and migration have already happened. Heat deaths in the thousands in Europe this year. Maybe I’m weird that my threshold for catastrophe is so low.


The article also states this:

"With just over 19 weeks into the year, this averages out to about 10 such attacks a week," when referring to the Buffalo shooting.

Saying "10 such attacks" implies the attacks are all similar in characteristics beyond just the number of people involved. The article talks about mental health, it talks about a pre-planned desire to kill. But it doesn't talk about gang violence. These are very different sources of intent. If you want to use the gun violence archive data to make a point, it should be about gun violence in totality - not cherry picking their numbers to make the case that the US is full of hate-filled crazies who are randomly shooting places up 10x per week.


>not cherry picking their numbers to make the case that the US is full of hate-filled crazies who are randomly shooting places up 10x per week.

This isn't an argument anyone, much less the article, is actually making, it's a strawman.

Plus, mental health and a pre-planned desire to kill obviously correlate with gang violence. Bloods killing crips and white supremacists shooting up restaurants are not as separate in their intent or root cause as you seem to believe.


But Buffett's point is that each unit of investment he pursues has the potential to produce incrementally more value. You're saying the aggregate technology underlying bitcoin has value - which it does. But you don't get twice the tech value by doubling your amount of coin owned. Conversely, if you double your farm holdings, you have double the ability to produce food.

Also, Buffett's quote here isn't some significantly new perspective. Hasn't this been the primary argument against bitcoin since the beginning?


I get it, its not an investment that Buffett typically makes. And he really shouldn't as it goes against his wise principal of not investing in things he doesn't fully understand. If you see his historical investments, they were generally boring high cashflow industries.

> But you don't get twice the tech value by doubling your amount of coin owned.

I think holding bitcoin is more of an investment in the underlying technology. If it is useful, presumably the demand will rise for it. It's like owning a stock that has most of its value in its intellectual property. For instance, Disney has a P/E (price earnings ratio) of 38, significantly higher than the typical firm. This is due to Disney's intellectual property. Disney will be able to produce and profit from Marvel movies for years along with their extensive catalog. When an investor buys Disney stock, a big chunk of what they're paying for is the technology (in this case intellectual property)


This kind of condescension towards religion is so close-minded, and it's incredibly tiring. I think you'd be surprised just how much the faithful appreciate and embrace science, and how much philosophical thinking about reality and existence we actually do. Adult religiousness is not the same as the cartoon version that children are introduced to, that so many atheists use to attack. The lazy, unintellectual thing to do is to base all your understanding of reality only on what science can describe, and to never explore beyond what can be proven as a "fact." Your statement about explaining away facts paints the faithful as simple-minded rubes, which is a cheap shot and is absent any understanding of how the religious actually view the world.


> This kind of condescension towards religion is so close-minded, and it's incredibly tiring.

I live in the US, God Bless America, where religion is shoved in your face over and over again from childhood to adulthood (it’s incredibly tiring).

I’m intimately aware of the philosophies and mental capabilities of “the faithful”. Religion has earned every ounce of condescension that it receives.

> The lazy, unintellectual thing to do is to base all your understanding of reality only on what science can describe, and to never explore beyond what can be proven as a "fact."

Science is exploration. Religion is not.


And yet, you shut down the possibility of religion without any degree of exploration.

I don't need you to be religious. I understand annoyance with the fact that the culture you're in is pushing something you don't agree with (I experience that as well on other topics). What I take issue with is the attitude that religion deserves condescension and vitriol, simply because you don't subscribe to it, or because you find it annoying. I see no recognition that faith and logic aren't mutually exclusive.

Don't have religion - that's fine. But don't pretend to understand (and then trivialize) any religious tenet, if you're unwilling to fairly explore it.

> Science is exploration. Religion is not.

You're conflating religiousness with dogma. Dogma, by definition, is belief without exploration. Religion, however, invites endless exploration.


> And yet, you shut down the possibility of religion without any degree of exploration.

You are both a reader and a writer of fiction!


I appreciated the "retro encabulator"-esque reference in the industrial sample.


Agreed. Speaking as someone who has hired hundreds of software engineers, I can tell you that having a degree at all tells me a lot about your work ethic. Sure, I'd prefer a comp sci background, but your degree is nearly as good. It's an indicator of how you think.

You know what other degree I see a LOT of in SWE? Music. I've hired countless music majors.


I have the opposite experience, almost. I've hired and trained lots of people with CS degrees who think they know everything and barely know the basics.

I massively prefer people who flunked university and just got a job writing code because they could. University no longer trains people to think, instead it's about jumping through hoops and dealing with pointless administrative bullshit [0]. source: I have a Master's degree.

And yes, even better are the people who wrote code as a hobby, and then decided to switch career to become professional developers. Some of the best coders I know are in that category (though the best coder I ever hired taught herself to code while getting off heroin in a Glasgow slum).

[0] Ok, so yes if you're hiring for a large corporate I can see how this would appeal.


Sure universities train people how to think. But they're in the liberal arts, which is why OP finds some many SWEs with music majors. STEM is very transactional. Liberal Arts is more about teaching you how to think and communicate.

The problem is too many tech bros have been pissing on liberal arts for 30 years. And now people are looking around and saying 'we have too many tech bros'.

My college years were spent reading various viewpoints, bringing them together, and synthesizing them into a coherent narrative. This turns out to be very helpful when it comes to solving problems and analyzing security incidents. I've found it more useful than that semester of C+ the new hires took for their CompSci degree.


> STEM is very transactional. Liberal Arts is more about teaching you how to think and communicate.

Are you able to elaborate on what you mean by these statements? What does being able to think mean and how would one be able to demonstrate that?

Do you think the only way to learn this is in the liberal arts?

> too many tech bros have been pissing on liberal arts for 30 years.

Your comment seems to be pissing on STEM. Am I completely misreading what you're saying? STEM and the vocational or mechanical arts have been pissed on by the liberal arts for a lot longer than the last 30 years.


> the best coder I ever hired taught herself to code while getting off heroin in a Glasgow slum

Those words bust the biggest bias I realized I had/have - against addicts. Thank you very much!


I said she was a great coder. I didn't say she was trustworthy. She eventually left the industry to go be pretty at gamblers in a casino where she could earn a lot more money for a lot less effort. My co-manager's blatant sexism also contributed to this. I often wonder whether she got back into coding, but lost touch years ago (this was pre-LinkedIn obviously).

"Never trust a junkie" is still very good advice.

Talent at coding doesn't automatically come with the personality or the temperament to be a good career software dev.


To provide a third vote: My midsized company barely considers people without degrees, even if they have extensive technical work experience. Example: I had to battle hard to get somebody considered for an entry-level non-technical role, and they ended up being rapidly promoted within engineering once they got past the recruiting screens.


This is my current difficulty - I work as a neteng for a large MSP in my area.

I quit university after my 2nd semester co-op during undergrad (to work at my current firm), I have since been promoted four times and now work as a senior engineer (though I cannot technically use this title). Trying to move firms now after working for 5 years, It is incredibly difficult for other firms to accept that I've been working in the field successfully given that I do not have a completed degree. On one hand, I have skill now that I could have never earned in school - on another hand I may have gimped my ability to move deep into the upper half of the first six figures salary.


Hmmm, have you had a lot of coworkers move through over the last five years? Following one of them to a new place is probably the easiest way to change jobs and build your resume.


I feel your pain. I've fought against this before.

I think it's because of the salary level - like HR/management cannot get their heads around paying someone who doesn't have a degree the kind of salary that developers get these days.


Wow. Pretty strong opinions here.

Completing a degree means you can commit and deliver, IMO.

Instead, leaving for writing code just because you can tells me that you took the short path, and that your missing basics (the real ones, which teaches you University) will knock at your door some day.

My 2 cents.


> Completing a degree means you can commit and deliver, IMO.

Not completing a degree doesn't mean you can't so this statement doesn't tell us all that much.


Not sure. As I said in the other reply, if you don't finish it, provided that you don't face major practical or financial issues, to me it is a red flag on being able to commit and bring something to conclusion.


As it stands it's a theory of yours. Now you need some actual data to back it up. It's a bit dangerous to reach conclusions based on a hunch, you might be biased after all.


I am definitely biased by my values, and I value finishing something once you started. Always ;)

Then yes, I might be of course proven wrong, I just gave my 2 cents on what not finishing a degree tells me, including my bias.

Then you could be a false negative, in the sense that you might have not finished your degree but be able to commit 100% on other things. However, as we all know, spotting false negatives is very hard and who hires don't like these bets if not in very particular circumstances, as it is going to be their fault if the person is then not a false negative but a true one. With or without backup data.

I think is no coincidence that as far as I can see the OP basically got the "finish your degree" as first and foremost comment.

Don't take me wrong: I have been a false negative and an outlier my entire life, now I love it, but in the early days I really wished someone told me what I am saying now.


> Then you could be a false negative, in the sense that you might have not finished your degree but be able to commit 100% on other things.

I'm not projecting my own experiences on others, I just take issue with your claim. Studying and working are very different beasts, I think it's very much possible for people to not finish degrees yet be really good employees that finish their tasks as agreed upon. The incentive structure is so different that the two can hardly be compared. Not finishing your degree, ignoring personal/financial issues, tells us nothing about their work performance.

> However, as we all know, spotting false negatives is very hard and who hires don't like these bets if not in very particular circumstances, as it is going to be their fault if the person is then not a false negative but a true one.

Yes that's hard but that's a very different issue from your original claim. It's a good reason for a recruiter to go for the person with the degree instead of the person without the degree, but that still doesn't mean that the person without the degree can't commit and finish their work.

> I think is no coincidence that as far as I can see the OP basically got the "finish your degree" as first and foremost comment.

He has 3-4 months left of a 5 year education, of course people will recommend him to finish his degree. There is very little opportunity cost downside.


Sorry, the "you" was not referred to yourself by any means, it was a generic "you".

My original claim was not an actual claim, it was just what not finishing something one starts tells me. A sensation. And by "not finishing" I do not mean trying for a bit and then quitting, that's of course OK.

What I meant is not finishing, after investing vigorous effort, because one gets "enchanted" with something else (e.g. "oh look I can code in javascript and build a website, who cares about finishing studying the basics, I don't need them anymore!").

By the term "finishing" I implicitly assumed to be near the finish line. If then this is not a symptom of poor commitment capability, I don't know what else it could be.

On another angle, the comment to which I responded originally basically said "University does not teach you anything valuable at all, just quit and go writing code" and here I strongly disagree and will always do.


>> I am definitely biased by my values, and I value finishing something once you started. Always ;)

What do you do when it becomes obvious that the thing you started is now irrelevant, or was poorly-conceived, even mistaken, when you started?

Genuinely curious. I've walked away from a lot of projects because I learned while doing them.


I don't want developers who continue doing something even when it's blatantly the wrong thing to be doing.

Flunking out of university is not the "short path". It's the hard path. Getting a degree is not difficult, or even hard work. It's the easy choice - the thing everyone expects you to do.

I don't want people who follow orders and do what they're told. I want people who are prepared to make hard decisions, take the less-travelled path, do what they feel is right and not just what everyone else tells them is right.

I want someone who can actually code, and not someone who has a certificate that says they can code.

My 2 cents.


I think we are just talking about two different things, and that you have in mind "coders" and "developers", not software engineers or data scientists. But the OP was talking about the latter, not the first, where I can to some extent agree with you.

I saw it many times: javascript kids or self-taught backend Python "engineers" falling extremely short when it came to scalability, algorithmic complexity, or just abstracting concepts.

Similarly, I saw several self-claimed "data scientists" not even knowing what a non gaussian error distribution means.

Not knowing this stuff jeopardise your work and the projects you are working in.

Then, if you are telling me that your interview process should spot these shortcomings at the same level a university degree can, then to me this sounds a bit unrealistic.

However, if you need someone that can write code to pass the unit-tests that some one else wrote without taking any architectural decision, than OK I can agree with you, but it does not sound much of an appealing career path IMO.


Completing a degree means you can commit and deliver, IMO.

So does a lot of other valuable and still relevant lived experiences, and they come without the price tag.

But we all already know that.


Sure, if you include the price in the equation then this makes things different, you are right. I got biased by the nearly free University we get in Europe.

However, if you start something, I think that it is a good attitude to finish it, provided that you don't face major practical/financial issues along the way.


However, if you start something, I think that it is a good attitude to finish it

I 100% agree with this. I've just got a bit (maybe a lot) of chagrin towards the sometimes strident mentality that college is the only meaningful way an individual can prove they possess such a capable characteristic, or that completing college is an indicator that such a person will retain such a characteristic throughout their careers/lives.

It's useful as a snapshot of an individual's educational accomplishments for sure, and more power to them for that accomplishment, but I will always have a bit of reticence about the application of "has degree" past that.


People have always been that way. Before Rogan showed up, before the internet was even invented, there were "nutters" that had deeply misinformed perspectives which they followed, sometimes, to death. I'm sure someone will argue scale is the differentiator, but I'd argue that the proportions of people who ignore facts vs consider facts, is probably unchanged. (Although admittedly, I have no statistics)

Also I think a big part of the problem is the aggressive labeling of content as "misinformation" or "fake news." To me, misinformation implies propaganda issued and promoted by an enemy entity. But today, it is a term that is used to mean anything that has a fact (whether verifiably correct or incorrect) that implies a conclusion that is generally unacceptable.

For example, if the generally acceptable premise is: "everyone who is able should get a vaccination," then publicly talking to someone harmed by a vaccination (even if it's true) would be considered misinformation, because it potentially concludes something opposing the acceptable premise.

If we can't openly share ideas, good, bad, informed, misinformed, then the 99% (fake number) of us who aren't "nutters" that follow bad advice to extreme conclusions, will be denied the volume of data, perspectives, and opinions we need to make a truly informed decision.


I don't think that comment was implying those measures are to blame for rising deaths. I think it was to argue that they haven't had the impact they were expected to have.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: