Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nothing wrong with the idea per se. But, as usual, no explanation for how the incentives and cultural changes needed to induce people would be implemented. Sorry, but no one is going to reward a grad student, career-wise, for rewriting other scientist's papers and catching bugs. Heck, the only reason scientists proofread their stuff at all is to avoid embarrassment. If there were an army of students eager to fix your mistakes, scientists would be even less careful.


I think the subtext here is that academic culture should become more like OSS culture, where your reputation is built on the content of your commits rather than just having your name on a project somewhere.


I fail to see how it changes things in a fundamental way. Having your name on a paper is supposed to mean that you contributed to the results of that paper. This is how you build up your reputation: you publish lots of relevant results.

This is only possible if you continue to work and publish, which in turn makes you a more sought after partner for collaboration. In the same vein, if your contributions are consistently minimal, then you will be less sought after which will lower your frequency of publication. In essence, the current system already correlates rewards with the quality of one's contributions.

Thus, the proposed changes do not affect how one garners reputation. Instead, I think the subtext is that the current body of research papers could be significantly improved, but isn't because there is no system in place to revise them in a distributed manner.


To elaborate on what nocipher said: The problem is not that people are getting their names on projects to which they didn't contribute much[1]. The problem is that hiring committees only care about new scientific works you are producing and not about other ways to contribute to science such as fixing typos, pedagogy, etc. If you want to incentivize such work, you either need to change the criteria upon which universities hire (unlikely) or give supplementary rewards such as cash prizes (possible).

[1] Yes, there are times when it is hard to tell who contributed what to a scientific work. And sometimes people get too much/little credit. But this isn't a massive issue, nor is it something that can be fixed by adopting an OSS culture. Software is often much more modular than science is, so it is often possible to put individual names on commits. But at those times when software is more like science--such as when several smart people are sitting around designing over-arching ideas/structure/strategy for a large project--then software faces a similar difficulty in teasing out individual contribution.


I agree. In my mind, one of the cultural changes that we need is for universities to abandon the tenure system. I think if that happened, there'd be more hiring of new professors, so there would be more of an incentive for academics to maintain a strong reputation in their field.

Academic versions of github might arise, as ways for researchers to show their prospective employers what they can do.


Tenure is the one big incentive academia has going for it. Un-tenured academics work like dogs for peanuts (given their skill sets) to get tenure. You take that away, and it becomes "work like a dog for peanuts your entire career". Who's going to take that deal without the carrot of "do what interests you without fearing for your job security" at the end of the tunnel?

Streams of irrelevant publications are only going to get worse when "publish or perish" extends to your entire career.

(Not to mention that tenure is a good idea for, say, climate change researchers, or people doing potentially provocative demographic research, or stem-cell researchers, or anyone else someone influential somehwere might want to see silenced)


"Tenure is the one big incentive academia has going for it."

I'd have to disagree. I think people pursue careers in academia for a lot of reasons besides job security. As you point out, there's such an overwhelming surplus of workers in academia that they end up working for very low pay. It's unlikely that removing tenure would deter so many of these people that we'd actually have a scarcity of workers in academia.

And I don't really understand why tenure would be so important for people doing controversial research. People do controversial research in all kinds of settings in which there's no tenure.


I doubt that removing tenure would result in the hiring of new professors. One recent trend has been to hire adjuncts as much as possible and to only slowly hire tenured faculty. Let's suppose that you have a professor (say a theorist) who works on a few ideas--but deep ones--she can take her time and focus on them instead of worrying about losing her job because she hasn't produced recently (funding is another story). Similarly, if she does something controversial--then again, her job is safe.

The incentive for most academics (say science/engineering) to maintain a strong reputation is personal. I do research because: 1) I want to know the answer to a question 2) I'm selfish and I want to know the answer first ;>

This leads me to putting in 100 hr weeks (sometimes you have to grind it out--other times, you have to step back and work less to be more creative) at times and being rather productive. Do you really think that fear of termination would result in the same output? Do you think that the average institution could pay researchers to put in the hours that they do--weekends, holidays, evenings, etc.?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: