I think there's the same dichotomy in tech journalism as there is in science journalism, perhaps other journalism as well but I'm not qualified to comment on other topic's veracity, namely, that the people who in fact understand the topics and are qualified to talk about them much more often actually work in the field and don't merely report about it.
From that nature, I am more inclined to rely on journalism for facts and overview alone, and less inclined to rely on them for their opinion than I am actual professionals in the field.
The other element is that a large portion (and probably the vast majority) of technology and business "journalism" is very thinly veiled boosterism and promotion.
And this isn't a particularly recent phenomenon, it's largely always been thus, at least for values of "always" dating to the late 1970s / early 1980s.
Idea (wo)men need to find money (wo)men, so recruit a PR agent to spread good buzz to churn up investment. There are the occasional hits, but a hell of a lot of misses. And if you're sitting on the churning dust cloud that is the expanding edge of the boom / development, there's a hell of a lot of noise and false leads.
Truly enduring changes tend to be based in deep, deep technology, most are a minimum of a decade old, if not several, and the truly good concepts are decades to centuries old (there's a reason they keep emerging: they serve a real need). Fads are often fanciful, contrived, or serve a narrow set of interests.
Unfortunately it gets really hard to detect the signal amidst all the sound and fury. Age and a solid grounding in history help more than the young might think.
There is a middle ground for people like James Gleick and Neal Stephenson who are scholars (in the sense that they study a field intelligently and more deeply than an average layman) but do not push the field forward.
Great students can become great teachers without being great doers.
They don't, not statistically anyway. Some people still listen to them because of confirmation bias, failure to understand statistics, and because there generally isn't a strong incentive for anyone to fact check such predictions after the fact.
Given that in this case out of a list of 10 there were 3 very devastatingly bad predictions I think that's pretty damning. More so when you consider that it's rare for a company to matter, so picking 10 big CEOs that don't matter should be highly biased toward correct predictions.
> More so when you consider that it's rare for a company to matter, so picking 10 big CEOs that don't matter should be highly biased toward correct predictions.
You've got it backwards. His pool for picks isn't the set of all CEOs, it's the set of all CEOs who are well known, have mattered recently, and are still working. That's a very different pool, with a very different bias.
Fair enough. But without a good comparison against random picks from the same pool it's impossible to judge the quality of these picks. Given that there were a few gimmes in the list (slashdot and vonage for example), and given the notable absence of plenty of high profile CEOs in 2006 who foundered shortly after I'm not at all impressed with this list.
In fairness, hindsight is 20/20, and it's very easy to look aghast -- or in smug condescension -- at the failed predictions of yesteryear. But how many of us would have called all of the shots on that list correctly back in 2006?
Furthermore, it is human nature to suffer from an anchoring bias when looking at a list like this one. We immediately assign more weight to the 2 or 3 "disastrously wrong" predictions in the list of 10 than we do to the other 7. Hell, we probably assign 99% of the weight on that list to the Facebook prediction. It's the most visibly and obviously wrong. So it anchors our perception of the entire list.
I doubt there's anyone out there who gets his predictions right 100% of the time. And I bet there are plenty of highly successful prognosticators with at least one or two hilariously bad calls on their track records.
I'm not defending this list or its author, per se, but just pointing out that everyone gets it wrong sooner or later. Throw enough shots at the hoop, and you're going to miss a few of them. Granted, you could certainly argue that tech journalists should shy away from the predictions game and focus solely on reporting. But where's the fun in that?
Perhaps, but then, the expression "caveat emptor" comes to mind. As the readers of such predictions, we should be taking these things with a healthy grain of salt.
That's the unspoken compact between the prognosticator and his audience. It's his job to make as good of a guess as he can; it's our job to keep in mind that he's guessing.
Actually, I'd be happier if I knew of prognosticators who did put their money where their mouth is. Predictions accompanied by bid/ask spreads would be vastly more useful than predictions intended for immediate entertainment (ie, all the ones the pundit is unwilling to bet on).
As it is, though, there are just a few mildly popular websites that enable that.
Well, the counterargument to that is conflict of interest. Do I really trust the predictions of someone with financial interests in the outcome he's predicting?
I think there's a place for both types of reporting in our media. You've got Seeking Alpha for financial predictions by interested parties (who will at least disclose their interests, which is nice). Then you've got the mainstream media for general, ostensibly unbiased reporting and predictions.
(Experience has taught me that getting into a debate about the presence or absence of bias in the mainstream media is a classic blunder akin to starting a land war in Asia, or going in against a Sicilian when death is on the line. So I'll refrain from getting that started.)
The thing about trying to move the market by betting is that if someone calls your bs, you're ruined; and all it takes is one knowledgeable person with moderate resources (since you're putting out big odds and lots of money if you really want to influence things). Prediction markets have historically been extremely resilient against manipulation attempts.
Experts and pundits are actually worse in their predictions than others. In fact, ironically there is an inverse relationship between how popular an expert is and their prediction accuracy. This article explains why - http://www.thedailybeast.com/newsweek/2009/02/13/why-pundits...
No really, what difference does it make who they judge is relevant or not? Just see that as the same as comments from sport commenters who obviously don't play. Its fun and caters to the need people have to talk about it but it has little informative value. Its not an advisory service. If your business depends on the information you get through these channels, be very afraid or start building some expertise yourself.
This this this.
Part of my job is to do industry analysis to help my partner write blog posts to build our brand.
After a few very painful months spent debugging crap data and models, I realized that the consumers have zero interest in accurancy or correct meaning, so we shouldn't have any either. It is all in good fun gossip.
Like most futurism, there's a strong incentive in this kind of journalism to publish long-shot predictions - Dow 5000, that sort of thing. If you're right, you're a genius, and if you're wrong, you're just one more wrong prediction in a very large pile.
Speaking of tracking pundits' predictions, whatever happened to Maciej Cegłowski's website, Wrong Tomorrow? www.wrongtomorrow.com seems to no longer be up and running.
Netflix CEO, Linus. Both are as relevant or more relevant than ever. Netflix is getting to be as big a force, in revenues and viewership, as any big cable company. Linux is very much still relevant in the server world and in the mobile and tablet computing world (android). And Linus' Git is growing in popularity (github?) and influence.
It's questionable whether the Vodaphone CEO is "relevant", but Vodaphone is definitely still a major competitor, they make more money than google and amazon combined.
Yes, plus Linus is relevant and important for both linux and git.
Git has been hugely, positively disruptive in a way no Dvcs was prior.
And anyone (cnn evidently) that thinks Linux developers are fungible misses the point of having a fierce, opinionated, blunt semi-dictator running the show. Great things can happen. He's the anti-steve jobs.
What gives you the idea he is missing aesthetic taste? You do realize that Linus (or any of the kernel devs) are not in any way responsible for the decisions groups like Gnome or KDE may make, right?
it sounds as if you didn't read the article. It doesn't say LinuX isn't still relevant - quite the opposite in fact - just that Linus has:
"Although he can claim credit for popularizing one of the most powerful ideas ever to sweep through the software industry, Torvalds's project has matured to such an extent that it's largely outgrown its illustrious creator."
I think it's fairly reasonable to say that Linux will continue on even if Linus disappeared. Same with git, for the matter.
The article was wrong re: Netflix, but the point they made - that Netflix didn't have a capable video over IP product at the time - was reasonable.
That does not make him irrelevant when he is still here. For example, the direction he "recently" gave to the arm branch (basically ordering everybody to clean up their shit immediately), will probably imply that Linux has been and will be more successful on portable devices. Would the same thing have happened without him? Maybe, or maybe not.
Assuming a cleanup is significant, yes, I'm fairly sure Google or another high-profile Linux contributor would have done the same. It's just maintenance work - necessary, but not industry changing like the initial release of GNU+Linux (as a complete OS) was.
"Revolutions" in the industry are often not the result of a big bang, but more frequently of mere accumulation of little steps. GNU+Linux has not changed anything in the industry when initially released. It was years (decades?) behind current techs at this time (but has fast taken leadership in many areas), and at first only used by hobbyist, then only used for applications with simple needs on simple hardware as a cost reduction measure, then used in more areas and so over.
And I'm far from sure that the decisions good for the long term development of Linux would be done by bigcorps if Linus went missing anyway (especially not Google). Bigcorps have bad habits (with high impedance mismatch in the context of the Linux project), like development behind closed doors, and not only time-to-the-market centered but even for some of them time-to-the-market weighting 95% in their choices. Because of that, bigcorps were the cause of the arm branch mess in the first place. Bigcorps often lead big projects that fails big. And Linux is not lead by bigcorps anyway, it's lead by peoples.
Also, the same argument could be applied to lot of people in leadership positions anyway. Maybe Apple would not be Apple without Steve Jobs, but the CEO does not matter more in lots of big companies than Linus matters for Linux. Maybe it would be very equivalent with somebody else, or maybe not, but in the meantime they are in charge, and when they do a good job, they are the ones who matter.
I'll argue that the true innovation of Linux was in the licensing, with both the development and adoption model this implied. Early Unix was distributed without a license (for numerous reasons). It rapidly spread throughout research and academic circles, but was constrained once software licensing emerged in the 1980s, and finally killed in the BSD wars of the 1990s.
It's not that Linus stumbled on some secret of technology or was smarter than anyone else doing 'Nix development. It's that he got obstructions out of the way that kept those who could contribute meaningfully from being able to do so and allowed others to utilize the results.
The end result is the innovation and technical superiority (by and large) of Linux over alternatives.
On how Unix initially emerged: there was no copyright of software (this changed in 1976), and AT&T were prohibited under a 1954 anti-trust consent decree from participating in computer equipment and software. So when Ritchie and Thompson created Unix, the company could do little but use it internally, and had no reason to prevent its more widespread distribution. Unix exists specifically because AT&T was prevented from productizing it.
I'm not sure Torvalds should be there, either. Sure, Linux long ago outgrew the need for him as chief evangelizer, but when Linus speaks, the software world's ears perk up. He's still incredibly influential, if no longer the Atlas of the FOSS movement (with apologies to Dr. Stallman).
To the average reader of CNN, git is a piece of software they've never heard of, much less understand why it's better. Linus never stopped being important to the software community, but git doesn't make him more important to the non-software community.
Github is more important than git so I guess that leaves figuring out how important linus or git was to the origin of github. Is here something about git specifically that made it more suitable for github than say mercurial or bazaar? Or was it timing or happenstance?
It's debatable whether github is more important than git or not. Despite the impression a reader of this site could have that (hype and with an user interface copy pasted from the prototype hype fashion simple user interface of the moment, with Big Buttons tm included) hosted services are the best thing since sliced bread and will replace everything, the world is interestingly neither centered around that, and it's indeed very ironic to value importance of centralized services more than the decentralized tool it embraces and extend, given many of the advances in the source code management area and development workflow directly come from the decentralized aspects.
That Hg also exist does not make git less important than github, no more than the fact that sourceforge exists make github less important than git.
While I happen to think the Github UI is quite nice, it's a minor aspect of Gihub's importance. Github basically invented "social coding" which I would argue has been a genuine phenomenon unmatched by BitBucket, Launchpad, et al (or Sourceforge). Git (and Mercurial) didn't offer substantial advancements over Bitkeeper (save licensing) and Git's popularity was solidified with it being chosen for Linux. While reasonable people may disagree on the merits of Git, Hg and Bazaar, it's difficult to put BitBucket and Launchpad in Github's circle.
While it may be ironic to value the importance of centralization on this topic, it's unwise to not to value it.