Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm legitimately worried about the long term future of my hobby (PC gaming) at this point.

Prices like this have a cascade effect because when less people can afford to build a PC and/or upgrade there's less incentive for developers and publishers to spend time on the platform. Consoles are shielded by the ability of Sony and Microsoft to take losses which will be absorbed by online fees and higher game prices but no such cushion exists for PCs (this is a strength to be sure, but a double edged sword in this respect).

PC gamers have spent years disproving the image of PCs as prohibitively expensive and now it's coming crashing down due to people in a faux gold rush wasting hardware and energy. I wish it would stop, but I don't know how much of the damage to the hobby can be repaired.

Of course there are other factors. Machine learning is eating up cards, smartphones and tablets are eating RAM, and I don't doubt there's price fixing. But ML is at least useful to humanity (sometimes), smartphones and tablets are tangible products, and price fixing can be investigated and punished.

I'm relatively shielded from this as I just built a high end machine (i7-7700, 1080ti, 32GB DDR4) that will last a while but the next time I want to build a machine might be painful. While I would probably pay the price, others will not, and the hobby will suffer.



Most PC gamers aren't using high-end machines and don't upgrade very often. The Steam Hardware Survey shows that the majority (74%) of PC gamers have a 1080p monitor and a plurality (37%) have a video card with 2GB of VRAM and 8GB of system RAM (47%). The GTX 1060 only recently overtook the GTX 750 Ti as the single most popular video card.

Pricing of high-end hardware is largely irrelevant to the success of PC gaming as a platform. The majority of users have always opted for value-oriented hardware.

http://store.steampowered.com/hwsurvey/


I can't get a mid level graphics card either. These cards in the article are $200 and $250 cards from AMD. They are going for over $400.

I am the person that buys $200 graphic cards and high end is $500+ like a 1080ti or some sort of Titan. I always thought that $200 was the sweet spot with the best price to performance ratio. I can't buy a $200 card now.

I was lucky and got a new AMD 480 last year for $220. It sadly has an issue and needs to go in for warranty (3 years). It can only handle the simplest of graphics or it hangs and freezes the computer. No big deal, I wanted to make my kids a Minecraft and coding computer and I can't buy a decent $150 card either.

RAM is also sky high and I do video editing from time to time an want at least 16 GB in my rig. That would cost $100 18 months ago and now thats $225+ also due to the miners.


When the crash comes or when someone comes up with an ASIC for Ethereum, all these cards will flood the secondary market and the spare capacity slack will cause a price crash. You'll be able to get the highest end card in the world for half the current price.


Unless there will be no crash and/or Ethereum will be replaced by another cryptocurrency that's profitable to mine with GPU's.


> RAM is also sky high and I do video editing from time to time an want at least 16 GB in my rig. That would cost $100 18 months ago and now thats $225+ also due to the miners.

That is crazily out of whack. Back in 2011 I bought 8Gb (2x4Gb, 1333Mhz) for in my 2009 15” MBP and it cost me €25 secondhand. Worked like a champ too, even transferred it to the computer of my little brother when I upgraded to a new MacBook.


It's not that long ago I used to think of RAM as _almost_ disposable - you could max out your capacity pretty cheaply, and it wasn't really worth reselling a lot of the time. Over the past few years it seems to just been getting increasingly expensive though.


The effects on prices are quite bad, but the pendulum always comes back in the other direction two. Every GPU shortage leads to the second hand market getting flooded a few years later with used mining GPUs.

So in effect we get short term problems, but if you time your purchases correctly you can get deals too.


The problem that a few years later those GPUs aren’t worth much and with more and more coins coming out miners have always something to fallback on for their older hardware.

It doesn’t look like the GPU flood of 2015 would happen again.


Would you want to buy a piece of hardware that had been run at 100% of it's capacity for years, without any warranty?


Well, not everyone is aware of this, and 2nd hand mining card price points on various market platforms probably will push down the prices of everything else to varying extents. (both new, as well as 2nd hand gamer cards).


I might. If it still works after mining for years then that tells me it'll last for a while.


This impacts everything from mid-range upwards. A friend of mine, not so well versed in gaming hardware, recently asked me for advice on buying a new GPU, to replace some 3 generation old card he's using.

I recommended him something like a 1060, assuming they'd be reasonably affordable by now, just to discover that they've actually increased in price, especially considering he's on a tight budget.

In 20 years of PC gaming I can't remember a situation comparable to whats going on right now and frankly, it's quite depressing.

With this going on neither Nvidia nor AMD have much reason at all to innovate and keep increasing their performance, as most of their performant cards sell like crazy anyway, which is a horrible situation for consumers.


Keep in mind that the hardware survey may be skewed by Steam being used on machines aren't primarily for gaming, so I'd be somewhat cautious about drawing firm conclusions (e.g. I have Steam on an old Surface Pro 3 that is basically just for chat/non-real time games).

I'd also wonder if the small percentage of people on top-end gear drive the market forward (e.g. purchasing significantly more games even if the users are smaller in number).


keep in mind that the numbers have been skewed in the last few months because of an influx of pubg players from china, who probably don't have as much money to spend than western gamers.

see: http://store.steampowered.com/hwsurvey/directx/

windows 7 having a huge uptick from aug-oct, and high end graphics cards' market shares dropping in the same time.


Simplified Chinese having 63.90% share by language suggests that as well.


Are those stats normalized for game purchases?


Gamers are complaining about why can't nVidia just ramp up their production.

Having worked in a Fab and an assembly factory, there is a tremendous amount of coordination that needs to happen. Let's forget the business aspects for a moment - in order to ramp up production, it is not just building more lines. Silicon industry supply chains are mind bogglingly complex from oil-free-air supply hardware, etching chemicals supplier, automation equipment all the way to the copper mine. All of these interdependent supply chains have to scale in perfect coordination without an exception to be able to scale production.

Then there is the business side - building more lines could be foolish investment if the LRP (long range projections) is weak.

nVidia fabs their chips at TSMC. Say, TSMC has wafers ready but the assembly houses are struggling to keep up - the entire supply chain is broken and demand cannot be met.


> Gamers are complaining about why can't nVidia just ramp up their production.

I don't see too much of that in gaming circles. It's understood that it would be risky as crypto will crash sooner than later. The anger in the PC community is almost exclusively directed at miners. Threads about it pop up on the daily, but what can we do? We could lobby NVIDIA/AMD to somehow implement restrictions on what the Gaming SKUs can be used for as they do with the Quadro/FirePro workstation cards, but it's unclear if that would work let alone if it's technically/legally possible.

It's a bad time to be a PC gamer, and just months ago it was a great time to be a PC gamer.


could lobby NVIDIA/AMD to somehow implement restrictions on what the Gaming SKUs

Infact the opposite has happened. NVidia has restricted consumer cards from DCs with an exception for blockchains! They are very much making hay while the sun shines


I think something interesting will happen with TPU's.

nvidia's titan v has a 110 TFLOPS TPU (and only 12 TF GPU iirc). Ir's $3000, but that seems not reflect costs, but avoid cannablizing their even higher-end scientific computing offerings... so price could come down (to forestall competition).

Can TPUs give better ROI than GPUs? Conversely, will games end up exploiting TPUs?


Google’s TPU is a GPU without the silicon burden of needing to render graphics so no.


I referred to nvidia's titan v, which is independent of google's tpu. https://wikipedia.org/wiki/Tensor_processing_unit https://www.nvidia.com/en-us/titan/titan-v/

I also didn't say it would be used for rendering in games, but that it could be used by games. Games can use GPUs for compute, for physics, simulation amd even AI. Some is already happening. TPUs would be even better. to I find that interesting.

The point of my comment was that TPUs providing some differentiation relevant to mining vs gaming would affect prices.


Why exactly the TPU would be better? it's not a general computation processor it only does MMA.


I don't know, but at 110 TF for $3000, it would be worthwhile figuring out some way to use it.


For gaming? the 110 TF of the Titan V are also very very very conditional they are only applied when you are doing mixed precision MMA for any other operations you aren't getting 110TF....


What about for mining?


If the mining was to crash hard, it'd be a fantastic time to be a gamer with 2nd market GPU offerings. ;)

Also, I bought my own 1080TI last May and have had it mining when PC not in use. It's already earned itself back, potentially speaking (haven't actually sold anything yet for real dollars). A single card isn't enough to bring anything solid in, but it's at least a foot in the door kind of thing, especially for the fun factor.


A card used for mining will have worn out fans and potentially damage from heat on it. You'll have to hope for really cheap used cards to make up for that.


I'm running my GPU in mid 50s Celcius, with plenty of case cooling. (Lowered the power target etc.)


I think goda90 was referring to the second hand market.You may have taken good care of yours, but I suspect that miners won't have been as careful so buying these could be a bit of a crapshoot


When it comes to GPU mining (especially ETH) it is very common to undervolt / power limit a GPU as it reduces power usage greatly so the temperatures are lower than usually. Power consumption together with the price of kWh is an important factor that affects mining profitability


Yes but they are still stuck in open air cases in not exactly pristine environments.

Getting a GPU that has been running 24/7 for 2-3 years will be the mother of all lotteries.


I wonder if Nvidia et al could short Ethereum as a hedge while expanding production.


The markets are not liquid enough for a hedge of that scale (eg to cover a factory investment of 8, 9 or 10 figures USD).


This is why futures exist. They would sell the futures to hedge their crypto exposure.


I don't think that taking out naked short positions on the cryptocurrency market, or a coin that's had 15,000% run-up over the last year, is anything that a publicly traded company, that isn't in finance, would do.


You'd trust your business to a bitcoin exchange?


Futures are sold on CME/CBOE, not on bitcoin exchanges.


Doesn't matter since the price can be easily manipulated. It pays to drain millions into inflating the price or keeping it afloat, if you could fleece the counterparty for billions...


I wonder if they could cripple integer performance; this would prevent mining while probably not hurting gaming performance.


That may not impact the mining speeds that much as most integer operations would then be done using floats.

Also mining speeds are way more bottlenecked by memory speeds than GPU speeds these days.


I don't understand why companies like Nvidia don't adopt to the market and offer cards that are specifically designed for miners. Wouldn't that be possible and make them potentially more useful than mere graphics cards?


Because when the crypto bubble pops NVIDIA will be holding the bag on a new line of unwanted products.


There is at least one [0], but I'm frankly not sure where you can buy it.

[0] https://www.asus.com/Graphics-Cards/MINING-P106-6G/


As much as I think it's bad for gaming (right now at least), I think hardware manufacturers artificially limiting what it can be used for is a bad route to go down.


It is worth noting that nVidia contributed as much revenue to TSMC as Bitmain did last quarter. A key reason nVidia can't just ramp up is that other customers might be able to pay more than them for priority.

https://seekingalpha.com/news/3323673-tsmc-expects-iphone-sh...


I remember when the PS3 and Xbox360 had just come out, and everyone was screaming from the rooftops about how PC gaming will just crash and burn. Not for a week or two, but for years after those boxes had been released.

So you're saying PC gaming survived that sustained direct threat only to be wiped out by cryptocurrency mining that isn't even concerned with gaming?

This is a silly exaggeration. In the long term, if the crypto market collapses the prices will come back to normal.

If it thrives instead, the GPU manufacturers will probably scale up their production lines to meet demand.

Or maybe a miracle will happen and we'll see some really good GPU manufacturers pop up who'll make the market more competitive.


> I remember when the PS3 and Xbox360 had just come out, and everyone was screaming from the rooftops about how PC gaming will just crash and burn. Not for a week or two, but for years after those boxes had been released.

This still lead to the situation that, for years, AAA publishers wouldn't touch the PC gaming market with a ten-foot pole and focused pretty much all of their efforts on the 7. console generation, this was additionally compounded by the "piracy factor".

During that time PC gaming pretty much only had MMORPG's (which made the big bucks due to subscription-based payment), some RTS and a couple of rare exceptions in the FPS genre going for it while Steam was still busy establishing itself and the indie scene was nowhere to be found yet.

It was a rather boring time to be a PC gamer as you'd miss out on many of the interesting and unique releases due to them being only released on consoles.


PC as in Windows, or Linux/Mac? Big titles like Quake and Unreal were available on PC (meaning x86-32 plus Windows or x86-64 plus Windows in my post though things like Wine have been amazing for me).

There's always been an ample amount of games available on PC. Whether they were boring or not is a matter of opinion (I wouldn't say missing out on say Resident Evil is any issue). I am entirely positive all these games had recent equivalents in their genres. I said recent because else the comparison is too broad. Even Rez is just a newer version of Space Harrier, or compare first Doom or Quake VS last.

Also, some game vendors sign exclusive deals with a console vendor where they initially only release the game for that specific console. Then you have to buy the console, or wait.

So yeah curious what you felt you were missing out on.

Personally, I don't like consoles because I got nothing with this whole Trusted Computing thing. Instead I got a Steam Link for 5 EUR during Black Friday.


> PC as in Windows, or Linux/Mac? Big titles like Quake and Unreal were available on PC (meaning x86-32 plus Windows or x86-64 plus Windows in my post though things like Wine have been amazing for me).

Counter-Strike was way more relevant at that point, but that doesn't change the basic fact that pretty much all the major publishers gave little to no attention to the PC sector during that time.

Case in point: While Halo 1 and 2 still had Windows ports, the following Halo games didn't get them, same situation with Gears of War, first game got a Windows port, none after that until recently.

> Whether they were boring or not is a matter of opinion (I wouldn't say missing out on say Resident Evil is any issue).

That's a matter of personal taste and preferences. Personally, I really enjoyed RE1+2 on my Playstation back then (Just like Metal Gear Solid) which was also my last console before gaming exclusively on PC, skipping the whole sixth console generation, which already led to a bit of a backlog due to many interesting GameCube/PS2 titles.

Wasn't until late in the seventh generation (Xbox360/PS3) that I got back into console gaming, mainly due to the lack of certain genres on PC (Third Person games/backlog of exclusives) and the consoles/games gotten more affordable at that point.

If it wasn't for my interest in MMORPG/RTS/western RPG games I'm not sure I could have held out that long back then, as I also enjoy the occasional JRPG/Arcade Racing Game/Third Person Shooter, which had been rare to non-existent on PC back then.

> Personally, I don't like consoles because I got nothing with this whole Trusted Computing thing. Instead I got a Steam Link for 5 EUR during Black Friday.

There's something to be said for getting tons of games very cheaply from a still functioning second-hand market. Getting a cheap used PS2+tons of games or a Wii+tons of GameCube games offered a lot of game for very little money. Steam had a phase where this worked somewhat well too, but it feels like over these past few years good deals have mostly been replaced by tons of shovelware 99 cent games for card collections or having to wait for Summer/Winter sale events.


> So you're saying PC gaming survived that sustained direct threat only to be wiped out by cryptocurrency mining that isn't even concerned with gaming?

Once bitten, twice shy. I remember the dark time you're referring to. I don't think they've actively contributed to it but I can't help but think that MS and Sony must be seeing this as an opportunity to "twist the knife" some, bring back the good old days of the 7th generation.

I admit that in reality my fears will probably never come to fruition but that doesn't change the way I feel about the current situation.


Are there any non-VR games that even push the 1080ti?

It seems like you'd need some giant 4k monitor to make it sweat.

But also Steam doesn't seem to be releasing a ton of graphics-intensive games. It seems much more cost effective to get away with a lower price and less intensive art requirements and focus on gameplay. And less graphics puts you in a safer place as far as ports to consoles and tablets/mobile (e.g. Stardew Valley).

The Vive HD upgrade will push the envelope further, but I don't think VR has great success. (Elite Dangerous in particular gets boring because it's rather easy to make a ton of credits to afford everything)


1080TI can take heavy hits at 2560x1080 even, Battlefield 1 max settings can see 90 FPS at times. 2560x1080 @ 200hz it can't even come close to consistently with high-end games and settings. PUBG? frames are meh. Even with an overclocked card it can be rough sometimes in my experience.

Personally i'll be upgrading to the next card that trumps the 1080 ti and moving from 2560x1080 200hz (for all but CS, cs is 1280x960 @ 144 on another monitor) and going back to 1080p @ 250hz.

ultrawide is god-tier for movies and productivity work, though.


I just upgraded to 2560x1080 and was wondering if 3440x1440 is worth it. But just the idea of having to push that many pixel made me decide that 1080p is enough. I also have a 200Hz Monitor, but I do not mind if I do not get 200Fps.


90 vs 200 FPS is meaningless on it's own.

If it feels off something else on your rig I'd messed up. Check latency on your keyboard / monitor etc.


And here I set in front of 1680x1050 with an ATI Radeon HD 5770 playing Rocket League on Minecraft like graphics so the the splitscreen with a friend does not stutter noticeably. :D


I have a 4k monitor with two GTX 1080 in SLI and I usually don't get 60fps if I play on max settings. Annoyed by this I got another monitor at 2560x1440 @ 144hz and I can't max that out either. The games were stuff like rise of tomb raider or witcher 3.

I think 1080ti situation is similar.


I have a gsync 1440p 144hz, running on a 980ti. IMHO, if you can get at least 90 FPS it's a great experience. I don't really see the need for faster frames.

Granted, I believe gsync does a lot to make it appear smoother so if you don't have that it may be more important to max the frame rate. I do see it though when I drop to 60 fps, it feels choppy.


It's still a silly situation to be in, you spend all that money on a fancy monitor and GPU, yet you can't even max out visual fidelity if you want to make full use of them.

About GSYNC: Afaik isn't its purpose to eliminate screen tearing without the use of vsync, thus not getting stuck with awkward halfing of fps if the GPU can't keep up?


Why do people still think you need a high-end rig updated every year or two just to be a PC gamer? Almost every game I want to play runs satisfactorily on a five-year-old machine.

This is bad for the tech-obsessed top 10% of PC gamers. For the rest of us it's meaningless, and it's not going to last in any case.


It's kind of funny when 99% of the games are optimised for PS4 or maybe xbox, and both run the Jaguar cpu and 1.8 tflops cpu (xbox slightly lower)

My i7 4770k with a gtx 670 is still running all the games i throw at it, even if i might have to go down from ultra high effects in some.

However I have to admit I play mostly on PS4/switch now a days.


A lot of those people are buying cards/systems for the next 5 years, replacing their 5-year-old setups.


Depends on how quality your five-year old machine is. Graphics matter, less CPUs these days.

e.g. I found I could play high-end games (Deus Ex: MD for instance) fine on my rig with a Core i7-3770 (almost 6 years old!). However, it lagged on my original Geforce GTX 650 (~4.5 years old at that time), prompting an upgrade to a Geforce 1060 (an upper-mid end card that is now super expensive)


You should be glad those people exist, because they subsidize the R&D for people who buy older/cheaper cards.


Enh, honestly, if graphics somehow stopped advancing forever right now, it wouldn't bother me that much as a gamer. The era where good game concepts just couldn't be made because the hardware couldn't handle it are long over. We're just adding successive layers of polish now. That's nice, but it doesn't add much to gameplay or fun.


You could find somebody who said or thought this very thing each year for the last 20 years.


Who do you think paid for the R&D on the 5 year old cards you're playing on today?


Games I shove the most time at aren't even that intensive. Klei, and Zachtronic games.


We've had the most fun recently with one of those "Pandora's Box" pirate 1980s arcade simulators. I believe internally it runs on a 32 bit ARM chip.


If those games are your thing, you'd probably love Factorio.


>I'm legitimately worried about the long term future of my hobby (PC gaming)

The equipment required for gaming loses value very quickly. Old 7970's released in 2012 command amazing resale prices in 2018. Everything else in a rig back then is now worthless apart from scrap metal.

GPU's are one of the biggest expenses for gaming, as a hobby it got far cheaper. The fear is unfounded and this is typical alarm-journalism.

Now if only CPU's held value 6 years from now.


My i7 2600k from 2011 is still doing great now, even not overclocked. Maybe not much resale, but I would guess it would play almost all games today fine (not much of a gamer myself). Actually, the upgrade from DDR3 to DDR4 is probably as much of a factor.


As an avid gamer, with an i7 2600k@5GHZ I can confirm that it's still up to the task with anything I've thrown at it so far, the bottleneck is pretty much always the GPU, and it's not looking like that will be changing anytime soon.

And I'm still sticking with DDR3, which runs with way sharper timings compared to DDR4.


> Old 7970's released in 2012 command amazing resale prices in 2018. Everything else in a rig back then is now worthless apart from scrap metal.

RAM is much more expensive and keeps resale value well. I could sell my 16GB from 2012 now almost for twice the price (!!), which is also keeping me from upgrading to 32GB.


Monetary value, perhaps.

Gaming value? Actually the reverse is true - everything in my i5-3570K's rig is from 2012, except for the GTX 1080 and it ran everything 2017 @1440p very well.

Admittedly no BF1 or PUBG as I grew tired of the FPS genre a long time ago.


Don’t worry. This kind of supply constraint is an intrinsically short-term thing.

Either...

* crypto mining will crash (in which case you’ll be able to get powerful cards for cheap since the secondhand market will be flooded)

* or crypto mining will move on from the stop-gap measure of using gaming cards (more sophisticated setups will be more efficient and everyone is competing)

* or — in the highly unlikely event the current situation proves to be stable — card manufacturers will ramp up production


I agree with all of these. I think the biggest worry is that graphics cards manufacturers will be, justifiably, very reluctant to ramp up production due to the how volatile the crypto currency ecosystem is. If they first ramped up production to meet increased demand, your scenario 1 (crypto market crashes) would hit them double hard since not only will they not be able to sell lots of cards to miners, but the miners will be selling their used cards for cut-rate prices.


The 1050 and the 1060 3GB GPUs are more than good enough to run any tier one game on nearly the highest graphics on a 24" monitor. Everything above it is just ridiculousness for the sake of being absurd in the gaming industry, and for benchmarks that don't reflect real usage - unlike GPU mining, where the cards are actually pushed to their maximums.


I have the 1060 6GB. It can max out any game on my 1080p 60hz 24" monitor.

However, that's table stakes these days. 1440p monitors are gaining ground as people realize the DPI difference between their PC monitor and their smartphone. Moreover, most people can also notice the difference between 60hz and 100hz monitors.


So now we're talking about bleeding-edge technology in the 1440p / 100hz realm. I don't feel really bad that gamers can't readily get access to the technologies that drive the 0.0001% of gamers to these settings, no more than I feel bad for cryptocurrency miners who spend $1200 on a 1080ti and configure it like an idiot and get half the earning potential they should.


1440p is hardly bleeding edge in the productivity end.

If you already have that kind of monitor, why not reuse it for gaming?


~4% of people are on 1440p or high monitors according to the Steam hardware survey - http://store.steampowered.com/hwsurvey/

It's high-end, but not exactly what I'd call bleeding-edge / 0.0001% of gamers.


Why is a noticeable difference considered meaningful? It won't make the gameplay any better.

(Asking as a Luddite who doesn't see the point of Retina displays.)


Gameplay is only part of what a game offers. Increased realism or visual acuity from higher resolutions, larger textures, higher framerates, better anti-aliasing, better shadow rendering, and so on can contribute to feeling more immersed in the world of the game. Many gamers, myself included, use gaming as an escape from the world - it is helpful that the "world" I escape to look and feel real within the context of the fictional world.


I don't have a strong argument for DPI in gaming, I find DPI to be more noticeable when watching movies/reading text.

Going from 60 => 100 fps on the other hand makes fast-paced games feel much more fluid and responsive.


"It looks nice(r)" is basically the end of that argument. Some people will care (I find easier to be immersed in a game when technical limitations aren't as apparent), others won't.


1060 3GB cards are also going through price shocks. I bought a 6GB Geforce 1060 for $250 in Oct, 2016. Even the 3GB version is going for $450+ now


> Consoles are shielded by the ability of Sony and Microsoft to take losses which will be absorbed by online fees and higher game prices

I imagine they are also shielded to some extent by being AMD APU-based, so the market for discrete GPUs has less of an effect.


AMD APUs (and Intel+AMD Graphics APUs) might be what keep PC gaming alive for the time being. If AMD can pack a chip that can match an RX 460 onto a single die for something like $200 it'd be the goto chip for an entry level system.


Let’s add some perspective. The average price of a PC in 1995, right in the middle of PC gaming’s glory years, was $1500 which is $2500 in today’s currency. That’s an average pc, not top of the line, which went a multiple of that. Unless average graphics cards start costing $1000 we’re nowhere near 90’s prices.

PC gaming will do fine. Don’t worry. Maybe game makers will be a little slower to move to higher end specs, and that’s about all it will amount to.


Long term is not a problem actually, with high demand more factories will be built, but it takes time.

Short-term, we're stuck with high prices.


I had no issues at all getting a 1080 when those were new. I don’t think the next generation will really be that bad either. Even with what happened with the Vega release, it was easier to find a Vega GPU than it was to find a Nintendo Switch when those were just coming out.


There is a side effect in that the margins of prebuilt PCs should rise.


"... ML is at least useful to humanity (sometimes), smartphones and tablets are tangible products, and price fixing can be investigated and punished."

I'm amused that you're qualifying the worthiness of a project by its usefulness to humanity. In comparison to PC gaming, to me, revolutionizing the financial world seems a lot more "useful to humanity".


> revolutionizing the financial world seems a lot more "useful to humanity".

1. Citation needed.

2. Revolutions aren't always positive changes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: