I have had this exact conversation at various past employers, usually when they started talking about "big data" and hadoop/friends:
"How much data do you expect to have?"
"We don't know, but we want it to scale up to be able to cover the whole market."
"Okay, so let's make some massive overestimates about the size of the market and scope of the problem... and that works out to about 100Mb/sec. That's about the speed at which you can write data to two hard drives. This is small data even in the most absurdly extreme scaling that I can think of. Use postgres."
Even experienced people do not have meaningful intuitions about what things are big or small on modern hardware. Always work out the actual numbers. If you don't know what they are then work out an upper bound. Write all these numbers down and compare them to your measured growth rates. Make your plans based on data. Anything that you've read about in the news is rare or it wouldn't be news, so it is unlikely to be relevant to your problem space.
"How much data do you expect to have?"
"We don't know, but we want it to scale up to be able to cover the whole market."
"Okay, so let's make some massive overestimates about the size of the market and scope of the problem... and that works out to about 100Mb/sec. That's about the speed at which you can write data to two hard drives. This is small data even in the most absurdly extreme scaling that I can think of. Use postgres."
Even experienced people do not have meaningful intuitions about what things are big or small on modern hardware. Always work out the actual numbers. If you don't know what they are then work out an upper bound. Write all these numbers down and compare them to your measured growth rates. Make your plans based on data. Anything that you've read about in the news is rare or it wouldn't be news, so it is unlikely to be relevant to your problem space.