Your comment was at the top so I continued to read expecting to find a bunch of ignorant group think about how git is awesome and Facebook is dumb, but that's not really what's going on down below.
I don't know what facebook's use case is, so I have no idea if their repositories are optimally structured. However, I've used git on a very large repository and ran into some of the same performance issues that they did (30+ seconds to run git status), so I don't think it's terribly hard to imagine they're in a similar situation.
What we did to solve it is exactly what you're excoriating the people below for suggesting: we split the repos and used other tools to manage multiple git repos, 'Repo' in some situations, git submodules in others.
However, we moved to that workflow mainly because it had a number of other advantages, not just because it made day-to-day git operations faster.
I hope git gets faster, some of the performance problems described are things we saw too, but things are always more complicated and I see nothing below that looks like the knee-jerk ignorant consensus you're describing.
Sometimes the answer to "it hurts when I do this" is "don't do that... because there's other ways to solve the same issue that work better for a number of other reasons and we haven't bothered fixing that particular one because most of the time the other way works better anyway."
On a simliar note, I've heard there of people who would hit the limit on fortran files, so they put every variable into a function call to the next file, which itself contained one function and a function call to the next file after that (if necessary).
It is intuitively obvious that it is better to be rich and healthy than poor and ill. Sadly, the reality choices are neither. And you can not just split a repo. Hearing some ideas for that case would have been interesting.
Solving a scaling problem by splitting it is, well, obvious.
And, yes, I also ran github on a couple of projects at $work and the issues are real, seen them.
So, if it hurts when I try to use git - the answer will be don't use git... But the conveniences are so tempting...
I don't know what facebook's use case is, so I have no idea if their repositories are optimally structured. However, I've used git on a very large repository and ran into some of the same performance issues that they did (30+ seconds to run git status), so I don't think it's terribly hard to imagine they're in a similar situation.
What we did to solve it is exactly what you're excoriating the people below for suggesting: we split the repos and used other tools to manage multiple git repos, 'Repo' in some situations, git submodules in others.
However, we moved to that workflow mainly because it had a number of other advantages, not just because it made day-to-day git operations faster.
I hope git gets faster, some of the performance problems described are things we saw too, but things are always more complicated and I see nothing below that looks like the knee-jerk ignorant consensus you're describing.
Sometimes the answer to "it hurts when I do this" is "don't do that... because there's other ways to solve the same issue that work better for a number of other reasons and we haven't bothered fixing that particular one because most of the time the other way works better anyway."