Interesting, because I almost think of it the opposite way. LLMs are like system 1 thinking, fast, intuitive, based on what you consider most probable based on what you know/have experienced/have been trained on. System 2 thinking is different, more careful, slower, logical, deductive, more like symbolic reasoning. And then some metasystem to tie these two together and make them work cohesively.