Saturday, February 15, 2025

"Chain of Thought" Reasoning Requires 10X the Compute of Simple Inference

One reason some observers are a bit skeptical about DeepSeek cost claims (training and inference) is that the architecture, including use of "chain of thought" for inference, compared to the"simple inference" models used by ChatGPT, for example, require more processing, not less. 


Granted, chain of thought models models break down problems into smaller steps, showing the reasoning at each step. So one might argue each of the smaller steps require less processing. On the other hand, more processing steps must occur. 


So chain of thought models require more processing power and time compared to simple inference approaches. On the other hand, CoT is viewed as better for more-complex problems.


source: Cerebras Systems 


Still, the CoT "more processing" profile is hard to square with claims that DeepSeek requires less compute.


No comments:

Content Attention Spans Continue to Contract!

It is quite hard to argue against the notion that content attention spans in the digital era have contracted, leading to more consumption of...