Friday, February 21, 2025

Perplexity Activates "Deep Research" Using Chain of Thought

Perplexity has released its Deep Research feature, using several techniques to improve results for complex queries. 


Deep Research combines chain-of-thought (CoT) prompting, dynamic computation allocation and hybrid neural-symbolic processing. Basically, Deep Research reasons  about how to solve a problem and also dynamically adjusts the abstraction level of reasoning steps based on problem complexity. 


Answers take longer (the model has to do more reasoning, hence more computing), compared to more common (at the moment) linear reasoning, so I would not recommend using it for simple queries. You’ll get answers faster using the legacy approach. 


Chain-of-thought prompting guides language models to break down complex problems into a series of logical steps, sometimes improving answer accuracy. 



No comments:

Most People Probably Pay Less for Home Broadband Than We Think

It always is difficult to ascertain what “most” consumers actually are paying for home broadband service, partly because people choose a ran...