LLMs need focus, because attention isn't enough. (thread on the Focus Chain, Cline's link to persistent context)
24,05K