Context...
- Andrew J Calvert

- 29 minutes ago
- 1 min read
I recently wrote about derived prompts and got instant feedback from several readers. Most was along the lines of "all of these assume there is context already present - how do you suggest someone give context ahead of each prompt?"
Context matters because You have knowledge the model doesn't — your audience, your history, your constraints, your definition of "good." Context is the mechanism for injecting that judgment. Without it, the model substitutes its own generic defaults. Context prevents confident wrongness (Models don't say "I don't know what you want" — they produce something.) And context shapes the entire reasoning path, not just the surface. It changes what the model prioritizes, what it treats as a constraint, what it considers relevant. "Write a summary" for a leader vs a coach vs someone early in their career produces structurally different outputs.
So I thought about it, and asked 3 different models how to do this - their responses are below.
Simple questions = simpler context, bigger questions need more context... Let me know how you get on?
ChatGPT

Claude

Notebook LM*

*This is from my AI insight notebook which has a context built into it from 20+ white apers, research reports and article pertaining to AI, its use, shortcomings and utility

Comments