Context Engineering¶
Context Engineering is defined as the art and science of filling an LLM's context window with the right information at each step of a task. It's the natural evolution of prompt engineering.
Prompt vs. Context Engineering: While prompt engineering uses static prompts, context engineering deals with dynamic information from multiple sources like user input, tool calls, history, and developer instructions.
Tutorials¶
- Prompt Engineering Guide by Matthew Berman and Nick Wentz
- Google Prompt Engineering White Paper
Prompt Generators¶
- ChatPRD AI for Product Managers
- SnapPrompt Turn screenshot into ready-to-use prompt to recreate the UI
- Anthropic Prompt Generator/Improver Generate a prompt
- OpenAI Playground Create prompt wizard
- meta-prompt-workflow Iterative approach
- Products like bolt also have prompt wizard
Prompt Libraries¶
- God of Prompt Your AI Superpowers In One Click
- LangChain Hub Explore and contribute prompts to the community hub
- Prompt Hub Community-driven prompt platform
- Prompt Library templates from Spec to Code o1 pro template system
- SDLC Prompts Github repo of prompts for software development
Best practices¶
Customizing ChatGPT¶
- ChatGPT Custom Instructions Collection of custom instructions
Hosted Models¶
- Openrouter Unified interface for LLMs
- Requesty Intelligent LLM Routing
LLM Rankings¶
Community¶
-
X.com Voices
- God of Prompt Sharing AI Prompts, Tips & Tricks
-
Reddit
- r/aipromptprogramming This group focuses on using AI tools for AI programming & prompt engineering
- r/ChatGPTCoding Coding side of ChatGPT