Google人工智能提示工程(英)
Prompt EngineeringAuthor: Lee BoonstraPrompt EngineeringFebruary 20252AcknowledgementsContent contributorsMichael ShermanYuan CaoErick ArmbrustAnant NawalgariaAntonio GulliSimone CammelCurators and EditorsAntonio GulliAnant NawalgariaGrace Mollison Technical WriterJoey HaymakerDesignerMichael Lanning Introduction 6Prompt engineering 7LLM output configuration 8Output length 8Sampling controls 9Temperature 9Top-K and top-P 10Putting it all together 11Prompting techniques 13General prompting / zero shot 13One-shot & few-shot 15System, contextual and role prompting 18System prompting 19Role prompting 21Contextual prompting 23Table of contentsStep-back prompting 25Chain of Thought (CoT) 29Self-consistency 32Tree of Thoughts (ToT) 36ReAct (reason & act) 37Automatic Prompt Engineering 40Code prompting 42Prompts for writing code 42Prompts for explaining code 44Prompts for translating code 46Prompts for debugging and reviewing code 48What about multimodal prompting? 54Best Practices 54Provide examples 54Design with simplicity 55Be specific about the output 56Use Instructions over Constraints 56Control the max token length 58Use variables in prompts 58Experiment with input formats and writing styles 59For few-shot prompting with classification tasks, mix up the classes 59Adapt to model updates 60Experiment with output formats 60JSON Repair 61Working with Schemas 62Experiment together with other prompt engineers 63CoT Best practices 64Document the various prompt attempts 64Summary 66Endnotes 68Prompt EngineeringFebruary 20256IntroductionWhen thinking about a large language model input and output, a text prompt (sometimes accompanied by other modalities such as image prompts) is the input the model uses to predict a specific output. You don’t need to be a data scientist or a machine learning engineer – everyone can write a prompt. However, crafting the most effective prompt can be complicated. Many aspects of your prompt affect its efficacy: the model you use, the model’s training data, the model configurations, your word-choice, style and tone, structure, and context all matter. Therefore, prompt engineering is an iterative process. Inadequate prompts can lead to ambiguous, inaccurate responses, and can hinder the model’s ability to provide meaningful output.You don’t need to be a data scientist or a machine learning engineer – everyone can write a prompt.Prompt EngineeringFebruary 20257When you chat with the Gemini chatbot,1 you basically write prompts, however this whitepaper focuses on writing prompts for the Gemini model within Vertex AI or by using the API, because by prompting the model directly you will have access to the configuration such as temperature etc.This whitepaper discusses prompt engineering in detail. We will look into the various prompting techniques to help you getting started and share tips and best practices to become a prompting expert. We will also discuss some of the challenges you can face while crafting prompts.Prompt enginee
Google人工智能提示工程(英),点击即可下载。报告格式为PDF,大小6.96M,页数68页,欢迎下载。