Playbook: Alex for Engineers
Your reference for applying Alex to design decisions, cross-functional communication, failure analysis, regulatory documentation, and technical client work. Ready-to-run prompts — built around the hard realities of engineering practice, not the textbook.
What This Guide Is Not
This is not a habit formation guide (see Self-Study Guide for that). This is a domain use-case library — the specific ways Alex supports professional engineering work across disciplines.
Where to Practice These Prompts
Every prompt in this guide works with any AI assistant you already use — GitHub Copilot, ChatGPT, Claude, Gemini, or others. The prompts are the skill; the tool is just where you type them. If you already have a preferred tool, start there.
For the deepest experience, the Alex VS Code extension (free) was built for these workflows. It understands engineering context, lets you save what works with /saveinsight, and keeps your playbook and exercises right inside the editor where you already work.
You don’t need a specific tool to benefit. You need the discipline of reaching for AI when the work is genuinely hard — not just when it’s repetitive.
Core Principle for Engineers
Engineering is the systematic management of consequences under uncertainty. The hardest decisions are not the ones with clear right answers — they are the ones where every option has failure modes and the question is which failure mode is most acceptable in this context.
Alex is most powerful for engineers not as a calculator or library substitute, but as a thinking discipline: a way to force yourself to articulate assumptions, pressure-test reasoning, and anticipate second-order effects before they materialize in hardware, software, infrastructure, or safety cases.
Do not use Alex as a primary source for engineering values, standards, or code compliance. Verify every technical claim against authoritative standards for your discipline and jurisdiction.
The Seven Use Cases
1. Design Decision Documentation and Justification
The engineer’s documentation challenge: Engineering design decisions involve tradeoffs that are clear to the engineer who made them and opaque to everyone else — including the engineer revisiting the same drawing two years later. The discipline of design documentation is not recording what was chosen; it is recording why, what was ruled out, and what conditions would require revisiting the decision.
Prompt pattern:
I am making a design decision: [describe the engineering decision].
Discipline: [structural / electrical / mechanical / systems / civil / software / other].
Design context: [project description, functional requirements, relevant constraints].
Options I considered:
Option A: [description, key characteristics]
Option B: [description, key characteristics]
Option C: [description if applicable]
My preferred option and primary rationale: [your reasoning].
Help me:
1. Challenge my rationale — what am I not seeing?
2. Identify failure modes of my preferred option that should be in the design record
3. State the performance assumptions the design depends on
4. Draft a decision record that is useful to a future engineer, not just a project deliverable
Follow-up prompts:
What does this design decision look like under [different loading condition / material change / operational environment]? Does the preferred option still hold?
What is the first sign that this design decision was wrong and I should revisit it?
What assumptions is this design making about operating conditions that I should document explicitly in the design record?
Try this now: You are deciding between welded steel and bolted connections for a pedestrian bridge retrofit. The bridge is 40 years old, in a seismic zone, and the client wants minimal traffic disruption during construction. Paste the constraints into the prompt above and ask for a design justification that captures why you chose one approach over the other. The output becomes a living document your team can revisit.
2. Design Review Preparation
The engineer’s review challenge: Design reviews fail to catch failures not because the reviewers are unintelligent, but because review is a social process with professional dynamics that inhibit the hardest questions. The most valuable review surface is the “what have I not thought of” — and that question is best asked before the review, not during it, when the design is already partly committed.
Prompt pattern:
I am presenting a design for review: [describe what you are presenting].
Audience: [discipline expertise of reviewers, seniority, any known concerns].
What I am most uncertain about: [honest self-assessment].
What I am hoping does not come up: [the thing you are not sure about].
Key decisions I need the reviewers to accept: [what success looks like from the review].
Help me prepare:
1. The three most likely hard questions and how I should answer them
2. The question I am hoping they do not ask — and whether I am prepared if they do
3. How to present the design so the decisions are clear without being defensive
4. Anything in my design rationale that a skeptical senior reviewer would probe
Follow-up prompts:
Generate the review questions a domain expert in [sub-discipline] would ask about this design.
I disagree with the review feedback: [describe feedback]. How do I present a technical rebuttal respectfully and effectively?
What is the question I am hoping the reviewers do not ask — and what is my prepared answer if they do?
3. Failure Mode and Root Cause Analysis
The engineer’s failure analysis challenge: Failure investigation is a discipline with a systematic structure, and the most common deviation from that structure is jumping from observed symptom to cause assignment before the investigation has actually established the causal chain. The FMEA framework and root cause discipline exist precisely because human cognition is unreliable at distinguishing “what went wrong” from “what we think went wrong based on what we already believed.”
Prompt pattern:
I am investigating a failure:
Discipline: [mechanical / electrical / structural / system / process].
Observed failure: [what happened, including description of the physical or functional evidence].
Component/system: [what failed or malfunctioned].
Operating conditions at time of failure: [load, environment, duration, any anomalies].
What I currently believe happened: [your hypothesis].
Evidence for and against my hypothesis: [be honest about both].
Generate:
1. Alternative root cause hypotheses I should investigate before settling on mine
2. The physical evidence that would distinguish between these competing hypotheses
3. Contributing factors that may have enabled the root cause (these often prevent recurrence)
4. What I need to preserve or document before the investigation goes further
Follow-up prompts:
My root cause is [X]. What design change or operational control would most reliably prevent recurrence?
I need to write a corrective action report. Help me structure it so it is technically precise, not defensive, and actionable.
What contributing factors beyond the immediate root cause made this failure possible — and which ones should I address to prevent the entire class of failure?
4. Client and Stakeholder Communication
The engineer’s communication challenge: Technical precision is a professional virtue and a communication liability at the same time. The stakeholder or client who receives a report written at engineering precision and not at their level of technical literacy will either misread it or not read it — and will make decisions based on the summary someone else gives them verbally. Engineering communication that does not reach its intended audience has not communicated.
Prompt pattern:
I need to communicate [technical matter: finding / recommendation / risk / design option] to [audience: client / project owner / regulatory body / public / executive leadership].
Technical content: [describe what you need to convey].
What they need to decide or understand: [the specific outcome you need from this communication].
Their likely technical background: [specialist / informed non-specialist / non-technical].
Format: [report / letter / presentation / executive summary].
Generate:
1. The communication at the right level for this audience
2. The technical detail I can include vs. what needs to go to an appendix
3. How to make the recommendation or finding clear without under- or overstating certainty
Follow-up prompts:
The client has misread my report and is drawing an incorrect conclusion. Help me draft a clarification that is diplomatic, not defensive.
I need to explain [complex engineering concept] to a non-technical project sponsor in under three minutes. Draft the verbal explanation.
What is the one figure or diagram that would make this report immediately clear to the intended audience?
5. Engineering Document Scaffolding
The engineer’s documentation challenge: Engineering documents — specifications, method statements, calculation packs, inspection and test plans — follow conventions that vary by discipline, firm, jurisdiction, and client. The overhead of starting from a blank page is substantial. Alex can produce a well-structured scaffold that you can populate, but the technical content and compliance review must remain yours.
Prompt pattern:
I need to produce a [document type: specification / method statement / calculation report / inspection and test plan / engineering assessment].
Discipline: [your engineering discipline].
Project context: [brief description of the project].
Key content it must address: [the specific technical items this document needs to cover].
Applicable standards or client requirements: [list any relevant standards — but verify compliance yourself].
Generate:
1. A section structure appropriate for this document type and discipline
2. Suggested content for each section at the outline level (I will write the actual technical content)
3. The sections that are most often incomplete or weakly written in this document type
4. Any content I should include that the typical template misses
Follow-up prompts:
What section of this document type is most often incomplete or poorly written — and what does a strong version look like?
How do I structure the assumptions section so a future engineer knows which assumptions to re-validate if conditions change?
Generate the review checklist for this document type — what should a peer reviewer verify before sign-off?
6. Procurement, Specifications, and Vendor Technical Evaluation
The engineer’s procurement challenge: Tender specifications that are insufficiently specific produce non-comparable bids. Specifications that are overly prescriptive eliminate competitive options that could reduce cost or improve performance. The discipline of specification writing is one of the most important and least taught aspects of engineering practice, and the evaluation of vendor technical responses requires the same rigor as design review.
When to use: Writing technical specifications for procurement, evaluating tender responses, conducting vendor technical capability assessments, or negotiating technical requirements.
Prompt pattern:
I am procuring [system/component/service] for [project].
Functional requirements: [what it must do].
Performance requirements: [measurable targets].
Environment and operational conditions: [context].
Applicable standards: [relevant codes or standards — verify these yourself].
Known failure modes I want to spec against: [honest list of what could go wrong with a poor solution].
Help me:
1. Build a specification structure that will produce comparable bids
2. Identify the requirements I have left vague enough to enable non-compliance in plain sight
3. Design the key assessment criteria for technical evaluation of bids
4. Draft the technical questions I should ask each bidder to surface capability gaps
Follow-up prompts:
Vendor A has submitted a bid that technically meets my specification but does not do what I need. Help me identify whether this is a specification failure or a vendor interpretation issue — and how to resolve it.
I am evaluating three bids. Here are the key differences: [describe]. Help me build a weighted technical evaluation scorecard.
The procurement process requires I treat all bids equally, but one vendor is clearly less technically competent. How do I construct the evaluation so this is visible in the scoring?
7. Regulatory Submissions and Safety Case Development
The engineer’s regulatory challenge: Safety cases and regulatory submissions fail for two common reasons: claims that are not fully supported by the evidence presented, and presentation at a level of technical detail that the regulator cannot evaluate efficiently. A submission that requires the regulator to go back for clarification wastes time, signals inadequate preparation, and often results in more scrutiny than a clean submission would have. The discipline is not just technical correctness but demonstrable traceability from claim to evidence.
When to use: Preparing regulatory submissions, developing safety cases, responding to regulator requests for information, or building compliance arguments for new or modified systems.
Prompt pattern:
I am preparing a [submission type: safety case / approval application / design acceptance / environmental statement / regulatory notification].
Regulatory body: [name the regulator or regulatory framework].
Subject: [what is being approved, certified, or notified].
Core safety or compliance argument: [what I am claiming and on what basis].
Evidence I have: [types and quality of evidence available].
Help me:
1. Structure the submission so the claim-evidence chain is visible and traceable
2. Identify where I am asserting something I have not yet fully evidenced
3. Anticipate the regulator's most likely technical objections
4. Flag where I am relying on assumptions rather than demonstrated compliance
Follow-up prompts:
The regulator has issued a request for information: [describe the question]. Help me draft a response that answers it precisely without volunteering unnecessary information.
I need to write the argument section of a safety case for [system/hazard]. Help me build the structured argument — I will supply the evidence and technical claims.
My safety case has a gap between [claim] and [evidence]. What options do I have to close this gap: additional testing, engineering justification, or operational control?
What Great Looks Like
After consistent use, you should notice:
- Design decisions are documented with reasoning, not just conclusions
- Design reviews surface harder questions because you have pre-loaded them yourself
- Failure investigations are more systematic and less anchored on first hypotheses
- Technical reports are calibrated to their audience — executives get summaries, regulators get traceability
- Procurement specifications produce comparable bids instead of surprises
The engineers who will be most effective in an AI-augmented practice are not those who offload technical judgment — they are those who use AI to think more rigorously, document more completely, and communicate more clearly while keeping the professional responsibility where it belongs: with themselves.
Your AI toolkit: These prompts work in ChatGPT, Claude, Copilot, Gemini — and in the Alex VS Code extension, which was designed around them. Start with whatever you have. The skill transfers across all of them.
Your First Week Back: Practice Plan
| Day | Task | Time |
|---|---|---|
| Day 1 | Use the Design Decision Documentation pattern on a recent design choice | 25 min |
| Day 2 | Run the Design Review Preparation pattern before your next review | 25 min |
| Day 3 | Use the Failure Analysis pattern on an open technical problem | 25 min |
| Day 4 | Try the Client Communication pattern on a report you are writing | 25 min |
| Day 5 | Review the week’s prompts — save your three best with /saveinsight | 25 min |
Month 2–3: Advanced Applications
Track Your Growth
Project Technical Decisions Archive
Keep a queryable record of engineering decisions across projects:
/saveinsight title="Design decision: [brief description]" insight="Context: [project, system]. Chose: [option]. Rejected: [alternatives]. Rationale: [key reasoning]. Assumptions: [what must be true]. Revisit if: [trigger conditions]. Proved correct/incorrect: [post-implementation learning]." tags="engineering,design-decisions"
Lessons Learned Library
/saveinsight title="Engineering lesson: [topic]" insight="What happened: [brief]. Root cause: [specific]. What we should have done: [specific]. What to check next time: [specific test or review step]. Applicable to: [project type or system type]." tags="engineering,lessons-learned"
Continue your practice: Self-Study Guide — the 30/60/90-day habit guide.
Show the world you've mastered AI for engineering. Add your verified certificate of completion to LinkedIn.