AI-Powered Generation

Use conversational AI to generate compliant requirement candidates from natural language.

The AIRGen Chat

Navigate to the Ask AIRGen page from the left-hand sidebar. The chat interface works like a conversation: you describe your system need, constraint, or objective in plain language, and AIRGen responds with structured requirement candidates.

There is no special syntax to learn. Write the way you would explain a requirement to a colleague. For example:

Each message you send becomes part of the conversation history. AIRGen uses the full conversation thread to maintain context across follow-up messages, so you can refine and iterate without repeating yourself.

Generating Candidates

When you submit a message, AIRGen uses your project context — existing requirements, uploaded documents, and architecture data stored in the Neo4j project graph — to generate 1 to 5 candidate requirements.

Each candidate is formatted as a proper requirement with three parts:

How many candidates? AIRGen generates between 1 and 5 candidates per prompt, depending on the complexity and ambiguity of your input. A narrow, specific prompt typically produces 1-2 candidates. A broader system need may produce up to 5 variations for you to choose from.

Reviewing Candidates

Candidates appear in the Smart Candidates panel on the Ask AIRGen page. Each candidate is displayed as an editable card. You have three actions available for every candidate:

  1. Edit — Modify the title, description, or rationale directly in the card. Fix wording, adjust thresholds, or tighten scope before accepting.
  2. Accept — Add the candidate to your project as a new requirement. It immediately becomes part of your requirement set and is available for QA scoring, tracing, and baselining.
  3. Reject — Discard the candidate entirely. Rejected candidates are removed from the panel and are not added to your project.
Edit before accepting. AI-generated candidates are starting points, not final requirements. Take a moment to sharpen the language, add measurable criteria, and verify the scope before you accept. This small step significantly improves your QA scores.

Context-Aware Generation

AIRGen does not generate requirements in a vacuum. The AI uses data from your Neo4j project graph to produce contextually relevant candidates. Specifically, it considers:

This context-awareness means the AI gets better results the more data your project contains. A project with uploaded specifications and a defined architecture will produce more precise candidates than an empty project.

Privacy note. Your project data is used only to generate candidates within your session. It is not shared across tenants or used to train models. Self-hosted deployments keep all data on your own infrastructure.

Best Practices

Follow these guidelines to get the most out of AI-powered generation:

Be Specific in Your Prompts

Include the system boundary, performance targets, environmental conditions, or safety constraints. The more concrete your input, the more precise the output.

Iterate with Follow-Up Messages

Use the conversation thread to refine candidates. If the first set of candidates is close but not right, send a follow-up message explaining what to change:

Upload Documents First

Before generating requirements, upload your system specifications, concept of operations, and interface documents. The richer your project context, the better the AI can align its output with your actual system.

Review QA Scores After Accepting

After accepting a candidate, run QA scoring immediately. This catches quality issues — ambiguous language, missing measurability, or compound statements — while the requirement is still fresh in your mind and easy to fix.