Enter Your Prompt
Type any question or prompt you want to analyze. Complex queries with multiple aspects work best.
Enter a prompt to discover the actual search queries that Gemini uses internally when answering your question. Understand how AI expands your query into multiple searches. See what searches AI actually performed—not generated, but measured from actual execution logs.
When you ask an AI assistant a question, it doesn't just search for your exact query. Instead, it "fans out" your single question into multiple related searches to gather comprehensive information. This tool extracts and visualizes the actual internal search queries that Gemini (with Google Search grounding enabled) used, showing you the real queries that were executed—not generated ones, but actual measurement-based analysis.
Type any question or prompt you want to analyze. Complex queries with multiple aspects work best.
Gemini with Google Search grounding processes your prompt and uses multiple search queries that were actually executed during the answer generation process. This tool extracts queries based on those execution logs.
See all the individual search queries that were used to gather information for your answer.
Understanding what queries AI actually executed—not what it might think, but what it actually searched—is the most important insight for LLMO and AI search optimization. By basing your strategy on actual queries that LLMs executed (not speculation or generation), you can design content that gets cited and recommended by AI.
One prompt triggers 5-15 different searches on average. Your content needs to be discoverable across all these variations.
Discover the exact phrases AI uses to find information. Use these queries to inform your content strategy.
See which sources are cited for each fan-out query. Identify opportunities where your competitors are being mentioned but you're not.
Identify sub-topics within a main query that you may not be covering. Each fan-out query represents a potential content opportunity.
Query fan-out is the process where AI assistants like Gemini expand a single user query into multiple related search queries to gather comprehensive information. Instead of searching for just what you asked, AI searches for many related terms to provide a more complete answer. This tool extracts and visualizes the actual search queries that Gemini executed during answer generation. It's not about generated queries, but actual measurement-based analysis.
Research shows that Gemini uses an average of 9 fan-out queries per prompt, with some complex queries generating up to 28 different searches. This varies based on the complexity and specificity of your original question.
Use the fan-out queries to understand what terms AI uses to find information about your topic. Ensure your content covers these variations. If a fan-out query returns competitor content, that's a content gap you should address.
Simple or conversational prompts may not trigger web search. Gemini may answer directly from its training data. Try more specific, factual, or recent-event questions that require current web information.