AI Setup & Usage

Grep Badger includes an AI panel so you can ask questions about the log you are viewing.

You can use a local AI provider such as Ollama, or an external provider such as OpenAI or Anthropic.

By default, Ollama has been added as a provider. There are many LLMs available on Ollama that work on CPU only.

1. Open the AI panel

The right panel contains an AI interface.

When you open a log, you can click lines in the log viewer to select context before sending a prompt.

This is useful when you want AI to focus on a specific error, stack trace, or section of the log.

2. Default AI setup

By default, Ollama is created as a provider.

For privacy, we recommend using a local AI model when possible.

The app assumes Ollama is installed and a model is pulled. If this is not the case, Ollama will not be usable. Please install Ollama if you wish to use local AI.

3. Choose a provider

Grep Badger supports multiple AI providers.

Common options include:

  • Ollama for local AI
  • OpenAI for any OpenAI-standard API
  • Anthropic for Claude models

If you have more than one provider configured, choose the one you want before sending your message.

4. Choose a model

After selecting a provider, choose the model you want to use.

Different models may vary in speed, cost, and response quality.

If a provider has no usable model selected, Grep Badger will not be able to send your request.

5. Send log context to AI

You can select lines in the log viewer and include them with your prompt.

This helps the AI answer questions such as:

  • what error is happening here
  • what might have caused this failure
  • what patterns appear in these lines
  • what should I investigate next

The more relevant your selected context is, the better the response is likely to be.

6. Prefer local AI for privacy

A local provider such as Ollama keeps your requests on your own machine.

External AI providers will see any data you send to them.

Never send secrets, private data, or sensitive logs to an external AI provider unless you fully understand and accept that provider's data handling practices.

7. Common setup tips

  • Make sure your provider is enabled and selected.
  • Make sure a valid model is selected before sending a message.
  • If you use Ollama, make sure Ollama is running and the model has already been pulled.
  • If you use OpenAI or Anthropic, make sure your API key and endpoint settings are correct.
  • Start with a small, relevant set of selected log lines for better results.

Important notes

  • Ollama is the default AI provider created by the app.
  • There are many models capable of running on your CPU.
  • External AI providers will see any data you send to them.
  • AI responses may be helpful, but they are not guaranteed to be correct. Always verify important conclusions against the log data.
  • Some OpenAI providers may implement the standard differently. We cannot guarantee that all OpenAI providers will work correctly.