AI CI Healer
Version updated for https://github.com/mariorazo97/ai-ci-healer to version v1.0.0.
- This action is used across all versions by ? repositories.
Go to the GitHub Marketplace to find the latest changes.
Release notes
🚀 v1.0.0: The Launch of AI CI Healer
We are thrilled to announce the first official release of AI CI Healer! 🚑
This action turns your CI/CD pipeline into a self-healing system. When a build fails, it doesn’t just stop—it analyzes the logs, finds the root cause, and writes a fix directly to your PR or commit.
“Stop Googling build errors. Let AI fix them for you.”
✨ Key Features
- 🧠 Multi-Brain Support: Use the speed of Groq, the intelligence of Gemini, or the privacy of Ollama (Self-Hosted).
- 🛡️ Smart Fallback: If one AI provider is down, it automatically retries with others to ensure you get a fix.
- 👁️ Chain of Thought: Expand the “View AI Reasoning” dropdown to see exactly why the AI chose a specific fix.
- ⚡ Blazing Fast: Analyzing logs and posting a fix takes seconds.
- 🔒 Enterprise Ready: Support for private repos and self-hosted runners using Ollama.
📦 Quick Start
Add this to your workflow file immediately after your build steps:
- name: AI CI Healer
if: failure()
uses: mariorazo97/ai-ci-healer@v1.0.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
llm-provider: "groq"
groq-api-key: ${{ secrets.GROQ_API_KEY }}
🛠️ Configuration
| Input | Description | Default |
|---|---|---|
llm-provider | Choose groq, gemini, or ollama. | groq |
model | Specific model ID (e.g., llama-3.3-70b-versatile). | Auto-selected |
enable-comments | Toggle PR commenting on/off. | true |
custom-context | Inject team-specific coding rules. | "" |
📸 Screenshots


☕ Support the Project
If this tool saved your build (and your sanity), consider buying me a coffee to keep the updates coming!