GitHub
Overview

Guides

Step-by-step guides for common workflows and advanced use cases.

Guides

These guides walk you through common workflows and advanced use cases in ProxifAI.

Setting Up a Development Workflow

A complete setup for a team using ProxifAI for the full development lifecycle:

Create an Organization

Set up your organization, invite team members, and configure roles. See Creating an Organization.

Create Projects and Repositories

Create projects for planning and repositories for code. Link them together so issues connect to PRs automatically.

Set Up CI/CD Pipelines

Define YAML-based pipelines in your repositories. Configure secrets, approval gates, and deploy environments.

Configure AI Agents

Set up AI models under SettingsAI Configuration. Configure agent images and compute resources.

Build Automation Workflows

Create workflows in the Flows section to automate repetitive tasks — issue triage, code review, testing, and more.

Common Workflows

Issue-to-PR Flow

  1. Create an issue describing the work
  2. Create a branch referencing the issue
  3. Push code and open a pull request
  4. CI pipeline runs automatically
  5. Team reviews and approves the PR
  6. Merge closes the linked issue

AI-Powered Code Review

  1. Open a pull request
  2. The AI agent is triggered via a workflow
  3. Agent clones the repo, reviews the diff, and leaves comments
  4. Developer addresses feedback
  5. Agent re-reviews and approves

Automated Issue Triage

  1. New issues arrive via the API, CLI, or web
  2. A workflow trigger fires on issue creation
  3. AI agent analyzes the issue content
  4. Agent assigns priority, labels, and a suggested assignee
  5. Team reviews and adjusts as needed

Sprint Planning

  1. Create a new sprint with start and end dates
  2. Review the backlog and drag issues into the sprint
  3. Assign issues to team members
  4. Track progress throughout the sprint
  5. Close the sprint and review completion stats

Integrations

Slack Integration

Connect Slack to receive notifications and interact with ProxifAI:

  • Receive alerts for PR reviews, pipeline failures, and workflow approvals
  • Use slash commands to create issues and check status
  • Interactive buttons for quick actions

Webhook Integration

Set up outbound webhooks for custom integrations:

  • Subscribe to specific event types
  • View delivery logs and retry failed deliveries
  • Test webhooks before going live

AI Gateway as API

Use the AI Gateway as a drop-in replacement for OpenAI-compatible APIs. The gateway is mounted at /api/v1/llm/* on your deployment host:

curl -X POST https://your-host/api/v1/llm/v1/chat/completions \
  -H "Authorization: Bearer pfai_your_token" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4.6",
    "messages": [{"role": "user", "content": "Hello"}]
  }'

The AI Gateway ships with three provider types out of the box — openai, anthropic, gemini — plus a generic openai-compatible driver for any OpenAI-shaped endpoint (Ollama, vLLM, OpenRouter, your own proxy). See AI Gateway for the full default model catalog.