ResourcesCommunityBug Reports & Features

Bug Reports & Feature Requests

We value your feedback and contributions to make ProxAI better. This guide will help you effectively report bugs and suggest new features.

Reporting Bugs

Before Reporting

  1. Check existing issues on GitHub to see if your bug has already been reported
  2. Update to the latest version to check if your issue has been fixed
  3. Try to isolate the problem to create a reliable reproduction case

How to Submit a Bug Report

  1. Go to our GitHub Issues page
  2. Click “Bug Report” template
  3. Fill out all the required information:
    • Clear, descriptive title
    • Steps to reproduce
    • Expected behavior
    • Actual behavior
    • Screenshots (if applicable)
    • Environment details (OS, Python version, etc.)
    • Any additional context

What Makes a Good Bug Report

A helpful bug report includes:

**Bug**: Authentication fails when using special characters in password
 
**Steps to Reproduce**:
1. Install proxai v1.2.3
2. Create client with `px.Client(api_key="my-key")`
3. Attempt to authenticate with password containing `!@#$`
 
**Expected**: Authentication succeeds
**Actual**: Authentication fails with error `Invalid character in password`
 
**Environment**:
- ProxAI version: 1.2.3
- Python version: 3.10.4
- OS: macOS 13.1

Requesting Features

Before Requesting

  1. Search existing issues to see if someone has already suggested a similar feature
  2. Check the roadmap to see if it’s already planned
  3. Consider the scope of your request and how it fits with ProxAI’s goals

How to Submit a Feature Request

  1. Go to our GitHub Issues page
  2. Click “Feature Request” template
  3. Fill out the template with:
    • Clear description of the feature
    • Problem it solves
    • Proposed implementation (if you have ideas)
    • Alternatives you’ve considered
    • Additional context or examples

What Makes a Good Feature Request

A helpful feature request includes:

**Feature**: Add support for streaming responses from Anthropic Claude
 
**Problem**: Currently, users cannot stream responses when using Claude models, leading to long wait times for complete responses.
 
**Proposed Solution**: Implement streaming support for Claude models similar to the existing streaming for OpenAI.
 
**Example Usage**:
```python
response = px.generate_text(
    "Write a long story about a journey",
    provider="anthropic",
    model="claude-3-opus",
    stream=True
)
for chunk in response:
    print(chunk, end="", flush=True)

Alternatives Considered: Using a separate method specifically for streaming.


## Issue Lifecycle

1. **Triage**: New issues are reviewed by maintainers
2. **Labeling**: Issues are labeled by type, priority, and status
3. **Discussion**: Community and maintainers discuss the issue
4. **Implementation**: Issue is assigned and worked on
5. **Review & Merge**: Changes are reviewed and merged
6. **Release**: Fixed/implemented in a release

## Community Voting

We use GitHub reactions to gauge community interest:
- 👍 Support for a feature or confirming a bug
- 👎 Disagreement with a feature direction

The number of reactions helps us prioritize what to work on next.

## Stay Updated

To follow progress on your reported issues:
- Enable notifications on the GitHub issue
- Join our [Discord](https://discord.gg/QhrDkzMHrP) for real-time updates
- Check our release notes for fixed issues

Thank you for helping improve ProxAI!