Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Intermittent httpx.ReadError with high concurrency #3348

Open
mmeendez8 opened this issue Oct 15, 2024 · 1 comment
Open

Intermittent httpx.ReadError with high concurrency #3348

mmeendez8 opened this issue Oct 15, 2024 · 1 comment

Comments

@mmeendez8
Copy link

mmeendez8 commented Oct 15, 2024

Issue Description

I've encountered intermittent httpx.ReadError exceptions when using high concurrency values with httpx.AsyncClient. This issue does not occur when using aiohttp with similar concurrency levels.

Reproduction Steps

  1. Set up a simple FastAPI server with a 3-second delay (code provided below).
  2. Run the httpx client script (provided below) with high concurrency (CONCURRENT_REQUESTS = 300, TOTAL_REQUESTS = 1000).
  3. Observe intermittent httpx.ReadError exceptions.
  4. Replace the httpx client with the aiohttp client (code provided) and run with the same concurrency settings.
  5. Observe that the aiohttp client completes without errors.

Environment

aiohttp                           3.9.5
httpcore                          1.0.5
httpx                             0.27.0

Server code

from fastapi import FastAPI
from pydantic import BaseModel, Field
import asyncio
import uvicorn

app = FastAPI()

class ImagePayload(BaseModel):
    image: str = Field(..., description="Base64 encoded image")

@app.post("/process_image")
async def process_image(payload: ImagePayload):
    await asyncio.sleep(3)

    return {
        "message": "Image processed successfully after a delay"
    }

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=8000)

Httpx client code

import asyncio
import httpx
import base64
import numpy as np

SERVER_URL = "http://localhost:8000/process_image"
CONCURRENT_REQUESTS = 300
TOTAL_REQUESTS = 1000
TIMEOUT = 30

async def send_request(client, image_data):
    response = await client.post(SERVER_URL, json={"image": image_data})
    return response.json()

async def main():
    image = np.random.randint(0, 255, (100, 100, 3), dtype=np.uint8)
    image_data = base64.b64encode(image).decode()
    
    async with httpx.AsyncClient(limits=httpx.Limits(max_connections=CONCURRENT_REQUESTS), verify=False, timeout=TIMEOUT) as client:
        tasks = [send_request(client, image_data) for _ in range(TOTAL_REQUESTS)]
        results = await asyncio.gather(*tasks)

    for i, result in enumerate(results, 1):
        print(f"Response {i}:", result)

if __name__ == "__main__":
    asyncio.run(main())

AIOHttp client code

import asyncio
import aiohttp
import base64
import numpy as np

SERVER_URL = "http://localhost:8000/process_image"
CONCURRENT_REQUESTS = 300
TOTAL_REQUESTS = 1000
TIMEOUT = 30

async def send_request(session, image_data):
    async with session.post(SERVER_URL, json={"image": image_data}) as response:
        return await response.json()

async def main():
    image = np.random.randint(0, 255, (100, 100, 3), dtype=np.uint8)
    image_data = base64.b64encode(image.tobytes()).decode()
    
    timeout = aiohttp.ClientTimeout(total=TIMEOUT)
    connector = aiohttp.TCPConnector(limit=CONCURRENT_REQUESTS)
    
    async with aiohttp.ClientSession(timeout=timeout, connector=connector) as session:
        tasks = [send_request(session, image_data) for _ in range(TOTAL_REQUESTS)]
        
        results = await asyncio.gather(*tasks)

    for i, result in enumerate(results, 1):
        print(f"Response {i}:", result)

if __name__ == "__main__":
    asyncio.run(main())
@mhdzumair
Copy link

mhdzumair commented Oct 27, 2024

+1 i observed similar random httpx.ReadError

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@mmeendez8 @mhdzumair and others