Skip to content

[Discussion] Integrating with GitHub Copilot via copilot-api results in an empty {} response #35

@kehao-chen

Description

@kehao-chen

Hi there,

First off, I want to say thank you for creating claude-code-router. It's a fantastic project that provides great flexibility in using different AI providers within the Claude Code environment.

I currently have an active GitHub Copilot subscription and was inspired to see if I could route its API through claude-code-router. To achieve this, I'm using another helpful project, ericc-ch/copilot-api, which exposes the GitHub Copilot API as a local, OpenAI-compatible endpoint.

I noticed a similar integration discussion in the copilot-api repository (issue #43), which made me optimistic that this setup could work.

The Problem

However, when I connect claude-code-router to the local endpoint provided by copilot-api, I'm encountering a peculiar issue.

  1. copilot-api shows a valid response: Looking at the verbose logs from the copilot-api server, I can clearly see that it's successfully receiving a streaming response from GitHub. The log shows chunks of data containing the content with the generated text.
ℹ Current token count: { input: 3887, output: 27 }                                          11:35:09 AM
⚙ Set max_tokens to: 16384                                                                  11:35:09 AM
⚙ Streaming response                                                                        11:35:10 AM
[11:35:10 AM] ⚙ Streaming chunk: {"data":"{\"choices\":[],\"created\":0,\"id\":\"\",\"prompt_filter_results\":[{\"content_filter_results\":{\"hate\":{\"filtered\":false,\"severity\":\"safe\"},\"self_harm\":{\"filtered\":false,\"severity\":\"safe\"},\"sexual\":{\"filtered\":false,\"severity\":\"safe\"},\"violence\":{\"filtered\":false,\"severity\":\"safe\"}},\"prompt_index\":0}]}"}
--> POST /chat/completions 200 2s
[11:35:10 AM] ⚙ Streaming chunk: {"data":"{\"choices\":[{\"index\":0,\"content_filter_offsets\":{\"check_offset\":18806,\"start_offset\":18806,\"end_offset\":18808},\"content_filter_results\":{\"hate\":{\"filtered\":false,\"severity\":\"safe\"},\"self_harm\":{\"filtered\":false,\"severity\":\"safe\"},\"sexual\":{\"filtered\":false,\"severity\":\"safe\"},\"violence\":{\"filtered\":false,\"severity\":\"safe\"}},\"delta\":{\"content\":\"\",\"role\":\"assistant\"}}],\"created\":1750390510,\"id\":\"chatcmpl-BkMowfAsar7wRdm2kvUMYhXX5WBGi\",\"model\":\"gpt-4.1-2025-04-14\",\"system_fingerprint\":\"fp_07e970ab25\"}"}
[11:35:10 AM] ⚙ Streaming chunk: {"data":"{\"choices\":[{\"index\":0,\"content_filter_offsets\":{\"check_offset\":18806,\"start_offset\":18806,\"end_offset\":18808},\"content_filter_results\":{\"hate\":{\"filtered\":false,\"severity\":\"safe\"},\"self_harm\":{\"filtered\":false,\"severity\":\"safe\"},\"sexual\":{\"filtered\":false,\"severity\":\"safe\"},\"violence\":{\"filtered\":false,\"severity\":\"safe\"}},\"delta\":{\"content\":\"Hi\"}}],\"created\":1750390510,\"id\":\"chatcmpl-BkMowfAsar7wRdm2kvUMYhXX5WBGi\",\"model\":\"gpt-4.1-2025-04-14\",\"system_fingerprint\":\"fp_07e970ab25\"}"}
[11:35:10 AM] ⚙ Streaming chunk: {"data":"{\"choices\":[{\"finish_reason\":\"stop\",\"index\":0,\"content_filter_offsets\":{\"check_offset\":18806,\"start_offset\":18806,\"end_offset\":18808},\"content_filter_results\":{\"hate\":{\"filtered\":false,\"severity\":\"safe\"},\"self_harm\":{\"filtered\":false,\"severity\":\"safe\"},\"sexual\":{\"filtered\":false,\"severity\":\"safe\"},\"violence\":{\"filtered\":false,\"severity\":\"safe\"}},\"delta\":{\"content\":null}}],\"created\":1750390510,\"id\":\"chatcmpl-BkMowfAsar7wRdm2kvUMYhXX5WBGi\",\"usage\":{\"completion_tokens\":3,\"completion_tokens_details\":{\"accepted_prediction_tokens\":0,\"rejected_prediction_tokens\":0},\"prompt_tokens\":11403,\"prompt_tokens_details\":{\"cached_tokens\":9984},\"total_tokens\":11406},\"model\":\"gpt-4.1-2025-04-14\",\"system_fingerprint\":\"fp_07e970ab25\"}"}
⚙ Streaming chunk: {"data":"[DONE]"}                                                        11:35:10 AM
<-- POST /chat/completion
  1. claude-code-router shows an empty response: Despite the backend receiving data, the response displayed within the Claude Code UI is consistently an empty object: {}. This happens for various models I've tried to proxy, such as gpt-4o, gpt-4o-mini, etc.

Image

This leads me to believe that the data stream format sent by copilot-api, while OpenAI-compatible, might have a subtle difference that claude-code-router isn't parsing correctly. The data seems to be flowing but gets lost or misinterpreted during processing on the claude-code-router side.

I'm creating this issue to open up a discussion. Has anyone else attempted a similar setup or has any insight into what might be causing this discrepancy? Could it be an issue with how stream terminators ([DONE]) are handled, or a slight variation in the JSON structure of the stream events?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions