Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,7 @@ yarn-error.log
/sentry-cli.exe

.vscode/


HackerNews.xcarchive.zip
HackerNews_arm64
2 changes: 1 addition & 1 deletion Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 12 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,18 @@ version = "2.46.0"
edition = "2021"
rust-version = "1.86"

[lib]
name = "sentry_cli"
path = "src/lib.rs"

[[bin]]
name = "test_full_upload"
path = "test_full_upload.rs"

[[bin]]
name = "test_mobile_app_upload"
path = "test_mobile_app_upload.rs"

[dependencies]
anylog = "0.6.3"
anyhow = { version = "1.0.69", features = ["backtrace"] }
Expand Down
81 changes: 81 additions & 0 deletions README_TEST.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
# Sentry CLI Upload Tests

This directory contains test programs that reuse the existing sentry-cli codebase to test different upload flows against a local Sentry service.

## Test Programs

### 1. Mobile App Upload Test (`test_mobile_app_upload`)

Tests uploading mobile app archives (like `.xcarchive.zip` files) to the preprod artifacts endpoint.

**Build and run:**
```bash
cargo run --bin test_mobile_app_upload
```

**What it tests:**
- Authentication with local Sentry service
- Chunk upload capabilities check (specifically `PreprodArtifacts` capability)
- Full chunk upload flow for large mobile app archives
- Assembly via `/projects/{org}/{project}/files/preprodartifacts/assemble/` endpoint

**Configuration:**
- Target file: `/Users/nicolashinderling/TestUploads/HackerNews.xcarchive.zip`
- Endpoint: `/projects/sentry/internal/files/preprodartifacts/assemble/`

### 2. Debug Files Upload Test (`test_full_upload`)

Tests uploading debug files (like dSYM files) to the debug info files endpoint.

**Build and run:**
```bash
cargo run --bin test_full_upload
```

**What it tests:**
- Authentication with local Sentry service
- Chunk upload capabilities check
- Full chunk upload + missing chunk detection flow
- Assembly via `/projects/{org}/{project}/files/difs/assemble/` endpoint

**Configuration:**
- Target file: `/Users/nicolashinderling/TestUploads/HackerNews_arm64`
- Endpoint: `/projects/sentry/internal/files/difs/assemble/`

## Common Configuration

Both tests are configured for:
- **Base URL:** `http://localhost:8000`
- **Organization:** `sentry`
- **Project:** `internal`
- **Auth Token:** Your provided token (update in each test file)

You can modify these values in the `main()` function of each test file.

## Common Output

Both tests will show:
- ✅/❌ Status indicators for each step
- Server capabilities (which ChunkUploadCapability features are supported)
- Chunk upload progress and statistics
- Assembly request/response details
- Detailed error messages if anything fails

## API Endpoints Tested

### Chunk Upload (Common)
- **GET** `/api/0/organizations/{org}/chunk-upload/` - Server capabilities
- **POST** `/api/0/organizations/{org}/chunk-upload/` - Upload chunks

### Mobile App Assembly
- **POST** `/api/0/projects/{org}/{project}/files/preprodartifacts/assemble/`

### Debug Files Assembly
- **POST** `/api/0/projects/{org}/{project}/files/difs/assemble/`

## Notes

- Both tests reuse the exact same code paths as the real `sentry-cli` commands
- They provide comprehensive testing of the upload functionality
- The mobile app test is for the new preprod artifacts endpoint
- The debug files test uses the existing DIF upload flow
72 changes: 71 additions & 1 deletion src/api/data_types/chunking/artifact.rs
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
use serde::{Deserialize, Serialize};
use sha1_smol::Digest;
use std::collections::HashMap;

use super::ChunkedFileState;

Expand All @@ -13,16 +14,85 @@ pub struct ChunkedArtifactRequest<'a> {
pub version: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
pub dist: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
pub filename: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub project_id: Option<String>,
}

#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct AssembleArtifactsResponse {
pub struct ChunkedArtifactResponse {
pub state: ChunkedFileState,
pub missing_chunks: Vec<Digest>,
pub detail: Option<String>,
}

#[derive(Debug, Serialize)]
#[serde(transparent)]
pub struct AssembleArtifactsRequest<'a>(HashMap<Digest, ChunkedArtifactRequest<'a>>);

impl<'a, T> FromIterator<T> for AssembleArtifactsRequest<'a>
where
T: Into<ChunkedArtifactRequest<'a>>,
{
fn from_iter<I>(iter: I) -> Self
where
I: IntoIterator<Item = T>,
{
Self(
iter.into_iter()
.map(|obj| obj.into())
.map(|r| (r.checksum, r))
.collect(),
)
}
}

pub type AssembleArtifactsResponse = ChunkedArtifactResponse;

#[derive(Debug, Serialize)]
pub struct ChunkedPreprodArtifactRequest<'a> {
pub checksum: Digest,
pub chunks: &'a [Digest],
// Optional metadata fields that the server supports
#[serde(skip_serializing_if = "Option::is_none")]
pub build_version: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
pub build_number: Option<i32>,
#[serde(skip_serializing_if = "Option::is_none")]
pub build_configuration: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
pub date_built: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
pub extras: Option<serde_json::Value>,
}

impl<'a> ChunkedPreprodArtifactRequest<'a> {
/// Create a new ChunkedPreprodArtifactRequest with the required fields.
pub fn new(checksum: Digest, chunks: &'a [Digest]) -> Self {
Self {
checksum,
chunks,
build_version: None,
build_number: None,
build_configuration: None,
date_built: None,
extras: None,
}
}
}

#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ChunkedPreprodArtifactResponse {
pub state: ChunkedFileState,
pub missing_chunks: Vec<Digest>,
pub detail: Option<String>,
}

pub type AssemblePreprodArtifactsResponse = HashMap<Digest, ChunkedPreprodArtifactResponse>;

fn version_is_empty(version: &Option<&str>) -> bool {
match version {
Some(v) => v.is_empty(),
Expand Down
5 changes: 4 additions & 1 deletion src/api/data_types/chunking/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,10 @@ mod file_state;
mod hash_algorithm;
mod upload;

pub use self::artifact::{AssembleArtifactsResponse, ChunkedArtifactRequest};
pub use self::artifact::{
AssembleArtifactsRequest, AssembleArtifactsResponse, ChunkedArtifactRequest, ChunkedArtifactResponse,
ChunkedPreprodArtifactRequest, ChunkedPreprodArtifactResponse,
};
pub use self::compression::ChunkCompression;
pub use self::dif::{AssembleDifsRequest, AssembleDifsResponse, ChunkedDifRequest};
pub use self::file_state::ChunkedFileState;
Expand Down
4 changes: 4 additions & 0 deletions src/api/data_types/chunking/upload/capability.rs
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,9 @@ pub enum ChunkUploadCapability {
/// Upload of il2cpp line mappings
Il2Cpp,

/// Upload of preprod artifacts (mobile app archives, etc.)
PreprodArtifacts,

/// Any other unsupported capability (ignored)
Unknown,
}
Expand All @@ -49,6 +52,7 @@ impl<'de> Deserialize<'de> for ChunkUploadCapability {
"sources" => ChunkUploadCapability::Sources,
"bcsymbolmaps" => ChunkUploadCapability::BcSymbolmap,
"il2cpp" => ChunkUploadCapability::Il2Cpp,
"preprod_artifacts" => ChunkUploadCapability::PreprodArtifacts,
_ => ChunkUploadCapability::Unknown,
})
}
Expand Down
26 changes: 23 additions & 3 deletions src/api/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -988,6 +988,8 @@ impl<'a> AuthenticatedApi<'a> {
projects: &[],
version: None,
dist: None,
filename: None,
project_id: None,
})?
.send()?
.convert_rnf(ApiErrorKind::ReleaseNotFound)
Expand All @@ -1011,11 +1013,32 @@ impl<'a> AuthenticatedApi<'a> {
projects,
version,
dist,
filename: None,
project_id: None,
})?
.send()?
.convert_rnf(ApiErrorKind::ReleaseNotFound)
}

/// Request preprod artifact assembling and processing from chunks.
pub fn assemble_preprod_artifact(
&self,
org: &str,
project: &str,
request: &ChunkedPreprodArtifactRequest<'_>,
) -> ApiResult<ChunkedPreprodArtifactResponse> {
let url = format!(
"/projects/{}/{}/files/preprodartifacts/assemble/",
PathArg(org),
PathArg(project)
);

self.request(Method::Post, &url)?
.with_json_body(request)?
.send()?
.convert_rnf(ApiErrorKind::ProjectNotFound)
}

pub fn associate_proguard_mappings(
&self,
org: &str,
Expand Down Expand Up @@ -1929,7 +1952,6 @@ pub struct AuthDetails {
#[derive(Deserialize, Debug)]
pub struct User {
pub email: String,
#[expect(dead_code)]
pub id: String,
}

Expand Down Expand Up @@ -2011,7 +2033,6 @@ pub struct UpdatedRelease {
#[derive(Debug, Deserialize)]
pub struct ReleaseInfo {
pub version: String,
#[expect(dead_code)]
pub url: Option<String>,
#[serde(rename = "dateCreated")]
pub date_created: DateTime<Utc>,
Expand Down Expand Up @@ -2077,7 +2098,6 @@ pub struct DebugInfoData {
#[serde(default, rename = "type")]
pub kind: Option<ObjectKind>,
#[serde(default)]
#[expect(dead_code)]
pub features: Vec<String>,
}

Expand Down
11 changes: 11 additions & 0 deletions src/lib.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
#![warn(clippy::allow_attributes)]
#![warn(clippy::unnecessary_wraps)]

pub mod api;
pub mod config;
pub mod constants;
pub mod utils;

// Re-export commonly used types
pub use api::{Api, ChunkUploadCapability};
pub use config::{Auth, Config};
Loading
Loading