Skip to content

How to Integrate External USB Camera for Live Streaming with LiveKit SDK (Kiosk Device Use Case) #712

@maksood5010

Description

@maksood5010

Hi,

I'm currently working on an Android kiosk application that does not have a built-in camera, so I'm using an external USB camera connected via OTG. For handling the USB camera, I'm successfully using this library:

🔗 AndroidUSBCamera by jiangdongguo

The camera feed is rendered correctly onto a TextureView or SurfaceView using this library.

Now, I want to live stream the USB camera feed to AWS IVS (RTMPS endpoint) using the LiveKit SDK or any recommended approach.

My main questions:
How can I integrate the USB camera feed with the LiveKit SDK?

Is there a way to pass the external camera stream (from AndroidUSBCamera) into LiveKit as a video source?

Can I provide a custom video track from a SurfaceView or TextureView?

Is there any recommended way to capture frames from the USB camera and push them to LiveKit's video track system?

If LiveKit does not support this directly, is there an alternative method using your SDK that allows pushing raw video frames to a broadcast service like AWS IVS via RTMPS?

Any guidance, code examples, or best practices would be greatly appreciated.

Thanks in advance!

Metadata

Metadata

Assignees

Labels

EnhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions