Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Video Processing API - processing inside docker on Jetson is noticeably slower than bare-metal #687

Open
1 of 2 tasks
PawelPeczek-Roboflow opened this issue Sep 27, 2024 · 0 comments
Labels
bug Something isn't working Video Management API issues

Comments

@PawelPeczek-Roboflow
Copy link
Collaborator

Search before asking

  • I have searched the Inference issues and found no similar bug report.

Bug

The same workflow tested, reporting only latency for single frame processing inside WorkflowRunner.run_workflow(...) function:

Jetson Nano Orin, bare metal in script using InferencePipeline directly - not measured precisely now, but older tests indicated the same performance as on MacBook when yolov8n-640 used - which was the model used in test case
Jetson Nano Orin, inside docker container, behind API - ~50ms
We have docker overhead, not 100% sure if it is visible on Jetson devices, but MacBook one makes drop from 27fps into <10fps 😢

Also measured that running 2 streams at the time gives over 30FPS in total so the compute is there, but something makes the pipeline slower than potentially could be inside Jetson Docker

Environment

Jetson Nano Orin, Jetpack 5.1.1

Minimal Reproducible Example

No response

Additional

No response

Are you willing to submit a PR?

  • Yes I'd like to help by submitting a PR!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Video Management API issues
Projects
None yet
Development

No branches or pull requests

1 participant