-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Open
Labels
P2Important issue, but not time-criticalImportant issue, but not time-criticalcoreIssues that should be addressed in Ray CoreIssues that should be addressed in Ray Corecore-schedulerenhancementRequest for new feature and/or capabilityRequest for new feature and/or capabilityjira-core
Milestone
Description
Description
When running ray on machines with different type of GPU accelerators the fractional GPU placement strategy is not suitable. Instead allow for specifying mb of gpu for example.
Additionally the code is not accelerator agnostic and requires writing boiler plate code to determine the fractional gpu to be used even if all accelerators are the same for a given machine.
@ray.remote(gpu_mem=20mb)
def some_fn():
return True
Use case
This applies for both making the ray remote code work more portable as well as improving GPU utilization for various applications when there is inconsistent GPU types across a cluster.
Metadata
Metadata
Assignees
Labels
P2Important issue, but not time-criticalImportant issue, but not time-criticalcoreIssues that should be addressed in Ray CoreIssues that should be addressed in Ray Corecore-schedulerenhancementRequest for new feature and/or capabilityRequest for new feature and/or capabilityjira-core