Arnold GPU FAQ

Why aren't my GPUs listed?

If you do not see your GPUs listed as available devices, then you either don't have a supported card or the required drivers

Can I mix different GPUs?

You can have a mix of RTX cards and use them all. What you can't do is mix RTX and non-RTX cards. In that case, you won't be able to render on all of them at the same time (just all of the RTXs at the same time).

How many GPUs can I have?

Arnold GPU supports up to eight GPUs.

Does the entire scene have to fit into GPU memory?

Yes, the scene and the textures have to fit into memory. Note that textures are loaded on-demand.

If I have multiple GPUs, will Arnold use them all?

Arnold will use all the GPUs at full capacity yes. It will also pool the memory if you have NVlink. However, you can only Nvlink 2 cards together at the same time. So your memory is bound by these "islands". One possible limitation of using differing RTX GPUs at render time is that the less powerful GPU may end up throttling the more powerful one as the work is divided between the GPUs evenly.

If I have three cards, how much memory will I have?

You can NVLink two cards, so your memory limit is determined by the third card. For example, if you have three cards with 16GB, the scene and textures have to fit in 16GB. The two NVLink cards combine for 32KB, but the third card is still 16GB, so the limit is still 16GB. Three cards are still faster than two, so it's a tradeoff.

If I have two cards that are not NVLinked, is a scene split across both cards?

The scene must fit on each card. For example, if you have two cards, each with 16GB, then the scene must fit in 16GB. Each card loads its own copy of the scene.

Can I use Arnold GPU on macOS?

No, unfortunately, Arnold GPU is not available for macOS. Arnold GPU uses Optix 6.0 from NVIDIA, and there is no version of Optix 6.0 for macOS. And the driver versions required by Arnold GPU are also not available for macOS.

Why does Arnold GPU take so long to start rendering?

The very first time you render with the GPU, the GPU renderer has to create a cache of shaders. This can delay the time to the first pixel for your first render. To avoid the one-time delay, we recommend that you pre-populate the cache before you do any renders.

Why don't I see Arnold using my GPU in the Task Manager?

The Windows Task Manager is not reliable for monitoring the GPU usage of Arnold GPU. Try something like GPU-Z instead.

 




  • No labels