UE4 performance - viewport drain

#1

Attempting to output a double-wide 1920x1080 (so 3840x1080) render target from UE4 and split it between two projectors in LA – got it working, but it dropped our fps from ~90+ to ~10fps.

One thing I noticed was because our player controller is still outputting to the UE4 viewport, it’s functionally rendering a 1920x1080 and the 3840x1080 views at the same time… is there any way to turn off the main UE4 viewport render for the player?? With the Scenecapture2D output going to LA, we don’t need to have UE4 outputting a separate view of the player camera to the viewport…

I assume there’s no way to send the player camera to LA from UE since render targets are exclusively mentioned in the tutorials? Also was beginning to attempt two 1920x1080 SceneCapture2D’s to output to each projector, but figured that was likely even less performant…

help!

#2

Hello,

Hmm, that sounds like a very big FPS drop indeed. It makes me think the reason might be somewhere else, but I’m not sure obviously. Does the performance drop happen also if you are not actually sending Spout out? Just using Scene Capture with render target? Also, what’s your VRAM usage?

Just as a comparison, we ran this scene on NVidia P6000, 32GB, Intel Xeon Gold 6126. I don’t remember the exact framerate, but it definitely didn’t drop that much. We used Scene capture component with a Render target with 4K resolution (3184x2160).

To your questions: I wouldn’t consider myself a UE4 expert and you might be better off asking this in UE4 forums, but afaik there is no way to ‘disable rendering’ as such. You might be able to do the same thing to play with the settings of the active camera. Alternatively, if you are using a packaged or standalone game, you will decrease the performance cost of viewport rendering by decreasing its resolution. There is a console command for that (I think it’s something like r.SetRes 640x480, but you should check that).

Lastly, UE4 has very good profiling tools that might help you pinpoint the biggest load.

Let me know what you find out.

Good luck!
Mitja

#3

Hey there, following up on this – the drop from SceneCapture2D / Render target usage is still a major hurdle to using Light Act in Unreal.

Just having the SceneCapture2D running drops a 50fps scene to around 25/30fps (this time outputting a 1080p texture rather than 4K), and I’m wondering if there’s any way to avoid this?

I’m not sure the VRAM usage (any suggestion on how to view the usage?) but I see 6144MB dedicated video memory, if that’s relevant.
image

These are my settings for the capture / render target:
image

Thank you!

#4

Hi Sam,

Sorry for the late reply. I was out of the office.

The performance drop happens because UE has to render everything twice - once for the viewport and once for the Scene Capture.

Therefore, the goal is to limit how much resources viewport rendering consumes.

As suggested by this thread, when reproducing your problem, I created a box with a black unlit material and placed it just in front of the viewport camera.
image
This helped to increase the FPS of the game, however, it didn’t reach the same level as when SceneCapture2D was switched off.

I hope this helps.

#5

That does help, thank you – so sounds like this will be a performance hurdle for the foreseeable future then, since there aren’t any plans/ability to take output from a UE4 camera instead of the render target?

#6

Hi,

There are several approaches we are discussing internally and with Epic. When any one of them proceeds to a sufficiently developed stage I’ll let you know. In the meantime, please use the above suggestions.

Best regards,
Mitja