Project

General

Profile

Actions

Emulator Issues #12988

open

Metal - Lower FPS while disabling 24-bit Color and enabling Per-Pixel Lightning

Added by neirene over 1 year ago. Updated over 1 year ago.

Status:
Questionable
Priority:
Normal
Assignee:
-
% Done:

0%

Operating system:
N/A
Issue type:
Bug
Milestone:
Regression:
No
Relates to usability:
No
Relates to performance:
Yes
Easy:
No
Relates to maintainability:
No
Regression start:
Fixed in:

Description

While using the Metal renderer and with the game Phantasy Star Online Episode 1 & 2 (GPOJ8P) the FPS will be slightly lower if you disable 24-bit Color Support and enable Per-Pixel Lightning (can be either or both)

To reproduce this issue:

-With an M1 mac run the attached FIFO set the game to specialised shaders (the default ones) and set the resolution upscale to 5x or beyond (just enough to make the GPU suffer and lower the FPS below the native 30 fps) then play with the 24-bit color/Per-Pixel Setting you will notice the FPS will be lower or higher depending if you have these settings enabled or disabled.

This was tested on an M1 Mac mini, with 16gb ram/512 disk and Monterey 12.5

Use this FIFO to test
https://drive.google.com/file/d/1dYJoq6Qf6Sheno7xydmOKJ8TymOtuP7f/view?usp=sharing


Files

status01.png (11.4 KB) status01.png specialised, 5x, per-pixel disabled, 24bit enabled neirene, 07/24/2022 11:39 AM
status02.png (12.8 KB) status02.png specialised, 5x, per-pixel enabled, 24bit disabled neirene, 07/24/2022 11:40 AM
Actions #1

Updated by JosJuice over 1 year ago

  • Status changed from New to Questionable
  • Relates to performance changed from No to Yes

That sounds normal.

Actions #2

Updated by pokechu22 over 1 year ago

For both of these, the GPU has to do more work.

  • Disabling 24-bit color, although more accurate, means that when the game isn't using 24-bit color itself we have to do extra work in pixel shaders to reduce the quality to what would normally be possible. (The GameCube's embedded framebuffer only has enough space to store 24 bits per pixel, which could be 8 bits each for red/green/blue with no alpha (transparency), or 6 bits each for red/green/blue and 6 bits for alpha (transparency). Force 24-bit color uses 8 bits for red/green/blue and also allows 6 bits for alpha, since we're not memory constrained in the same way). I wouldn't expect this to have a significant impact, though, as the extra work is very small (even at 5x IR, where there are 25 times more pixels) (rgba6_format is always false when using 24-bit color).
  • Per-pixel lighting just requires more work in general. With vertex lighting (which is what real hardware does), a single triangle needs to have lighting calculated at 3 points (each of its vertices). With per-pixel lighting, this instead happens at the pixels that the triangle covers; this is generally hundreds of pixels (and at higher IRs, could easily be thousands). Modern GPUs can usually handle this, but it's still doing a lot more work.

How big is the difference with just changing the 24-bit color option while per-pixel lighting is off?

Actions #3

Updated by neirene over 1 year ago

pokechu22 wrote:

For both of these, the GPU has to do more work.

  • Disabling 24-bit color, although more accurate, means that when the game isn't using 24-bit color itself we have to do extra work in pixel shaders to reduce the quality to what would normally be possible. (The GameCube's embedded framebuffer only has enough space to store 24 bits per pixel, which could be 8 bits each for red/green/blue with no alpha (transparency), or 6 bits each for red/green/blue and 6 bits for alpha (transparency). Force 24-bit color uses 8 bits for red/green/blue and also allows 6 bits for alpha, since we're not memory constrained in the same way). I wouldn't expect this to have a significant impact, though, as the extra work is very small (even at 5x IR, where there are 25 times more pixels) (rgba6_format is always false when using 24-bit color).
  • Per-pixel lighting just requires more work in general. With vertex lighting (which is what real hardware does), a single triangle needs to have lighting calculated at 3 points (each of its vertices). With per-pixel lighting, this instead happens at the pixels that the triangle covers; this is generally hundreds of pixels (and at higher IRs, could easily be thousands). Modern GPUs can usually handle this, but it's still doing a lot more work.

How big is the difference with just changing the 24-bit color option while per-pixel lighting is off?

Its a very small difference of 2~3 FPS, I just decided to report it because the description of the option itself said that it doesnt affects the performance.

Maybe the description can be updated to something more like “Slightly decreases performance while being more accurate to the original hardware” or something among those lines for both Pixel Light and 24 bit color…

Actions #4

Updated by phire over 1 year ago

Fair enough.

Technically both statements are accurate; Pixel lighting says "rarely causes slowdowns". I think it's assuming the user isn't trying to run at stupidly high IRs. Should probably be changed to be more explicit about this.

And when 24 bit color says it won't impact performance, it's really wanting to say that enabling it won't negatively impact performance.

There is a general theme on the "Enhancements" page that enhancements might cause negative performance impacts, and it's only wanting to point out the enhancements that won't. Disabling fog and copy filter probably also have a small performance improvement. Perhaps we should change them to be more explicit about potential performance advantages.

Actions

Also available in: Atom PDF