PDA

View Full Version : Idea: A BENCHMARK application for DREAMCAST



kohan69
May 28th, 2006, 09:35
Greetz,

In conjuction with overclocking dreamcasts, I was wondering if some one has coded an app that can be used for testing and compating performance on a dreamcast.

Putting in Metropolis street racer and waitinf for demo or loading your Shenmue save and going to "that spot in city where it lags" is very tedious with an open dreamcast, osciliscope and a lot of wiring and tools on your table

My suggestion is for the program to do the following, while displaying FPS (frame per second)

spinning 3d models with with variations of texture sizes, number of light sources and number of objects rendered .
Total of 3-5 tests to show performans, min/max fps, maybe possible clock/gpu speeds too :)

Thanks,
KoHaN

Eric
May 28th, 2006, 11:30
I know there is a benchmark program available but not to what your looking for. The one i'm talking about was for use to pop right into the system and see what its running at. Was mostly used for people who wanted to over clock there Dreamcast. Overclocking the Dreamcast is not the best way to run games either as most games wont run and some will so you pretty much got a Dreamcast that has been poorly updated. A wicked idea for that was if the system could have been fully updated with a new video card and such but thats pretty much impossible. Even doing that would make no sense who is going to make games to look that amazing on the Dreamcast compared to whats available today.

quzar
May 28th, 2006, 15:55
There is a benchmark program, maybe even two. It measures different data read right speeds for memory, then does a lot of different sorts of CPU calculations. Just search the dreamcast forum for benchmark or something.

EDIT: here are the two different benchmarking programs for the DC: http://www.dcemu.co.uk/vbulletin/showthread.php?t=6980&highlight=benchmark

http://www.dcemu.co.uk/vbulletin/showthread.php?t=4144&highlight=benchmark

kohan69
May 28th, 2006, 23:27
There is a benchmark program, maybe even two. It measures different data read right speeds for memory, then does a lot of different sorts of CPU calculations. Just search the dreamcast forum for benchmark or something.

EDIT: here are the two different benchmarking programs for the DC: http://www.dcemu.co.uk/vbulletin/showthread.php?t=6980&highlight=benchmark

http://www.dcemu.co.uk/vbulletin/showthread.php?t=4144&highlight=benchmark


Thanks, that's a great start!

I'll try those out.
However, my idea is to have a realistic in-game application of the bench, while displaying the FPS

For example:
demo1 = graphical intensity of Shenmue in rain, at stock clocks, fps will be ~15 fps, at 240 MHz, FPS will be 26 fps
-to simulate the realistic effect of the overclock.


The DMAC demo maybe can be used for this pupose, however, but that has to be tested (comparing the framrates to the ms resutls to the clocks)

Epicenter
January 4th, 2007, 15:37
It sounds to me like the thing that's really lagging the system there is the GPU, not the CPU, not getting its work done in time for the new frame to be drawn. IIRC no one's overclocked the PowerVR chip yet, probably because it would cause a rise in the frequency of the video output and potentially throw the timing of the whole system out of whack-- that is, unless its reference frequency and its operating frequency are 2 distinct entities.

As for running a framerate meter in-game, you'd have to manually edit the software (difficult, nearly impossible). I'd look for a hardware event that you can attribute to a unique frame being drawn (60 FPS will be drawn, or 50 FPS in some regions, regardless of if the game is running at full speed or not. The frames drawn, however, won't be unique.) Perhaps a line is pulled high or low when a new frame is completed and you could hook up some sort of a counter to it.

Probably better off just using a comprehensive benchmark program!

quzar
January 4th, 2007, 18:08
It sounds to me like the thing that's really lagging the system there is the GPU, not the CPU, not getting its work done in time for the new frame to be drawn. IIRC no one's overclocked the PowerVR chip yet, probably because it would cause a rise in the frequency of the video output and potentially throw the timing of the whole system out of whack-- that is, unless its reference frequency and its operating frequency are 2 distinct entities.

As for running a framerate meter in-game, you'd have to manually edit the software (difficult, nearly impossible). I'd look for a hardware event that you can attribute to a unique frame being drawn (60 FPS will be drawn, or 50 FPS in some regions, regardless of if the game is running at full speed or not. The frames drawn, however, won't be unique.) Perhaps a line is pulled high or low when a new frame is completed and you could hook up some sort of a counter to it.

Probably better off just using a comprehensive benchmark program!

No, it's the CPU. Almost regardless of how much you tax the CPU to have it send data to the GPU, it never really lags behind, and even when it does, it works on 3 frames at a time, so the frame just gets drawn next. The PVR is by far (although maybe the AICA is more complex even though you can't quite have access to it like you should) the most powerful part of the system.

Epicenter
January 18th, 2007, 01:20
No, it's the CPU. Almost regardless of how much you tax the CPU to have it send data to the GPU, it never really lags behind, and even when it does, it works on 3 frames at a time, so the frame just gets drawn next. The PVR is by far (although maybe the AICA is more complex even though you can't quite have access to it like you should) the most powerful part of the system.

If I recall correctly the PVR has direct paths to video memory and DMA to Main SH-4 RAM. Why would the CPU be heavily taxed communicating with the PVR with the exception of perhaps data decompression or conversion to a VRAM-friendly image format? ...

The PVR is extremely powerful, no doubt about it. But that doesn't mean it can't slow down. Sure, it has to draw 60 (or 50..) frames per second, but if it's overworked it will just duplicate frames and slow down the game. It's not like the SH-4 is doing 3D transformation math in software, so all it's running is game logic, physics, decompression of media to be copied to the GPU, music/audio handling code, and input, for the most part. Why would a scene liek that described in Shenmue cause intense CPU slowdown...? I could be way off base, if so, please correct me.

BlueCrab
January 18th, 2007, 03:17
It's not like the SH-4 is doing 3D transformation math in software, so all it's running is game logic, physics, decompression of media to be copied to the GPU, music/audio handling code, and input, for the most part.Actually, the SH4 is doing the 3D transformation math. Granted it has a few instructions for doing most of it, it still takes a bit of time for those instructions to run (more than normal instructions). Plus, decompressing media is a VERY CPU intensive task. Decompressing anything is pretty intensive, look how long it takes for a large bzip2'd tarball to untar (take the Linux kernel for instance, or GCC for a good example). Complex compression algorithms take a lot of time to run.

Epicenter
January 18th, 2007, 13:38
Decompression should be a one-shot thing for each loaded element, though. You shouldn't be decompressing content on the fly every frame unless the programmer is psychotic... :)

If the transformation is all happening on the SH-4 what is the PVR's power being put towards, mostly filling polys, putting down raw pixels and performing its specific 3D functions (filtering, fog effects, fun things like that?) Seems like a big waste to task the CPU with that, although its floating point performance is absurd, it still seems unorthodox to do.

OneThirty8
January 20th, 2007, 12:53
Decompression should be a one-shot thing for each loaded element, though. You shouldn't be decompressing content on the fly every frame unless the programmer is psychotic... :)
That depends on what you're decompressing. The mention above was "decompression of media," which could be interpreted a number of ways. If you're talking about decompressing a sprite or a skin for a 3d model from a compressed image format, you're right. It wouldn't make sense to do that every frame if you can do it once and store it in memory somewhere. When talking about commercial games, think about things like cut-scenes. Sofdec movies use MPEG-1 compression for the video. It wouldn't make sense to even attempt to decompress this all at once and store it. You'd run out of memory very quickly. You really have no choice but to decompress each frame of video right before it's displayed.

If the transformation is all happening on the SH-4 what is the PVR's power being put towards, mostly filling polys, putting down raw pixels and performing its specific 3D functions (filtering, fog effects, fun things like that?) Seems like a big waste to task the CPU with that, although its floating point performance is absurd, it still seems unorthodox to do.
Yeah, if fills polygons with the color or texture you've told it to use, it does depth-sorting of polygons, resizes graphics on the fly and applies bilinear filtering... it does handle quite a bit for you.

kohan69
February 11th, 2007, 22:30
As for running a framerate meter in-game, you'd have to manually edit the software (difficult, nearly impossible).

No. I meant write a software that mimics the effects of overclocking.

so this 'shenmue1 city rain benchmark' would be around 15fps on a stock 200MHz system, and 23fps at 215MHz, so on and forth

I believe overclocking the dreamcasts makes both the gpu and cpu higher clcoks

update:

Apparently there is benchmark with a golden tea kettle called "Dreamcast Debug and Benchmark" however the gramerate never drops from 60fps. It's a great way to test if all your buttons work thou :D