Assetto Corsa Benchmark Thread

  • Thread starter Punknoodle
  • 59 comments
  • 27,775 views
3,211
Australia
Brisbane
Punknoodle_Nick
For those asking what performance they can expect from certain hardware etc, AC now has an in built bench marking tool.

I'll go first, I hit benchmark with my current settings which I run all the time.

AC VERSION: 1.3.4 (x64)
POINTS: 9315
FPS: AVG=63 MIN=26 MAX=131 VARIANCE=10 CPU=89%

LOADING TIME: 13s
GPU: NVIDIA GeForce GTX 780 (5760x1080)
OS-Version: 6.2.9200 () 0x300-0x1
CPU CORES: 4
FULLSCREEN: ON
AA:2X AF:8X SHDW:2048 BLUR:0
WORLD DETAIL: 5 SMOKE:3
PP: QLT:5 HDR:1 FXAA:0 GLR:5 DOF:5 RAYS:1 HEAT:1
 
AC VERSION: 1.3.4 (x64)
POINTS: 25270
FPS: AVG=172 MIN=31 MAX=281 VARIANCE=41 CPU=74%

LOADING TIME: 13s
GPU: NVIDIA GeForce GTX 760 (1920x1080)
OS-Version: 6.1.7601 (Service Pack 1) 0x100-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:4X AF:16X SHDW:2048 BLUR:0
WORLD DETAIL: 5 SMOKE:5
PP: QLT:4 HDR:1 FXAA:1 GLR:4 DOF:4 RAYS:1 HEAT:1

(i7-3770, Asus ROG MARS GTX-760)
 
AC VERSION: 1.3.4 (x86)
POINTS: 5027
FPS: AVG=34 MIN=2 MAX=47 VARIANCE=1 CPU=53%

LOADING TIME: 85s
GPU: AMD Radeon R9 200 Series (1920x1058)
OS-Version: 6.1.7601 (Service Pack 1) 0x100-0x1
CPU CORES: 4
FULLSCREEN: OFF
AA:4X AF:16X SHDW:4096 BLUR:0
WORLD DETAIL: 5 SMOKE:0
PP: QLT:3 HDR:0 FXAA:1 GLR:3 DOF:3 RAYS:1 HEAT:1
 
AC VERSION: 1.3.4 (x64)
POINTS: 8433
FPS: AVG=57 MIN=37 MAX=69 VARIANCE=0 CPU=34%

LOADING TIME: 13s
GPU: NVIDIA GeForce GTX 980 Ti (5900x1080)
OS-Version: 6.2.9200 () 0x100-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:4X AF:16X SHDW:4096 BLUR:4
WORLD DETAIL: 5 SMOKE:5
PP: QLT:5 HDR:1 FXAA:1 GLR:5 DOF:5 RAYS:1 HEAT:1
 
AC VERSION: 1.3.4 (x64)
POINTS: 19805
FPS: AVG=135 MIN=12 MAX=228 VARIANCE=5 CPU=61%

LOADING TIME: 37s
GPU: NVIDIA GeForce GTX 970 (1920x1080)
OS-Version: 6.1.7601 (Service Pack 1) 0x300-0x1
CPU CORES: 4
FULLSCREEN: ON
AA:4X AF:8X SHDW:2048 BLUR:0
WORLD DETAIL: 5 SMOKE:2
PP: QLT:5 HDR:1 FXAA:1 GLR:5 DOF:5 RAYS:1 HEAT:1

^ Running my normal settings. Might bump up the AA and AF and see how much it affects the FPS. Also CPU is a Core i5 4690.
 
I see a GTX 760, which is a 2Gb card if I'm not mistaken, with a bench of 25,270 but a GTX 970 comes in at 19,805 and a 980Ti at only 8433:confused::confused:. Is this like a reverse benchmark scale?
 
I see a GTX 760, which is a 2Gb card if I'm not mistaken, with a bench of 25,270 but a GTX 970 comes in at 19,805 and a 980Ti at only 8433:confused::confused:. Is this like a reverse benchmark scale?

My 770 2GB was basically on par with the 980 O.o
 
I see a GTX 760, which is a 2Gb card if I'm not mistaken, with a bench of 25,270 but a GTX 970 comes in at 19,805 and a 980Ti at only 8433:confused::confused:. Is this like a reverse benchmark scale?

The MARS760 has 2 GPUs on one card, each GPU has 2Gb
 
The MARS760 has 2 GPUs on one card, each GPU has 2Gb
That might put it at 8-9k, not 25k. 25k is essentially 2x980Ti performance. There's also the fact that the 980Ti got out benched by a GTX770 and a GTX970 doubled up on it's performance as well. Something doesn't make sense to me but I'm no expert.
 
Yeah my score was triple screen rendering as well. Running single screen rendering will increase the score, and obviously going back to 1920x1080 resolution would increase it greatly but I wanted to show what a single 780 does running triple screens, as there are misconceptions out there that you need 2 cards to run triple screens when you clearly don't. My settings aren't low, sure more AA would probably look nicer on larger displays but on my 24s I don't notice the difference between them and I don't think it's worth the performance hit. I'm very happy with my single card performance - and it's a 1 generation old card, too.
 
Your score seems to be based on the FPS you're getting, which obviously depends on the settings you're running. My GTX 970 got 14197 points at my usual settings and 8548 points with everything at the highest settings.
 
8FD7137D521F6790CC962576ECAEB5BB8D516A36

Here is mine with an I5 and 7950 3gb gpu.
 
In order for results to be comparable in this thread, we would need to run benches at agreed settings. I would suggest some "middle ground" settings that most people can run, i.e. resolution at 1920x1080, for example.

We could of course go all in and put everything to max, but the last time I checked those reflection settings at max completely choked my GPUs, so for me that's not really relevant. I have Reflection Quality at Low and Reflection Rendering Frequence at Static, IIRC.

As @Huks said, the score seems to be based on FPS, so that number is irrelevant unless everyone has equal settings.

EDIT: It might also be worth remembering that VRAM doesn't stack. If you have two cards (SLI or Crossfire) with 4GB on each card, then you effectively have 4GB VRAM. 4+4=4, not 8. There was a rumor that DX12 might support VRAM stacking, but I haven't any heard more about it really.
 
Last edited:
What is the variance value telling us?
Stefano explained: "variance is the average fps variation from frame to frame. You want it as low as possible, a low variance is an indication of stable "smooth" frame rate.".

EDIT: If I'm not mistaken, this function measures the frame time, i.e. how long each frame is displayed on the screen.
 
Last edited:
AC VERSION: 1.3.4 (x64)
POINTS: 10036
FPS: AVG=68 MIN=21 MAX=132 VARIANCE=2 CPU=42%

LOADING TIME: 14s
GPU: NVIDIA GeForce GTX 965M (1920x1080)
OS-Version: 6.2.9200 () 0x300-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:4X AF:16X SHDW:4096 BLUR:0
WORLD DETAIL: 5 SMOKE:5
PP: QLT:5 HDR:1 FXAA:1 GLR:5 DOF:5 RAYS:1 HEAT:1

Everything is maxed out. My laptop has 2 GTX 965M GPUs running in SLI.
 
On a Dell Latitude E6450 laptop, with a 24in monitor connected via HDMI. Not sure whether I'm actually using the onboard Intel graphics card (listed in the GPU section below) or the add-on AMD Radeon HD 8790M w/ 2GB. Getting 60fps in hotlap conditions, so that's all I care about. Graphics settings toward the low end, but that's fine because work paid for the laptop. :P

AC VERSION: 1.3.4 (x64)
POINTS: 6131
FPS: AVG=41 MIN=19 MAX=55 VARIANCE=1 CPU=86%

LOADING TIME: 42s
GPU: Intel(R) HD Graphics 4600 (1280x720)
OS-Version: 6.1.7601 (Service Pack 1) 0x100-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:1X AF:2X SHDW:512 BLUR:0
WORLD DETAIL: 2 SMOKE:2
PP: QLT:0 HDR:0 FXAA:0 GLR:0 DOF:0 RAYS:0 HEAT:0
 
Benchmark doesn't make much sense to me to only base the score on fps alone. Since my fps is capped at 60, I got the same score for both 1440p and 4k [8765pts, most settings maxed, 2xAA].

Tomorrow I'm going to uncap the frame rate, run everything maxed out at 4k, then do a lowest settings run at minimum resolution to see what my new 980ti can get.
 
Benchmark doesn't make much sense to me to only base the score on fps alone. Since my fps is capped at 60, I got the same score for both 1440p and 4k [8765pts, most settings maxed, 2xAA].

Tomorrow I'm going to uncap the frame rate, run everything maxed out at 4k, then do a lowest settings run at minimum resolution to see what my new 980ti can get.
I might be wrong, but even if you cap your FPS at 60, when you run the benchmark it uncaps the FPS limit. I thought I read that on the official forums. I have mine capped at 60FPS for game play, but my system averaged 68fps during the benchmark test.
 
I might be wrong, but even if you cap your FPS at 60, when you run the benchmark it uncaps the FPS limit. I thought I read that on the official forums. I have mine capped at 60FPS for game play, but my system averaged 68fps during the benchmark test.

Yes the benchmark measures potential performance. Running at higher frame rates than your monitor can produce won't produce better graphics. Having minimum values below your monitors frame rate means that some frames won't be updated creating a mini stutter.
 
1440p and 4k both registered 60fps max and 59 average with both running at the same graphical settings when I did the benchmarks, hence getting the same score. 1440p shoud have had a much higher frame rate. Might be to do with Rivatuna running in the background which overruled the Assetto Corsa benchmark?
 
1440p and 4k both registered 60fps max and 59 average with both running at the same graphical settings when I did the benchmarks, hence getting the same score. 1440p shoud have had a much higher frame rate. Might be to do with Rivatuna running in the background which overruled the Assetto Corsa benchmark?

Something else is happening with your results. I am running with a evga GTX970 SSC and a single 4K/60hz monitor. I have been running at it's max resolution at 3840 X 2160. I ran a benchmark on it.

3840 X 2160
AC VERSION: 1.3.4 (x64)
POINTS: 7998
FPS: AVG=54 MIN=43 MAX=99 VARIANCE=0 CPU=27%

LOADING TIME: 16s
GPU: NVIDIA GeForce GTX 970 (3840x2160)
OS-Version: 6.2.9200 () 0x100-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:4X AF:4X SHDW:2048 BLUR:0
WORLD DETAIL: 4 SMOKE:3
PP: QLT:5 HDR:1 FXAA:0 GLR:5 DOF:5 RAYS:1 HEAT:1

When I saw this result I was struck by the fact that the GPU was not able to keep the monitor fed at it's full capacity while the computer was loafing. So I tried 1080p for the output.

1920 X 1080
AC VERSION: 1.3.4 (x64)
POINTS: 22911
FPS: AVG=156 MIN=72 MAX=241 VARIANCE=25 CPU=78%

LOADING TIME: 12s
GPU: NVIDIA GeForce GTX 970 (1920x1080)
OS-Version: 6.2.9200 () 0x100-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:4X AF:4X SHDW:2048 BLUR:0
WORLD DETAIL: 4 SMOKE:3
PP: QLT:5 HDR:1 FXAA:0 GLR:5 DOF:5 RAYS:1 HEAT:1

The GPU is now really pumping out the frames, the CPU is working a lot harder BUT the quality of the picture on the monitor is significantly less detailed. It is easy to the see the quality difference. So I selected an intermediate resolution between 1080p and 2160p with the same aspect ratio.

2560 X 1440
AC VERSION: 1.3.4 (x64)
POINTS: 15517
FPS: AVG=105 MIN=77 MAX=148 VARIANCE=0 CPU=53%

LOADING TIME: 11s
GPU: NVIDIA GeForce GTX 970 (2560x1440)
OS-Version: 6.2.9200 () 0x100-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:4X AF:4X SHDW:2048 BLUR:0
WORLD DETAIL: 4 SMOKE:3
PP: QLT:5 HDR:1 FXAA:0 GLR:5 DOF:5 RAYS:1 HEAT:1

I am going to be using this setting for a while as this seems to be in the sweet spot for my rig but has me thinking about a new GPU that can comfortable perform with minimums above 60fps at a full 4K resolution. Is anyone currently running a GPU on a single 4K monitor that meets this objective.
 
I am going to be using this setting for a while as this seems to be in the sweet spot for my rig but has me thinking about a new GPU that can comfortable perform with minimums above 60fps at a full resolution. Is anyone currently running a GPU on a single monitor that meets this objective.

I found the 970 could pretty much max the game out at 1440p and keep above 60fps. My Zotac could overclock beyond 1500mhz though so your mileage may vary. Like yourself however, once I got a taste of the image quality that 4K delivers, even 1440p can leave something to be desired, although it is a great compromise between image quality and frame rate.

I upgraded to the 980ti which handles this game at 4K very well, I have no problem locking in a solid 60fps with 2xAA and most settings maxed, the other few on high.


Edit 2 - Seems like it was Rivatuna over riding AC. Anyway, tried everything on minimum at 640x480, but was completely CPU limited. Avg of 239fps and max of 355 with the GPU between 26-29% utilization. Gave a score of 34000.
 
Last edited:
Was messing around today with some settings and running the test and discovered something today that's been bugging me in AC. I have always had a little bit of stutter in certain corners on a couple of tracks. Doesn't matter if I run triple or single screen , 250 fps or 50, vsync or not it always happened same spots same tracks. I turned the shadow resolution from high to med. today and it was like night and day difference. The game runs so much smoother now. It was good before except those few corners, but know its just so much better. I didnt noticed much difference in the shadows, but the fps boost allowed me to bump up AF 1 notch and world detail 1 to very high. Just thought Id share in case others are having stuttering issues, might work for you as well.
 
Does anyone know what the scores mean? Like, what is minimum score a system should achieve in order to provide an enjoyable experience? Of course, some systems will have to turn down some settings (maybe even all the settings), in order to achieve that "baseline score", whereas, some people own systems that allow them to run everything fully maxed out and still obtain a great 5 figure score. Is 9000 good? Or is 9000 just okay? Is 10000 good? Or is 10000 outstanding?

One thing I've noticed (maybe it was on the official forums) is that some people achieved benchmark test scores of like ~14000, but they keep several of the settings turned down quite low. Now, I'm sure their system is capable of running everything maxed out and probably still score 11000, yet they don't... That blows my mind. Why keep settings turned down if it's not hindering your performance?
 
Was messing around today with some settings and running the test and discovered something today that's been bugging me in AC. I have always had a little bit of stutter in certain corners on a couple of tracks. Doesn't matter if I run triple or single screen , 250 fps or 50, vsync or not it always happened same spots same tracks. I turned the shadow resolution from high to med. today and it was like night and day difference. The game runs so much smoother now. It was good before except those few corners, but know its just so much better. I didnt noticed much difference in the shadows, but the fps boost allowed me to bump up AF 1 notch and world detail 1 to very high. Just thought Id share in case others are having stuttering issues, might work for you as well.
Was one of those corners the two sharp right handers at the back of Magione? I had trouble with those for months with horrible stuttering. I noticed in my benchmark my frames went as low as 34:guilty: They certainly can't be having something like that when the game makes it to console.
 
Like, what is minimum score a system should achieve in order to provide an enjoyable experience?
Enjoyable experience is a very subjective thing. For me it's about keeping the framerate above my refresh rate (since I'm using vsync) while still having a decent amount of eye candy. For smooth and stutter-free gameplay.

I guess you'll just have to figure it out for yourself: What's the minimum framerate you can accept with how many cars on the track, and at which graphical detail. :) Once you figure that out it's just a matter of driving some laps, watching the replay and keep an eye on the fps and adjust settings accordingly. Then repeat the process until you are happy. That's is what I did, but of course others might have a completely different approach.

The fps score is irrelevant at the moment, until we know how it's calculated, so I wouldn't worry too much about it.
 
Last edited:
Back