Skip to content

Commit

Permalink
New consistency factor calculation
Browse files Browse the repository at this point in the history
  • Loading branch information
TheDark98 committed Jan 17, 2025
1 parent 91b28ea commit e7dbde5
Showing 1 changed file with 10 additions and 9 deletions.
19 changes: 10 additions & 9 deletions osu.Game.Rulesets.Taiko/Difficulty/TaikoDifficultyCalculator.cs
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
using System;
using System.Collections.Generic;
using System.Linq;
using osu.Framework.Extensions.ObjectExtensions;

Check failure on line 7 in osu.Game.Rulesets.Taiko/Difficulty/TaikoDifficultyCalculator.cs

View workflow job for this annotation

GitHub Actions / Code Quality

Using directive is unnecessary. (https://learn.microsoft.com/dotnet/fundamentals/code-analysis/style-rules/ide0005)

Check failure on line 7 in osu.Game.Rulesets.Taiko/Difficulty/TaikoDifficultyCalculator.cs

View workflow job for this annotation

GitHub Actions / Code Quality

Using directive is unnecessary. (https://learn.microsoft.com/dotnet/fundamentals/code-analysis/style-rules/ide0005)
using osu.Game.Beatmaps;
using osu.Game.Rulesets.Difficulty;
using osu.Game.Rulesets.Difficulty.Preprocessing;
Expand Down Expand Up @@ -203,25 +204,25 @@ private double combinedDifficultyValue(Rhythm rhythm, Reading reading, Colour co
peaks.Add(peak);
}

// We are taking 20% of the top weight spikes in strains to achieve a good perception of the map's peak sections.
List<double> hardStrains = peaks.OrderDescending().ToList().GetRange(0, peaks.Count / 10 * 2);

// We are taking 20% of the middle weight spikes in strains to achieve a good perception of the map's overall progression.
List<double> midStrains = peaks.OrderDescending().ToList().GetRange(peaks.Count / 10 * 4, peaks.Count / 10 * 2);

// We can calculate the consitency factor by doing middle weight spikes / most weight spikes.
// It resoult in a value that rappresent the consistency for all peaks in a range number from 0 to 1.
totalConsistencyFactor = midStrains.Average() / hardStrains.Average();
List<double> hardStrains = new List<double>();

double difficulty = 0;
double weight = 1;

foreach (double strain in peaks.OrderDescending())
{
// We are taking only the spikes that fit in the 80% of the top weight spike in strains to achieve a good perception of the map's peak sections.
if (strain / peaks.Max() > 0.8)
hardStrains.Add(strain);

difficulty += strain * weight;
weight *= 0.9;
}

// We can calculate the consitency factor by doing middle weight spikes / most weight spikes.
// It result in a value that represent the consistency for all peaks in a range number from 0 to 1.
totalConsistencyFactor = peaks.Average() / hardStrains.Average();

return difficulty;
}

Expand Down

0 comments on commit e7dbde5

Please sign in to comment.