Skip to content

Commit

Permalink
Add HuntBugs static analyzer and fix warnings
Browse files Browse the repository at this point in the history
HuntBugs is the unofficial successor to Findbugs on a clean code base.
The remaining warnings are all false positives and not sure how to
suppress individually. It did find a few mistakes where the array was
checked for null, rather than its element (weak key null guard).

See https://github.com/amaembo/huntbugs for details.
  • Loading branch information
ben-manes committed Nov 6, 2016
1 parent 182f7a5 commit 4750631
Show file tree
Hide file tree
Showing 9 changed files with 29 additions and 14 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -1634,7 +1634,7 @@ V putSlow(K key, V value, int newWeight, boolean notifyWriter, boolean onlyIfAbs
nodeKey[0] = n.getKey();
oldValue[0] = n.getValue();
oldWeight[0] = n.getWeight();
if ((nodeKey == null) || (oldValue[0] == null)) {
if ((nodeKey[0] == null) || (oldValue[0] == null)) {

This comment has been minimized.

Copy link
@johnou

johnou Nov 6, 2016

Could this have theoretically caused missing removal event callbacks?

This comment has been minimized.

Copy link
@ben-manes

ben-manes Nov 6, 2016

Author Owner

No, I don't think so. It would have had slight negative behavior, though.

putSlow:
Mislabeled the removal cause as REPLACED. A consumer might then NPE assuming the key was non-null.

computeIfAbsent & remap:
Retains the node (believes it is present) so a stale value is returned. But that seems unlikely since keys are looked up by identity, so the caller must have the key instance. Therefore I think this check is for consistency as the key can never be GC'd at this point.

So only one observable mistake, I think. Guava had a similar bug so I presume most users check for nullness if they use weak keys regardless of the cause.

This comment has been minimized.

Copy link
@ben-manes

ben-manes Nov 6, 2016

Author Owner

Also, putSlow is only called for when "weight=0" call, so that's very uncommon. Its either an AsyncLoadingCache with an incomplete future or a weighted cache with a weigher than returns zero. Both uncommon as we try to promote loading from (not inserting into) the cache. So I think it was benign.

This comment has been minimized.

Copy link
@johnou

johnou Nov 6, 2016

If the removal event is mislabeled that may cause a leak with how local objects are stored in Orbit (see code snippet below).. I am keen to see how it behaves with this change.

lo

    private final ConcurrentMap<Object, LocalObjectEntry> localObjects = new ConcurrentHashMap<>();
    private final Cache<Object, LocalObjectEntry> objectMap = Caffeine.newBuilder().weakKeys()
            .removalListener(this::onRemoval).build();

   private void onRemoval(final Object key, final LocalObjectEntry value, final RemovalCause removalCause)
    {
        if (removalCause == RemovalCause.COLLECTED)
        {
            if (value != null && value.getRemoteReference() != null)
            {
                localObjects.remove(value.getRemoteReference());
            }
        }
    }

This comment has been minimized.

Copy link
@ben-manes

ben-manes Nov 6, 2016

Author Owner

You should be fine. You don't use a weighted cache in that configuration, so all entries have a weight of 1. If you did, and used zero weights, and called put explicitly, then it might be observed.

This comment has been minimized.

Copy link
@johnou

johnou Nov 6, 2016

Thanks for explaining that, I'll investigate further to see what else might be the issue.

This comment has been minimized.

Copy link
@ben-manes

ben-manes Nov 6, 2016

Author Owner

I'll have a release out this evening for #130, so if I'm wrong you'll have this soon anyway. Let me know if you see anything suspicious.

This comment has been minimized.

Copy link
@ben-manes

ben-manes Nov 7, 2016

Author Owner

Released.

I think all 3 of these are benign and the key check is unnecessary. Since weak keys are compared by identity and this can only occur when the entry was found, that means the caller has a strong reference to the key. Therefore the key should not be eligible for collection in any of these three cases (put, compute, merge). I'm leaving the check as a safe pattern, as its too subtle and cheap to optimize away by removing, but I don't think the oversight could have led to any bugs.

Regardless, I do love that static analysis can catch these typos!

cause[0] = RemovalCause.COLLECTED;
} else if (hasExpired(n, now)) {
cause[0] = RemovalCause.EXPIRED;
Expand Down Expand Up @@ -1994,7 +1994,7 @@ V doComputeIfAbsent(K key, Object keyRef,
nodeKey[0] = n.getKey();
weight[0] = n.getWeight();
oldValue[0] = n.getValue();
if ((nodeKey == null) || (oldValue[0] == null)) {
if ((nodeKey[0] == null) || (oldValue[0] == null)) {
cause[0] = RemovalCause.COLLECTED;
} else if (hasExpired(n, now)) {
cause[0] = RemovalCause.EXPIRED;
Expand Down Expand Up @@ -2141,7 +2141,7 @@ V remap(K key, Object keyRef, BiFunction<? super K, ? super V, ? extends V> rema
synchronized (n) {
nodeKey[0] = n.getKey();
oldValue[0] = n.getValue();
if ((nodeKey == null) || (oldValue[0] == null)) {
if ((nodeKey[0] == null) || (oldValue[0] == null)) {
cause[0] = RemovalCause.COLLECTED;
} else if (hasExpired(n, now)) {
cause[0] = RemovalCause.EXPIRED;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ void configure(String key, @Nullable String value) {
maximumWeight(key, value);
return;
case "weakKeys":
weakKeys(key, value);
weakKeys(value);
return;
case "weakValues":
valueStrength(key, value, Strength.WEAK);
Expand All @@ -200,7 +200,7 @@ void configure(String key, @Nullable String value) {
refreshAfterWrite(key, value);
return;
case "recordStats":
recordStats(key, value);
recordStats(value);
return;
default:
throw new IllegalArgumentException("Unknown key " + key);
Expand Down Expand Up @@ -233,7 +233,7 @@ void maximumWeight(String key, String value) {
}

/** Configures the keys as weak references. */
void weakKeys(String key, @Nullable String value) {
void weakKeys(@Nullable String value) {
requireArgument(value == null, "weak keys does not take a value");
requireArgument(keyStrength == null, "weak keys was already set");
keyStrength = Strength.WEAK;
Expand Down Expand Up @@ -268,7 +268,7 @@ void refreshAfterWrite(String key, String value) {
}

/** Configures the value as weak or soft references. */
void recordStats(String key, @Nullable String value) {
void recordStats(@Nullable String value) {
requireArgument(value == null, "record stats does not take a value");
requireArgument(!recordStats, "record stats was already set");
recordStats = true;
Expand Down
7 changes: 7 additions & 0 deletions gradle/code_quality.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
* Configurations for code quality analysis
*/
apply plugin: 'org.kordamp.gradle.stats'
apply plugin: 'one.util.huntbugs'
apply plugin: 'checkstyle'
apply plugin: 'findbugs'
apply plugin: 'jacoco'
Expand Down Expand Up @@ -32,6 +33,12 @@ checkstyle {
config = resources.text.fromArchiveEntry(configurations.checkstyleConfig, 'google_checks.xml')
}

huntbugs {
classSimpleFilter { classname ->
!classname.contains('Header') && !classname.contains('Mpsc') && !classname.contains('Factory')
}
}

findbugs {
effort = 'max'
excludeFilter = file("${rootDir}/config/findbugs/exclude.xml")
Expand Down
6 changes: 6 additions & 0 deletions gradle/dependencies.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,8 @@ ext {
coveralls: '2.6.3',
extra_conf: '3.1.0',
error_prone: '0.0.8',
huntbugs_core: '0.0.11',
huntbugs_plugin: '0.3.5',
jmh: '0.3.0',
nexus: '2.3.1',
pmd: '5.5.1',
Expand Down Expand Up @@ -164,6 +166,10 @@ ext {
coveralls: "org.kt3k.gradle.plugin:coveralls-gradle-plugin:${plugin_versions.coveralls}",
extra_conf: "com.netflix.nebula:gradle-extra-configurations-plugin:${plugin_versions.extra_conf}",
error_prone: "net.ltgt.gradle:gradle-errorprone-plugin:${plugin_versions.error_prone}",
huntbugs: [
"one.util:huntbugs:${plugin_versions.huntbugs_core}",
"gradle.plugin.one.util.huntbugs:huntbugs-gradle-plugin:${plugin_versions.huntbugs_plugin}",
],
jmh: "gradle.plugin.me.champeau.gradle:jmh-gradle-plugin:${plugin_versions.jmh}",
nexus: "com.bmuschko:gradle-nexus-plugin:${plugin_versions.nexus}",
semantic_versioning: "io.ehdev:gradle-semantic-versioning:${plugin_versions.semantic_versioning}",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ public void increment(long e) {
// just a model.
if (currSum == maxSum) {
List<Long> array = new ArrayList<>(table.keySet());
long itemToRemove = array.get(random.nextInt(array.size()));
Long itemToRemove = array.get(random.nextInt(array.size()));
value = table.remove(itemToRemove);

if (value > 1) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ public HashedItem createHash(long item) {
h ^= item;
h *= m;

fpaux.fingerprint = (byte) (h & fpMask);
fpaux.fingerprint = (byte) h;
// the next line is a dirty fix as I do not want the value of 0 as a fingerprint.
// It can be eliminated if we want very short fingerprints.
fpaux.fingerprint = (fpaux.fingerprint == 0L) ? 1 : fpaux.fingerprint;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,6 @@
public final class TinyCacheAdapter implements Frequency {
// the actual data structure.
TinyCacheSketch tcs;
// number of (independent sets)
int nrSets;
// size between cache and sample.
static final int sampleFactor = 10;
// max frequency estimation of an item.
Expand All @@ -39,7 +37,7 @@ public final class TinyCacheAdapter implements Frequency {
*/
public TinyCacheAdapter(Config config) {
BasicSettings settings = new BasicSettings(config);
nrSets = sampleFactor * settings.maximumSize() / 64;
int nrSets = sampleFactor * settings.maximumSize() / 64; // number of (independent sets)
tcs = new TinyCacheSketch(nrSets, 64,settings.randomSeed());
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,14 @@
*/
@SuppressWarnings("PMD.AvoidDollarSigns")
public final class TinyCacheWithGhostCache {
private static final int sampleSize = 10;

public final long[] chainIndex;
public final long[] isLastIndex;
private final HashFunctionParser hashFunc;
private final int itemsPerSet;
private final long[] cache;
private final Random rnd;
private final int sampleSize;
private final TinyCacheSketch ghostCache;

public TinyCacheWithGhostCache(int nrSets, int itemsPerSet, int randomSeed) {
Expand All @@ -41,7 +42,6 @@ public TinyCacheWithGhostCache(int nrSets, int itemsPerSet, int randomSeed) {
hashFunc = new HashFunctionParser(nrSets);
this.itemsPerSet = itemsPerSet;
cache = new long[nrSets * itemsPerSet];
sampleSize = 10;
ghostCache = new TinyCacheSketch(nrSets * sampleSize, itemsPerSet, randomSeed + 1);
rnd = new Random(randomSeed);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -138,6 +138,10 @@ private void evict() {
break;
}
}
if (victim == null) {
return;
}

victim.remove();
data.remove(victim.key);
out.put(victim.key, victim);
Expand Down

0 comments on commit 4750631

Please sign in to comment.