Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lint all the code #323

Merged
merged 31 commits into from
Jun 10, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
0197942
Add lint job to CI
Jun 1, 2022
9174beb
Ignore lint step failure
Jun 1, 2022
c540649
Don't need checkout to run lint
Jun 1, 2022
e5ca2a9
ignore unused :as binding when linting
Jun 1, 2022
c6a53fb
[lint] Remove unused imports
Jun 1, 2022
067364b
[lint] Fix namespace required but never used
Jun 1, 2022
33e7d44
[lint] Fix Prefer placing return type hint on arg vector
Jun 1, 2022
2fc57ae
Fix wrongly named namespace
Jun 1, 2022
1de71d1
[lint] Remove all usage of :refer :all
Jun 1, 2022
6b89c67
[lint] Add linter aliases for jackdaw.test.transports/deftransport an…
Jun 1, 2022
6b8f3a5
[lint] Fix redundant let expression
Jun 1, 2022
33a73e4
[lint] Fix empty or misplaced docstrings
Jun 1, 2022
c7164c4
[lint] Fix unresolved symbol
Jun 1, 2022
a6fc7fa
[lint] Fix missing else branch
Jun 1, 2022
431448b
[lint] Fix unresolved symbols
Jun 1, 2022
bcc4c56
[lint] Add linter alias for clojure.test.check.clojure-test/defspec
Jun 1, 2022
d35d652
[lint] Fix incorrect format string
Jun 1, 2022
ac6b51c
[lint] Fix redundant do
Jun 1, 2022
32753ed
[lint] Fix redundant let
Jun 1, 2022
df11e1c
[lint] Remove unused referred symbol
Jun 1, 2022
5e92ee3
[lint] Add missing require
Jun 1, 2022
6a2aa1d
[lint] Remove unused require
Jun 1, 2022
aff8e0e
[lint] Remove unused bindings
Jun 7, 2022
ac04de9
[lint] Remove trickier unused bindings
Jun 7, 2022
3c84e08
[lint] Locally ignore unresolved namespace for Datafy backport
Jun 8, 2022
335f55b
[lint] Remove unused private var
Jun 8, 2022
8b53e30
[lint] Fix unresolved symbol
Jun 8, 2022
033fa9c
[lint] Fix redefined vars
Jun 8, 2022
385560a
Update changelog
Jun 8, 2022
a360d09
Remove extra whitespace
Jun 8, 2022
1de7249
[lint] allow :refer :all in specific places
Jun 10, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,14 @@ jobs:
key: v1-jackdaw-repo-{{ .Branch }}-{{ .Revision }}
paths:
- .
lint:
Copy link
Collaborator

@jbropho jbropho Jun 6, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Jackdaw is still using circleci, not drone.

executor: machine
working_directory: /home/circleci/jackdaw
steps:
- checkout
- run: ls -la
- run: docker run --volume `pwd`:/project --rm --workdir /project cljkondo/clj-kondo sh -c 'clj-kondo --lint src test' || true

deps:
<<: *build_config
steps:
Expand All @@ -111,6 +119,8 @@ jobs:
key: *mvn_cache_key
paths:
- /home/circleci/.m2


test:
<<: *test_config
steps:
Expand Down Expand Up @@ -159,6 +169,7 @@ workflows:
version: 2
build_and_test:
jobs:
- lint
- checkout_code
- deps:
requires:
Expand Down
10 changes: 10 additions & 0 deletions .clj-kondo/config.edn
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{:linters {:unused-binding { ;; ignore unused :as binding.
:exclude-destructured-as true}
:unresolved-symbol { ;; `thrown-with-msg-and-data?` is a legit extension to the `is` macro
;; via an `assert-expr` defmethod (see clojure.test doc)
:exclude [(clojure.test/is [thrown-with-msg-and-data?])]}}

:lint-as {clojure.test.check.clojure-test/defspec clojure.core/def
jackdaw.data/defn->data clojure.core/defn
jackdaw.test.transports/deftransport clojure.core/defn
manifold.deferred/loop clojure.core/let}}
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Changelog

### Unreleased

- Add clj-kondo and fix all lint warnings and errors [#323](https://github.com/FundingCircle/jackdaw/pull/323)

### [0.9.5] - [2022-05-26]

* Move away from deprecated class ConsumerRecordFactory (to prepare migration to Kafka Streams 3.2.0)
Expand Down
2 changes: 1 addition & 1 deletion src/jackdaw/admin.clj
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@
{:pre [(client? client)
(sequential? topics)]}
(->> @(describe-topics* client (map :topic-name topics))
(every? (fn [[topic-name {:keys [partition-info]}]]
(every? (fn [[_topic-name {:keys [partition-info]}]]
(every? (fn [part-info]
(and (boolean (:leader part-info))
(seq (:isr part-info))))
Expand Down
22 changes: 11 additions & 11 deletions src/jackdaw/client.clj
Original file line number Diff line number Diff line change
Expand Up @@ -26,16 +26,16 @@

;;;; Producer

(defn ^KafkaProducer producer
(defn producer
"Return a producer with the supplied properties and optional Serdes."
([config]
(^KafkaProducer [config]
(KafkaProducer. ^java.util.Properties (jd/map->Properties config)))
([config {:keys [^Serde key-serde ^Serde value-serde]}]
(^KafkaProducer [config {:keys [^Serde key-serde ^Serde value-serde]}]
(KafkaProducer. ^java.util.Properties (jd/map->Properties config)
(.serializer key-serde)
(.serializer value-serde))))

(defn ^Callback callback
(defn callback
"Return a kafka `Callback` function out of a clojure `fn`.

The fn must be of 2-arity, being `[record-metadata?, ex?]` where the
Expand All @@ -44,9 +44,9 @@
the record.

Callbacks are `void`, so the return value is ignored."
[on-completion]
^Callback [on-completion]
(reify Callback
(onCompletion [this record-meta exception]
(onCompletion [_this record-meta exception]
(on-completion record-meta exception))))

(defn send!
Expand Down Expand Up @@ -89,11 +89,11 @@

;;;; Consumer

(defn ^KafkaConsumer consumer
(defn consumer
"Return a consumer with the supplied properties and optional Serdes."
([config]
(^KafkaConsumer [config]
(KafkaConsumer. ^java.util.Properties (jd/map->Properties config)))
([config {:keys [^Serde key-serde ^Serde value-serde] :as t}]
(^KafkaConsumer [config {:keys [^Serde key-serde ^Serde value-serde] :as t}]

(when-not (or key-serde
(get config "key.deserializer"))
Expand Down Expand Up @@ -134,15 +134,15 @@
topic-configs))
consumer)

(defn ^KafkaConsumer subscribed-consumer
(defn subscribed-consumer
"Given a broker configuration and topics, returns a consumer that is
subscribed to all of the given topic descriptors.

WARNING: All topics subscribed to by a single consumer must share a
single pair of key and value serde instances. The serdes of the
first requested topic are used, and all other topics are expected to
be able to use same serdes."
[config topic-configs]
^KafkaConsumer [config topic-configs]
(when-not (sequential? topic-configs)
(throw (ex-info "subscribed-consumer takes a seq of topics!"
{:topic-configs topic-configs})))
Expand Down
25 changes: 13 additions & 12 deletions src/jackdaw/client/partitioning.clj
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@
in `jackdaw.client` but backed by the partitioning machinery."
{:license
"BSD 3-Clause License <https://github.com/FundingCircle/jackdaw/blob/master/LICENSE>"}
(:require [jackdaw.client :as jc]
(:require [clojure.string :as str]
[jackdaw.client :as jc]
[jackdaw.data :as jd])
(:import org.apache.kafka.clients.producer.Producer
org.apache.kafka.common.serialization.Serde
Expand All @@ -49,9 +50,9 @@
[{:keys [record-key] :as t}]
(let [record-key (as-> record-key %
(-> %
(clojure.string/replace "$." "")
(clojure.string/replace "_" "-")
(clojure.string/split #"\."))
(str/replace "$." "")
(str/replace "_" "-")
(str/split #"\."))
(mapv keyword %))]
(assoc t ::key-fn #(get-in % record-key))))

Expand All @@ -63,7 +64,7 @@

(defn default-partition
"The kafka default partitioner. As a `::partition-fn`"
[{:keys [topic-name key-serde]} key value partitions]
[{:keys [topic-name key-serde]} key _value partitions]
(let [key-bytes (.serialize (.serializer ^Serde key-serde) topic-name key)]
(default-partitioner* key-bytes partitions)))

Expand Down Expand Up @@ -91,11 +92,11 @@
(partition-fn t key value %)
(->ProducerRecord producer t % key value))
(jd/->ProducerRecord t key value)))
([^Producer producer topic partition key value]
([^Producer _producer topic partition key value]
(jd/->ProducerRecord topic (int partition) key value))
([^Producer producer topic partition timestamp key value]
([^Producer _producer topic partition timestamp key value]
(jd/->ProducerRecord topic partition timestamp key value))
([^Producer producer topic partition timestamp key value headers]
([^Producer _producer topic partition timestamp key value headers]
(jd/->ProducerRecord topic partition timestamp key value headers)))

(defn produce!
Expand All @@ -107,15 +108,15 @@
([producer topic value]
(jc/send! producer
(->ProducerRecord producer topic value)))
([producer topic key value]
([producer topic _key value]
(jc/send! producer
(->ProducerRecord producer topic value)))
([producer topic partition key value]
([producer topic partition _key value]
(jc/send! producer
(->ProducerRecord producer topic partition topic value)))
([producer topic partition timestamp key value]
([producer topic partition timestamp _key value]
(jc/send! producer
(->ProducerRecord producer topic partition timestamp topic value)))
([producer topic partition timestamp key value headers]
([producer topic partition timestamp _key value headers]
(jc/send! producer
(->ProducerRecord producer topic partition timestamp topic value headers))))
3 changes: 2 additions & 1 deletion src/jackdaw/data.clj
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@
(datafy [o] o)))))

;;; Just vendor this - not worth the footwork to import the "real" one

;; Ignore clj-kondo's warning: Unresolved namespace clojure.core.protocols. Are you missing a require?
#_{:clj-kondo/ignore [:unresolved-namespace]}
(defn datafy
"Attempts to return x as data.
Expand Down
6 changes: 0 additions & 6 deletions src/jackdaw/data/admin.clj
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
(ConfigEntry. k (:value v))))

(defn->data ConfigEntry->data
""
[^ConfigEntry e]
{:name (.name e)
:value (.value e)
Expand All @@ -27,13 +26,11 @@
;;; Config

(defn map->Config
""
^Config [m]
(Config.
(map (partial apply ->ConfigEntry) m)))

(defn->data Config->data
""
[^Config c]
(into {}
(comp (map ConfigEntry->data)
Expand All @@ -44,15 +41,13 @@
;;; TopicDescription

(defn->data TopicDescription->data
""
[^TopicDescription td]
{:is-internal? (.isInternal td)
:partition-info (map datafy (.partitions td))})

;;; NewTopic

(defn map->NewTopic
""
[{:keys [:topic-name
:partition-count
:replication-factor
Expand All @@ -71,7 +66,6 @@
;;;; Result types

(defn->data DescribeClusterResult->data
""
[^DescribeClusterResult dcr]
{:cluster-id (-> dcr .clusterId .get)
:controller (-> dcr .controller .get datafy)
Expand Down
13 changes: 4 additions & 9 deletions src/jackdaw/data/common.clj
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@
;;; Node

(defn->data Node->data
""
[^Node node]
{:host (.host node)
:port (.port node)
Expand All @@ -31,7 +30,6 @@
;;; TopicPartitionInfo

(defn->data TopicPartitionInfo->data
""
[^TopicPartitionInfo tpi]
{:isr (mapv datafy (.isr tpi))
:leader (datafy (.leader tpi))
Expand All @@ -40,26 +38,23 @@

;;; Topic partition tuples

(defn ^TopicPartition ->TopicPartition
(defn ->TopicPartition
"Given unrolled ctor-style arguments, create a Kafka `TopicPartition`."
[{:keys [:topic-name]} partition]
^TopicPartition [{:keys [:topic-name]} partition]
(TopicPartition. topic-name (int partition)))

(defn map->TopicPartition
"Given a `::topic-parititon`, build an equivalent `TopicPartition`.
"Given a `topic-partition`, build an equivalent `TopicPartition`.

Inverts `(datafy ^TopicPartition tp)`."
[{:keys [topic-name
partition]
:as m}]
[{:keys [partition] :as m}]
(->TopicPartition m partition))

(defn->data TopicPartition->data [^TopicPartition tp]
{:topic-name (.topic tp)
:partition (.partition tp)})

(defn as-TopicPartition
""
^TopicPartition [o]
(cond (instance? TopicPartition o)
o
Expand Down
8 changes: 0 additions & 8 deletions src/jackdaw/data/common_config.clj
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,12 @@
;;; ConfigResource.Type

(def +broker-config-resource-type+
""
ConfigResource$Type/BROKER)

(def +topic-config-resource-type+
""
ConfigResource$Type/TOPIC)

(def +unknown-config-resource-type+
""
ConfigResource$Type/UNKNOWN)

(defn ->ConfigResourceType [o]
Expand All @@ -26,7 +23,6 @@
+unknown-config-resource-type+))

(defn->data ConfigResourceType->data
""
[^ConfigResource$Type crt]
(cond (= +broker-config-resource-type+ crt)
:config-resource/broker
Expand All @@ -40,22 +36,18 @@
;;; ConfigResource

(defn ->ConfigResource
""
[^ConfigResource$Type type ^String name]
(ConfigResource. type name))

(defn ->topic-resource
""
[name]
(->ConfigResource +topic-config-resource-type+ name))

(defn ->broker-resource
""
[name]
(->ConfigResource +broker-config-resource-type+ name))

(defn->data ConfigResource->data
""
[^ConfigResource cr]
{:name (.name cr)
:type (datafy (.type cr))})
14 changes: 7 additions & 7 deletions src/jackdaw/data/consumer.clj
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,16 @@

(import '[org.apache.kafka.clients.consumer
ConsumerRecord OffsetAndTimestamp]
'org.apache.kafka.common.header.Headers)
'org.apache.kafka.common.header.Headers
'org.apache.kafka.common.record.TimestampType)

(set! *warn-on-reflection* true)

(defn ^ConsumerRecord ->ConsumerRecord
(defn ->ConsumerRecord
"Given unrolled ctor-style arguments create a Kafka `ConsumerRecord`.

Convenient for testing the consumer API and its helpers."
[{:keys [:topic-name]} partition offset ts ts-type
^ConsumerRecord [{:keys [:topic-name]} partition offset ts ts-type
key-size value-size key value ^Headers headers]
(ConsumerRecord. topic-name
(int partition)
Expand Down Expand Up @@ -72,16 +73,15 @@

;;; OffsetAndTimestamp tuples

(defn ^OffsetAndTimestamp ->OffsetAndTimestamp
[{:keys [offset timestamp]}]
(defn ->OffsetAndTimestamp
^OffsetAndTimestamp [{:keys [offset timestamp]}]
(OffsetAndTimestamp. offset (long timestamp)))

(defn->data OffsetAndTimestamp->data [^OffsetAndTimestamp ots]
{:offset (.offset ots)
:timestamp (.timestamp ots)})

(defn map->OffsetAndTimestamp
[{:keys [offset timestamp] :as m}]
(defn map->OffsetAndTimestamp [m]
(->OffsetAndTimestamp m))

(defn as-OffsetAndTimestamp
Expand Down
Loading