Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lint all the code #323

Merged
merged 31 commits into from
Jun 10, 2022
Merged
Show file tree
Hide file tree
Changes from 22 commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
0197942
Add lint job to CI
Jun 1, 2022
9174beb
Ignore lint step failure
Jun 1, 2022
c540649
Don't need checkout to run lint
Jun 1, 2022
e5ca2a9
ignore unused :as binding when linting
Jun 1, 2022
c6a53fb
[lint] Remove unused imports
Jun 1, 2022
067364b
[lint] Fix namespace required but never used
Jun 1, 2022
33e7d44
[lint] Fix Prefer placing return type hint on arg vector
Jun 1, 2022
2fc57ae
Fix wrongly named namespace
Jun 1, 2022
1de71d1
[lint] Remove all usage of :refer :all
Jun 1, 2022
6b89c67
[lint] Add linter aliases for jackdaw.test.transports/deftransport an…
Jun 1, 2022
6b8f3a5
[lint] Fix redundant let expression
Jun 1, 2022
33a73e4
[lint] Fix empty or misplaced docstrings
Jun 1, 2022
c7164c4
[lint] Fix unresolved symbol
Jun 1, 2022
a6fc7fa
[lint] Fix missing else branch
Jun 1, 2022
431448b
[lint] Fix unresolved symbols
Jun 1, 2022
bcc4c56
[lint] Add linter alias for clojure.test.check.clojure-test/defspec
Jun 1, 2022
d35d652
[lint] Fix incorrect format string
Jun 1, 2022
ac6b51c
[lint] Fix redundant do
Jun 1, 2022
32753ed
[lint] Fix redundant let
Jun 1, 2022
df11e1c
[lint] Remove unused referred symbol
Jun 1, 2022
5e92ee3
[lint] Add missing require
Jun 1, 2022
6a2aa1d
[lint] Remove unused require
Jun 1, 2022
aff8e0e
[lint] Remove unused bindings
Jun 7, 2022
ac04de9
[lint] Remove trickier unused bindings
Jun 7, 2022
3c84e08
[lint] Locally ignore unresolved namespace for Datafy backport
Jun 8, 2022
335f55b
[lint] Remove unused private var
Jun 8, 2022
8b53e30
[lint] Fix unresolved symbol
Jun 8, 2022
033fa9c
[lint] Fix redefined vars
Jun 8, 2022
385560a
Update changelog
Jun 8, 2022
a360d09
Remove extra whitespace
Jun 8, 2022
1de7249
[lint] allow :refer :all in specific places
Jun 10, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,14 @@ jobs:
key: v1-jackdaw-repo-{{ .Branch }}-{{ .Revision }}
paths:
- .
lint:
Copy link
Collaborator

@jbropho jbropho Jun 6, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Jackdaw is still using circleci, not drone.

executor: machine
working_directory: /home/circleci/jackdaw
steps:
- checkout
- run: ls -la
- run: docker run --volume `pwd`:/project --rm --workdir /project cljkondo/clj-kondo sh -c 'clj-kondo --lint src test' || true

deps:
<<: *build_config
steps:
Expand All @@ -111,6 +119,8 @@ jobs:
key: *mvn_cache_key
paths:
- /home/circleci/.m2


test:
<<: *test_config
steps:
Expand Down Expand Up @@ -159,6 +169,7 @@ workflows:
version: 2
build_and_test:
jobs:
- lint
- checkout_code
- deps:
requires:
Expand Down
7 changes: 7 additions & 0 deletions .clj-kondo/config.edn
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{:linters {:unused-binding { ;; ignore unused :as binding.
:exclude-destructured-as true}}

:lint-as {clojure.test.check.clojure-test/defspec clojure.core/def
jackdaw.data/defn->data clojure.core/defn
jackdaw.test.transports/deftransport clojure.core/defn
manifold.deferred/loop clojure.core/let}}
20 changes: 10 additions & 10 deletions src/jackdaw/client.clj
Original file line number Diff line number Diff line change
Expand Up @@ -26,16 +26,16 @@

;;;; Producer

(defn ^KafkaProducer producer
(defn producer
"Return a producer with the supplied properties and optional Serdes."
([config]
(^KafkaProducer [config]
(KafkaProducer. ^java.util.Properties (jd/map->Properties config)))
([config {:keys [^Serde key-serde ^Serde value-serde]}]
(^KafkaProducer [config {:keys [^Serde key-serde ^Serde value-serde]}]
(KafkaProducer. ^java.util.Properties (jd/map->Properties config)
(.serializer key-serde)
(.serializer value-serde))))

(defn ^Callback callback
(defn callback
"Return a kafka `Callback` function out of a clojure `fn`.

The fn must be of 2-arity, being `[record-metadata?, ex?]` where the
Expand All @@ -44,7 +44,7 @@
the record.

Callbacks are `void`, so the return value is ignored."
[on-completion]
^Callback [on-completion]
(reify Callback
(onCompletion [this record-meta exception]
(on-completion record-meta exception))))
Expand Down Expand Up @@ -89,11 +89,11 @@

;;;; Consumer

(defn ^KafkaConsumer consumer
(defn consumer
"Return a consumer with the supplied properties and optional Serdes."
([config]
(^KafkaConsumer [config]
(KafkaConsumer. ^java.util.Properties (jd/map->Properties config)))
([config {:keys [^Serde key-serde ^Serde value-serde] :as t}]
(^KafkaConsumer [config {:keys [^Serde key-serde ^Serde value-serde] :as t}]

(when-not (or key-serde
(get config "key.deserializer"))
Expand Down Expand Up @@ -134,15 +134,15 @@
topic-configs))
consumer)

(defn ^KafkaConsumer subscribed-consumer
(defn subscribed-consumer
"Given a broker configuration and topics, returns a consumer that is
subscribed to all of the given topic descriptors.

WARNING: All topics subscribed to by a single consumer must share a
single pair of key and value serde instances. The serdes of the
first requested topic are used, and all other topics are expected to
be able to use same serdes."
[config topic-configs]
^KafkaConsumer [config topic-configs]
(when-not (sequential? topic-configs)
(throw (ex-info "subscribed-consumer takes a seq of topics!"
{:topic-configs topic-configs})))
Expand Down
9 changes: 5 additions & 4 deletions src/jackdaw/client/partitioning.clj
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@
in `jackdaw.client` but backed by the partitioning machinery."
{:license
"BSD 3-Clause License <https://github.com/FundingCircle/jackdaw/blob/master/LICENSE>"}
(:require [jackdaw.client :as jc]
(:require [clojure.string :as str]
[jackdaw.client :as jc]
[jackdaw.data :as jd])
(:import org.apache.kafka.clients.producer.Producer
org.apache.kafka.common.serialization.Serde
Expand All @@ -49,9 +50,9 @@
[{:keys [record-key] :as t}]
(let [record-key (as-> record-key %
(-> %
(clojure.string/replace "$." "")
(clojure.string/replace "_" "-")
(clojure.string/split #"\."))
(str/replace "$." "")
(str/replace "_" "-")
(str/split #"\."))
(mapv keyword %))]
(assoc t ::key-fn #(get-in % record-key))))

Expand Down
6 changes: 0 additions & 6 deletions src/jackdaw/data/admin.clj
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
(ConfigEntry. k (:value v))))

(defn->data ConfigEntry->data
""
[^ConfigEntry e]
{:name (.name e)
:value (.value e)
Expand All @@ -27,13 +26,11 @@
;;; Config

(defn map->Config
""
^Config [m]
(Config.
(map (partial apply ->ConfigEntry) m)))

(defn->data Config->data
""
[^Config c]
(into {}
(comp (map ConfigEntry->data)
Expand All @@ -44,15 +41,13 @@
;;; TopicDescription

(defn->data TopicDescription->data
""
[^TopicDescription td]
{:is-internal? (.isInternal td)
:partition-info (map datafy (.partitions td))})

;;; NewTopic

(defn map->NewTopic
""
[{:keys [:topic-name
:partition-count
:replication-factor
Expand All @@ -71,7 +66,6 @@
;;;; Result types

(defn->data DescribeClusterResult->data
""
[^DescribeClusterResult dcr]
{:cluster-id (-> dcr .clusterId .get)
:controller (-> dcr .controller .get datafy)
Expand Down
7 changes: 2 additions & 5 deletions src/jackdaw/data/common.clj
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@
;;; Node

(defn->data Node->data
""
[^Node node]
{:host (.host node)
:port (.port node)
Expand All @@ -31,7 +30,6 @@
;;; TopicPartitionInfo

(defn->data TopicPartitionInfo->data
""
[^TopicPartitionInfo tpi]
{:isr (mapv datafy (.isr tpi))
:leader (datafy (.leader tpi))
Expand All @@ -40,9 +38,9 @@

;;; Topic partition tuples

(defn ^TopicPartition ->TopicPartition
(defn ->TopicPartition
"Given unrolled ctor-style arguments, create a Kafka `TopicPartition`."
[{:keys [:topic-name]} partition]
^TopicPartition [{:keys [:topic-name]} partition]
(TopicPartition. topic-name (int partition)))

(defn map->TopicPartition
Expand All @@ -59,7 +57,6 @@
:partition (.partition tp)})

(defn as-TopicPartition
""
^TopicPartition [o]
(cond (instance? TopicPartition o)
o
Expand Down
8 changes: 0 additions & 8 deletions src/jackdaw/data/common_config.clj
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,12 @@
;;; ConfigResource.Type

(def +broker-config-resource-type+
""
ConfigResource$Type/BROKER)

(def +topic-config-resource-type+
""
ConfigResource$Type/TOPIC)

(def +unknown-config-resource-type+
""
ConfigResource$Type/UNKNOWN)

(defn ->ConfigResourceType [o]
Expand All @@ -26,7 +23,6 @@
+unknown-config-resource-type+))

(defn->data ConfigResourceType->data
""
[^ConfigResource$Type crt]
(cond (= +broker-config-resource-type+ crt)
:config-resource/broker
Expand All @@ -40,22 +36,18 @@
;;; ConfigResource

(defn ->ConfigResource
""
[^ConfigResource$Type type ^String name]
(ConfigResource. type name))

(defn ->topic-resource
""
[name]
(->ConfigResource +topic-config-resource-type+ name))

(defn ->broker-resource
""
[name]
(->ConfigResource +broker-config-resource-type+ name))

(defn->data ConfigResource->data
""
[^ConfigResource cr]
{:name (.name cr)
:type (datafy (.type cr))})
11 changes: 6 additions & 5 deletions src/jackdaw/data/consumer.clj
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,16 @@

(import '[org.apache.kafka.clients.consumer
ConsumerRecord OffsetAndTimestamp]
'org.apache.kafka.common.header.Headers)
'org.apache.kafka.common.header.Headers
'org.apache.kafka.common.record.TimestampType)

(set! *warn-on-reflection* true)

(defn ^ConsumerRecord ->ConsumerRecord
(defn ->ConsumerRecord
"Given unrolled ctor-style arguments create a Kafka `ConsumerRecord`.

Convenient for testing the consumer API and its helpers."
[{:keys [:topic-name]} partition offset ts ts-type
^ConsumerRecord [{:keys [:topic-name]} partition offset ts ts-type
key-size value-size key value ^Headers headers]
(ConsumerRecord. topic-name
(int partition)
Expand Down Expand Up @@ -72,8 +73,8 @@

;;; OffsetAndTimestamp tuples

(defn ^OffsetAndTimestamp ->OffsetAndTimestamp
[{:keys [offset timestamp]}]
(defn ->OffsetAndTimestamp
^OffsetAndTimestamp [{:keys [offset timestamp]}]
(OffsetAndTimestamp. offset (long timestamp)))

(defn->data OffsetAndTimestamp->data [^OffsetAndTimestamp ots]
Expand Down
34 changes: 17 additions & 17 deletions src/jackdaw/data/producer.clj
Original file line number Diff line number Diff line change
Expand Up @@ -13,27 +13,27 @@

;;; Producer record

(defn ^ProducerRecord ->ProducerRecord
(defn ->ProducerRecord
"Given unrolled ctor-style arguments creates a Kafka `ProducerRecord`."
([{:keys [topic-name]} value]
(^ProducerRecord [{:keys [topic-name]} value]
(ProducerRecord. ^String topic-name value))
([{:keys [topic-name]} key value]
(^ProducerRecord [{:keys [topic-name]} key value]
(ProducerRecord. ^String topic-name key value))
([{:keys [topic-name]} partition key value]
(let [partition-or-nil (if partition (int partition))]
(^ProducerRecord [{:keys [topic-name]} partition key value]
(let [partition-or-nil (when partition (int partition))]
(ProducerRecord. ^String topic-name
^Integer partition-or-nil
key value)))
([{:keys [topic-name]} partition timestamp key value]
(let [partition-or-nil (if partition (int partition))
timestamp-or-nil (if timestamp (long timestamp))]
(^ProducerRecord [{:keys [topic-name]} partition timestamp key value]
(let [partition-or-nil (when partition (int partition))
timestamp-or-nil (when timestamp (long timestamp))]
(ProducerRecord. ^String topic-name
^Integer partition-or-nil
^Long timestamp-or-nil
key value)))
([{:keys [topic-name]} partition timestamp key value headers]
(let [partition-or-nil (if partition (int partition))
timestamp-or-nil (if timestamp (long timestamp))]
(^ProducerRecord [{:keys [topic-name]} partition timestamp key value headers]
(let [partition-or-nil (when partition (int partition))
timestamp-or-nil (when timestamp (long timestamp))]
(ProducerRecord. ^String topic-name
^Integer partition-or-nil
^Long timestamp-or-nil
Expand Down Expand Up @@ -87,26 +87,26 @@
offset 0 ;; Force absolute offset
timestamp
nil ;; No checksum, it's deprecated
^Integer (if key-size (int key-size))
^Integer (if value-size (int value-size))))
^Integer (when key-size (int key-size))
^Integer (when value-size (int value-size))))
([{:keys [:topic-name] :as t} partition base-offset relative-offset timestamp
key-size value-size]
(RecordMetadata. (->TopicPartition t partition)
base-offset
relative-offset ;; Full offset control
timestamp
nil ;; No checksum, it's depreciated
^Integer (if key-size (int key-size))
^Integer (if value-size (int value-size))))
^Integer (when key-size (int key-size))
^Integer (when value-size (int value-size))))
([{:keys [:topic-name] :as t} partition base-offset relative-offset timestamp checksum
key-size value-size]
(RecordMetadata. (->TopicPartition t partition)
base-offset
relative-offset ;; Full offset control
timestamp
checksum ;; Have fun I guess
^Integer (if key-size (int key-size))
^Integer (if value-size (int value-size)))))
^Integer (when key-size (int key-size))
^Integer (when value-size (int value-size)))))

(defn map->RecordMetadata
"Given a `::record-metdata`, build an equivalent `RecordMetadata`.
Expand Down
8 changes: 4 additions & 4 deletions src/jackdaw/serdes/avro.clj
Original file line number Diff line number Diff line change
Expand Up @@ -68,12 +68,12 @@
KafkaAvroSerializer KafkaAvroDeserializer]
java.lang.CharSequence
java.nio.ByteBuffer
[java.io ByteArrayOutputStream ByteArrayInputStream]
[java.io ByteArrayOutputStream]
[java.util Collection Map UUID]
[org.apache.avro
AvroTypeException Schema$Parser Schema$ArraySchema Schema Schema$Field]
[org.apache.avro.io
EncoderFactory DecoderFactory JsonEncoder]
EncoderFactory DecoderFactory]
[org.apache.avro.generic
GenericDatumWriter GenericDatumReader
GenericContainer GenericData$Array GenericData$EnumSymbol
Expand All @@ -91,10 +91,10 @@
(when schema-str
(.parse (Schema$Parser.) ^String schema-str)))))

(defn- ^String mangle [^String n]
(defn- mangle ^String [^String n]
(str/replace n #"-" "_"))

(defn- ^String unmangle [^String n]
(defn- unmangle ^String [^String n]
(str/replace n #"_" "-"))

(defn- dispatch-on-type-fields
Expand Down
1 change: 0 additions & 1 deletion src/jackdaw/serdes/edn.clj
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@
(:require [clojure.edn]
[jackdaw.serdes.fn :as jsfn])
(:import java.nio.charset.StandardCharsets
org.apache.kafka.common.serialization.Serde
org.apache.kafka.common.serialization.Serdes))

(set! *warn-on-reflection* true)
Expand Down
Loading