Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PLAT-33341] Add Caffeine interface to limit Sjsonnet worker cache size #128

Merged
merged 29 commits into from
Oct 27, 2021
Merged
Show file tree
Hide file tree
Changes from 11 commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
5f1c4db
Draft - 1: Defined the interface and interaction between caffeine and…
tanmay-db Sep 1, 2021
0c975b3
added mapping function
tanmay-db Sep 1, 2021
b260882
typo edit
tanmay-db Sep 1, 2021
ee1766a
passing function name instead of whole expression
tanmay-db Sep 1, 2021
6db24c5
Merge remote-tracking branch 'sjsonnet_fork/PLAT-33341' into PLAT-33341
tanmay-db Sep 1, 2021
dd9323e
extend interface for new hashmap class
tanmay-db Sep 3, 2021
8288a59
removing changes from src-js
tanmay-db Sep 3, 2021
d3f8048
Resolved comments about class name and removed bugs
tanmay-db Sep 9, 2021
cd31540
Removed comments
tanmay-db Sep 9, 2021
5b6cb8c
Removed space
tanmay-db Sep 9, 2021
9399365
clean up
tanmay-db Sep 9, 2021
53cc9b6
remove creating default cache
tanmay-db Sep 17, 2021
9d210c1
merge commit
tanmay-db Sep 22, 2021
13a4385
Fixed the JS build
tanmay-db Sep 27, 2021
9474382
remove redundant import
tanmay-db Sep 27, 2021
354b717
add UTF-8 encoding as standard
tanmay-db Sep 29, 2021
45a7998
fix sjsonnet server
tanmay-db Sep 29, 2021
8d0c663
cleanup
tanmay-db Sep 29, 2021
3031867
warning to check
tanmay-db Sep 30, 2021
278e37a
testing PR https://github.com/databricks/sjsonnet/pull/132 for no sta…
tanmay-db Oct 6, 2021
d462de0
resolve bugs
tanmay-db Oct 6, 2021
cad3c82
Merge branch 'changes-containing-no-static-errors-2' into PLAT-33341
tanmay-db Oct 6, 2021
842d937
Merge remote-tracking branch 'databricks-sjsonnet/master' into PLAT-3…
tanmay-db Oct 15, 2021
2db27b7
updating for new master
tanmay-db Oct 15, 2021
558961a
resolving the reviews
tanmay-db Oct 15, 2021
30ad578
bug
tanmay-db Oct 18, 2021
be733ed
Merge remote-tracking branch 'databricks-sjsonnet/master' into PLAT-3…
tanmay-db Oct 21, 2021
c794cc9
resolve profiler and benchmarking
tanmay-db Oct 25, 2021
78cb4c0
remove valuesIterator and keySet methods
tanmay-db Oct 27, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion sjsonnet/src-js/sjsonnet/SjsonnetMain.scala
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import scala.scalajs.js.annotation.{JSExport, JSExportTopLevel}

@JSExportTopLevel("SjsonnetMain")
object SjsonnetMain {
def createParseCache() = collection.mutable.HashMap[(Path, String), Either[String, (Expr, FileScope)]]()
def createParseCache() = new DefaultParseCache
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we deprecate this method? We have the default encapsulated nicely in new DefaultParseCache now and it is used directly in several places, so this method seems superfluous.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that makes sense. Updated

@JSExport
def interpret(text: String,
extVars: js.Any,
Expand Down
22 changes: 18 additions & 4 deletions sjsonnet/src-jvm-native/sjsonnet/SjsonnetMain.scala
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,22 @@ import java.nio.file.NoSuchFileException
import scala.util.Try
import scala.util.control.NonFatal

// Trait extended by JsonnetWorker (in universe) so that it can pass the cache based on Caffeine to main0 here
trait ParseCache {
def getOrElseUpdate(key: (Path, String), defaultValue: => Either[Error, (Expr, FileScope)]): Either[Error, (Expr, FileScope)]
}

// A default implementation based on a mutable HashMap. This implementation is not thread-safe.
class DefaultParseCache extends ParseCache {
val cache = new collection.mutable.HashMap[(Path, String), Either[Error, (Expr, FileScope)]]()

override def getOrElseUpdate(key: (Path, String), defaultValue: => Either[Error, (Expr, FileScope)]): Either[Error, (Expr, FileScope)] = {
cache.getOrElseUpdate(key, defaultValue)
}
}

object SjsonnetMain {
def createParseCache() = collection.mutable.HashMap[(Path, String), Either[Error, (Expr, FileScope)]]()
def createParseCache() = new DefaultParseCache
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here


def resolveImport(searchRoots0: Seq[Path], allowedInputs: Option[Set[os.Path]] = None) = new Importer {
def resolve(docBase: Path, importName: String): Option[Path] =
Expand All @@ -33,7 +47,7 @@ object SjsonnetMain {
case Array(s, _*) if s == "-i" || s == "--interactive" => args.tail
case _ => args
},
collection.mutable.HashMap.empty,
new DefaultParseCache,
System.in,
System.out,
System.err,
Expand All @@ -44,7 +58,7 @@ object SjsonnetMain {
}

def main0(args: Array[String],
parseCache: collection.mutable.HashMap[(Path, String), Either[Error, (Expr, FileScope)]],
parseCache: ParseCache,
stdin: InputStream,
stdout: PrintStream,
stderr: PrintStream,
Expand Down Expand Up @@ -127,7 +141,7 @@ object SjsonnetMain {

def mainConfigured(file: String,
config: Config,
parseCache: collection.mutable.HashMap[(Path, String), Either[Error, (Expr, FileScope)]],
parseCache: ParseCache,
wd: os.Path,
allowedInputs: Option[Set[os.Path]] = None,
importer: Option[(Path, String) => Option[os.Path]] = None): Either[String, String] = {
Expand Down
4 changes: 2 additions & 2 deletions sjsonnet/src/sjsonnet/Importer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,8 @@ class CachedImporter(parent: Importer) extends Importer {

class CachedResolver(
parentImporter: Importer,
val parseCache: mutable.HashMap[(Path, String), Either[Error, (Expr, FileScope)]] = new mutable.HashMap
) extends CachedImporter(parentImporter) {
val parseCache: ParseCache = new DefaultParseCache
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's remove this default value so we can make sure we pass the right parse cache from all code paths.

This means you'll have to find all places where this method is called and decide if it should pass a fresh DefaultParseCache (if it's on a code path that's NOT coming from Bazel, meaning mostly tests) or make sure we have an external parse cache.

) extends CachedImporter(parentImporter) {

def parse(path: Path, txt: String)(implicit ev: EvalErrorScope): Either[Error, (Expr, FileScope)] = {
parseCache.getOrElseUpdate((path, txt), {
Expand Down
4 changes: 2 additions & 2 deletions sjsonnet/src/sjsonnet/Interpreter.scala
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@ class Interpreter(extVars: Map[String, ujson.Value],
preserveOrder: Boolean = false,
strict: Boolean = false,
storePos: Position => Unit = null,
val parseCache: mutable.HashMap[(Path, String), Either[Error, (Expr, FileScope)]] = new mutable.HashMap,
) { self =>
val parseCache: ParseCache = new DefaultParseCache
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, let's not have a default argument for the parse cache

) { self =>

val resolver = new CachedResolver(importer, parseCache) {
override def process(expr: Expr, fs: FileScope): Either[Error, (Expr, FileScope)] =
Expand Down
4 changes: 2 additions & 2 deletions sjsonnet/test/src-jvm-native/sjsonnet/MainTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@ object MainTests extends TestSuite {
val outF = File.createTempFile("sjsonnet", ".json")
val out = new ByteArrayOutputStream()
val pout = new PrintStream(out)
SjsonnetMain.main0(Array(source), collection.mutable.HashMap.empty, System.in, pout, System.err, os.pwd, None)
SjsonnetMain.main0(Array(source), new DefaultParseCache, System.in, pout, System.err, os.pwd, None)
pout.flush()
SjsonnetMain.main0(Array("-o", outF.getAbsolutePath, source), collection.mutable.HashMap.empty, System.in, System.out, System.err, os.pwd, None)
SjsonnetMain.main0(Array("-o", outF.getAbsolutePath, source), new DefaultParseCache, System.in, System.out, System.err, os.pwd, None)
val stdoutBytes = out.toByteArray
val fileBytes = os.read(os.Path(outF)).getBytes
// stdout mode uses println so it has an extra platform-specific line separator at the end
Expand Down