Create SparkConf/SparkContext/SparkSession for Jobs #49
Labels
difficult
Feature that may be difficult to implement
distribution
Distributed architecture work
enhancement
New feature or request
help wanted
Extra attention is needed
Milestone
Allow the ability for a
SparkConf
and/orSparkContext
to be created once, and shared amongstJobs
(notTask
s). This will allow for aJob
to be submitted to a running server, allow for the job to wait, then kick off multipleTask
s based on the DAG. Apply toSparkSession
, as that is where the context is created.This will require a lot of testing, as this can get very complex very quickly.
The text was updated successfully, but these errors were encountered: