-
Notifications
You must be signed in to change notification settings - Fork 708
Common Exceptions and possible reasons
This is probably when you are missing a dependency, or have a problem with the binary version of a dependency. Some libraries required at run time, such as Hadoop and many of Hadoop's includes, are expected to be on the classpath and are not bundled by Scalding itself. So check for that.
This probably is also a jar version problem.
This probably is also a jar version problem.
- Your classpath or jar is missing a needed class. Check your build, or perhaps the name of the job you are trying to run.
- Make sure you specified the fully packaged name of the class you wish to run package.JobName and not just JobName.
You have compiled your code with a later version of javac/scalac than is supported by the java vm you are attempting to run with. You need to either downgrade the compiler or upgrade the java installation.
: invalid CEN header (bad signature)
This is the jar file exceeding 64k problem. The solution is to use --bundle option of scald.rb.
Caused by: java.lang.NullPointerException
at cascading.scheme.Scheme.setSourceFields(Scheme.java:166)
at cascading.scheme.Scheme.<init>(Scheme.java:108)
at ...my class initialization ...
at ...my class initialization ...
at ...my class initialization ...
... 10 more
This null pointer occurs during initialization of classes: Scheme<init>
refers to the initialization of static members in the Scheme
class. A NullPointerException during the initialization of a class often happens because it references static fields in another class, which has not been initialised yet. This can be difficult to solve because Java has no way to specify the order in which classes are initialized. In my case I had:
import cascading.tuple.Fields;
var fields = Fields.ALL
And I was passing the fields
into the Scheme
initialization. Writing directly Fields.ALL
instead of using the variable fields
solved the problem.
Cascading requires all sources to have final sinks on disk. This exception happens when you miss an output for an input.
It also could signify an attempt to write an unserializable datatype.
This exception could also be triggered by creating a non-lazy Finagle client.
This happens when data is missing from the path you provided.
Try putting quotes around your input value
Common exception when running hadoop commands with sudo.
Check that you have given permissions for the user (root) in hdfs
sudo addgroup supergroup; sudo adduser root supergroup
- Scaladocs
- Getting Started
- Type-safe API Reference
- SQL to Scalding
- Building Bigger Platforms With Scalding
- Scalding Sources
- Scalding-Commons
- Rosetta Code
- Fields-based API Reference (deprecated)
- Scalding: Powerful & Concise MapReduce Programming
- Scalding lecture for UC Berkeley's Analyzing Big Data with Twitter class
- Scalding REPL with Eclipse Scala Worksheets
- Scalding with CDH3U2 in a Maven project
- Running your Scalding jobs in Eclipse
- Running your Scalding jobs in IDEA intellij
- Running Scalding jobs on EMR
- Running Scalding with HBase support: Scalding HBase wiki
- Using the distributed cache
- Unit Testing Scalding Jobs
- TDD for Scalding
- Using counters
- Scalding for the impatient
- Movie Recommendations and more in MapReduce and Scalding
- Generating Recommendations with MapReduce and Scalding
- Poker collusion detection with Mahout and Scalding
- Portfolio Management in Scalding
- Find the Fastest Growing County in US, 1969-2011, using Scalding
- Mod-4 matrix arithmetic with Scalding and Algebird
- Dean Wampler's Scalding Workshop
- Typesafe's Activator for Scalding