spark-instrumented-optimizer/docs/contributing-to-spark.md
Andy Konwinski 16da942d66 Adding docs directory containing documentation currently on the wiki
which can be compiled via jekyll, using the command `jekyll`. To compile
and run a local webserver to serve the doc as a website, run
`jekyll --server`.
2012-09-12 13:03:43 -07:00

1.4 KiB

layout title
global How to Contribute to Spark

Contributing to Spark

The Spark team welcomes contributions in the form of GitHub pull requests. Here are a few tips to get your contribution in:

  • Break your work into small, single-purpose patches if possible. It's much harder to merge in a large change with a lot of disjoint features.
  • Submit the patch as a GitHub pull request. For a tutorial, see the GitHub guides on forking a repo and sending a pull request.
  • Follow the style of the existing codebase. Specifically, we use [[standard Scala style guide|http://docs.scala-lang.org/style/]], but with the following changes:
    • Maximum line length of 100 characters.
    • Always import packages using absolute paths (e.g. scala.collection.Map instead of collection.Map).
    • No "infix" syntax for methods other than operators. For example, don't write table containsKey myKey; replace it with table.containsKey(myKey).
  • Add unit tests to your new code. We use ScalaTest for testing. Just add a new Suite in core/src/test, or methods to an existing Suite.

If you'd like to report a bug but don't have time to fix it, you can still post it to our issues page. Also, feel free to email the mailing list.