Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scaladoc fails for overloaded constructor with default arguments #8479

Closed
scabug opened this issue Apr 6, 2014 · 12 comments
Closed

Scaladoc fails for overloaded constructor with default arguments #8479

scabug opened this issue Apr 6, 2014 · 12 comments
Assignees
Milestone

Comments

@scabug
Copy link

scabug commented Apr 6, 2014

There is a bug in the scaladoc generation that is preventing us from making an important API change in Spark.

We are using constructor overloading in Spark's flagship class "SparkContext" and when generating our docs the compilation phase fails even though a normal compile is fine. Weirdly, if I change our class constructor to have a default argument, then the compilation succeeds.

Our class SparkContext has the following constructors:

// Class constructor
class SparkContext(config: SparkConf)

// Constructor 1
def this(config: SparkConf, preferredNodeLocationData: Map[String, Set[SplitInfo]]) = {

// Constructor 2
def this(master: String, appName: String, conf: SparkConf)

// Constructor 3
 def this(
      master: String,
      appName: String,
      sparkHome: String = null,
      jars: Seq[String] = Nil,
      environment: Map[String, String] = Map(),
      preferredNodeLocationData: Map[String, Set[SplitInfo]] = Map())

When compiling docs, all uses of Constructor 3 fail that make use of the default arguments. An example of a failure message is:

[error] /home/patrick/Documents/spark/core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala:70: not enough arguments for constructor SparkContext: (master: String, appName: String, sparkHome: String, jars: Seq[String], environment: scala.collection.Map[String,String], preferredNodeLocationData: scala.collection.Map[String,scala.collection.Set[org.apache.spark.scheduler.SplitInfo]])org.apache.spark.SparkContext
[error]     this(new SparkContext(master, appName, sparkHome, Seq(jarFile)))

Strangely, if I change the default constructor to have a default argument. The doc compile starts working:

// Existing class constructor
class SparkContext(config: SparkConf)
// Doing this makes the compile succeed
class SparkContext(config: SparkConf, foo: Int = 1)

This is not an acceptable workaround for us though, as it obfuscates our API.

@scabug
Copy link
Author

scabug commented Apr 6, 2014

Imported From: https://issues.scala-lang.org/browse/SI-8479?orig=1
Reporter: Patrick Wendell (pwendell)
Affected Versions: 2.10.4
Other Milestones: 2.11.1

@scabug
Copy link
Author

scabug commented Apr 6, 2014

@retronym said:
I'm not able to reproduce this with Scala 2.10.4 or 2.10.3 with:

  ~/code/incubator-spark git show
commit 4d880304867b55a4f2138617b30600b7fa013b14
Author: Bryn Keller <bryn.keller@intel.com>
Date:   Mon Feb 24 17:35:22 2014 -0800

Could you please point to a commit SHA that can reproduce the problem?

@scabug
Copy link
Author

scabug commented Apr 6, 2014

Patrick Wendell (pwendell) said (edited on Apr 6, 2014 7:55:45 PM UTC):
Thanks for looking at this. This exists only in a topic branch in my repo:

git clone https://github.com/pwendell/spark.git
cd spark
git checkout 7fb13b21864ac030ae3bd09d0a4262bdb5bf66f3
sbt/sbt compile # works
sbt/sbt doc # compile fails

By the way - that build will be 2.10.3. I tested 2.10.4 also and it also failed, so I reported that in the issue since it's the most recent version in which I saw a failure.

@scabug
Copy link
Author

scabug commented Apr 6, 2014

@retronym said (edited on Apr 6, 2014 9:26:03 PM UTC):
Standalone reproduction:

object Test {
  val x = new SparkContext(master = "")
}

class SparkContext(config: Any) {

  /** Scaladoc comment */
  def this(
      master: String,
      appName: String = "") = this(null)
}
warning: dropping dependency on node with no phase object: pickler
test.scala:2: error: not enough arguments for constructor SparkContext: (master: String, appName: String)SparkContext
  val x = new SparkContext(master = "")
          ^

Removing scaladoc from the constructor with defaults seems to hide the bug.

@scabug
Copy link
Author

scabug commented Apr 6, 2014

Patrick Wendell (pwendell) said:
Based on your comment I did some more debugging. Unfortunately none of these made any difference:

  • Removed scaladoc from Constructor 3 (no change)
  • Removed scaladoc from all non-class constructors (no change)
  • Removed scaldoc from all constructors (no change)

@scabug
Copy link
Author

scabug commented Apr 7, 2014

@retronym said:
Maybe it is scaladoc anywhere in SparkContexj.scala.

@scabug
Copy link
Author

scabug commented Apr 7, 2014

@retronym said:
Fix for 2.10.5: scala/scala#3678

@scabug
Copy link
Author

scabug commented Apr 7, 2014

@retronym said:
A reliable, if unpalalable, workaround until 2.10.5 is to avoid using the constructor defaults from within the core module of spark itself.

@scabug
Copy link
Author

scabug commented Apr 7, 2014

Patrick Wendell (pwendell) said (edited on Apr 7, 2014 6:27:37 PM UTC):
We created package private constructors that select for specific combinations of the variables that have defaults. Once this bug gets fixed, we'll just remove them and there is no change to consumers of the Spark API. The only thing I'm concerned about is that I'm surprised Scala this doesn't throw any ambiguity exceptions here. In the example below the private constructor is used even though either constructor could technically apply. Just wondering - why is this not ambiguous? Is this approach we've taken safe?

package mypackage

class Example {

  // This is our main constructor involving defaults
  def this(a: String, b: String = "foo", c: String = "bar") {
    this()
  }

  private[mypackage] def this(a: String, b: String) {
    this(a, b, "bar")
    println("called internal constructor")
  }
}

object Example {
  def main(args: Array[String]) {
    val x = new Example("a", "b")
  }
}

@scabug
Copy link
Author

scabug commented Apr 8, 2014

@retronym said:
I generally wouldn't recommend overloading like that, the rules are tricky to remember, even for compiler writers: scala/scala@a5cd601dac.

But it might be okay until Scala 2.10.5 is released in a couple months.

@scabug
Copy link
Author

scabug commented Apr 9, 2014

Patrick Wendell (pwendell) said:
I think we'll go this way because we don't want to change the public API which has, in the past, contained the constructor with defaults. The other constructors are package-private so no one should run into potential ambiguities but us. And in all cases of ambiguity it's actually valid to call either constructor, because the ones we've added are only to just avoid this bug, they aren't trying to differ in functionality in any way from the existing constructor.

@scabug
Copy link
Author

scabug commented Feb 26, 2015

@adriaanm said:
Looks like this was fixed by scala/scala#3678

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants