New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scaladoc fails for overloaded constructor with default arguments #8479
Comments
Imported From: https://issues.scala-lang.org/browse/SI-8479?orig=1 |
@retronym said:
Could you please point to a commit SHA that can reproduce the problem? |
Patrick Wendell (pwendell) said (edited on Apr 6, 2014 7:55:45 PM UTC): git clone https://github.com/pwendell/spark.git
cd spark
git checkout 7fb13b21864ac030ae3bd09d0a4262bdb5bf66f3
sbt/sbt compile # works
sbt/sbt doc # compile fails By the way - that build will be 2.10.3. I tested 2.10.4 also and it also failed, so I reported that in the issue since it's the most recent version in which I saw a failure. |
@retronym said (edited on Apr 6, 2014 9:26:03 PM UTC): object Test {
val x = new SparkContext(master = "")
}
class SparkContext(config: Any) {
/** Scaladoc comment */
def this(
master: String,
appName: String = "") = this(null)
}
Removing scaladoc from the constructor with defaults seems to hide the bug. |
Patrick Wendell (pwendell) said:
|
@retronym said: |
@retronym said: |
@retronym said: |
Patrick Wendell (pwendell) said (edited on Apr 7, 2014 6:27:37 PM UTC): package mypackage
class Example {
// This is our main constructor involving defaults
def this(a: String, b: String = "foo", c: String = "bar") {
this()
}
private[mypackage] def this(a: String, b: String) {
this(a, b, "bar")
println("called internal constructor")
}
}
object Example {
def main(args: Array[String]) {
val x = new Example("a", "b")
}
} |
@retronym said: But it might be okay until Scala 2.10.5 is released in a couple months. |
Patrick Wendell (pwendell) said: |
@adriaanm said: |
There is a bug in the scaladoc generation that is preventing us from making an important API change in Spark.
We are using constructor overloading in Spark's flagship class "SparkContext" and when generating our docs the compilation phase fails even though a normal compile is fine. Weirdly, if I change our class constructor to have a default argument, then the compilation succeeds.
Our class SparkContext has the following constructors:
When compiling docs, all uses of Constructor 3 fail that make use of the default arguments. An example of a failure message is:
Strangely, if I change the default constructor to have a default argument. The doc compile starts working:
This is not an acceptable workaround for us though, as it obfuscates our API.
The text was updated successfully, but these errors were encountered: