New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
indexOfSlice() hangs when working on a largish stream #9830
Comments
Imported From: https://issues.scala-lang.org/browse/SI-9830?orig=1 |
Jasper-M said: |
ImNotTellingYouThat (intyt) said: java.lang.OutOfMemoryError: GC overhead limit exceeded So
scala> source.toSeq so the obvious question is, is indexOfSlice hanging on to the head of the stream as it works its way along, because how else is memory being retained? What should be happening do you think? (I'm asking because I genuinely don't know). cheers jan |
ImNotTellingYouThat (intyt) said: |
@SethTisue said: This doesn't have anything to do with The underlying issue here is that at runtime, calling .toSeq on Iterator returns a scala.collection.immutable.Stream, which is a very expensive data structure (both in space and time) — though the fact that its tail is lazy means that you don't pay the cost until you actually traverse it. In general, So, I've responded here on JIRA, but not to all of your questions. They are good questions, but I suggest asking them on scala-user, the scala/scala Gitter channel, or Stack Overflow. (If you have followup questions about what I've said here, same recommendation about where to take the discussion.) |
This takes a few secs but works
val source = scala.io.Source.fromChars(("x" * 6000000).toArray)
source.toSeq.indexOfSlice("tteesstt")
modify the 6000000 to 7000000 and it hangs, eating CPU (though not memory).
Seems that it's the indexOfSlice that's failing.
The text was updated successfully, but these errors were encountered: