Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

An existing connection was forcibly closed by the remote host #169

Closed
aysuays opened this issue Jul 10, 2019 · 7 comments
Closed

An existing connection was forcibly closed by the remote host #169

aysuays opened this issue Jul 10, 2019 · 7 comments
Labels
question Further information is requested

Comments

@aysuays
Copy link

aysuays commented Jul 10, 2019

Hello,
I installed correctly dotnet spark according to this Microsoft tutorial. I try the basic codes in above link and [Windows Instructions](https://github docs/getting-started/windows-instructions.md).
Both of them, I see the results of programs but then I get this error ;

**An existing connection was forcibly closed by the remote host**
`19/07/10 14:34:58 ERROR DotnetBackendHandler: Exception caught:
java.io.IOException: An existing connection was forcibly closed by the remote host
        at sun.nio.ch.SocketDispatcher.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1106)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:343)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
        at java.lang.Thread.run(Thread.java:748)
19/07/10 14:34:58 INFO DotnetRunner: Closing DotnetBackend
19/07/10 14:34:58 INFO DotnetBackend: Requesting to close all call back sockets
19/07/10 14:34:58 INFO DotnetRunner: .NET application exited successfully
19/07/10 14:35:01 INFO SparkContext: Invoking stop() from shutdown hook
19/07/10 14:35:01 INFO SparkUI: Stopped Spark web UI at http://BTEKINT35.TEKNET.LOCAL:4040
19/07/10 14:35:01 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/07/10 14:35:01 INFO MemoryStore: MemoryStore cleared
19/07/10 14:35:01 INFO BlockManager: BlockManager stopped
19/07/10 14:35:01 INFO BlockManagerMaster: BlockManagerMaster stopped
19/07/10 14:35:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/07/10 14:35:01 WARN SparkEnv: Exception while deleting Spark temp dir: C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f\userFiles-b6086b0f-164b-4aa1-896b-374a9bb8beba
java.io.IOException: Failed to delete: C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f\userFiles-b6086b0f-164b-4aa1-896b-374a9bb8beba\microsoft-spark-2.4.x-0.3.0.jar
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
        at org.apache.spark.SparkEnv.stop(SparkEnv.scala:103)
        at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1974)
        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1973)
        at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
19/07/10 14:35:01 INFO SparkContext: Successfully stopped SparkContext
19/07/10 14:35:01 INFO ShutdownHookManager: Shutdown hook called
19/07/10 14:35:01 INFO ShutdownHookManager: Deleting directory C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f\userFiles-b6086b0f-164b-4aa1-896b-374a9bb8beba
19/07/10 14:35:01 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f\userFiles-b6086b0f-164b-4aa1-896b-374a9bb8beba
java.io.IOException: Failed to delete: C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f\userFiles-b6086b0f-164b-4aa1-896b-374a9bb8beba\microsoft-spark-2.4.x-0.3.0.jar
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
19/07/10 14:35:01 INFO ShutdownHookManager: Deleting directory C:\Users\tekint35\AppData\Local\Temp\spark-29523fe8-4024-495f-8fce-4422f5b78961
19/07/10 14:35:01 INFO ShutdownHookManager: Deleting directory C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f
19/07/10 14:35:01 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f
java.io.IOException: Failed to delete: C:\Users\tekint35\AppData\Local\Temp\spark-4c26bd8b-8b0b-4eae-b677-c1bf5e79110f\userFiles-b6086b0f-164b-4aa1-896b-374a9bb8beba\microsoft-spark-2.4.x-0.3.0.jar
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
        at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)`

I try to solve this error but I failed. I hope you can help me. Thank you so much.

@aysuays aysuays changed the title existing An existing connection was forcibly closed by the remote host Jul 10, 2019
@imback82
Copy link
Contributor

@fightlikeagirl can you share your program?

@imback82 imback82 added the question Further information is requested label Jul 10, 2019
@aysuays
Copy link
Author

aysuays commented Jul 12, 2019

First program I try :

`using Microsoft.Spark.Sql;

namespace MySparkApp
{
    class Program
    {
        static void Main(string[] args)
        {
            // Create a Spark session
            var spark = SparkSession
                .Builder()
                .AppName("word_count_sample")
                .GetOrCreate();

            // Create initial DataFrame
            DataFrame dataFrame = spark.Read().Text("input.txt");

            // Count words
            var words = dataFrame
                .Select(Functions.Split(Functions.Col("value"), " ").Alias("words"))
                .Select(Functions.Explode(Functions.Col("words"))
                .Alias("word"))
                .GroupBy("word")
                .Count()
                .OrderBy(Functions.Col("count").Desc());

            // Show results
            words.Show();
        }
    }
}`

@aysuays
Copy link
Author

aysuays commented Jul 12, 2019

And the second one :

`using Microsoft.Spark.Sql;

namespace HelloSpark
{
    class Program
    {
        static void Main(string[] args)
        {
            var spark = SparkSession.Builder().GetOrCreate();
            var df = spark.Read().Json("people.json");
            df.Show();
        }
    }
}

Both of them gave the true results then gave the above errors.

@imback82
Copy link
Contributor

Can you try to spark.Stop() at the end? Also, for the errors deleting files, please refer to #49 (comment).

@imback82
Copy link
Contributor

@fightlikeagirl Any update on this?

@imback82
Copy link
Contributor

Closing this due to inactivity. Feel free to reopen it.

@leeichang
Copy link

Hello:
I also installed correctly dotnet spark according to this Microsoft tutorial. and I solved the problem like in #49 (comment).. But I have the same problem

 ERROR DotnetBackendHandler: Exception caught:
java.io.IOException: An existing connection was forcibly closed by the remote host

and the source had then spark.Stop();
How to solve this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants