Hi Kaarthik. I just cloned your Git repo. I am not really a Spark developer, so my question might be fairly elemental. I got everything building, and the unit tests run just fine. What I want to do is debug the Pi sample from within Visual Studio 2015. I first opened up a Visual Studio Developer's Command Prompt and as per your debug instructions, ran the command "sparkclr-submit.cmd debug". I got the message "[CSharpRunner.main] Backend running debug mode. Press enter to exit" .... so far, so good. Then, I opened the SparkCLR.sln solution in Visual Studio, set the startup program to the "Samples" project, and in the project properties, I set the arguments to "--torun pi*". I then started the debug session. After the 300,000 integer array was initialized, I got these errors:
JVM method execution failed: Static method collectAndServe failed for class org.apache.spark.api.python.PythonRDD when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=13], )
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.net.SocketException: Connection reset by peer: socket write error
Do you have any idea? I am sure that I need to set some jar file path or class path somewhere.
Hi all, looking to try out mobius as a way of trying out spark to build a simple etl. Our data sources are mostly SQL Server, a little bit Oracle and some CSV files. I have just started reading up on Mobius but was curious to see if this is a solution that could work. I should also mention that we use Entity Framework extensively. However, my understanding is that pure SQL or json files is how Mobius/Spark is used.
Many thanks in advance!
I'm having trouble using the C# mobius Spark package. I'm fairly new to C# and am using .net core. I'm getting the error
Unhandled Exception: System.MissingMethodException: Method not found: 'System.AppDomainSetup System.AppDomain.get_SetupInformation()'.
at System.Lazy1.ExecutionAndPublication(LazyHelper executionAndPublication, Boolean useDefaultConstructor)
at Microsoft.Spark.CSharp.Core.SparkConf..ctor(Boolean loadDefaults)
at Microsoft.Spark.CSharp.Examples.SparkProcessor.Main(String args) in /Users/jokim/workspace/IFFParserSpark/SparkProcessor/SparkProcessor/Program.cs:line 20
I tried using mono but i keep getting
Program.cs(5,17): error CS0234: The type or namespace name 'Spark' does not exist in the namespace 'Microsoft' (are you missing an assembly reference?)
Program.cs(6,17): error CS0234: The type or namespace name 'Spark' does not exist in the namespace 'Microsoft' (are you missing an assembly reference?)
when doing csc Program.cs
I added the package and it exists... so compiling wise nothing is red and everything works but when I try to run it using mono I keep getting that error. I'm using MAc