Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Sep 14 22:40
    som-snytt commented #8419
  • Sep 14 22:38
    som-snytt commented #8419
  • Sep 14 19:14
    scala-jenkins milestoned #8419
  • Sep 14 19:14
    som-snytt opened #8419
  • Sep 14 13:21
    psilospore review_requested #7798
  • Sep 13 23:36
    scala-jenkins milestoned #8418
  • Sep 13 23:36
    mkeskells opened #8418
  • Sep 13 18:41
    linasm commented #8414
  • Sep 13 16:51
    som-snytt commented #8416
  • Sep 13 16:50
    som-snytt commented #8416
  • Sep 13 16:41
    scala-jenkins milestoned #8417
  • Sep 13 16:41
    som-snytt opened #8417
  • Sep 13 14:54
    som-snytt commented #8416
  • Sep 13 10:41
    linasm synchronize #8414
  • Sep 13 08:40
    martin-g commented #8416
  • Sep 13 08:39
    martin-g commented #8416
  • Sep 13 07:15
    mkeskells commented #8411
  • Sep 13 06:43
    som-snytt commented #8415
  • Sep 13 06:37
    som-snytt commented #8415
  • Sep 13 06:24
    sjrd commented #8415
Jijo Sunny
@jijosg
@AmirSarvestani Yes i can , what is your query
@abhisam You dont need to write return whatever is the last value in the method is returned by default in scala, what are you finding difficult in doing that
Manjeru
@Manjeru

Dear all,

We are currently looking for a senior level Scala Developer available to begin work immediately. This is a contract position lasting until the end of the year; however, this assignment could go longer or have the possibility of permanent placement.

Please find more details about in link below:
https://curaweb2.mindscope.com/TECHNA04436_CURA/aspx/JobDetails.aspx?lang=en&Job_ID=90

Kindly let me know if you are interested to apply for this position and if yes, please share your CV, with your availability date and expected salary to my email diana.m@technaura.com

We are looking forward hearing from you soon.
Please do not hesitate to contact me if you have any questions.

prassee
@prassee
Please avoid posting about Scala Job
Freppn
@freppn_twitter

Getting a coffee with any open source buddy whom you havent talked and going to music concerts with Bill Gates or any one of the 100+ activities is that easy now. As you can see the plans are completely anonymous, secure unless both of you agree. Freppn (Friends Happen) . Make plans with literally anyone.
Start using Freppn App, live on play store at https://play.google.com/store/apps/details?id=com.freppn.codeiatio.freppn
WEBSITE:- www.freppn.com

We are a new idea app that is reducing the distance between different social media and support 9+ Verified logins (Facebook, Twitter, Google+, Github , Pinterest , Tumblr , Phone , Codeforces , Linkedin) at the moment Plus chat apps like Whatsapp , Snapchat , Hangouts , Messenger , KiK , Telegram etc). We are working on patenting , trademarking and adding all other websites and removing bugs at the moment . Please share our app so that we can take this to many people as possible as we are currently working on marketing, patenting and trademarking this unique idea.

Please help us in Sharing this app as this helps to meet your open source buddies and maybe learn open source over a cup of coffee . No need to Directly ask. Just anonymously use Freppn.

arunsumbria
@arunsumbria
@arunsumbria
hi i am new to spark, i installed spark on windows 10 , when using below command its showing path dosent exist, i have winutils too under c drive
 val df = spark.read.csv("file:///C:/spark/myprog/emp")
org.apache.spark.sql.AnalysisException: Path does not exist: file:/C:/spark/myprog/emp;
Ghost
@ghost~5b767f25d73408ce4fa4e2e5
hi
val dbConfig = Database.forURL("jdbc:mysql://localhost:3306/equineapp?user=root&password=123456", driver = "com.mysql.jdbc.Driver")
val setup1 = sql"call HorsePrfile ($HorseId);".as[(Int,String)]
val res = Await.result(dbConfig.run(setup1), 1000 seconds)
// val json = Json.toJson(res)
Ok(Json.toJson(res.toList))
How to return json reponse with header
i ma using scala playframework with slick
AmirSarvestani
@AmirSarvestani
@AmirSarvestani
Hey Guys, I have a sequence of objects, How Can i filter the sequence based on one attribite?
Like This : Seq[CompanyApplicant] I want to filter the sequence based on a property in the CompanyApplicant
The property is an String
I am new to scala
Sabuj Kumar Jena
@sabujkumarjena
@@AmirSarvestani..Yes you can. let s = Seq[CompanyApplicant]. s.filter(boolean function). You have to pass a boolean function with respect to property of object.
For example if u want to filter all company applicants whose names start with letter S...the query would be s.filter(_.startWith("S"))
Ghost
@ghost~5b8a94aed73408ce4fa690fd

If anybody would like to translate #fpmortals into any Indian language, please contact me. We have begun versions for French, Russian and Singapore

You can keep any profits, but the minimum price must be $0

https://leanpub.com/fpmortals

Sabuj Kumar Jena
@sabujkumarjena
@pradeepert :
def addAll(args:Int*) = {args foreach (add)}
then u can call ints.addAll(1,2,3,4,5)
premgc
@premgc
Hi
I need one help
in one of my spark scala job i'm getting this error
ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 2) at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession
I'm creating one RDD and loop through this . inside the loop written a function passing the values
here is the sudo code

import scala.collection.Map
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.{Row, SQLContext, SaveMode}
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.hive.HiveContext

object dataingest{

def main(args: Array[String]) {




def procIngestBAMData(url:String, username:String, passwd:String, qrySql:String ,hivetbl:String ,sqlContext: SQLContext) : Unit = {
println("URL : " + url)
println("username : " + username)
println("pwd : " + passwd)
println("sql : " + qrySql)
println("hivetable : " + hivetbl)
  val driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
  val dfDynSrc =  sqlContext.read.format("jdbc").option("url", url.trim).option("user", username.trim).option("password", passwd.trim).option("dbtable", qrySql).option("driver", driver).load() 

}

val driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
var getDriverDataDF = spark.read.
format("jdbc").
option("url", s_url.trim).
option("user", s_username.trim).
option("password", s_passwd.trim).
option("dbtable", getdriverfileSQL).
option("driver", driver).
load()

val driverDatatoRDD = getDriverDataDF.rdd

if (driverDatatoRDD.count > 1 ) {

for (e <- driverDatatoRDD) {
  val a  = e(0)
  val b = e(1)
  val p_ActivityName = a.asInstanceOf[String]
  val p_TableName = b.asInstanceOf[String]
  log_message("procData processing names : " + p_ActivityName + " .../"   +  p_TableName )

  val strtblName = p_TableName.toLowerCase
  val strActivityName = p_ActivityName.toLowerCase

  println("Activity name to process000 ====> " + strtblName )
  println("Table Name00000 "+ strActivityName )

 val sc2 = spark.sqlContext
 procIngestBAMData(p_url,p_username , p_passwd, p_qry2, p_hivetbl, sc2)

} else {
log_message("No records found for the given date range " + process_date + "." )
}

}
}

i'm not sure getting this error
ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 2) at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession
when calling this function procIngestBAMData
Can any one help me please.............
vivekn1986
@vivekn1986
Hello, can anyone help me in converting integer to long in scala

scala> val i: Integer = 33
i: Integer = 33

scala> i.asInstanceOf[Long]
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:110)
at $iwCiwCiwCiwCiwCiwCiwCiwCiwCiwC

KaTeX parse error: Unexpected character: '$' at position 36: 29)
        at ̲$iwC: iwC.<init>(<console>:29)
        at $iwC
iwCiwCiwCiwCiwCiwCiwC
KaTeX parse error: Unexpected character: '$' at position 36: 34)
        at ̲$iwC: iwC.<init>(<console>:34)
        at $iwC
iwCiwCiwCiwCiwCiwCiwC.<init>(<console>:36)
at $iwCiwCiwCiwCiwCiwCiwC.<init>(<console>:38)
at $iwCiwCiwCiwC
KaTeX parse error: Unexpected character: '$' at position 36: 40)
        at ̲$iwC: iwC.<init>(<console>:40)
        at $iwC
iwC
KaTeX parse error: Unexpected character: '$' at position 36: 42)
        at ̲$iwC: iwC.<init>(<console>:42)
        at $iwC
iwC.<init>(<console>:44)
at $iwC.<init>(<console>:46)
at <init>(<console>:48)
at .<init>(<console>:52)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILooploop(SparkILoop.scala:670)atorg.apache.spark.repl.SparkILooploop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoopanonfun$org$apache$spark$repl$SparkILoop
KaTeX parse error: Unexpected character: '$' at position 7: process̲$1.apply$mcZ$s: process$1.apply$mcZ$sp(SparkILoop.scala:997)
        at org.apache.spark.repl.SparkILoop
anonfun$org$apache$spark$repl$SparkILoop
KaTeX parse error: Unexpected character: '$' at position 7: process̲$1.apply(Spark: process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop
anonfun$org$apache$spark$repl$SparkILoop
KaTeX parse error: Unexpected character: '$' at position 7: process̲$1.apply(Spark: process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop
process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Vanam
@vanamraghu
@vivekn1986 , you can use integer.toLong
AmirSarvestani
@AmirSarvestani
Hey guys, I have a problem I would appreciate if someone can help me, I hace a Seq[Income], income has attributes "type" , "amount" and "frequency"
I want to create an xml node like this.
<Income s"${income.`type`}Amount=${income.amount} ${income.`type`}Frequency=${income.frequency}"}"/>
so income type is an Enum and each income type has an income frequency
how would i be able to change the sequence of Income to one Income node in xml?
ehsanshah
@ehsanshah
Hi every one I need to use RPC protocol but I don't know which library has good performance ! in Scala. can guide me ?
thanks.
Abdhesh Kumar
@abdheshkumar
@ehsanshah Use finagle https://twitter.github.io/finagle/
Emperor-less
@Emperor-less
hello
who are you
anbutech17
@anbutech17
hello
ramesh004
@ramesh004

can anyone help me to get rid of the task not serializable exception
even i had implemented the Serializable trait

var structFieldArr = SchemaFileRDD.map(m=> StructField(m.split(Pattern.quote("|"))(1),types(m.split(Pattern.quote("|"))(2)))).collect()

Karan Barai
@BaraiKaran
Hi, I am trying to unit test some functions which use slick to interact with database. Can anyone explain me right way to do this? Also, if you can share some mocking examples it would be great. I have looked up for the examples but didn't find any good solution. Thanks!
Pradeep Sonaimuthu
@pradeepert
I come across with one use case. I want to write one function I will pass two types to that function. It should return it can be convert able from first type to second type or not. Any suggestions?
Eg: if I pass int and string to the function as an input parameters, it should return true.
Karan Barai
@BaraiKaran
Can you elaborate a little bit, not quite sure what you are trying to achieve. @pradeepert
Pradeep Sonaimuthu
@pradeepert
I want to pack avro library dependencies along with my project jar. What I have to do for this. What changes I have to do in SBT level?
agamjain14
@agamjain14

I have sort function defined as below:

case class Country(name: String, id: Int)
def sortT(compare: (T, T) => Boolean): List[T] = {

}
My question is, How do I sort the list of Country based on the name?

and How do I call this function having multiple parameter lists?

case class Country(name: String, id: Int)
def sort(list: List[T])(compare: (T, T) => Boolean): List[T] = {

}

Deshbandhu Mishra
@deshbandhumishra
I am solving/explaining famous 99-scala-Algo over voice chat on Skype with 4-5 scala freshers. You may also join this voice group chat.
Pradeep Sonaimuthu
@pradeepert
I m getting this character issue when reading spark dataframe from csv,
'\uFEFF'. I want to check as a business logic df.columns.contains("Id") eventhough Id column is there because of this special character type it fails