Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 10:50
    som-snytt synchronize #8598
  • 10:11
    som-snytt edited #8598
  • 10:09
    som-snytt synchronize #8598
  • 08:01
    som-snytt commented #8477
  • 07:14
    som-snytt synchronize #8598
  • 06:21
    som-snytt commented #8641
  • 00:58
    som-snytt synchronize #8598
  • Jan 17 19:26
    som-snytt commented #8641
  • Jan 17 18:39
    som-snytt edited #8598
  • Jan 17 18:33
    som-snytt synchronize #8598
  • Jan 17 18:30

    som-snytt on 2.13.x

    Report original errors in adapt… Merge pull request #8596 from j… (compare)

  • Jan 17 18:30
    som-snytt closed #8596
  • Jan 17 12:07
    diesalbla labeled #8642
  • Jan 17 11:47
    julien-truffaut commented #8642
  • Jan 17 11:43
    scala-jenkins milestoned #8642
  • Jan 17 11:43
    julien-truffaut opened #8642
  • Jan 17 08:31
    lrytz review_requested #8596
  • Jan 17 06:05
    som-snytt labeled #8633
  • Jan 16 22:12
    retronym synchronize #8641
  • Jan 16 21:26
    retronym synchronize #8641
AmirSarvestani
@AmirSarvestani
I am new to scala
Sabuj Kumar Jena
@sabujkumarjena
@@AmirSarvestani..Yes you can. let s = Seq[CompanyApplicant]. s.filter(boolean function). You have to pass a boolean function with respect to property of object.
For example if u want to filter all company applicants whose names start with letter S...the query would be s.filter(_.startWith("S"))
Ghost
@ghost~5b8a94aed73408ce4fa690fd

If anybody would like to translate #fpmortals into any Indian language, please contact me. We have begun versions for French, Russian and Singapore

You can keep any profits, but the minimum price must be $0

https://leanpub.com/fpmortals

Sabuj Kumar Jena
@sabujkumarjena
@pradeepert :
def addAll(args:Int*) = {args foreach (add)}
then u can call ints.addAll(1,2,3,4,5)
premgc
@premgc
Hi
I need one help
in one of my spark scala job i'm getting this error
ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 2) at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession
I'm creating one RDD and loop through this . inside the loop written a function passing the values
here is the sudo code

import scala.collection.Map
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.{Row, SQLContext, SaveMode}
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.hive.HiveContext

object dataingest{

def main(args: Array[String]) {




def procIngestBAMData(url:String, username:String, passwd:String, qrySql:String ,hivetbl:String ,sqlContext: SQLContext) : Unit = {
println("URL : " + url)
println("username : " + username)
println("pwd : " + passwd)
println("sql : " + qrySql)
println("hivetable : " + hivetbl)
  val driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
  val dfDynSrc =  sqlContext.read.format("jdbc").option("url", url.trim).option("user", username.trim).option("password", passwd.trim).option("dbtable", qrySql).option("driver", driver).load() 

}

val driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
var getDriverDataDF = spark.read.
format("jdbc").
option("url", s_url.trim).
option("user", s_username.trim).
option("password", s_passwd.trim).
option("dbtable", getdriverfileSQL).
option("driver", driver).
load()

val driverDatatoRDD = getDriverDataDF.rdd

if (driverDatatoRDD.count > 1 ) {

for (e <- driverDatatoRDD) {
  val a  = e(0)
  val b = e(1)
  val p_ActivityName = a.asInstanceOf[String]
  val p_TableName = b.asInstanceOf[String]
  log_message("procData processing names : " + p_ActivityName + " .../"   +  p_TableName )

  val strtblName = p_TableName.toLowerCase
  val strActivityName = p_ActivityName.toLowerCase

  println("Activity name to process000 ====> " + strtblName )
  println("Table Name00000 "+ strActivityName )

 val sc2 = spark.sqlContext
 procIngestBAMData(p_url,p_username , p_passwd, p_qry2, p_hivetbl, sc2)

} else {
log_message("No records found for the given date range " + process_date + "." )
}

}
}

i'm not sure getting this error
ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 2) at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession
when calling this function procIngestBAMData
Can any one help me please.............
vivekn1986
@vivekn1986
Hello, can anyone help me in converting integer to long in scala

scala> val i: Integer = 33
i: Integer = 33

scala> i.asInstanceOf[Long]
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:110)
at $iwCiwCiwCiwCiwCiwCiwCiwCiwCiwC

KaTeX parse error: Unexpected character: '$' at position 36: 29)
        at ̲$iwC: iwC.<init>(<console>:29)
        at $iwC
iwCiwCiwCiwCiwCiwCiwC
KaTeX parse error: Unexpected character: '$' at position 36: 34)
        at ̲$iwC: iwC.<init>(<console>:34)
        at $iwC
iwCiwCiwCiwCiwCiwCiwC.<init>(<console>:36)
at $iwCiwCiwCiwCiwCiwCiwC.<init>(<console>:38)
at $iwCiwCiwCiwC
KaTeX parse error: Unexpected character: '$' at position 36: 40)
        at ̲$iwC: iwC.<init>(<console>:40)
        at $iwC
iwC
KaTeX parse error: Unexpected character: '$' at position 36: 42)
        at ̲$iwC: iwC.<init>(<console>:42)
        at $iwC
iwC.<init>(<console>:44)
at $iwC.<init>(<console>:46)
at <init>(<console>:48)
at .<init>(<console>:52)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILooploop(SparkILoop.scala:670)atorg.apache.spark.repl.SparkILooploop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoopanonfun$org$apache$spark$repl$SparkILoop
KaTeX parse error: Unexpected character: '$' at position 7: process̲$1.apply$mcZ$s: process$1.apply$mcZ$sp(SparkILoop.scala:997)
        at org.apache.spark.repl.SparkILoop
anonfun$org$apache$spark$repl$SparkILoop
KaTeX parse error: Unexpected character: '$' at position 7: process̲$1.apply(Spark: process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop
anonfun$org$apache$spark$repl$SparkILoop
KaTeX parse error: Unexpected character: '$' at position 7: process̲$1.apply(Spark: process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop
process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Vanam
@vanamraghu
@vivekn1986 , you can use integer.toLong
AmirSarvestani
@AmirSarvestani
Hey guys, I have a problem I would appreciate if someone can help me, I hace a Seq[Income], income has attributes "type" , "amount" and "frequency"
I want to create an xml node like this.
<Income s"${income.`type`}Amount=${income.amount} ${income.`type`}Frequency=${income.frequency}"}"/>
so income type is an Enum and each income type has an income frequency
how would i be able to change the sequence of Income to one Income node in xml?
ehsanshah
@ehsanshah
Hi every one I need to use RPC protocol but I don't know which library has good performance ! in Scala. can guide me ?
thanks.
Abdhesh Kumar
@abdheshkumar
@ehsanshah Use finagle https://twitter.github.io/finagle/
Emperor-less
@Emperor-less
hello
who are you
anbutech17
@anbutech17
hello
ramesh004
@ramesh004

can anyone help me to get rid of the task not serializable exception
even i had implemented the Serializable trait

var structFieldArr = SchemaFileRDD.map(m=> StructField(m.split(Pattern.quote("|"))(1),types(m.split(Pattern.quote("|"))(2)))).collect()

Karan Barai
@BaraiKaran
Hi, I am trying to unit test some functions which use slick to interact with database. Can anyone explain me right way to do this? Also, if you can share some mocking examples it would be great. I have looked up for the examples but didn't find any good solution. Thanks!
Pradeep Sonaimuthu
@pradeepert
I come across with one use case. I want to write one function I will pass two types to that function. It should return it can be convert able from first type to second type or not. Any suggestions?
Eg: if I pass int and string to the function as an input parameters, it should return true.
Karan Barai
@BaraiKaran
Can you elaborate a little bit, not quite sure what you are trying to achieve. @pradeepert
Pradeep Sonaimuthu
@pradeepert
I want to pack avro library dependencies along with my project jar. What I have to do for this. What changes I have to do in SBT level?
agamjain14
@agamjain14

I have sort function defined as below:

case class Country(name: String, id: Int)
def sortT(compare: (T, T) => Boolean): List[T] = {

}
My question is, How do I sort the list of Country based on the name?

and How do I call this function having multiple parameter lists?

case class Country(name: String, id: Int)
def sort(list: List[T])(compare: (T, T) => Boolean): List[T] = {

}

Deshbandhu Mishra
@deshbandhumishra
I am solving/explaining famous 99-scala-Algo over voice chat on Skype with 4-5 scala freshers. You may also join this voice group chat.
Pradeep Sonaimuthu
@pradeepert
I m getting this character issue when reading spark dataframe from csv,
'\uFEFF'. I want to check as a business logic df.columns.contains("Id") eventhough Id column is there because of this special character type it fails
Pradeep Sonaimuthu
@pradeepert

In Spark I am not able to filter by existing column:
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve 'Inv. Pty' given input columns: [ο»ΏPstng Date, Name 1, Inv. Pty, Year]

Through Inv. Pty I am trying to filter. This column already in the dataframe input file you can see inside []

Pradeep Sonaimuthu
@pradeepert

Here I have used dense_rank() as the analytical function, I want to pass this function as a argument incase if I want to pass any other function tomorrow,

val indexLatName = df.withColumn("LatestNameOrder", dense_rank() .over(Window.partitionBy("`" + partBy + "`") .orderBy(desc("`" + ordBy + "`"))))

How to pass the argument, if the variable has the value
val winFunc = "dense_rank"
Harsh Maheshwari
@harshmah

HiπŸ‘‹ Everyone, I am new to Scala. I am struggling with a existing piece of scala code so need your help.

We (i.e. in my org) are using Gatling for perf testing. So we are using following piece of code:

ScenarioBuilder = scenario(scn).during(config.duration) { exec(session => { val daysFromNow = nextInt(180) + 1 //avoid daysFromNow = 0 because current date is invalid for some locale half a world away val siteAndLang = chooseFrom(siteLangs) session.set("siteId", siteAndLang.substring(0, siteAndLang.indexOf("/"))) .set("locale", siteAndLang.substring(siteAndLang.indexOf("/")+1)) .set("hotelId", hotelIds) .set("startDate", computeDateString(daysFromNow)) .set("endDate", computeDateString(daysFromNow + lengthOfStay(nextInt(lengthOfStay.length)))) .set("regionId", regionIds) .set("adults", chooseFrom(adultCounts)) }).exec(addCookie(Cookie("DUAID", DUAID))) .exec(http(scn) .get(s"$endpoint/$path") .header("Trace-ID", session => getTraceID()) .signatureCalculator(new CommonSignatureCalculator(DUAID)) .check(status is 200) .check(validator) ) }

β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”-β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”-β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”-

This is what we are passing as validator:

private def validator = (jsonPath("$.hotelId").exists)

So API which this script is hitting start sending compressed data and then it get following error:

19:44:50 14:14:50.084 [gatling-http-thread-1-3] ERROR io.gatling.http.response.StringResponseBody$ - Response body is not valid UTF-8 bytes

My question - how to unzip GZip response before pass to validator.

Harsh Maheshwari
@harshmah
Here is correct code snippet which is more clean:
import java.time.LocalDate
import java.time.format.DateTimeFormatter._
import java.util.UUID

import config.RunConfiguration
import io.gatling.core.Predef._
import io.gatling.core.structure.ScenarioBuilder
import io.gatling.http.Predef._
import io.gatling.http.check.HttpCheck
import signature.CommonSignatureCalculator

import scala.io.Source
import scala.util.Random._

def buildScenario(
    endpoint: String,
    path: String,
    scn: String,
    validator: HttpCheck
): ScenarioBuilder =
  scenario(scn).during(config.duration) {
    exec(session => {
      val daysFromNow = nextInt(180) + 1 //avoid daysFromNow = 0 because current date is invalid for some locale half a world away
      val siteAndLang = chooseFrom(siteLangs)
      session
        .set("siteId", siteAndLang.substring(0, siteAndLang.indexOf("/")))
        .set("locale", siteAndLang.substring(siteAndLang.indexOf("/") + 1))
        .set("htlId", hotelIds)
        .set("startDate", computeDateString(daysFromNow))
        .set(
          "endDate",
          computeDateString(
            daysFromNow + lengthOfStay(nextInt(lengthOfStay.length))
          )
        )
        .set("regionId", regionIds)
        .set("adults", chooseFrom(adultCounts))
    }).exec(addCookie(Cookie("DUAID", DUAID)))
      .exec(
        http(scn)
          .get(s"$endpoint/$path")
          .header("Trace-ID", session => getTraceID())
          .signatureCalculator(new CommonSignatureCalculator(DUAID))
          .check(status is 200)
          .check(validator)
      )
  }

private def validator = (jsonPath("$.hotelId").exists)

buildScenario(
  "a_get_end_point",
  "some scenario",
  validator
)
sachin407
@sachin407
hello developers here i am coming to join the room
Deshbandhu Mishra
@deshbandhumishra
hi @sachin407 welcome to this scala room
Archit Kapoor
@archit47
Hey guys!
It's almost the end of 2019. Just wanted to catch-up with y'all, who are still using Scala here in India?
How are things been at your end? Have you tried building a community of Scala developers outside of this channel?
Deshbandhu Mishra
@deshbandhumishra
Hi All
can we have a regular group discussion over voice call by using skype/gotomeeting etc.. Please suggest.
I have created one group to discuss scala over skype call. you may also join: https://join.skype.com/dLdTEMogQwrq