Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • 12:18
    vincenzobaz opened #1282
  • 12:13
    jakoschiko closed #12465
  • 12:13
    jakoschiko commented #12465
  • 11:04
    jakoschiko opened #12465
  • 08:44
    lrytz commented #12463
  • 03:06
    nwk37011 synchronize #9752
  • 00:14
    nwk37011 synchronize #9752
  • Sep 26 21:12
    som-snytt commented #9769
  • Sep 26 21:09
    som-snytt commented #8885
  • Sep 26 21:04
    som-snytt commented #9712
  • Sep 26 21:01
    scala-jenkins milestoned #9769
  • Sep 26 21:01
    som-snytt opened #9769
  • Sep 25 20:02
    joroKr21 commented #12463
  • Sep 25 09:30
    gaopinghuang0 opened #2190
  • Sep 24 18:28
    pjfanning commented #12327
  • Sep 24 18:28
    robert-blankenship commented #12327
  • Sep 24 18:27
    robert-blankenship commented #12327
  • Sep 24 18:25
    robert-blankenship commented #12327
  • Sep 24 17:34
    som-snytt commented #11631
  • Sep 24 15:28

    SethTisue on 2.13.x

    advance http4s to make it green… (compare)

Rob Norris
@tpolecat
:+1: indeed
idk-kid
@idk-kid

I have a nested schema dataframe.
I wanted to traverse and get some information out.
I was writing this function

def func(struct: StructType) = {
    struct match {
      case StructType(arr) => {
        arr.map{ fields =>
          fields match {
            case StructField(field, dataType, _, _) =>
              dataType match {
                case StructType(_) => func(dataType)
                case ArrayType(structTyp: StructType, _) => func(structTyp)
                case _ => .....
              }
          }
        }
      }
    }
  }

This line case StructType(_) => func(dataType) why is it of type DataType by default?
I tried making it StructType but it said cannot upcast.

Can anybody help?

Luis Miguel Mejía Suárez
@BalmungSan
Try with case struct @ StructType(_) => func(struct)
idk-kid
@idk-kid
So, how does this help? I never used that before
Luis Miguel Mejía Suárez
@BalmungSan
foo @ Bar means match the Bar pattern and if it matches then assign it to foo
In this case, it helps, because struct is of type StructType, which is what you need.
Eric K Richardson
@ekrich

I am really confused by this piece of code and Scala 3 braceless format:

object Foo:
  def joe(): List[(Int, Int)] =
    List((2, 3), (3, 4)).filter { case (a, b) => b > a }

If I remove the {} I get a parser error. If I then wrap the case onto the next line and indent it doesn't like that. I can replace the {} with () but then it doesn't like the case so then I can remove the case and it seems fine with that. I can't wrap the case or without the case to the next line. It is like () is required but is that going to work with a multi-line lambda?

Dan Sokolsky
@dansok
Hi, how to do this properly? --
val logicalPlans: Seq[LogicalPlan] = parsedQuery.collect {
  case logicalPlan >: org.apache.spark.sql.catalyst.plans.logical => logicalPlan
}
I.e., I want to check whether the logicalPlan is of subtype org.apache.spark.sql.catalyst.plans.logical
Rob Norris
@tpolecat
case logicalPlan: org.apache.spark.sql.catalyst.plans.logical => ...
Dan Sokolsky
@dansok
doesn't like it
Rob Norris
@tpolecat
What does it say?
idk-kid
@idk-kid
@BalmungSan it still complains
Luis Miguel Mejía Suárez
@BalmungSan
Really? What is the error this time?
Rob Norris
@tpolecat
Are you sure org.apache.spark.sql.catalyst.plans.logical is a type? The capitalization is suspicious.
Dan Sokolsky
@dansok
intellij doesn't recognize logical. highlights it in red, and gives you one of three opotionns along the lie of "create case class logical"
although the import statement import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan does in fact import LogicalPlan
Rob Norris
@tpolecat
Are you sure you don't want case logicalPlan: LogicalPlan => ...?
Dan Sokolsky
@dansok
I am
Rob Norris
@tpolecat
Why don't you want that?
Dan Sokolsky
@dansok
I want to filter all logical statements and make sure they are all SELECTs
(of type org.apache.spark.sql.catalyst.plans.logical.Project)
Rob Norris
@tpolecat
Ok then case logicalPlan: Project => ...
And your return type can be Seq[Project]
idk-kid
@idk-kid
@BalmungSan it's the same complaint
Dan Sokolsky
@dansok
but I want to throw if there is, say, an insert
so I want to get all logical statements, not just the selects
so I need to filter for org.apache.spark.sql.catalyst.plans.logical.*, not Project
Rob Norris
@tpolecat
Oh I see. You want to filter for anything in that package?
You can't do that.
You would need to check for each type, or for their common supertype if they have one.
Dan Sokolsky
@dansok
alright plan b -- convert to JSON and parse strings 🤘
yolo
"class" : "org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement"
I was trying to be a good citizen
Luis Miguel Mejía Suárez
@BalmungSan
I am sure my attempt to reproduce the code has many differences, try playing with the code until you get the same error.
Another possibility is that maybe the error you are seeing is not real? Like it is just an IDE thing but a real compiler will not produce it?
Dan Sokolsky
@dansok
how to get only the match from a regex?
I'm getting a hit, but it's returning the entire line

so,

val regex: Regex = "\"class\":\"org.apache.spark.sql.catalyst.plans.logical.*\"".r
println(regex.findAllMatchIn(queryString).toList)

gives back

List("class":"org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement","num-children":1,"table":[{"class":"org.apache.spark.sql.catalyst.analysis.UnresolvedRelation","num-children":0,"multipartIdentifier":"[table_1]"}],"partitionSpec":null,"query":0,"overwrite":false,"ifPartitionNotExists":false},{"class":"org.apache.spark.sql.catalyst.analysis.UnresolvedInlineTable","num-children":0,"names":"[col1]","rows")

where I am only looking for

"class":"org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement"

to be returned

Derek Wickern
@dwickern
For my Play plugin to work with any Play version [2.8.0,2.9), should the plugin depend on 2.8.0? Or should it depend on the range like [2.8.0,2.9.0[ or 2.8.+? Does it matter?
Martijn
@martijnhoekstra:matrix.org
[m]
@dansok: values in scala have types. findAllMatchIn returns a value of the type Iterator[Match]. You can then use those matches to get what you want. But it's not clear to me what it is you want exactly. Is it the matched text? Then you can use findAllIn which returns the matched strings rather than match objects
or select the matched member on the match object, which will give you the matched text
@dwickern: you can just depend on 2.8.0, and everything should work as expected.
you can also do the range, or at least, I've read that before, but I've never seen anyone do that in the wild
Derek Wickern
@dwickern
yeah, I haven't seen anyone use version ranges either
som-snytt
@som-snytt

@dansok your regex is greedy

scala> val r = raw"""(".*")""".r
val r: scala.util.matching.Regex = (".*")

scala> r.findFirstIn(""""one" and "two"""")
val res0: Option[String] = Some("one" and "two")

scala> val r = raw"""("[^"]*")""".r
val r: scala.util.matching.Regex = ("[^"]*")

scala> r.findFirstIn(""""one" and "two"""")
val res1: Option[String] = Some("one")

Nobody can tell but dotty is my default snippet repl now. Also, I notice using raw for regex spells raw.r.