#!/usr/local/bin/amm
import $ivy.`com.lihaoyi::requests:0.1.5`
val r = requests.post("http://httpbin.org/post", data = Map("key" -> "value"))
println(r.statusCode)
println(r.headers("content-type"))
println(r.text)
$ ./HttpClient-requests.scala
Exception in thread "main" java.lang.ClassNotFoundException: $file.`HttpClient-requests`$$routes$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at ammonite.interp.SpecialClassLoader.findClass(ClassLoaders.scala:113)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
Hi, I think it's not directly related to requests-scala, i did find a simple to do it, let met explain :
qualifEnv
(let us say baseUrl: https://qualif.gatway): `val qualifEnv = requests.Session(some specific config for qualif)qualifEnvSession.get(s"$baseUrl/resource", ...)
It could be cool to have a scoped requests instance allowing: qualifEnvSession.get("resource", ...)
, which takes care about addding the baseUrl.
I did not find a simple way to do that, if have you ideas, thank you
geny.Readable
and geny.Writable
, and a scoped by env requests instance, we could have a less error prone and smooth tool for data migration between envs
Hi
I'm feeling that the way this library is going forces me to use var
more than it should.
If reading a stream from GET I have to use a var
to store the StreamHeaders
value and then only read from the stream if response is SC_OK, like this:
var responseHeaders: Option[StreamHeaders] = None
requests.get.stream("https://some.url", check = false
, onHeadersReceived = streamHeaders => {
responseHeaders = Some(streamHeaders)
if (streamHeaders.statusCode == HttpServletResponse.SC_OK) {
// stuff
} else {
// other stuff
}
}).readBytesThrough{ inputStream =>
if (responseHeaders.exists(_.statusCode == HttpServletResponse.SC_OK)) {
// read stream
} else {
// don't
}
}
It would be great if one could do this (access streamHeaders
in the Readable
:
val readable = requests.get.stream("https://some.url", check = false
, onHeadersReceived = streamHeaders => {
if (streamHeaders.statusCode == HttpServletResponse.SC_OK) {
// stuff
} else {
// other stuff
}
})
if (readable.streamHeaders.statusCode == HttpServletResponse.SC_OK) {
readable.readBytesThrough{ inputStream =>
// read stream
}
}
i've been working with requests-scala
in an (sbt 1.3.7/scala 2.13.1) app with utests without any problems, as per:
"com.lihaoyi" %% "requests" % "0.5.0",
however, when i run an ammonite script mimicking the successful utest with ammonite 1.6.3 (scala 2.12.8 on macOS 10.15.3):
import $ivy.`com.lihaoyi::requests:0.5.0`
import $ivy.`com.lihaoyi::pprint:0.5.6`, pprint._
// requires fsm-interface.FsmInterface to be up and running
val host = "http://localhost:8080"
val resp = requests.get(s"$host/echo")
pprintln(resp.statusCode)
pprintln(resp.text())
i get
Compiling /Users/psc/Documents/Development/Scala/fsm/interface/src/main/sc/fsm-interface.sc
fsm-interface.sc:6: Symbol 'type geny.ByteData' is missing from the classpath.
This symbol is required by 'class requests.Response'.
Make sure that type ByteData is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'Response.class' was compiled against an incompatible version of geny.
val resp = requests.get(s"$host/echo")
^
fsm-interface.sc:8: value text is not a member of requests.Response
val res_5 = pprintln(resp.text())
^
Compilation Failed
(the pprint
import works fine and can be used.)
scala>val r = requests.get("https://www.freebsd.org/fsdkfjsdkfd")
scala> println(r.toString)
<console>:12: error: not found: value r
println(r.toString)
check = false
will give the old behavior.
Hello! Any idea if I'm missing an import?
I'm getting this error by just running a command from requests-scala's readme in amm REPL without any additional imports.
@ requests.post("http://localhost:9000", data = ujson.Obj("hello" -> "world"))
cmd1.sc:1: overloaded method value apply with alternatives:
(r: requests.Request,data: requests.RequestBlob)requests.Response <and>
(url: String,auth: requests.RequestAuth,params: Iterable[(String, String)],headers: Iterable[(String, String)],data: requests.RequestBlob,readTimeout: Int,connectTimeout: Int,proxy: (String, Int),cookies: Map[String,java.net.HttpCookie],cookieValues: Map[String,String],maxRedirects: Int,verifySslCerts: Boolean,autoDecompress: Boolean,compress: requests.Compress,keepAlive: Boolean)requests.Response
cannot be applied to (String, data: ujson.Obj)
val res1 = requests.post("http://localhost:9000", data = ujson.Obj("hello" -> "world"))
^
Compilation Failed
I tried importing the library explicitly, with similar resultsimport $ivy.{`com.lihaoyi::requests:0.6.5`, `com.lihaoyi::ujson:0.9.5`}
I am trying to import and use requests
in a Databricks notebook.
I am getting this error:
command-1361810570373404:40: error: object post is not a member of package requests
requests.post(
Also, if I try to import the package with import com.lihaoyi.requests._
I am not finding it:
command-497441197922137:2: error: object lihaoyi is not a member of package com
import com.lihaoyi.requests._
Have downloaded the jar from Maven: https://mvnrepository.com/artifact/com.lihaoyi/requests_2.13/0.5.2
Datbricks runtime 7.5 (includes Apache Spark 3.0.1, Scala 2.12)
Any idea how to properly import and use the library in Databricks?
Cheers.
>>> payload = {'key1': 'value1', 'key2': ['value2', 'value3']}
>>> r = requests.get('https://httpbin.org/get', params=payload)
Given that I send gzip using :
def sendString(txt: String): Unit = {
requests.post("http://localhost:8000/test",
compress = requests.Compress.Gzip,
data = txt)
}
how do I read the stream on the server?
If I send gzip data as an Array[Byte] I can use:
override def handle(t: HttpExchange) {
val is = new GZIPInputStream(t.getRequestBody)
and just stream into a StringBuilder or something but that doesn't work.