Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Michael Butler
    and yeah maxHandles is misleading. that's just for curl connections, not the response body handles which is handled somewhere else
    Philip Ardery
    Do you think there is some way to get your approach to work in __desctruct or do you think its impossible given the logic? Also, I should take the time to THANK YOU immensely for engaging with me on this. Sadly, the Guzzle devs dont seem interested.
    Michael Butler
    if I have a chance I can try using __destruct again for the cleanup instead of manually calling it. should be provable in the gist
    and yeah, no problem.
    Philip Ardery
    Many thanks Michael. I will continue to watch this. I think a destruct solution would be ideal because then it would happen behind the scenes without the implementing user having to worry about it. But if its impossible, your approach is nice.
    Michael Butler
    Hi all, new to Guzzler or in general PHP. I am working on creating GET/POST request using send async. I am confused on how to add headers and body to the request. `$headers = array('Api-Timestamp'=> $timeStamp, 'Api-Key'=> $this->apiKey,'Accept'=> 'application/json');
    $body = $content;
    $options =array('headers'=>$headers,'body'=>$body);
    // $request = new \GuzzleHttp\Psr7\Request('GET', $uri, $options);
    $client = new \GuzzleHttp\Client();
        # Send an asynchronous request.
        $request = new \GuzzleHttp\Psr7\Request($method, $uri,$options);
        $promise = $client->sendAsync($request)->then(function ($response) {
            echo 'I completed! ' . $response->getBody();
        });` I am not getting the right response with the headers but once i remove  the headers i  get the response back. Is there something i am missing out. You help will be much appreciated .
    adira gasada
    how to install in windows 10
    morning folks. I'm having difficulty getting my head around Pool requests, and how I can catch their exceptions. I put a thread on Stack Overflow, but it hasn't drawn much attention and I don't have enough points to raise a bounty. could someone please have a look and let me know their thoughts? https://stackoverflow.com/questions/56183592/handling-exceptions-on-a-pool-of-requests
    Hayden Young

    Hi. I am attempting to use Guzzle to consume this service: https://github.com/orbitdb/orbit-db-http-api.

    However, when passing stream = true to the client get method, I receive the error:

    <GET https://localhost:3000/db/my-feed/events/load> [CONNECT] 
    <GET https://localhost:3000/db/my-feed/events/load> [AUTH_RESULT] severity: "2" message: "HTTP/1.0 403 Forbidden
    " message_code: "403" 
    <GET https://localhost:3000/db/my-feed/events/load> [MIME_TYPE_IS] message: "text/plain" 
    <GET https://localhost:3000/db/my-feed/events/load> [PROGRESS] 
    PHP Fatal error:  Uncaught GuzzleHttp\Exception\ClientException: Client error: `GET https://localhost:3000/db/my-feed/events/load` resulted in a `403 Forbidden` response in /home/html/orbitdb/vendor/guzzlehttp/guzzle/src/Exception/RequestException.php:113

    However, if I remove stream true, it looks fine (debug=true enabled on request):

    *   Trying
    * TCP_NODELAY set
    * Connected to localhost ( port 3000 (#0)
    * ALPN, offering h2
    * ALPN, offering http/1.1
    * successfully set certificate verify locations:
    *   CAfile: /etc/ssl/certs/ca-certificates.crt
      CApath: /etc/ssl/certs
    * SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256
    * ALPN, server accepted to use h2
    * Server certificate:
    *  subject: C=AU; ST=A; L=B; O=Organization; OU=OrganizationUnit; CN=localhost; emailAddress=demo@example.com
    *  start date: May 25 14:56:35 2019 GMT
    *  expire date: Oct  6 14:56:35 2020 GMT
    *  common name: localhost (matched)
    *  issuer: C=AU; ST=A; L=B; O=Internet Widgits Pty Ltd; CN=Local Certificate
    *  SSL certificate verify ok.
    * Using HTTP2, server supports multi-use
    * Connection state changed (HTTP/2 confirmed)
    * Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
    * Using Stream ID: 1 (easy handle 0x558b8007c8a0)
    > GET /db/zdpuAoZQznotzijaUCrgUEQkPVCH64RCmGHPXUnRPX5EQn3L1%2Fmy-feed2/events/replicate.progress HTTP/2
    Host: localhost:3000
    Accept: text/event-stream
    Cache-Control: no-cache
    User-Agent: GuzzleHttp/6.3.3 curl/7.58.0 PHP/7.2.17-0ubuntu0.18.04.1
    * Connection state changed (MAX_CONCURRENT_STREAMS updated)!
    < HTTP/2 200 
    < content-type: text/event-stream; charset=utf-8
    < content-encoding: identity
    < cache-control: no-cache
    < vary: accept-encoding
    < date: Sat, 25 May 2019 15:10:38 GMT

    But without stream true I can't access the network stream. I'm using a self signed cert but have generated a local CA cert which I have added to the local CA store. curl and guzzle seem to be able to resolve the cert without problem (I'm not disabling it using flags like curl -k).

    ^ seems like 'stream'=>true forces http1.1. which the server does not accept. Where would I look for that piece of code I wonder?
    Hayden Young
    @phillmac seems like there is an option stream_context which can be passed to the request. Wondering if this can be used to force http/2 by the client? https://guzzle.readthedocs.io/en/latest/faq.html?highlight=stream_context#how-can-i-add-custom-stream-context-options
    It looks too me like the http context has a protocol_version that accepts either 1 or 1.1
    It might accept 2?But I cannot find any mention in the docs
    Kieron Wiltshire
    Guys, guzzle doesn't actually work? it doesn't send form_params
    Hi. I have problems with Guzzle 6.3.3. I’m getting segfaults when I’m trying to make a request.
    (new Client())->post('https://example.com‘);
    I’m getting: [1] 19546 segmentation fault php artisan test.
    Running PHP 7.2.19 on MacOS Mojave.
    I’ve already googled, but did not found anything related to my case.
    hi there, is it possible to use guzzle without a scheme? my use case is this: I am running several php microservices on kubernetes. the k8s internal dns reveals FQDN's such as 'service.namespace.cluster.local' . When I try to point guzzle to that fqdn without a schema (non provided by dns) it keeps the first letter of the domain then strips out the rest.
    for example REQ: 'users-service.users.cluster.local/auth' results in 'uauth'
    @martinmike2 Are you trying to make just plain http requests? Or naked tcp connections? what are you trying to achieve by omitting the scheme?
    Márk Sági-Kazár
    Sounds weird to me. Which version of Guzzle do you use? Do you manually provide the URL? Can you try using plain cURL to see the answer? (BTW I think the correct URL is service.namespace.svc.cluster.local)
    Hi there
    Zerox Millienium
    is it possible to construct raw http requests using Guzzle so they can used to build OData $batch requests?
    @yelena870513 I have the same problem cURL error 0: The cURL request was retried 3 times and did not succeed. The most likely reason for the failure is that cURL was unable to rewind the body of the request and subsequent retries resulted in the same error. Turn on the debug option to see what went wrong. See https://bugs.php.net/bug.php?id=47204 for more information. (see http://curl.haxx.se/libcurl/c/libcurl-errors.html), can somebody help me on this issue? env: (GuzzleHttp/6.3.3 PHP/5.6.23, redhat)
    Abishek R Srikaanth
    Could anybody advice on how I can convert this to using guzzlephp https://gist.github.com/abishekrsrikaanth/49ddecc1df2bedf576e0f53e66b54bf9.
    Delf Tonder
    ahoi! does anyone know/understand/can-explain why guzzle is turning HTTP errors (400-range) into exceptions?
    Delf Tonder
    ah, found that i can set 'http_errors' => false to bypass this behavior :)

    any idea why we are getting this error after the December updates:
    [28-Dec-2019 17:57:20 UTC] PHP Fatal error: Uncaught Error: Call to undefined function GuzzleHttp_idn_uri_convert() in wp-content/plugins/q4vrplugin/vendor/guzzlehttp/guzzle/src/Client.php:220
    Stack trace:

    #0 guzzlehttp/guzzle/src/Client.php(155): GuzzleHttp\Client->buildUri(Object(GuzzleHttp\Psr7\Uri), Array)

    #1 guzzlehttp/guzzle/src/Client.php(183): GuzzleHttp\Client->requestAsync('GET', 'units', Array)

    #2 plugin/Models/ApiClient.php(97): GuzzleHttp\Client->request('GET', 'units', Array)

    Is this a place I can ask for help with Guzzle?
    PHP Fatal error: Uncaught Error: Call to undefined method Guzzle\Http\Message\EntityEnclosingRequest::getStatusCode()
    I was looking for some help with the above error. My code looks like this:
    $response = $client->post($path, $arr);
    $code = $response->getStatusCode();
    I know this place is called "Gitter" but I think "Grizzle" might be a better name...
    It's not really a forum if I'm the only one here...
    Hi guys! Any idea why Guzzle hangs forever when I upload a big (255MB) file? If I decrease the size of the file I'm getting the server's response about the uploading, with bigger file Guzzle hangs forever. When I use curl everything is perfect with the same request.
    @sagikazarmark Are you around? :)
    Hey and thanks for guzzle ;) I though to ask here before creating a issue. Probably there is a easy way to fix that. Someone reported issues with adding a feed. The app uses feed-io to parse rss feeds and feed-io uses guzzle. There is a method isModified to tell if a feed is modified or not. Internally it checks the status code and if the body size is greater 0. And now guzzle comes into the game. A request using StreamHandler to http://rss.kicker.de/news/2bundesliga will return a valid body size greater 0. The same request using CurlHandler 0. Here is a mwe: https://gist.github.com/kesselb/34a522300f9e1ad772241b4cda66bb85 Any ideas? ;)
    @kesselb I think this is the intended behaviour for guzzle at least. If the rss feed does not implement some way of requesting feed items a
    After a specific date then isModified will always be true, because the rss feed will always have content
    I'd recomend instead checking if getLastModified() has changed since the last request
    In any case this isn't a bug in guzzle. PM me if you want to discuss futher
    Seems like Guzzle uses fstat() to determine the size of the stream, but the handle doesn't report a size because it isn't a file handle. Im not sure if thats something that needs to be fixed
    Do you want me to report that as issue? getSize should return null if the body size is unknown according to the doc block.
    Yeah sure why not
    Mauro Spivak
    Hey there! I'm working with a Pool of Requests. When a request is successfully fulfilled, I want to be able to know the URL of the request, so I can (say) print a message, but all I get is a response object and an index
    Does anyone happen to know of a code example or way where I could run Guzzle as a service, i.e essentially have it check against say Redis for new URLs to fetch? Currently it stops running once all promises are fulfilled in the setup I have. I mainly need it for retrying timeout requests with modified request options (the latter part being the problem), i.e changing proxy or header., so having it daemonized would help in sending requests back into the pool to scrape.
    What is the fastest way to send 100,000 HTTP requests to PHP and retrieve data from them? Need to get numerical values from a large list of profiles on the REST API + JWT, aggregation method API system where you need to get data there.

    @oleksandr-roskovynskyi fastest way isn't by using Guzzle, as great as Guzzle actually is.. If absolute maximum speed efficiency is desired, you're best off using an approach utilizing fsockopen and pcntl_fork or similar. Or simply use curl-multi (which Guzzle does too) directly. There's quite a few different approaches to these things, and I think stackoverflow is the best place to find the niche solutions you might look for. It also depends on whether or not your target server is expecting your multitude of connections.. if so, then you can just go nuts and open up 1000 threads. If not, then do keep in mind that webmasters often optimise their setups based on average traffic and your sudden flood of requests could cause problems for them.

    If you're set on using Guzzle, the Pool/Batch mode is what you're looking for. Keep in mind though that the result set will grow in memory until all results have been returned. If you want async return, look into using iterators and each_limit.. @alexeyshockov has an awesome code example for that here: https://github.com/alexeyshockov/guzzle-dynamic-pool .. It creates the requests sequentially though, so it's not optimised for speed, but it's still fairly fast once it has opened all the threads it's allowed to.

    I'm a bit confused about PSR-7 and why adding guzzle my project is suggesting the following: guzzlehttp/psr7 suggests installing zendframework/zend-httphandlerrunner (Emit PSR-7 responses)
    The other suggesting about the intl extension at least explained what the impact was, performance. But I'm not sure how Guzzle would be different with or without that other zend thing.
    It got more confusing from there because I looked at their github page, https://github.com/zendframework/zend-httphandlerrunner and it says it has moved to https://github.com/laminas/laminas-httphandlerrunner
    But I don't know if that newer project is compatible with guzzle or not, or if installing either of those would help guzzle in any way.