Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 18:19

    aws-sdk-php-automation on 3.195.1

    (compare)

  • 18:19

    aws-sdk-php-automation on master

    Update models for release 3.195.1 release (compare)

  • Sep 27 18:16

    aws-sdk-php-automation on 3.195.0

    (compare)

  • Sep 27 18:16

    aws-sdk-php-automation on master

    Update models for release 3.195.0 release (compare)

  • Sep 26 18:03
    fynmanoj edited #2322
  • Sep 26 18:03
    fynmanoj edited #2322
  • Sep 26 18:03
    fynmanoj edited #2322
  • Sep 26 18:03
    fynmanoj labeled #2322
  • Sep 26 18:03
    fynmanoj labeled #2322
  • Sep 26 18:03
    fynmanoj opened #2322
  • Sep 25 02:32
    sikanderiqbal opened #2321
  • Sep 24 18:27

    aws-sdk-php-automation on 3.194.5

    (compare)

  • Sep 24 18:27

    aws-sdk-php-automation on master

    Update models for release 3.194.5 release (compare)

  • Sep 24 17:52
    SamRemis commented #2320
  • Sep 24 17:29
    cdloh opened #2320
  • Sep 23 18:19

    aws-sdk-php-automation on 3.194.4

    (compare)

  • Sep 23 18:19

    aws-sdk-php-automation on master

    Update models for release 3.194.4 release (compare)

  • Sep 22 18:17

    aws-sdk-php-automation on 3.194.3

    (compare)

  • Sep 22 18:17

    aws-sdk-php-automation on master

    Update models for release 3.194.3 release (compare)

  • Sep 22 14:06
    ulrichsg commented #2294
byounes
@byounes
I get your point, I'll try the git release then.
Thanks Jeskew for your time.
Jonathan Eskew
@jeskew
Np happy to help
byounes
@byounes
regarding the Ubuntu package, it has no autoload file, I searched for "auto", and got zero hits
"auto"
Jonathan Eskew
@jeskew
hmmm well then they might expect you to call require_once for every class in the SDK
we don’t maintain the ubuntu package, so I’m not really sure how it’s supposed to be used
Zxurian
@Zxurian
Is there a way to stage multiple transfers to s3, or fork/queue the upload process so the script isn't waiting for a transfer to complete before moving onto the next section of the code?
Witold Cyrek
@wcyrek-comrise
Yes!
Zxurian
@Zxurian
ex. I have multiple separate directories that need to be uploaded that I can loop on, however I'd rather not wait for the transfer to complete before moving onto the next iteration.
@wcyrek-comrise excellent, do you have a link referencing that feature, or API?
Witold Cyrek
@wcyrek-comrise
I mean I am sure there is a way, but you have to probably put it in your script using soem sort of concurrency library.
I do not think there is a built in way.
Let me jsut make sure by rummiging through the docs, sicne there might be async way of doign that
Witold Cyrek
@wcyrek-comrise
@Zxurian https://aws.amazon.com/articles/1904 suggests using multi-threading to increase throughput
which would make thing that it is not built in, but like I said you can create multipel threads to process the requests concurrently
Zxurian
@Zxurian
okay, so there's no built-in method with the aws-sdk-php library to fork/queue uploads?
Witold Cyrek
@wcyrek-comrise
That appears to be the case. But you can roll one yourself. Or do a little more digging and find one. I foudn one in python, someone no doubt did something somewhere in PHP.
Zxurian
@Zxurian
thanks. and just to double check, is there another library of aws-sdk besides php that would support pre-generating a list of transfers, and then just pass it to an sdk api and it would take care of uploading everything?
Witold Cyrek
@wcyrek-comrise
I don't think so, the guy who I found used the python SDK, he still had to write his won multithreaded code: https://gist.github.com/jeffaudi/f62a57f11a594e41d11d
Only concurrency that I seen is built-in is for uploading big files where the source is a single file
Zxurian
@Zxurian
thanks, I'll look around and see what's available. I'd rather create a single thread that can parse a list of downloads and let it run, rather than having the active code have to loop over it
Jonathan Eskew
@jeskew
@Zxurian have you taken a look at Aws\S3\MultipartUploader?
It has configurable concurrency (managed by curl multi, not threads) and can return a promise if you want the upload to resolve in the background
Jonathan Eskew
@jeskew
There's a similarly asynchronous way to upload directories as well
Zxurian
@Zxurian
@jeskew awesome, that's exactly what I was looking for. Thanks for the info
YohaanYoon
@YohaanYoon
Using AWS SDK PHP, I am trying to send email asynchronously through AWS SES in my local machine (mac, apache)
While $promise->wait() works well, $promise->then doesn't work
Do you have any ideas to fix it? Such as requirement issues, environment issues, and so on.
Robert Casanova
@robertcasanova
Hi guys, sorry to disturb you. I'm having a problem with uploading image to S3. I'm using Sonata Media Bundle inside a PHP project with a multi docker container configuration. I'm having this error
cURL resource: Resource id 54; cURL error: Could not resolve host: s3-eu-west-1.amazonaws.com (cURL error code 6). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.
does somebody have the same problem or some suggestion for this?
Jonathan Eskew
@jeskew
@YohaanYoon could you open an issue on GitHub with a code sample?
@robertcasanova that looks like an outbound networking issue. Are you sure the docker container hosting the app can contact the internet?
Robert Casanova
@robertcasanova
@jeskew thx for your answer. I tryed to reboot and then it works: it happens randomly. Probably, as you said, is something with docker. Thx.
mikemherron
@mikemherron
Hi - I've got a quick question about the automatic handling of ProvisionedThroughputExceededExceptions when writing items to DynamoDB - my understanding is that the SDK handles these automatically, but on my test server, I can see these are being thrown from the SDK. I'm intentionally trying to test this behaviour so I've provisioned a table with 1 Unit of Read Capacity and 1 Unity of Write Capacity, and have a script that will definitely exceed this. I suspect that I'm reaching some internal retry limit, but I've not been able to find documentation on this - can anyone point me to whereabouts in the code the retry logic is handled so I can confirm?
Jonathan Eskew
@jeskew
Dynamo calls are retired 11 times by default
mikemherron
@mikemherron
That's great, thank you for the pointer.
mikemherron
@mikemherron
@jeskew Another quick question if you have time? So I added the below line to RetryMiddleware.php:

$command['@http']['delay'] = $delay($retries++);

  • \Log::info("Retrying with delay:" . $command['@http']['delay'] );

return $handler($command, $request)->then($g, $g);

and observed the below output:
[2015-11-05 16:28:48] test.INFO: Retrying with delay:0
[2015-11-05 16:28:48] test.INFO: Retrying with delay:0.05
[2015-11-05 16:28:48] test.INFO: Retrying with delay:0.1
[2015-11-05 16:28:48] test.INFO: Retrying with delay:0.2
[2015-11-05 16:28:48] test.INFO: Retrying with delay:0.4
[2015-11-05 16:28:48] test.INFO: Retrying with delay:0.8
[2015-11-05 16:28:48] test.INFO: Retrying with delay:1.6
[2015-11-05 16:28:48] test.INFO: Retrying with delay:3.2
[2015-11-05 16:28:48] test.INFO: Retrying with delay:6.4
[2015-11-05 16:28:48] test.INFO: Retrying with delay:12.8
[2015-11-05 16:28:48] test.INFO: Retrying with delay:25.6
I assume the delays are in ms ?
Jonathan Eskew
@jeskew
I believe they're in seconds, though I'll have to check to be sure
The requests have an exponentially increasing delay between them
So that throttled requests will try again after the throttle has been lifted
mikemherron
@mikemherron
Ok thanks - from the timestamps recorded in my logs, it looks like the retries are all being logged at the same second.
In fact, thinking about it now, a retry of "0.05" ms probably doesn't make sense :)