by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 12:14
    miradnan commented #1386
  • 12:13
    miradnan commented #1386
  • 12:13
    miradnan commented #1386
  • 12:12
    miradnan commented #1386
  • 12:12
    miradnan commented #1386
  • 08:09
    zzgab commented #2057
  • Jul 09 23:13
    diehlaws labeled #2057
  • Jul 09 23:13
    diehlaws commented #2057
  • Jul 09 23:13
    diehlaws unlabeled #2057
  • Jul 09 23:13
    diehlaws assigned #2057
  • Jul 09 22:40
    codecov-commenter commented #2058
  • Jul 09 22:29
    codecov-commenter commented #2058
  • Jul 09 22:29
    codecov-commenter commented #2058
  • Jul 09 22:18
    codecov-commenter commented #2058
  • Jul 09 22:16
    howardlopez review_requested #2058
  • Jul 09 21:30
    howardlopez opened #2058
  • Jul 09 18:15

    aws-sdk-php-automation on 3.145.4

    (compare)

  • Jul 09 18:15

    aws-sdk-php-automation on master

    Update models for release 3.145.4 release (compare)

  • Jul 09 12:51
    ankurk91 closed #185
  • Jul 09 11:32
    zzgab edited #2057
Jonathan Eskew
@jeskew
You can download a phar file from github and require that
It will include all dependencies
They would be on the
byounes
@byounes
does that mean that the Ubuntu package is not very useful, it has the next dependencies:
php5-common (>= 5.3.3), php-pear (>= 1.4.0), php-guzzle (>= 3.0.3), php5-curl
So Guzzle is there also,, I can see
it under /usr/share/php/Guzzle
Jonathan Eskew
@jeskew
I'm not familiar with the Ubuntu package but that would be an uncommon way to install a PHP library
Does it have an autoloader instructions?
byounes
@byounes
I get your point, I'll try the git release then.
Thanks Jeskew for your time.
Jonathan Eskew
@jeskew
Np happy to help
byounes
@byounes
regarding the Ubuntu package, it has no autoload file, I searched for "auto", and got zero hits
"auto"
Jonathan Eskew
@jeskew
hmmm well then they might expect you to call require_once for every class in the SDK
we don’t maintain the ubuntu package, so I’m not really sure how it’s supposed to be used
Zxurian
@Zxurian
Is there a way to stage multiple transfers to s3, or fork/queue the upload process so the script isn't waiting for a transfer to complete before moving onto the next section of the code?
Witold Cyrek
@wcyrek-comrise
Yes!
Zxurian
@Zxurian
ex. I have multiple separate directories that need to be uploaded that I can loop on, however I'd rather not wait for the transfer to complete before moving onto the next iteration.
@wcyrek-comrise excellent, do you have a link referencing that feature, or API?
Witold Cyrek
@wcyrek-comrise
I mean I am sure there is a way, but you have to probably put it in your script using soem sort of concurrency library.
I do not think there is a built in way.
Let me jsut make sure by rummiging through the docs, sicne there might be async way of doign that
Witold Cyrek
@wcyrek-comrise
@Zxurian https://aws.amazon.com/articles/1904 suggests using multi-threading to increase throughput
which would make thing that it is not built in, but like I said you can create multipel threads to process the requests concurrently
Zxurian
@Zxurian
okay, so there's no built-in method with the aws-sdk-php library to fork/queue uploads?
Witold Cyrek
@wcyrek-comrise
That appears to be the case. But you can roll one yourself. Or do a little more digging and find one. I foudn one in python, someone no doubt did something somewhere in PHP.
Zxurian
@Zxurian
thanks. and just to double check, is there another library of aws-sdk besides php that would support pre-generating a list of transfers, and then just pass it to an sdk api and it would take care of uploading everything?
Witold Cyrek
@wcyrek-comrise
I don't think so, the guy who I found used the python SDK, he still had to write his won multithreaded code: https://gist.github.com/jeffaudi/f62a57f11a594e41d11d
Only concurrency that I seen is built-in is for uploading big files where the source is a single file
Zxurian
@Zxurian
thanks, I'll look around and see what's available. I'd rather create a single thread that can parse a list of downloads and let it run, rather than having the active code have to loop over it
Jonathan Eskew
@jeskew
@Zxurian have you taken a look at Aws\S3\MultipartUploader?
It has configurable concurrency (managed by curl multi, not threads) and can return a promise if you want the upload to resolve in the background
Jonathan Eskew
@jeskew
There's a similarly asynchronous way to upload directories as well
Zxurian
@Zxurian
@jeskew awesome, that's exactly what I was looking for. Thanks for the info
YohaanYoon
@YohaanYoon
Using AWS SDK PHP, I am trying to send email asynchronously through AWS SES in my local machine (mac, apache)
While $promise->wait() works well, $promise->then doesn't work
Do you have any ideas to fix it? Such as requirement issues, environment issues, and so on.
Robert Casanova
@robertcasanova
Hi guys, sorry to disturb you. I'm having a problem with uploading image to S3. I'm using Sonata Media Bundle inside a PHP project with a multi docker container configuration. I'm having this error
cURL resource: Resource id 54; cURL error: Could not resolve host: s3-eu-west-1.amazonaws.com (cURL error code 6). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.
does somebody have the same problem or some suggestion for this?
Jonathan Eskew
@jeskew
@YohaanYoon could you open an issue on GitHub with a code sample?
@robertcasanova that looks like an outbound networking issue. Are you sure the docker container hosting the app can contact the internet?
Robert Casanova
@robertcasanova
@jeskew thx for your answer. I tryed to reboot and then it works: it happens randomly. Probably, as you said, is something with docker. Thx.
mikemherron
@mikemherron
Hi - I've got a quick question about the automatic handling of ProvisionedThroughputExceededExceptions when writing items to DynamoDB - my understanding is that the SDK handles these automatically, but on my test server, I can see these are being thrown from the SDK. I'm intentionally trying to test this behaviour so I've provisioned a table with 1 Unit of Read Capacity and 1 Unity of Write Capacity, and have a script that will definitely exceed this. I suspect that I'm reaching some internal retry limit, but I've not been able to find documentation on this - can anyone point me to whereabouts in the code the retry logic is handled so I can confirm?
Jonathan Eskew
@jeskew
Dynamo calls are retired 11 times by default
mikemherron
@mikemherron
That's great, thank you for the pointer.
mikemherron
@mikemherron
@jeskew Another quick question if you have time? So I added the below line to RetryMiddleware.php:

$command['@http']['delay'] = $delay($retries++);

  • \Log::info("Retrying with delay:" . $command['@http']['delay'] );

return $handler($command, $request)->then($g, $g);

and observed the below output: