Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 19 23:51
    SamRemis commented #2279
  • Oct 19 23:11
    SamRemis edited #2276
  • Oct 19 23:11
    SamRemis commented #2264
  • Oct 19 22:47
    Herrick19 commented #2264
  • Oct 19 21:51
    SamRemis synchronize #2276
  • Oct 19 21:45
    SamRemis commented #2264
  • Oct 19 21:16
    SamRemis synchronize #2276
  • Oct 19 20:56
    SamRemis synchronize #2276
  • Oct 19 20:33
    Rene-Roscher commented #2264
  • Oct 19 19:11
    Herrick19 commented #2264
  • Oct 19 18:17

    aws-sdk-php-automation on 3.198.8

    (compare)

  • Oct 19 18:17

    aws-sdk-php-automation on master

    Update models for release 3.198.8 release (compare)

  • Oct 19 17:18
    SamRemis synchronize #2336
  • Oct 19 09:57
    driesvints commented #2331
  • Oct 19 01:42
    GrahamCampbell commented #2337
  • Oct 18 18:19

    aws-sdk-php-automation on master

    Update models for release 3.198.7 release (compare)

  • Oct 18 18:19

    aws-sdk-php-automation on 3.198.7

    (compare)

  • Oct 18 08:06
    sylvainfloride labeled #2338
  • Oct 18 08:06
    sylvainfloride labeled #2338
  • Oct 18 08:06
    sylvainfloride opened #2338
byounes
@byounes
I've tried adding:
require 'Aws/Common/Aws.php';
use Aws\S3\S3Client;
but it doesn't work
Fatal error: Class 'Guzzle\Service\Builder\ServiceBuilder' not found in /usr/share/php/Aws/Common/Aws.php on line 27
Jonathan Eskew
@jeskew
I don't think a system-wide aptitude install is what you want with a PHP library.
You can download a phar file from github and require that
It will include all dependencies
They would be on the
byounes
@byounes
does that mean that the Ubuntu package is not very useful, it has the next dependencies:
php5-common (>= 5.3.3), php-pear (>= 1.4.0), php-guzzle (>= 3.0.3), php5-curl
So Guzzle is there also,, I can see
it under /usr/share/php/Guzzle
Jonathan Eskew
@jeskew
I'm not familiar with the Ubuntu package but that would be an uncommon way to install a PHP library
Does it have an autoloader instructions?
byounes
@byounes
I get your point, I'll try the git release then.
Thanks Jeskew for your time.
Jonathan Eskew
@jeskew
Np happy to help
byounes
@byounes
regarding the Ubuntu package, it has no autoload file, I searched for "auto", and got zero hits
"auto"
Jonathan Eskew
@jeskew
hmmm well then they might expect you to call require_once for every class in the SDK
we don’t maintain the ubuntu package, so I’m not really sure how it’s supposed to be used
Zxurian
@Zxurian
Is there a way to stage multiple transfers to s3, or fork/queue the upload process so the script isn't waiting for a transfer to complete before moving onto the next section of the code?
Witold Cyrek
@wcyrek-comrise
Yes!
Zxurian
@Zxurian
ex. I have multiple separate directories that need to be uploaded that I can loop on, however I'd rather not wait for the transfer to complete before moving onto the next iteration.
@wcyrek-comrise excellent, do you have a link referencing that feature, or API?
Witold Cyrek
@wcyrek-comrise
I mean I am sure there is a way, but you have to probably put it in your script using soem sort of concurrency library.
I do not think there is a built in way.
Let me jsut make sure by rummiging through the docs, sicne there might be async way of doign that
Witold Cyrek
@wcyrek-comrise
@Zxurian https://aws.amazon.com/articles/1904 suggests using multi-threading to increase throughput
which would make thing that it is not built in, but like I said you can create multipel threads to process the requests concurrently
Zxurian
@Zxurian
okay, so there's no built-in method with the aws-sdk-php library to fork/queue uploads?
Witold Cyrek
@wcyrek-comrise
That appears to be the case. But you can roll one yourself. Or do a little more digging and find one. I foudn one in python, someone no doubt did something somewhere in PHP.
Zxurian
@Zxurian
thanks. and just to double check, is there another library of aws-sdk besides php that would support pre-generating a list of transfers, and then just pass it to an sdk api and it would take care of uploading everything?
Witold Cyrek
@wcyrek-comrise
I don't think so, the guy who I found used the python SDK, he still had to write his won multithreaded code: https://gist.github.com/jeffaudi/f62a57f11a594e41d11d
Only concurrency that I seen is built-in is for uploading big files where the source is a single file
Zxurian
@Zxurian
thanks, I'll look around and see what's available. I'd rather create a single thread that can parse a list of downloads and let it run, rather than having the active code have to loop over it
Jonathan Eskew
@jeskew
@Zxurian have you taken a look at Aws\S3\MultipartUploader?
It has configurable concurrency (managed by curl multi, not threads) and can return a promise if you want the upload to resolve in the background
Jonathan Eskew
@jeskew
There's a similarly asynchronous way to upload directories as well
Zxurian
@Zxurian
@jeskew awesome, that's exactly what I was looking for. Thanks for the info
YohaanYoon
@YohaanYoon
Using AWS SDK PHP, I am trying to send email asynchronously through AWS SES in my local machine (mac, apache)
While $promise->wait() works well, $promise->then doesn't work
Do you have any ideas to fix it? Such as requirement issues, environment issues, and so on.
Robert Casanova
@robertcasanova
Hi guys, sorry to disturb you. I'm having a problem with uploading image to S3. I'm using Sonata Media Bundle inside a PHP project with a multi docker container configuration. I'm having this error
cURL resource: Resource id 54; cURL error: Could not resolve host: s3-eu-west-1.amazonaws.com (cURL error code 6). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.
does somebody have the same problem or some suggestion for this?
Jonathan Eskew
@jeskew
@YohaanYoon could you open an issue on GitHub with a code sample?
@robertcasanova that looks like an outbound networking issue. Are you sure the docker container hosting the app can contact the internet?
Robert Casanova
@robertcasanova
@jeskew thx for your answer. I tryed to reboot and then it works: it happens randomly. Probably, as you said, is something with docker. Thx.
mikemherron
@mikemherron
Hi - I've got a quick question about the automatic handling of ProvisionedThroughputExceededExceptions when writing items to DynamoDB - my understanding is that the SDK handles these automatically, but on my test server, I can see these are being thrown from the SDK. I'm intentionally trying to test this behaviour so I've provisioned a table with 1 Unit of Read Capacity and 1 Unity of Write Capacity, and have a script that will definitely exceed this. I suspect that I'm reaching some internal retry limit, but I've not been able to find documentation on this - can anyone point me to whereabouts in the code the retry logic is handled so I can confirm?
Jonathan Eskew
@jeskew
Dynamo calls are retired 11 times by default