Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jul 02 00:04
    github-actions[bot] closed #1758
  • Jul 02 00:04
    github-actions[bot] labeled #1758
  • Jul 02 00:04
    github-actions[bot] unlabeled #1758
  • Jul 02 00:04
    github-actions[bot] closed #1726
  • Jul 02 00:04
    github-actions[bot] labeled #1726
  • Jul 02 00:04
    github-actions[bot] closed #1515
  • Jul 02 00:04
    github-actions[bot] unlabeled #1726
  • Jul 02 00:04
    github-actions[bot] labeled #1515
  • Jul 02 00:04
    github-actions[bot] unlabeled #1515
  • Jul 02 00:04
    github-actions[bot] closed #1501
  • Jul 02 00:04
    github-actions[bot] labeled #1501
  • Jul 02 00:04
    github-actions[bot] unlabeled #1501
  • Jul 02 00:04
    github-actions[bot] labeled #1301
  • Jul 02 00:04
    github-actions[bot] closed #1301
  • Jul 02 00:04
    github-actions[bot] unlabeled #1301
  • Jul 02 00:04
    github-actions[bot] closed #1273
  • Jul 02 00:04
    github-actions[bot] labeled #1273
  • Jul 02 00:04
    github-actions[bot] unlabeled #1273
  • Jul 02 00:04
    github-actions[bot] labeled #1081
  • Jul 02 00:04
    github-actions[bot] closed #1081
Daniel Hensby
@dhensby
thanks so much @jeskew testReturnCallback is what I needed
Daniel Hensby
@dhensby
tests passing now :)
just in time for me to go home
Miguel Vieira
@apoca
I want change my local files to S3 it is possible?
like uploaded files from users of platform
Daniel Hensby
@dhensby
Miguel Vieira
@apoca
Is possible in last version aws V3 upload files directly from POST
?
I can't find documentation about ..
Jonathan Eskew
@jeskew
@apoca I’m not sure what you mean
Do you mean from $_FILES?
Miguel Vieira
@apoca
@jeskew hoo tks I already solved :)
Zxurian
@Zxurian
so I'm looking at the Command Pool examples, is it possible to add commands to an existing command pool that has already started processing?
Zxurian
@Zxurian
aka, build a command pool, then afterwards have a separate loop that feed promises into it that will process according to the concurrency rules of the command pool (so that looping can continue while the promises are executing asynchronously), and then afterwards just wait until the command pool is empty?
richardudovich
@richardudovich
@richardudovich
hey all... can i please ask aws powershell codedeploy related question here?
this command creates a deployment and pushes to s3 in a zip file... >>>> aws deploy push --application-name shop.auf.com.au --description "Test code deploy" --ignore-hidden-files --s3-location s3://code-deploy-test-30112015/shop.auf.com.au.zip --source .
the equivalent in powershell seems to be >>>> New-CDDeployment -ApplicationName shop.auf.com.au -S3Location_Bucket code-deploy-test-30112015 -S3Location_BundleType zip -DeploymentGroupName shop.auf.com.au -Description "Shop code deploy" -S3Location_Key shop.auf.com.au.zip  -Revision_RevisionType S3  -DeploymentConfigName CodeDeployDefault.OneAtATime
however... the powershell one doesnt seem to offer an option for source...
and even though it doesnt complain ... there is no updated .zip on s3
Jonathan Eskew
@jeskew
@richardudovich I can really only answer questions about the PHP SDK… Are you using the Powershell tools or the AWS CLI?
richardudovich
@richardudovich
well, the aws cli works fine
powershell tools doesnt seem to have the source option
does that mean... i gotta zip up the source myself and push it to s\3
s3
Jonathan Eskew
@jeskew
I think your best bet for finding support on this would be the AWS Powershell tools forum
richardudovich
@richardudovich
you know if theres a room on that?
and anyway im not aksing anything powershell related anymore
Jonathan Eskew
@jeskew
I don’t think they have a gitter room
But they are very responsive on the forums
Patrik Patie Gmitter
@patie
hi, im trying to do synchronous command pool, but i give Catchable Fatal Error: Argument 1 passed to Aws\Handler\GuzzleV5\GuzzleHandler::Aws\Handler\GuzzleV5{closure}() must be an instance of Exception, string given, called in ......../vendor/guzzlehttp/promises/src/Promise.php on line 199 and defined
any idea ? :)
i using guzzle 5.3 which is supported, when im looking at packagist
Patrik Patie Gmitter
@patie
ok with guzzle 6.1 there is no problem.
Zxurian
@Zxurian
when using the uploadAsync() method, should it start uploading as soon as the method is called, or is there somthing additional I have to call for it to start uploading?
Jonathan Eskew
@jeskew
@Zxurian It should start as soon as the method is called
It returns a promise, so you can call ->wait() on that if you need to block until the upload is complete
Zxurian
@Zxurian
that's what I thought as well, however when I was issuing the ->uploadAsync() method, nothing was getting uploaded. I had a separate timer instead of calling wait, but checked on the value returned by the promise ->then() method, and also checking S3 directly, the file never uploaded
Jonathan Eskew
@jeskew
I’m going to run a couple tests on that
Zxurian
@Zxurian
@jeskew I'll see if I can strip my code down to a simple version for testing
Jonathan Eskew
@jeskew
@Zxurian How big is the file you’re uploading? >= 16MB?
Zxurian
@Zxurian
@jeskew some are, yes
in fact most of them
Jonathan Eskew
@jeskew
Those will be handled as multipart uploads
So it looks like all uploads will start
But multipart uploads may not finish before the end of a script
this is the test I was running:
$s3 = new \Aws\S3\S3Client([
    'region' => 'us-west-2',
    'version' => 'latest',
    'debug' => true,
]);

$uploads = [
    'small' => $s3->uploadAsync(
        'testing-testing-testing',
        'small-file-async-upload',
        fopen(__FILE__, 'r')
    ),
    'large' => $s3->uploadAsync(
        'testing-testing-testing',
        'large-file-async-upload',
        fopen(__DIR__ . '/phparchitect-2015-06.pdf' , 'r'),
        'private',
        [
            'before_upload' => function (\Aws\CommandInterface $command) {
                echo "uploading chunk {$command['PartNumber']}\n";
            }
        ]
    )
];

echo "uploading\n";
\GuzzleHttp\Promise\queue()->run();
The second upload is large enough to trigger a multipart upload
Which is handled by a Guzzle coroutine
And coroutines might not be complete before the call to \GuzzleHttp\Promise\queue()->run() finishes
Zxurian
@Zxurian
hm...
creating a test script now
Jonathan Eskew
@jeskew
does that sound like it’s the same error you’re seeing?