Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • Oct 12 13:24
    Norec39 edited #651
  • Oct 07 13:32
    Norec39 opened #651
  • Sep 24 13:51
    jum4 commented #650
  • Sep 24 09:53
    jenkoian commented #650
  • Sep 23 18:05
    jum4 commented #650
  • Jun 30 18:50
    jenkoian synchronize #650
  • Jun 30 18:42
    jenkoian opened #650
  • Apr 05 17:10
    williamdes opened #649
  • Apr 05 16:30
    williamdes commented #311
  • Mar 11 17:22
    Webonaute opened #648
  • Mar 09 10:11
    akerouanton removed as member
  • Mar 09 10:10
    akerouanton added as member
  • Feb 10 07:35
    mo8n removed as member
  • Feb 10 07:34
    mo8n added as member
  • Feb 02 22:30
    chadyred commented #635
  • Feb 02 22:30
    chadyred commented #635
  • Feb 02 22:29
    chadyred commented #635
  • Feb 02 10:17
    jaljo added as member
  • Jan 28 12:11
    rimas-kudelis commented #275
  • Jan 21 18:50
    stof commented #586
Dimitrios Desyllas
Hello fellows I have some trouble to understand the documentation for configuring the Gaufette fro s3 using Symfony 3.0 framework
Andrew Kovalyov
hi, which troubles do you experience?
basically the installation should be the same
Dimitrios Desyllas
Int not issue of Instalation but on How to use the library. As I asked over there: http://stackoverflow.com/questions/35672016/how-to-use-gaufrette-and-symfony-3-0 On the documentation it says that I have to Initiate an new AWS s3 client. But what is the reason to do that If I have an already configured on settings.yml the connection to S3 as described on documentation.
I think on Documentation it needs a Usage Example with Symfony.
Jaime Domínguez
I'm pretty sure this is a recurrent question but... what are the best practices when comes to upload large files to S3 using the awsS3 adapter.
My problem is the memory usage :(
(I'm new at this gitter thing, please excuse me if I'm rude asking directly) :)
Andrew Kovalyov
hi Jaime and welcome to gitter :)
Jaime Domínguez
Andrew Kovalyov
I assume that you're passing the whole content to write method, right?
for S3 adapter it should work fine if you pass resource (however, I haven't tried it yet and don't have an opportunity to test it right now)
so if you could try it and report if it works, then we can improve some docs and save some time and nerves for Gaufrette users.
Jaime Domínguez
Yes, I'm quite sure I am passing the whole content (I use the Gaufrette wrapped in a cakephp plugin)

"pass the resource" I will be happy triying but my question is ¿where?

I Think the first memory kaputt-point is in the _storeFile of the abstractListener. Should I try to rewrite that function? or should I try to create a new listener and rewrite the _storeFile in it?

(excuse my english, it's not my native language :) )
(but I believe that the memory-kaputt-point is a pretty descriptive concept :D )
Andrew Kovalyov
no worries. To be honest, I haven't tried neither cakephp nor gaufrette plugin for that.
Are you talking about this ?
I think, that fopen should be there instead of file_get_contents. If it works, please report it here. I think, in this case we can improve both docs and that plugin.
Jaime Domínguez
wow, yes. That exact line :)
fopen... ok, i'll try and will be back with the results :)
thanks a lot
Andrew Kovalyov
my pleasure.
Jaime Domínguez
I just realize that I was talking about the abstractListener as if it comes with the gaufrette library and you already know it. My bad. Sorry.
I have spent several hours digging from my controller to the burzum library down to the gaufrette and hitting the rock bottom in the aws sdk and i'm messing it all in my mind :)
thank again for your patience :)
Andrew Kovalyov
no worries, sometimes debugging can be really hard. And I am really interesed to write down all the troubles with memory, because actually it is a common trouble of filesystem abstractions. We are experimenting with ways to fix that, but still it is highly WIP.
Jaime Domínguez
Win!!! It seems to work!
Andrew Kovalyov
Jaime Domínguez
I tried 2 things. Pass a resource with fopen as you suggest and pass anGuzzle\Http\EntityBodyInterface object
the later results to slow (200kb upload out of a 20mbps connection)
but the fopen works
to the fopen to work it's required to pass the contentLength to the putObject method so i add this to the gaufrette S3Client
    if (is_resource($options['Body'])) {
        $stats = fstat($options['Body']);
        $options['ContentLength'] = $stats['size'];
in the getOptions function
I'll make more tests but it looks the memory issue is over for me :) Now I can lower that 6gb memory limit cap in the php.ini :)
thanks again!
(to the gaufrette s3 adapter, no s3 client. Sorry) :D
Lance Bailey
Hi Gaufrette people :)
was just wondering when there will be a new tag release from the master branch?
Andrew Kovalyov
probably @NiR- could answer.
Christophe Coevoet
@NiR- is there any place where you described your plans for the new structure splitting adapters ? I don't see any issue with the discussion about it
btw, there is one thing weird: many PRs are opened for each adapter, but none of them is merged, as if the work on the new structure was not meant to move forward
and btw, this will hurt you if fixes are done in the master branch in the meantime, as you will have to re-apply them to the moved code...