by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jul 20 14:22
    Nyholm synchronize #639
  • Jul 20 14:09
    Nyholm opened #639
  • Jul 20 12:02
    Nyholm opened #638
  • Jun 12 07:44
    Stadly commented #522
  • Jun 11 08:22
    Stadly commented #522
  • May 22 10:00
    chadyred commented #584
  • May 22 06:43
    benIT commented #584
  • May 22 06:43
    benIT commented #584
  • May 12 12:10
    SebCargo commented #584
  • May 12 08:41
    floplus opened #637
  • Apr 06 19:11
    mkveksas commented #522
  • Apr 06 19:10
    mkveksas commented #522
  • Apr 04 13:02
    chadyred synchronize #636
  • Mar 30 08:40
    Djeg removed as member
  • Mar 30 08:38
    Djeg added as member
  • Mar 27 13:20
    chadyred commented #636
  • Mar 17 20:24
    cjunge-work closed #300
  • Mar 15 18:41
    chadyred edited #635
  • Mar 15 18:40
    chadyred opened #636
  • Mar 15 18:33
    chadyred edited #635
Dimitrios Desyllas
@pc-magas
Hello fellows I have some trouble to understand the documentation for configuring the Gaufette fro s3 using Symfony 3.0 framework
Andrew Kovalyov
@akovalyov
hi, which troubles do you experience?
basically the installation should be the same
Dimitrios Desyllas
@pc-magas
Int not issue of Instalation but on How to use the library. As I asked over there: http://stackoverflow.com/questions/35672016/how-to-use-gaufrette-and-symfony-3-0 On the documentation it says that I have to Initiate an new AWS s3 client. But what is the reason to do that If I have an already configured on settings.yml the connection to S3 as described on documentation.
I think on Documentation it needs a Usage Example with Symfony.
Jaime Domínguez
@jdominguezpaz
Hello,
I'm pretty sure this is a recurrent question but... what are the best practices when comes to upload large files to S3 using the awsS3 adapter.
My problem is the memory usage :(
(I'm new at this gitter thing, please excuse me if I'm rude asking directly) :)
Andrew Kovalyov
@akovalyov
hi Jaime and welcome to gitter :)
Jaime Domínguez
@jdominguezpaz
:-)
Andrew Kovalyov
@akovalyov
I assume that you're passing the whole content to write method, right?
for S3 adapter it should work fine if you pass resource (however, I haven't tried it yet and don't have an opportunity to test it right now)
so if you could try it and report if it works, then we can improve some docs and save some time and nerves for Gaufrette users.
Jaime Domínguez
@jdominguezpaz
Yes, I'm quite sure I am passing the whole content (I use the Gaufrette wrapped in a cakephp plugin)

"pass the resource" I will be happy triying but my question is ¿where?

I Think the first memory kaputt-point is in the _storeFile of the abstractListener. Should I try to rewrite that function? or should I try to create a new listener and rewrite the _storeFile in it?

(excuse my english, it's not my native language :) )
(but I believe that the memory-kaputt-point is a pretty descriptive concept :D )
Andrew Kovalyov
@akovalyov
no worries. To be honest, I haven't tried neither cakephp nor gaufrette plugin for that.
Are you talking about this ?
I think, that fopen should be there instead of file_get_contents. If it works, please report it here. I think, in this case we can improve both docs and that plugin.
Jaime Domínguez
@jdominguezpaz
wow, yes. That exact line :)
fopen... ok, i'll try and will be back with the results :)
thanks a lot
Andrew Kovalyov
@akovalyov
my pleasure.
Jaime Domínguez
@jdominguezpaz
I just realize that I was talking about the abstractListener as if it comes with the gaufrette library and you already know it. My bad. Sorry.
I have spent several hours digging from my controller to the burzum library down to the gaufrette and hitting the rock bottom in the aws sdk and i'm messing it all in my mind :)
thank again for your patience :)
Andrew Kovalyov
@akovalyov
no worries, sometimes debugging can be really hard. And I am really interesed to write down all the troubles with memory, because actually it is a common trouble of filesystem abstractions. We are experimenting with ways to fix that, but still it is highly WIP.
Jaime Domínguez
@jdominguezpaz
Win!!! It seems to work!
Andrew Kovalyov
@akovalyov
:thumbsup:
Jaime Domínguez
@jdominguezpaz
I tried 2 things. Pass a resource with fopen as you suggest and pass anGuzzle\Http\EntityBodyInterface object
the later results to slow (200kb upload out of a 20mbps connection)
but the fopen works
but!
to the fopen to work it's required to pass the contentLength to the putObject method so i add this to the gaufrette S3Client
    if (is_resource($options['Body'])) {
        $stats = fstat($options['Body']);
        $options['ContentLength'] = $stats['size'];
    }
in the getOptions function
I'll make more tests but it looks the memory issue is over for me :) Now I can lower that 6gb memory limit cap in the php.ini :)
thanks again!
(to the gaufrette s3 adapter, no s3 client. Sorry) :D
Lance Bailey
@beyerz
Hi Gaufrette people :)
was just wondering when there will be a new tag release from the master branch?
Andrew Kovalyov
@akovalyov
probably @NiR- could answer.
Christophe Coevoet
@stof
@NiR- is there any place where you described your plans for the new structure splitting adapters ? I don't see any issue with the discussion about it
btw, there is one thing weird: many PRs are opened for each adapter, but none of them is merged, as if the work on the new structure was not meant to move forward
and btw, this will hurt you if fixes are done in the master branch in the meantime, as you will have to re-apply them to the moved code...