Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 10:21
    peter279k synchronize #627
  • 10:17
    nm2107 closed #628
  • 10:17
    nm2107 commented #628
  • 10:01
    peter279k synchronize #627
  • 09:49
    peter279k commented #627
  • 09:35
    peter279k opened #628
  • 08:48
    peter279k opened #627
  • Oct 17 13:44
    JoniJnm edited #626
  • Oct 17 09:16
    JoniJnm opened #626
  • Oct 16 06:00
    fabschurt unassigned #405
  • Oct 16 06:00
    fabschurt unassigned #415
  • Oct 16 06:00
    fabschurt unassigned #416
  • Oct 16 06:00
    fabschurt unassigned #431
  • Oct 16 05:58
    fabschurt unassigned #430
  • Oct 15 14:34
    SumonMSelim commented #625
  • Oct 15 13:32
    SumonMSelim opened #625
  • Sep 24 08:03
    nm2107 closed #624
  • Sep 24 08:03
    nm2107 commented #624
  • Sep 16 13:00
    averichev opened #624
  • Sep 11 14:11
    l0wskilled closed #605
Dimitrios Desyllas
@pc-magas
Hello fellows I have some trouble to understand the documentation for configuring the Gaufette fro s3 using Symfony 3.0 framework
Andrew Kovalyov
@akovalyov
hi, which troubles do you experience?
basically the installation should be the same
Dimitrios Desyllas
@pc-magas
Int not issue of Instalation but on How to use the library. As I asked over there: http://stackoverflow.com/questions/35672016/how-to-use-gaufrette-and-symfony-3-0 On the documentation it says that I have to Initiate an new AWS s3 client. But what is the reason to do that If I have an already configured on settings.yml the connection to S3 as described on documentation.
I think on Documentation it needs a Usage Example with Symfony.
Jaime Domínguez
@jdominguezpaz
Hello,
I'm pretty sure this is a recurrent question but... what are the best practices when comes to upload large files to S3 using the awsS3 adapter.
My problem is the memory usage :(
(I'm new at this gitter thing, please excuse me if I'm rude asking directly) :)
Andrew Kovalyov
@akovalyov
hi Jaime and welcome to gitter :)
Jaime Domínguez
@jdominguezpaz
:-)
Andrew Kovalyov
@akovalyov
I assume that you're passing the whole content to write method, right?
for S3 adapter it should work fine if you pass resource (however, I haven't tried it yet and don't have an opportunity to test it right now)
so if you could try it and report if it works, then we can improve some docs and save some time and nerves for Gaufrette users.
Jaime Domínguez
@jdominguezpaz
Yes, I'm quite sure I am passing the whole content (I use the Gaufrette wrapped in a cakephp plugin)

"pass the resource" I will be happy triying but my question is ¿where?

I Think the first memory kaputt-point is in the _storeFile of the abstractListener. Should I try to rewrite that function? or should I try to create a new listener and rewrite the _storeFile in it?

(excuse my english, it's not my native language :) )
(but I believe that the memory-kaputt-point is a pretty descriptive concept :D )
Andrew Kovalyov
@akovalyov
no worries. To be honest, I haven't tried neither cakephp nor gaufrette plugin for that.
Are you talking about this ?
I think, that fopen should be there instead of file_get_contents. If it works, please report it here. I think, in this case we can improve both docs and that plugin.
Jaime Domínguez
@jdominguezpaz
wow, yes. That exact line :)
fopen... ok, i'll try and will be back with the results :)
thanks a lot
Andrew Kovalyov
@akovalyov
my pleasure.
Jaime Domínguez
@jdominguezpaz
I just realize that I was talking about the abstractListener as if it comes with the gaufrette library and you already know it. My bad. Sorry.
I have spent several hours digging from my controller to the burzum library down to the gaufrette and hitting the rock bottom in the aws sdk and i'm messing it all in my mind :)
thank again for your patience :)
Andrew Kovalyov
@akovalyov
no worries, sometimes debugging can be really hard. And I am really interesed to write down all the troubles with memory, because actually it is a common trouble of filesystem abstractions. We are experimenting with ways to fix that, but still it is highly WIP.
Jaime Domínguez
@jdominguezpaz
Win!!! It seems to work!
Andrew Kovalyov
@akovalyov
:thumbsup:
Jaime Domínguez
@jdominguezpaz
I tried 2 things. Pass a resource with fopen as you suggest and pass anGuzzle\Http\EntityBodyInterface object
the later results to slow (200kb upload out of a 20mbps connection)
but the fopen works
but!
to the fopen to work it's required to pass the contentLength to the putObject method so i add this to the gaufrette S3Client
    if (is_resource($options['Body'])) {
        $stats = fstat($options['Body']);
        $options['ContentLength'] = $stats['size'];
    }
in the getOptions function
I'll make more tests but it looks the memory issue is over for me :) Now I can lower that 6gb memory limit cap in the php.ini :)
thanks again!
(to the gaufrette s3 adapter, no s3 client. Sorry) :D
Lance Bailey
@beyerz
Hi Gaufrette people :)
was just wondering when there will be a new tag release from the master branch?
Andrew Kovalyov
@akovalyov
probably @NiR- could answer.
Christophe Coevoet
@stof
@NiR- is there any place where you described your plans for the new structure splitting adapters ? I don't see any issue with the discussion about it
btw, there is one thing weird: many PRs are opened for each adapter, but none of them is merged, as if the work on the new structure was not meant to move forward
and btw, this will hurt you if fixes are done in the master branch in the meantime, as you will have to re-apply them to the moved code...