Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • May 04 21:24
    crajah opened #65
  • Feb 04 2021 16:35
    jjmalina opened #64
  • Jan 29 2020 19:15
    pjfanning commented #58
  • Jan 29 2020 17:50
    franchev commented #58
  • Jan 23 2020 22:22
    geraldstanje commented #63
  • Jan 23 2020 15:37
    geraldstanje commented #63
  • Jan 23 2020 15:35
    geraldstanje commented #63
  • Jan 23 2020 15:08
    montpetitg commented #63
  • Jan 23 2020 15:07
    montpetitg commented #63
  • Jan 23 2020 05:22
    geraldstanje commented #63
  • Jan 23 2020 05:22
    geraldstanje commented #63
  • Jan 23 2020 05:21
    geraldstanje commented #63
  • Jan 17 2020 18:57
    montpetitg commented #63
  • Jan 17 2020 18:56
    montpetitg commented #63
  • Jan 02 2020 14:25
    wiktorolko commented #63
  • Dec 21 2019 14:41
    maziyarpanahi commented #63
  • Nov 20 2019 22:05
    tpiron commented #63
  • Nov 19 2019 17:07
    jjmalina opened #63
  • Sep 06 2019 02:35
    kimxogus opened #62
  • May 13 2019 11:38
    schneist commented #58
Eduardo Pareja Tobes
@eparejatobes
we're fine with anyone selling whatever
as long as it's open too :)
anyway
for private projects
in a lot of cases AGPLv3 is not a big issue
the requirement is over users of your app
so for something internal
Ryan Means
@rmmeans
Yeah, I'm specifically bringing up private projects here where the project might be for some API for our business that customers hit. All the other licenses used by our API allow us to keep the project private and not share our API source to the world, but that last step in our build process of publishing this thing to a private s3 maven repo that now forces me to open source my whole API is a little disappointing because using your s3-resolver is just part of the process for the build management - I wasn't profiting off of the project for my API itself. So it's not that I really have a problem with y'all using APGLv3 for all of your other stuff, because that makes sense based on your business goals for your other intellectual work, but for a build process tool, it just kind of bugs me :smile: given that your work didn't assist in me in creating the intellectual work of what I'm trying to keep private, it only assisted in the process part of efficiently storing or retrieving my compiled work. For now, I've just being using the other library from frugal mechanic, but part of my developer soul is bleeding knowing there is a better project out there, but I can't use it :smile:
Eduardo Pareja Tobes
@eparejatobes
@rmmeans yep I understand. It's just that we decided to keep everything AGPLv3; dealing with different licenses, compatibility, is this linking or not, etc takes a significant amount of time that we think it better spent anywhere else :)
Eduardo Pareja Tobes
@eparejatobes
Anyway, if there'd be someone else maintaining this
we could maybe move it to the sbt org
relicense it etc
that could be an option
Ryan Means
@rmmeans
now I totally get that! In fact, I figured that's what it was :smile: Be awesome to get some wider community effort behind it and move it over to the sbt org, etc. Anyways, thanks for the good conversation. I'll be continuing to watch the project over time. Good work!
Mark Pierotti
@mopierotti
Hi, thanks a ton for the useful package, but looks like this package is no longer available from your repo at https://s3-eu-west-1.amazonaws.com/releases.era7.com/, is that intended?
Eduardo Pareja Tobes
@eparejatobes
Hey @mopierotti
That sounds strange, we resolve from there daily
Maybe a misconfiguration, or something region related?
Proxies?
Alexey Alekhin
@laughedelic
@mopierotti if you still have the problem, show us you sbt config (the one in the project/ folder)
Benjamin Rizkowsky
@benoahriz
I was hoping to find some info about what buckey policies are needed to make this work. I keep getting a 403 from amazon however I am able to use the aws-cli to put files in my bucket using the same profile/creds
   "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "arn:aws:s3:::*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": "arn:aws:s3:::xxxx"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": "arn:aws:s3:::xxxx/*"
        }
    ]
}
that is an example of the bucket policy i'm currently using
Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied;
Benjamin Rizkowsky
@benoahriz
using a separate account works fine but has access to everything. Im trying to lock the account down to the minimum needed permissions
   "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "arn:aws:s3:::*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": "arn:aws:s3:::xxxx"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:*"
            ],
            "Resource": "arn:aws:s3:::xxxx/*"
        }
    ]
}
this seemed to work fine
Alexey Alekhin
@laughedelic
hi @benoahriz.
As I see from the code in ivy-s3-resolver (which is used in this plugin), there is no much happening besides listObjects/putObject actions..
probably s3:PutObjectAcl is needed? (here)
on the other hand s3:DeleteObject is not needed, because resolver cannot delete a published artifact
Benjamin Rizkowsky
@benoahriz
cool thanks @laughedelic I’ll try it
Alexey Alekhin
@laughedelic
I've just tested it and I think s3:PutObjectAcl is the missing chain. Here is the policy which I successfully used for publishing and resolving:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": "arn:aws:s3:::bucket.name"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:PutObjectAcl",
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::bucket.name/*"
        }
    ]
}
In theory s3:CreateBucket may be also needed if you publish to a non-existing bucket.
@benoahriz thanks for raising this concern, I should add this to the readme. Tell me if it works for you now.
Benjamin Rizkowsky
@benoahriz
yeah i’ll test it and let you know
Alexey Alekhin
@laughedelic
:+1: :shipit: :satisfied:
Benjamin Rizkowsky
@benoahriz
i tested it and it worked fine i sent a pr for the readme
also added an example of using the aws credentials and env vars if you set them which is how im implementing it
Alexey Alekhin
@laughedelic
Cool. Thanks @benoahriz :+1:
Benjamin Rizkowsky
@benoahriz
np
Pishen Tsai
@pishen

Hello, I tried to use the plugin like this

import com.amazonaws.auth._

lazy val root = (project in file(".")).settings(
  name := "helpers",
  version := "0.6.0",
  scalaVersion := "2.10.6",
  organization := "net.pishen",
  libraryDependencies ++= Seq(
    "com.typesafe.play" %% "play-json" % "2.3.10",
    "com.github.nscala-time" %% "nscala-time" % "2.6.0"
  ),
  publishMavenStyle := false,
  s3credentials := new EnvironmentVariableCredentialsProvider(),
  s3region := Region.fromValue("us-west-2"),
  publishTo := Some(s3resolver.value("S3 Repository", s3("my-bucket")).withIvyPatterns)
)

But it said that it can't find the value Region

error: not found: value Region
  s3region := Region.fromValue("us-west-2"),
              ^
sbt.compiler.EvalException: Type error in expression
    at sbt.compiler.Eval.checkError(Eval.scala:384)
    at sbt.compiler.Eval.compileAndLoad(Eval.scala:183)
    at sbt.compiler.Eval.evalCommon(Eval.scala:152)
    at sbt.compiler.Eval.evalDefinitions(Eval.scala:122)
    at sbt.EvaluateConfigurations$.evaluateDefinitions(EvaluateConfigurations.scala:271)
    at sbt.EvaluateConfigurations$.evaluateSbtFile(EvaluateConfigurations.scala:109)
    at sbt.Load$.sbt$Load$$loadSettingsFile$1(Load.scala:712)
    at sbt.Load$$anonfun$sbt$Load$$memoLoadSettingsFile$1$1.apply(Load.scala:717)
...

And if I add this line import com.amazonaws.services.s3.model._ to my build.sbt it will work. Seems that it can't see the type aliases in autoImport?

Alexey Alekhin
@laughedelic
Hi @pishen! Sorry, I was on vacations and missed the notification. This behaviour is correct. A type alias just provides a shorter name for the type, it doesn't "reexport" Region object from the amazonaws lib. So yes, you need to do the corresponding import.
Alexey Alekhin
@laughedelic

@/all v0.17.0 is out:

  • #51: Upgraded to SBT 1.x (by @macalinao)
  • #25: Published to Bintray community repository
  • #35: Added storage class setting
  • Upgraded to ivy-s3-resolver v0.11.0

https://twitter.com/laughedelic/status/910148754065457152

Alexey Alekhin
@laughedelic

@/all Thanks to the effort of Michael Ahlers @michaelahlers, we've got a new bugfix release that workarounds the problem with redundant delimiters in the ivy-style patterns introduced in sbt 1.0 (see #52 and sbt/sbt#3573).

Everybody is recommended to update to v0.17.1

Michael Ahlers
@michaelahlers
:+1:
Alexey Alekhin
@laughedelic
@/all Forgot to announce that a week ago we had a new release: v0.18.0 with a contribution from @hkupty :tada:
  • #55: Changed s3acl type to Option[S3ACL]: when it's unset, artifacts will be published inheriting the bucket ACL
Michael Ahlers
@michaelahlers
:clap:
Alexey Alekhin
@laughedelic
@/all New release: v0.19.0 with a nice contribution from @tsuyoshizawa :tada:
Fernando
@nandotorterolo
Hi, icould you provide me and example for, 'sbt.version=0.13.15' and 'scalaVersion := "2.12.3"', is this line in plugins.sbt ok? resolvers += Resolver.jcenterRepo
addSbtPlugin("ohnosequences" % "sbt-s3-resolver" % "0.16.0")
Alexey Alekhin
@laughedelic
Hi @nandotorterolo. If you are using sbt-0.13, the right version of the sbt-s3-resolver is 0.16.0, this is correct. But you have to check the readme for that version: it was published to a different repository:
resolvers += "Era7 maven releases" at "https://s3-eu-west-1.amazonaws.com/releases.era7.com"
addSbtPlugin("ohnosequences" % "sbt-s3-resolver" % "0.16.0")