These are chat archives for CommBank/maestro
uber.scoringrepo, I'll see if there's anything interesting in our settings
We do have this in all projects:
concurrentRestrictions in Global := Seq( Tags.limit(Tags.CPU, 2), Tags.limit(Tags.Network, 10), Tags.limit(Tags.Test, 1), Tags.limitAll( 15 ) )
But according to the docs it won't have any effect unless you tag tasks.
javaOptionsare fairly crazy:
javaOptions ++= Seq("-Xms2048M", "-Xmx8192M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
Globaland adding Test limitations means that only 1 test over everything can run at once)
javaOptionsare just to expand resources for SBT. I'm not 100% confident that CMS is the default GC for Java 7 anymore; I have heard rumours that JRE7 uses the G1 collector by default (can anyone confirm?). So, you might have to add
-XX:+UseConcMarkSweepGCjust to be sure CMS is enabled.
el.util. But but there’s at least one exception, and probably others:
CommBank\eventuallyseems to depend directly - at least it has a
maestroVersionset directly in its
"commbank-releases-private" at "https://commbank.artifactoryonline.com/commbank/libs-releases-local"
def uniformDependencySettings: Seq[Sett] = uniformPublicDependencySettings ++ uniformPrivateDependencySettings
uniformPublicDependencySettingsin maestro instead. any thoughts?
Unable to find credentials for [Artifactory Realm @ commbank.artifactoryonline.com]that we're getting, yeah?
maestro-examplecode running on a local CommBank/cdh5.3.0 vagrant vm
org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "datanucleus" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification.
CustomerJob.scala:59when trying to instantiate HiveMetaStoreClient
HADOOP_CLASSPATH=/etc/hbase/conf:/etc/hive/conf:<the-jar-I-am-running> hadoop jar <the-jar-I-am-running> blabla...
Has anyone seen this sort of error before when using
[</tmp/hadoop/test-186e22ec-2964-48de-aabd-f9cf7f1c277d/hive/warehouse/scoresTestTable/partition_model_id=modelId1/partition_score_date=20140805/*.parquet>. (PathFact.scala:37)][ ] [error] No files found under
I have a test which works perfectly locally, but fails when run on a TeamCity build agent.