Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Sep 13 20:55

    mandar2812 on master

    Update kernel, scala-interprete… Merge pull request #121 from sc… (compare)

  • Sep 13 20:55
    mandar2812 closed #121
  • Sep 13 20:54

    mandar2812 on master

    [dynaml-core]: Improvements to … [dynaml-tensorflow]: Added 1d a… (compare)

  • Sep 13 13:41
    scala-steward opened #121
  • Sep 11 04:51
    scala-steward opened #120
  • Sep 10 13:10

    mandar2812 on master

    Updated install script (compare)

  • Sep 07 09:21
    scala-steward opened #119
  • Sep 06 18:04

    mandar2812 on master

    Added tensorflow version proper… (compare)

  • Sep 06 12:21

    mandar2812 on master

    Minor mod (compare)

  • Sep 06 12:17

    mandar2812 on master

    Added packaged TF flag to build… (compare)

  • Sep 04 07:42

    mandar2812 on master

    Removing target folder of docs … (compare)

  • Sep 02 13:16

    mandar2812 on master

    Minor clean up TF Wave PDE (compare)

  • Sep 02 13:16

    mandar2812 on master

    Update to the TF Wave PDE examp… (compare)

  • Sep 02 10:11

    mandar2812 on master

    FIxes to images (compare)

  • Aug 31 08:16
    scala-steward opened #118
  • Aug 30 15:32
    scala-steward opened #117
  • Aug 30 15:32
    scala-steward synchronize #109
  • Aug 30 15:31
    scala-steward synchronize #98
  • Aug 30 15:31
    scala-steward synchronize #88
  • Aug 30 12:13

    mandar2812 on master

    Added new module dynaml-tensorf… (compare)

Mandar Chandorkar
@mandar2812
Hi @michellemay glad you liked DynaML! Yes you are very welcome to implement BayesOptimization in DynaML, pul requests and contributions are very welcome!
So Bayes Opt is not implemented anywhere in DynaML
there is a global optimization API
which has GridSearch CoupledSimulatedAnnealing
The top level traits are GloballyOptimizable and GloballyOptimizableWithGrad
These are implemented by the models themselves (GP and others)
Mandar Chandorkar
@mandar2812
While GlobalOptimizer is the trait which GridSearch and CoupledSimulatedAnnealing implement
So your proposed BayesOptimization class can extend the GlobalOptimizer trait
So you can extend the GPRegression class and override the energy() method to compute the expected Improvement Acquisition
def energy(h: Map[String, Double],
             options: Map[String, String] = Map()): Double
here h is some particular value of hyper-parameters, options can be used to store other configuration items, (not needed in many cases)
Mandar Chandorkar
@mandar2812
If you want your algorithm to use gradient information then your model must extend GloballyOptimizableWithGrad and your model must implement the gradEnergy() method also.
def gradEnergy(h: Map[String, Double])
  : Map[String, Double] = Map()
Hope these is not too many details too soon! Let me know if you have any more questions.
Mandar Chandorkar
@mandar2812
Bayes Optimization was a idea I was interested in some time ago, but I never got the time to implement it, it would be great if you are interested to work on it.
Michel Lemay
@michellemay
Thanks for the details.. I was thinking about the same.
In bayesOpt, we have a multi step process: 1) sample the target function, 2) train GPRegression, 3) optimize the acquisition function, 4) evaluate target function at best point in 3 , 5) loop until convergence or max steps.
One thing I want to clarify, GlobalOptimizer is strongly dependent on energy landscape and this is a grid search approach.
How would you break that assumption?
Mandar Chandorkar
@mandar2812

Okay, I guess I was a bit confused. But your description helped to clear my thinking!

So you want to optimise f(x), by sampling it at some points Seq[I], then training a AbstractGPRegressionModel[I] on these points and using that model to propose new areas in the x:I domain

Michel Lemay
@michellemay
Another thing, the relation between energy(h) and training data DenseVectors is not quite clear. Is it only a direct mapping of the keys of the map into indices of the dense vector ?
Mandar Chandorkar
@mandar2812
Now I am thinking that you might not need to extend the GlobalOptimizer trait
I think your implementation can look as follows BayesOptimization[I](f: I => Double)
internally it can train a AbstractGPRegressionModel[I] on the samples xs: Seq[I]
and use the predictiveDistribution() method of GP to figure out which new samples are promising
so here f:I => Double is the potentially unknown black box function to optimize
Michel Lemay
@michellemay
This is pretty much what I've prototyped so far..
Mandar Chandorkar
@mandar2812
sounds good to me!
so the type I can be anything really, but DenseVector[Double] would be one of the common use cases
If you think your prototype is ready, you can submit a pull request into DynaML dynaml.optimization package
I would love to review it and merge if all goes well!
Michel Lemay
@michellemay
it's not ready at all.. It's just some piece of code in the shell ;)
Mandar Chandorkar
@mandar2812
Okay, no problem :)
Michel Lemay
@michellemay
there are lots of details to iron out. for starter, the optimization of the acquisition function could be done with lbfgsb and I don't see that in your library. maybe I can resort to regular gradient search.
also, I need to define clear bounds and constraints in the input parameter ranges
in my test implementation, i was using predictionWithErrorBars to get the mean and stddev at test points. That might be somewhat inefficient to call that for a single data point at each step of (3)
Mandar Chandorkar
@mandar2812

in my test implementation, i was using predictionWithErrorBars to get the mean and stddev at test points. That might be somewhat inefficient to call that for a single data point at each step of (3)

Yes I agree, the predictionWithErrorBars accepts a Seq of inputs, so it might make more sense to compute them on a batch of points every few iterations or something?

Mandar Chandorkar
@mandar2812
Hi @/all , so two major announcements :
  • DynaML v1.5.3 onward is now available on maven central repository
  • DynaML v2.0-SNAPSHOT will be compatible with scala 2.11.x and scala 2.12.x with cross building enabled in PR #81
Anastasios
@hellene
hi fellas, are we dead in here?
Anastasios
@hellene
I seemed to be doing OK with the Dynaml "demo", until the middle showed some spurious repeats and what have you, not sure what is going on, pasting

val trainData =
tf_dataset.training_data
trainData: ops.io.data.Dataset[(Tensor, Tensor), (tensorDataHelper.OutputType, tensorDataHelper.OutputType), (tensorDataHelper.DataTypes, tensorDataHelper.DataTypes), (tensorDataHelper.Shapes, tensorDataHelper.Shapes)] = ZipDataset(
TensorSlicesDataset(UINT8[50000, 32, 32, 3], "TensorSlicesDataset"),
TensorSlicesDataset(UINT8[50000], "TensorSlicesDataset"),
"Zip"
)

DynaML> .repeat()
SyntaxError: found ".repeat()", expected Import | Prelude ~ BlockDef | Expr | End at index 4
.repeat()
^

DynaML> .shuffle(10000)
SyntaxError: found ".shuffle(10000)", expected Import | Prelude ~ BlockDef | Expr | End at index 4
.shuffle(10000)
^

DynaML> .batch(128)
SyntaxError: found ".batch(128)", expected Import | Prelude ~ BlockDef | Expr | End at index 4
.batch(128)
^

DynaML> .prefetch(10)

Mandar Chandorkar
@mandar2812
Hi @hellene you can enclose the entire code in {} before pasting in the terminal
Anastasios
@hellene
you mean to improve formating here, or to avoid errors during execution?
Mandar Chandorkar
@mandar2812
yes Its a problem with formatting when pasting multiline commands in the shell
Anastasios
@hellene
lol I would never have guessed, can't check right now but hope it is the answer!
Mandar Chandorkar
@mandar2812
Also something seems wrong in the first few lines as well
val trainData =
tf_dataset.training_data
trainData: ops.io.data.Dataset[(Tensor, Tensor), (tensorDataHelper.OutputType, tensorDataHelper.OutputType), (tensorDataHelper.DataTypes, tensorDataHelper.DataTypes), (tensorDataHelper.Shapes, tensorDataHelper.Shapes)] = ZipDataset(
TensorSlicesDataset(UINT8[50000, 32, 32, 3], "TensorSlicesDataset"),
TensorSlicesDataset(UINT8[50000], "TensorSlicesDataset"),
"Zip"
)
I dont think what you pasted here has the complete code dump, it seems to be combination of your code and the terminals response
Anastasios
@hellene
I will redo the whole thing in one liners, first of all
like I said, half of the copy and pastes went fine