Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jan 22 12:55
    ljvmiranda921 assigned #369
  • Jan 21 16:00
    alisaffari97 edited #369
  • Jan 21 15:58
    alisaffari97 opened #369
  • Jan 17 13:57
    gabrielaleon opened #368
  • Jan 16 15:20
    ljvmiranda921 unlabeled #357
  • Jan 16 15:12
    stale[bot] labeled #357
  • Jan 16 15:12
    stale[bot] commented #357
  • Jan 06 08:24
    NealWood commented #366
  • Jan 06 08:23
    NealWood commented #366
  • Jan 06 08:23
    NealWood closed #366
  • Jan 06 08:23
    NealWood commented #366
  • Jan 04 12:52
    stale[bot] closed #365
  • Jan 03 14:58
    sidml commented #366
  • Jan 01 12:35
    stale[bot] labeled #365
  • Jan 01 12:35
    stale[bot] commented #365
  • Dec 31 2019 09:16
    ljvmiranda921 opened #367
  • Dec 24 2019 03:24
    stale[bot] closed #364
  • Dec 21 2019 02:16
    stale[bot] labeled #364
  • Dec 21 2019 02:16
    stale[bot] commented #364
  • Dec 20 2019 02:56
    NealWood opened #366
Lj Miranda
@ljvmiranda921
Hi @DewaldDeJager , the Gitter badge in the README links to this room. So perhaps people who have questions will drop here first. :smile:
while i.alive: i.love(you)
@saeedsltm
Hi, why the final results are different when trying to obtain hyper-parameters using utils.search? c1, c2, w are different in every run with the same condition!
Lj Miranda
@ljvmiranda921
Any ideas on this @SioKCronin ?
Siobhán K Cronin
@SioKCronin
Hi there! Is the question why might grid search or random search return different parameters? If there is a single optima (one particular combination of parameters that yields the best score) and n_selection_iters is set high enough, you should see both of these methods converge consistently to the same parameters. If there are multiple optima, then it makes sense that different parameters might be returned for random search, irregardless of the n_selection_iters on random search, and that the same parameters would be selected for grid search, as the core operation is to gather the enumerated scores and return the min (min selects the first encountered in the case of ties, at least in Python 2.7. I'll check this for 3.6). Were you encountering this with grid search or random search @saeedsltm ?
while i.alive: i.love(you)
@saeedsltm
OK, here i tried only via GridSearch. my problem is so simple (misfit function is a norm function with 4 model parameters). every time i run the code to find the best combination of hyper-parametres, i got different values! so the question is which combination i have to select in real case!
Siobhán K Cronin
@SioKCronin
@saeedsltm I see. Yes, it makes sense you would see different output (the same parameters will produce different results each pass through a stochastic algorithm). I found the following article by Alexandra Johnson helpful in explaining how we can confidently select the best (specifically Mann Whitney Test and Area Under the Curve): https://blog.sigopt.com/posts/evaluating-hyperparameter-optimization-strategies
while i.alive: i.love(you)
@saeedsltm
@SioKCronin Well, thanks it seems there should be some trial and errors to find the best estimation (not absolute) value for hyper-parameters.
Aaron
@whzup
Hey guys before opening an issue for that I thought I'll ask you here. I wrote a little script to analyze a simple non-linear electronic circuit with pyswarms (here is the code https://pastebin.com/BX6Q8sr3). I get a weird error message and I don't see why it would not work to be honest. It might be a "bug" because I use one dimensional optimization? I found this stackoverflow question when I googled the error message https://stackoverflow.com/questions/47493559/valueerror-non-broadcastable-output-operand-with-shape-3-1-doesnt-match-the . Maybe you can have a look at it. If it is just something I did wrong maybe we should include an exception for that so we can give a more meaningful error message.
Siobhán K Cronin
@SioKCronin
@whzup I was able to replicate your error message, and am looking into it now.
Siobhán K Cronin
@SioKCronin
Looking at this post, numpy/numpy#9031, it seems like this was a conscious Numpy design choice, and that the quickest thing to do would be to change the inplace operation on line 159 of 'compute_position'. @ljvmiranda921 I can change this line and include 'compute_position' in our test coverage (is there any other backend test coverage I can tackle at the same time?)
Lj Miranda
@ljvmiranda921
Woops sorry just read this now. compute_positi
Lj Miranda
@ljvmiranda921
I think this is a common bug that I must state clearly in the documentation. A lot of issues are being reported in relation to this. Can you first try making the output of your cost_function return a shape of (n_particles,) rather than (n_particles,dimensions)? If this remedy doesn't work, I think @SioKCronin 's solution should. But try changing the output shape of your cost_function first.
Woops sorry just read this now, thanks for reporting this @whzup and for responding very early @SioKCronin !
I tried solving this via numpy.ravel(), please check it out: https://repl.it/@ljvmiranda921/PySwarms-Replicate-Broadcasting
Lj Miranda
@ljvmiranda921
If you guys want, maybe we can pass a logger.warn that checks the output of a custom objective function, and if the shape isn’t (n_particles, ), gives off a warning? At the same time, i think it’s really important that we state this clearly in the docs (re custom objective functions)
Aaron
@whzup
Yes thanks it works when I change the shape!
It's a nice idea to check the input. I find the error message you get at the moment very confusing for a mistake that is so easy to make.
Aaron
@whzup
By the way: Now that my script is working it actually optimizes the cost below zero. I'd like to have a cost of exactly zero. I did not find anything to change this so how about an optional argument to specify the minimum, maximum or exact value that optimization should aim to?
Lj Miranda
@ljvmiranda921
Ok, I’ll try to add the warning since a lot of people are reporting it. About the other one, hmmm, I’m a bit on the fence for it defeats the purpose of optimization if we have an explicit “target” value. I suggest that you engineer your cost function to minimize the “distance” between your target and the swarm’s global best? Similar to the inverse kinematics problem :smile:
Lj Miranda
@ljvmiranda921

Hi friends @SioKCronin and @whzup , just a heads up, I might lie-low for a while in PySwarms from July til August. I can still review code and do some patches or bugs, but may be passive in developing new features. I appreciate that @whzup , and some students in St. Thomas University are planning to contribute. Again, I'll still review PRs and answer Issues, so there's no problem with that.

I'll be moving to Tokyo this August for a research fellowship, and will go back to the Philippines by the end of September. Thanks to @SioKCronin for answering queries here in Gitter and to @whzup for actively thinking of solutions to stale issues and new features. Perhaps we can add you as collaborator after releasing v.0.3.0 if you're interested.

Siobhán K Cronin
@SioKCronin
Hi @ljvmiranda921 Congrats on the fellowship! I'm going to be off my computer the next two weeks for much needed reconnection with nature (have to to gather inspiration for all those algorithms ;), but when I get back I'd like to double down on my commitment to the project; helping out here on Gitter responding to issues and hopefully providing some helpful PRs.
Aaron
@whzup
Hey @ljvmiranda921, Congrats from me too! And @SioKCronin have a nice reconnection :yum: Sure is necessary from time to time! Thank you @ljvmiranda921 a lot for creating this welcoming atmosphere! I'm learning a lot working on this library and it's a lot of fun working with you.I'd love to be a collaborator! Although, I'm not quite sure how much time I can spend on the project after the holidays. I heared the second year of the university is going to be the hardest :smile: but I'll sure be able to contribute regularly.
Lj Miranda
@ljvmiranda921

Hi friends! Thank you so much for all your support.

Goodluck and enjoy your well-deserved rest @SioKCronin ! I'm also happy to see your project going strong! It's nice to have a good ecosystem of swarm libraries!

No problem, @whzup . Just responding to issues and perhaps triaging PRs is already good. I'm happy that you are enjoying! Contribute and comment whenever you have time!

I'll respond to your issues and PRs next week, I'm getting swamped by logistics (moving out, filing documents, etc. etc.) right now. The good thing is that we have a lot of things on the pipeline and we'll just need to review, merge, and do releases.

Lj Miranda
@ljvmiranda921

Hi @whzup , I can add you as collaborator right now if you're interested. I believe you can triage the issues and projects, but cannot push directly to development or master (requires code review from one collaborator or owner).

I really appreciate you being proactive and I'm also sorry if I can't help out in actual development for the following weeks. Hopefully I can reply to your issues and PRs just like today.

There's no pressure here. We're not a very large project with sponsors or institutions pressuring us to deliver. So just dive-in whenever you have time, etc. I'm happy that we have people on the team to collaborate and bounce ideas with.

Aaron
@whzup
Thanks a lot @ljvmiranda921 ! Although, I'd rather have someone looking through my work before pushing it into the main project to be honest :smile:. I am trying to be a bit productive in my holidays, I start working for 6 weeks next week and I may not be able to contribute too much because I'm probably going to work long hours (hourly wage) :yum:. I'm just enjoying myself a lot working on this project :smiley:. It's so much fun working with others and finding solutions together!
Lj Miranda
@ljvmiranda921
No problem, the master and dev branches are protected so it requires a review from the owner (me) for commits to be pushed. Glad you are enjoying, I appreciate your commitment!
Lj Miranda
@ljvmiranda921

The docs for our dev branch seems to be failing with respect to the last merge commit. The error says:

Problem parsing YAML configuration. Invalid "requirements_file": path ./requirements_dev.txt does not exist

Which is weird, because we definitely have a requirements_dev.txt file, and it was not changed during the latest commit. Anyway, I searched around and figured out that this could be related to a ReadTheDocs bug: rtfd/readthedocs.org#4378 (posted today, still no fix yet)

Lj Miranda
@ljvmiranda921
Idk why but merging whenever the master branch is different from the development branch has been very tricky with me lately... hah
*tricky for me
Siobhán K Cronin
@SioKCronin
Did this proposed solution provide any help? rtfd/readthedocs.org#4379
Lj Miranda
@ljvmiranda921
Hi @SioKCronin , it seems that it fixed itself on its own, so now our docs are working well :+1:
MinaTugraz
@MinaTugraz
Hi everyone, I want to use Pyswarm to optimize a parameter(which is not neither Weight nor Baise ) in ANN . and I use Keras to train my network. how can I pass loss function of Keras to optimizer.optimze in Pyswarm ?
Gabriel
@gmichelassi
Hi everyone, I want to use Pyswarm to optimize some parameters of my SKLearn Classifier (Random Forest)... I want to know if this is possible and what's the best way to do it, since I do not found any specification in the documentation
khaliltalib277
@khaliltalib277
I want to use Pyswarm to optimize some parameters of SVR (for regression problems)... I want to know if this is possible and what's the best way to do it?
Patrick Bezzina
@pbezz1
Any luck with using Pyswarms to optimize RF or SVR parameters please @gmichelassi / @khaliltalib277 ? Can you share your views ? I'd like to have some ideas please
AbbasThajeel
@AbbasThajeel
Hello, my friends, I want to code for Particle Swarm Optimization with
Dynamic and Static Topologies
pierremifasol
@pierremifasol

Hello,

I am interested in using PySwarm for 2D geometric optimisation. All my variable are and need to be integers thus I would like to use the discrete method. However it seems that it does not handle bounds neither constraints on the variables as the continuous method does. Is it something you plan to add ?

houhoubema
@houhoubema
hello, i want to add PSO as a new optimizer in optimizers.py (python file) please help
Ashish Upadhyay
@panditu2015
can someone point me towards some materials explaining the speed of running pso in pywarms. thnaks in advance