Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Ivan Eggel
    @ieggel

    Dear ImageCLEF Caption - Concept detection participants,

    We are happy to announce the launch of the submission phase. You can now submit runs by clicking on the button "Create Submission" next to tabs on the challenge page. The official submission deadline is May 1st 2018.

    Good to luck to all of you! :) :four_leaf_clover:

    Ivan Eggel
    @ieggel
    Dear participants, the deadline was extended to Tuesday May 8th.
    Eduardo Pinho
    @Enet4
    Good day! Is the evaluation tool for this challenge the same as the one provided at imageclef.org? The link I found seems to point to the one used in the 2017 edition.
    Ivan Eggel
    @ieggel
    Hi @Enet4 , it is the same, we only adjusted it to make it work on CrowdAI.
    Eduardo Pinho
    @Enet4
    @ieggel The ground truth concepts file isn't exactly in the same format, though. Has this been considered as well?
    Ivan Eggel
    @ieggel
    @Enet4 yes this has been considered. Please find the sourcecode here:https://github.com/crowdAI/CLEF_evaluators_2018
    Eduardo Pinho
    @Enet4
    All right, thank you.
    Ivan Eggel
    @ieggel
    i put the link of the evaluation sourecode above
    Eduardo Pinho
    @Enet4
    Oh, but compute_f1 is still assuming a comma separated list of concepts in that code: https://github.com/crowdAI/CLEF_evaluators_2018/blob/master/concept_detection/concept_detection_evaluator.py#L153-L163
    Ivan Eggel
    @ieggel
    @Enet4 I will have a look at that tomorrow morning as soon as i can. Will keep you up to date.
    Eduardo Pinho
    @Enet4
    @ieggel Thanks. I can leave you a pull request with the fix in the mean time.
    Ivan Eggel
    @ieggel
    @Enet4 I just had a real quick look at the code and it seems that you are right. You can create a PR and I will have a deeper look into it tomorrow morning.
    @Enet4 Thank your very much
    Eduardo Pinho
    @Enet4
    Glad I could help! :)
    Ivan Eggel
    @ieggel
    BTW can you send me a submission file to ivan.eggel@hevs.ch? Like that i will test it with your file.
    Eduardo Pinho
    @Enet4
    Yes, I can do that. :+1:
    Ivan Eggel
    @ieggel
    @Enet4 I merged your PR, there was really a bug. Thank you very much
    NOBODY HAS TO RESUBMIT, WE RE-RUN YOUR SUBMISSIONS AND THE SCORES SHOULD BE CORRECT NOW
    Ivan Eggel
    @ieggel
    I'm currently blocking new submissions for the concept detection challenge until the new code has been pushed to production. I'm just waiting for the guys from CrowdAI. I will let you know as soon as we reopen. Sorry for the inconvenience.
    Ivan Eggel
    @ieggel
    Updated code pushed to production. Submissions reopened.
    Ivan Eggel
    @ieggel
    The challenge has officially ended. Thanks to all participants!
    Ivan Eggel
    @ieggel
    Due to various requests we will reopen the submission and leave it open until tomorrow 11:59 PM UTC
    This will be the last deadline extension
    Ivan Eggel
    @ieggel
    Dear participants,
    Post-challenge submissions are enabled from now on. Please find more information at the top of the challenge description page.
    Eduardo Pinho
    @Enet4
    Good day again. Could there be a double-check on the results presented on the official website ( http://imageclef.org/2018/caption )? I noticed that two of the graded runs from CrowdAI are not listed.
    Ivan Eggel
    @ieggel
    Hi Eduardo, can you please specify which subtask?
    Ivan Eggel
    @ieggel
    i forwarded your question to the person in charge of those results.
    Eduardo Pinho
    @Enet4
    Thanks. I only checked the concept detection subtask. 28 runs were graded by the system, but the list on the website shows 26 runs.
    WCSEE Essex
    @wcsee_essex_twitter
    Hello Eduardo, I found we were missing one run and I added it now, can you send me the other run that is missing, I don't know which one it is.
    Thanks
    Eduardo Pinho
    @Enet4
    Hmm, if I'm not mistaken, the other one is ID 6026, from the AILAB team.
    WCSEE Essex
    @wcsee_essex_twitter
    I see now, the run name was the same for both and i guess this got mixed. I will update it now. Thanks for noticing it!