Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 18 14:45
    he2lec closed #308
  • Oct 18 14:17
    he2lec opened #308
  • Oct 18 07:17
    P-NA-J closed #306
  • Oct 18 07:17
    P-NA-J commented #306
  • Oct 17 17:42
    Snaipe commented #306
  • Oct 17 17:37
    ntuDerekWang closed #307
  • Oct 17 17:37
    ntuDerekWang commented #307
  • Oct 17 17:34
    Snaipe commented #307
  • Oct 17 17:28
    ntuDerekWang edited #307
  • Oct 17 17:28
    ntuDerekWang edited #307
  • Oct 17 17:26
    ntuDerekWang opened #307
  • Oct 17 17:23
    ntuDerekWang closed #305
  • Oct 17 17:23
    ntuDerekWang commented #305
  • Oct 17 07:32
    vincentdupaquis commented #304
  • Oct 16 12:25
    P-NA-J opened #306
  • Oct 14 08:44
    Snaipe commented #304
  • Oct 14 07:41
    vincentdupaquis commented #304
  • Oct 12 17:28
    Snaipe commented #304
  • Oct 12 12:13
    Snaipe commented #304
  • Oct 12 11:43
    Snaipe commented #305
Dominik
@kaidowei
where can I see the results of the cram test?
it just says, it failed
Franklin Mathieu
@Snaipe
oh, right, I'm assuming you don't have cram installed on your system
install cram 0.6 with sudo pip install cram==0.6 (or with --user if you don't want to install it in /usr)
Dominik
@kaidowei
I have... if I remember correctly, we had this conversation some time ago
Franklin Mathieu
@Snaipe
okay
in doubt, run make cram_tests
and to troubleshoot with ctest, you need to pass it an option
let me check which one
--output-on-failure
Dominik
@kaidowei
[100%] Built target criterion_samples
CMake Error at /*beep*/Criterion/.cmake/Modules/Cram.cmake:66 (message):
  Cram tests failed


make[3]: *** [cram_tests] Error 1
make[2]: *** [test/CMakeFiles/cram_tests.dir/all] Error 2
make[1]: *** [test/CMakeFiles/cram_tests.dir/rule] Error 2
make: *** [cram_tests] Error 2
Franklin Mathieu
@Snaipe
huh.
let me check on my end
oh, it's a bug in the cmake module, somehow
try export PYTHON_BIN=python3
and re-run the tests
I'll push a fix to address the case where PYTHON_BIN isn't set
Dominik
@kaidowei
yeah, works
Dominik
@kaidowei
mkay, did my part :)
Franklin Mathieu
@Snaipe
Thanks!
Dominik
@kaidowei
uuuh, just noticed, that the locale print sometimes prints "0,05" instead of "0.05"
do you think, xml parsers mind?
Franklin Mathieu
@Snaipe
oh, right
Dominik
@kaidowei
damn it
Franklin Mathieu
@Snaipe
well, according to the standard, xs:decimal needs a dot
actually, let me rephrase: the internet is telling me that the standard says that it needs a dot
Let me check for sure
so how do we fix that?
Franklin Mathieu
@Snaipe
well, you could swap the locale, but you'll have to restore it afterwards
Dominik
@kaidowei
that really stinks...
how did you do that for the other output providers?
Franklin Mathieu
@Snaipe
Other output providers doesn't have this restriction
and I didn't have any floats to print anyway, so that's why
Dominik
@kaidowei
that's not entirely true, the --tap option also prints floats
Franklin Mathieu
@Snaipe
I guess the cleanest way (as in, reusable for other output providers) would be to implement a compatibility function for this
yes, but tap doesn't have the restriction, iirc the time is part of a comment string
so using the locale here is the right thing to do
Dominik
@kaidowei
okay...
the jenkins tap parser had problems finding the time, but I guess that is a problem of the tap-format
(and the locale, maybe)
Franklin Mathieu
@Snaipe
this is precisely why I'm not testing the timestamps in cram
because the way to print time just isn't consistent
Dominik
@kaidowei
so the xml.t should be removed? (which is also not a good thing)
Franklin Mathieu
@Snaipe
no, but xml.t will probably not have time measurements anyway
I mean, time is disabled for tests because you can't really compare two times textually
well, you can, but I mean it's not reliable
because one test might take 1ms to complete one time, and 2ms the other time
Dominik
@kaidowei
okay... amend this pull request or make another one?