These are chat archives for dropbox/pyston
tokenizer.c's diff patch,
make run_test_tokenizeused 40+s in debug mode, so the unit test is breaked because of timeout .(cpython2.7.8 used 6s, pyston without this patch used 8s, same test code). I'm compiling the codes in release mode to see how long it takes, but still no idea what to do, should we add the time out limit in test code?
make run_release_test_tokenizeused 8s like before, I think we can increase the unit test timeout because it dose not slow down the code in release?
test_tokenizeis much slower than before, and it caused a timeout. I thought it was because the changes in
tokenizer.c, but I saw the codes just before in
>>> if not test_support.is_resource_enabled("cpu"): ... testfiles = random.sample(testfiles, 10) ...
test_support.is_resource_enabledalways returns True in current Pyston test environment, and before it always returns False.
network, I just disabled it with hard code. But I think this is not a good choice ...