Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • May 14 18:06
    rspeer commented #303
  • Apr 25 12:49
    gsittyz opened #303
  • Mar 18 18:53
    rspeer closed #302
  • Mar 18 18:53
    rspeer commented #302
  • Mar 18 00:22
    lalchand-pandia opened #302
  • Mar 08 11:42
    senisioi commented #300
  • Jan 05 15:38
    JohannaOm opened #301
  • Dec 18 2020 10:20
    annargrs opened #300
  • Nov 19 2020 15:43
    jlowryduda closed #299
  • Nov 19 2020 15:43
    jlowryduda commented #299
  • Nov 19 2020 13:10
    kartik7511 commented #299
  • Nov 19 2020 12:51
    henningsmith commented #299
  • Nov 19 2020 10:52
    kartik7511 edited #299
  • Nov 19 2020 10:49
    kartik7511 edited #299
  • Nov 19 2020 10:49
    kartik7511 opened #299
  • Nov 19 2020 03:07
    ghamzak closed #298
  • Nov 18 2020 17:52
    ghamzak opened #298
  • Nov 01 2020 09:08
    ChenTao98 opened #297
  • Oct 28 2020 14:02
    sdaitzman opened #296
  • Oct 13 2020 14:26
    felix9 opened #295
Robyn Speer
When sources like WordNet and Wiktionary give us disambiguating information, we attempt to preserve it
a custom script in building ConceptNet decides how to assign WordNet senses to human-readable categories like "communication". It comes from the WordNet properties called "domain_category" and "lexical_domain"
Stanislav Shelemekh
Okay, thank you so much, what about the big CSV file in the Downloads ( https://github.com/commonsense/conceptnet5/wiki/Downloads )? Does it have any "domain_category" or "lexical_domain" info? I was trying to find any such cases but didn't succeed :(
I do not know the answer, but a way to tell where communication come from is to look at the python code of the given page
Pranav Pawar

Hi, I am new to conceptnet; Have set up the Amazon AMI, following instructions from here https://github.com/commonsense/conceptnet5/wiki/Running-your-own-copy

And as mentioned here, there is a timeout error issue with using AMI; though I tried out the solution for this problem, as mentioned in the documentation

Here's a command to warm up the entire disk, by accessing every byte of data on the disk:

sudo dd if=/dev/xvda of=/dev/null bs=16M

But it doesn't seem to work, facing issue of missing /dev/xvda directory

Can somebody help me out resolving this? Or any other solution for the timeout issue is welcome!

Zhang Cheng
Hi, everyone! I'm new to ConceptNet. And I want to know how to calculate the relatedness based on the NumberBtach. Is it just cosine similarity or dot product?
Zhang Cheng
s = a 1 · a 2 + b 1 · b 2 + w 1 (b 2 − a 2 ) · (b 1 − a 1 ) + w 2 (b 2 − b 1 ) · (a 2 − a 1 )
Is this the formula for relatedness? While w 1 = 0.2 and w 2 = 0.6 is for NumberBatch 16.09, the weight for latest NumberBatch(19.08) is what?
Hi, Wondering if anyone would be so kind to assist in installing ConceptNet with the database on a machine. I've spent a week on it using virtual machines, configuring puppet, resetting my machine, etc. and pulling my hair out...super frustrated. :-(
@christat13 Same thing all over again, I put online a simple way to load conceptnet in a database that has a REST interface, but nobody gave any feedback, so I do not know how to proceed. See https://github.com/amirouche/easy-conceptnet
Does conceptnet APIs accept spelling mistakes?
Anna Rogers
@christat13 if a local database would suffice, here's a Python library that makes things simple: https://pypi.org/project/conceptnet-lite/

Error in rule convert_opensubtitles_ft:
jobid: 0
output: data/vectors/fasttext-opensubtitles.h5

CalledProcessError in line 674 of /home/conceptnet5/Snakefile:
Command 'set -euo pipefail; CONCEPTNET_DATA=data cn5-vectors convert_fasttext -n 2000000 data/raw/vectors/ft-opensubtitles.vec.gz data/vectors/fasttext-opensubtitles.h5' returned non-zero exit status 137.
File "/home/conceptnet5/Snakefile", line 674, in __rule_convert_opensubtitles_ft
File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
Exiting because a job execution failed. Look above for error message
[Fri Nov 27 21:43:20 2020]
Finished job 183.
11 of 94 steps (12%) done
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /home/conceptnet5/.snakemake/log/2020-11-27T213538.909921.snakemake.log

getting this error??
it would be great if anyone can help
@amirouche Hi... when trying 'make database-download' step facing 'make: * No rule to make target 'database-download'. Stop.
' issue. Can you help?
@subhadeepdas7593-prog I just figured that I commited the wrong thing
It will take a few days to fix
I need to go though my backups
Hallo ,
Is ConceptNet version 5 from 2012 available to download? Unfortunately i couldn't find a download source for this version.
Alistair Nottle (Personal)
Hello! I am trying to install a local instance of ConceptNet, using the Puppet method. It initially executes fine, but towards the end I'm getting a consistent error. Anybody got any hints?! thank you.
[Tue Dec 22 15:02:54 2020]
rule join_propagate:
    input: data/vectors/numberbatch-biased.h5.shard0, data/vectors/numberbatch-biased.h5.shard1, data/vectors/numberbatch-biased.h5.shard2, data/v
ectors/numberbatch-biased.h5.shard3, data/vectors/numberbatch-biased.h5.shard4, data/vectors/numberbatch-biased.h5.shard5
    output: data/vectors/numberbatch-biased.h5
    jobid: 18
    resources: ram=24

Unable to open/create file 'data/vectors/numberbatch-biased.h5.shard5'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/conceptnet/env/bin/cn5-vectors", line 33, in <module>
    sys.exit(load_entry_point('ConceptNet', 'console_scripts', 'cn5-vectors')())
  File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/conceptnet/conceptnet5/conceptnet5/vectors/cli.py", line 339, in run_join_shard_files
    join_shards(filename, nshards, sort=sort)
  File "/home/conceptnet/conceptnet5/conceptnet5/vectors/retrofit.py", line 58, in join_shards
    shard = load_hdf(output_filename + '.shard%d' % i)
  File "/home/conceptnet/conceptnet5/conceptnet5/vectors/formats.py", line 21, in load_hdf
    return pd.read_hdf(filename, 'mat', encoding='utf-8')
  File "/home/conceptnet/env/lib/python3.8/site-packages/pandas/io/pytables.py", line 389, in read_hdf
    store = HDFStore(path_or_buf, mode=mode, errors=errors, **kwargs)
  File "/home/conceptnet/env/lib/python3.8/site-packages/pandas/io/pytables.py", line 553, in __init__
    self.open(mode=mode, **kwargs)
  File "/home/conceptnet/env/lib/python3.8/site-packages/pandas/io/pytables.py", line 729, in open
    raise IOError(str(err)) from err
OSError: HDF5 error back trace

  File "H5F.c", line 509, in H5Fopen
    unable to open file
  File "H5Fint.c", line 1400, in H5F__open
    unable to open file
  File "H5Fint.c", line 1709, in H5F_open
    unable to read root group
  File "H5Groot.c", line 239, in H5G_mkroot
    can't check if symbol table message exists
  File "H5Omessage.c", line 883, in H5O_msg_exists
    unable to protect object header
  File "H5Oint.c", line 1100, in H5O_protect
    unable to load object header
  File "H5AC.c", line 1625, in H5AC_protect
    H5C_protect() failed
  File "H5C.c", line 2362, in H5C_protect
    can't load entry
 File "H5C.c", line 6712, in H5C_load_entry
    incorrect metadatda checksum after all read attempts
  File "H5Ocache.c", line 219, in H5O__cache_get_final_load_size
    can't deserialize object header prefix
  File "H5Ocache.c", line 1231, in H5O__prefix_deserialize
    bad object header version number

End of HDF5 error back trace

Unable to open/create file 'data/vectors/numberbatch-biased.h5.shard5'
Is there a lightweight, English-only version of Conceptnet for integrating into other apps?
5 replies
Wellington Franco

Anybody getting this error??
Please, it would be great if anyone can help

return _wf_cache[args]
KeyError: ('##', 'ja', 'best', 0.0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/anaconda3/bin/cn5-vectors", line 11, in <module>
load_entry_point('ConceptNet', 'console_scripts', 'cn5-vectors')()
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 722, in call
return self.main(args, kwargs)
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback,
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 535, in invoke
return callback(
args, *kwargs)
File "/dados/conceptnet5/conceptnet5/vectors/cli.py", line 288, in run_miniaturize
mini = miniaturize(frame, other_vocab=othervocab, k=k)
File "/dados/conceptnet5/conceptnet5/vectors/miniaturize.py", line 51, in miniaturize
term for term in frame.index if '
' not in term and termfreq(term) >= 1e-8
File "/dados/conceptnet5/conceptnet5/vectors/miniaturize.py", line 51, in <listcomp>
term for term in frame.index if '
' not in term and term_freq(term) >= 1e-8
File "/dados/conceptnet5/conceptnet5/vectors/miniaturize.py", line 25, in term_freq
return wordfreq.word_frequency(term, lang)
File "/root/anaconda3/lib/python3.7/site-packages/wordfreq/init.py", line 301, in word_frequency
_wf_cache[args] = _word_frequency(
File "/root/anaconda3/lib/python3.7/site-packages/wordfreq/init.py", line 244, in _word_frequency
tokens = lossy_tokenize(word, lang)
File "/root/anaconda3/lib/python3.7/site-packages/wordfreq/tokens.py", line 313, in lossy_tokenize
tokens = tokenize(text, lang, include_punctuation, external_wordlist)

hello everyone! i apologize if this has been asked before, but is there a way to report incorrect connections on conceptnet? when i searched "robot", it listed that robots are capable of "[crying] salt water" and "[playing] the ocarina". neither of these seem accurate to me, but then again, i'm not a robot expert. any advice would be appreciated :)
Hanna L Tischer
Hey I was wondering what the best way to parse the csv file for conceptnet is? Also is there a file that only has English words?
Rejnald Lleshi
Hi everyone! Whenever I query the REST API like so:'http://api.conceptnet.io/c/en/' + word + '?limit=2000' I get a variable result even though I am setting a limit=2000. Any idea why this is happening?
@hannalt hi hanna, I wrote a simple python python script that filteres english assertions from the raw assertions.csv, you can take a look here:
hi guys, I wonder how you can get the links to wordnet from the assertions.csv ?
Hello everyone, I am trying to import the ConceptNet graph to Neo4j. I found some instances on github: https://github.com/redsk/neo_concept but they are all done long time ago and the ConceptNet structure is different now. Can anyone help me extract the Conceptnet to nodes and relationships csv files that I will later import to Neo4j database? Thank you in advance.
Hi Kiko - i wrote a reader for it that you could use to dump a csv file with minor modifications: https://github.com/curiosity-ai/catalyst-conceptnet - or just ingest directly using the neo4j driver for c#
hi I want to download all the relations with surfaceText in csv format. Is there a url to directly do that?
@rspeer I want to get results with relation=IsA and language preference is english.
So, I am using a python request:
response = requests.get('http://api.conceptnet.io/query?rel=/r/IsA&lang=en')
Still, I am getting results in Japanese.
hello everyone!!
this is the program extracting the relation using conceptnet API. I am new and have less knowledge about conceptnet . I got an error a very long time.
This is the error :
multiprocessing.pool.MaybeEncodingError: Error sending result: '<multiprocessing.pool.ExceptionWithTraceback object at 0x7f9d326b5940>'. Reason: 'TypeError("cannot pickle '_io.BufferedReader' object")
I appreciate your help. thanks in advance

anyone got an error like me
return _wf_cache[args]
KeyError: ('##', 'ja', 'best', 0.0)

During handling of the above exception, another exception occurred:

Robyn Speer
I think the above is an issue I responded to over e-mail as well, but it's a problem with the dependencies of wordfreq, which is a dependency of ConceptNet. I've been working on untangling those dependencies. pip install --upgrade wordfreq should solve it.
@92komal I'd need a lot more context in your case, such as why multiprocessing and pickle are involved. Neither of those modules are involved in ConceptNet as far as I know.
oh, now I see the code. Anyway, right, that's a problem with something you're trying to send between processes in multiprocessing, and I believe you'd encounter it no matter what API you're using.