Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • Jun 04 16:02
    C0c0aG33k closed #304
  • Jun 04 16:02
    C0c0aG33k commented #304
  • Jun 04 15:30
    henningsmith commented #304
  • Jun 04 15:20
    C0c0aG33k opened #304
  • May 14 18:06
    rspeer commented #303
  • Apr 25 12:49
    gsittyz opened #303
  • Mar 18 18:53
    rspeer closed #302
  • Mar 18 18:53
    rspeer commented #302
  • Mar 18 00:22
    lalchand-pandia opened #302
  • Mar 08 11:42
    senisioi commented #300
  • Jan 05 15:38
    JohannaOm opened #301
  • Dec 18 2020 10:20
    annargrs opened #300
  • Nov 19 2020 15:43
    jlowryduda closed #299
  • Nov 19 2020 15:43
    jlowryduda commented #299
  • Nov 19 2020 13:10
    kartik7511 commented #299
  • Nov 19 2020 12:51
    henningsmith commented #299
  • Nov 19 2020 10:52
    kartik7511 edited #299
  • Nov 19 2020 10:49
    kartik7511 edited #299
  • Nov 19 2020 10:49
    kartik7511 opened #299
  • Nov 19 2020 03:07
    ghamzak closed #298
Hi, Wondering if anyone would be so kind to assist in installing ConceptNet with the database on a machine. I've spent a week on it using virtual machines, configuring puppet, resetting my machine, etc. and pulling my hair out...super frustrated. :-(
@christat13 Same thing all over again, I put online a simple way to load conceptnet in a database that has a REST interface, but nobody gave any feedback, so I do not know how to proceed. See https://github.com/amirouche/easy-conceptnet
Does conceptnet APIs accept spelling mistakes?
Anna Rogers
@christat13 if a local database would suffice, here's a Python library that makes things simple: https://pypi.org/project/conceptnet-lite/

Error in rule convert_opensubtitles_ft:
jobid: 0
output: data/vectors/fasttext-opensubtitles.h5

CalledProcessError in line 674 of /home/conceptnet5/Snakefile:
Command 'set -euo pipefail; CONCEPTNET_DATA=data cn5-vectors convert_fasttext -n 2000000 data/raw/vectors/ft-opensubtitles.vec.gz data/vectors/fasttext-opensubtitles.h5' returned non-zero exit status 137.
File "/home/conceptnet5/Snakefile", line 674, in __rule_convert_opensubtitles_ft
File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
Exiting because a job execution failed. Look above for error message
[Fri Nov 27 21:43:20 2020]
Finished job 183.
11 of 94 steps (12%) done
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /home/conceptnet5/.snakemake/log/2020-11-27T213538.909921.snakemake.log

getting this error??
it would be great if anyone can help
@amirouche Hi... when trying 'make database-download' step facing 'make: * No rule to make target 'database-download'. Stop.
' issue. Can you help?
@subhadeepdas7593-prog I just figured that I commited the wrong thing
It will take a few days to fix
I need to go though my backups
Hallo ,
Is ConceptNet version 5 from 2012 available to download? Unfortunately i couldn't find a download source for this version.
Alistair Nottle (Personal)
Hello! I am trying to install a local instance of ConceptNet, using the Puppet method. It initially executes fine, but towards the end I'm getting a consistent error. Anybody got any hints?! thank you.
[Tue Dec 22 15:02:54 2020]
rule join_propagate:
    input: data/vectors/numberbatch-biased.h5.shard0, data/vectors/numberbatch-biased.h5.shard1, data/vectors/numberbatch-biased.h5.shard2, data/v
ectors/numberbatch-biased.h5.shard3, data/vectors/numberbatch-biased.h5.shard4, data/vectors/numberbatch-biased.h5.shard5
    output: data/vectors/numberbatch-biased.h5
    jobid: 18
    resources: ram=24

Unable to open/create file 'data/vectors/numberbatch-biased.h5.shard5'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/conceptnet/env/bin/cn5-vectors", line 33, in <module>
    sys.exit(load_entry_point('ConceptNet', 'console_scripts', 'cn5-vectors')())
  File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/conceptnet/conceptnet5/conceptnet5/vectors/cli.py", line 339, in run_join_shard_files
    join_shards(filename, nshards, sort=sort)
  File "/home/conceptnet/conceptnet5/conceptnet5/vectors/retrofit.py", line 58, in join_shards
    shard = load_hdf(output_filename + '.shard%d' % i)
  File "/home/conceptnet/conceptnet5/conceptnet5/vectors/formats.py", line 21, in load_hdf
    return pd.read_hdf(filename, 'mat', encoding='utf-8')
  File "/home/conceptnet/env/lib/python3.8/site-packages/pandas/io/pytables.py", line 389, in read_hdf
    store = HDFStore(path_or_buf, mode=mode, errors=errors, **kwargs)
  File "/home/conceptnet/env/lib/python3.8/site-packages/pandas/io/pytables.py", line 553, in __init__
    self.open(mode=mode, **kwargs)
  File "/home/conceptnet/env/lib/python3.8/site-packages/pandas/io/pytables.py", line 729, in open
    raise IOError(str(err)) from err
OSError: HDF5 error back trace

  File "H5F.c", line 509, in H5Fopen
    unable to open file
  File "H5Fint.c", line 1400, in H5F__open
    unable to open file
  File "H5Fint.c", line 1709, in H5F_open
    unable to read root group
  File "H5Groot.c", line 239, in H5G_mkroot
    can't check if symbol table message exists
  File "H5Omessage.c", line 883, in H5O_msg_exists
    unable to protect object header
  File "H5Oint.c", line 1100, in H5O_protect
    unable to load object header
  File "H5AC.c", line 1625, in H5AC_protect
    H5C_protect() failed
  File "H5C.c", line 2362, in H5C_protect
    can't load entry
 File "H5C.c", line 6712, in H5C_load_entry
    incorrect metadatda checksum after all read attempts
  File "H5Ocache.c", line 219, in H5O__cache_get_final_load_size
    can't deserialize object header prefix
  File "H5Ocache.c", line 1231, in H5O__prefix_deserialize
    bad object header version number

End of HDF5 error back trace

Unable to open/create file 'data/vectors/numberbatch-biased.h5.shard5'
Is there a lightweight, English-only version of Conceptnet for integrating into other apps?
5 replies
Wellington Franco

Anybody getting this error??
Please, it would be great if anyone can help

return _wf_cache[args]
KeyError: ('##', 'ja', 'best', 0.0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/anaconda3/bin/cn5-vectors", line 11, in <module>
load_entry_point('ConceptNet', 'console_scripts', 'cn5-vectors')()
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 722, in call
return self.main(args, kwargs)
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback,
File "/root/anaconda3/lib/python3.7/site-packages/click/core.py", line 535, in invoke
return callback(
args, *kwargs)
File "/dados/conceptnet5/conceptnet5/vectors/cli.py", line 288, in run_miniaturize
mini = miniaturize(frame, other_vocab=othervocab, k=k)
File "/dados/conceptnet5/conceptnet5/vectors/miniaturize.py", line 51, in miniaturize
term for term in frame.index if '
' not in term and termfreq(term) >= 1e-8
File "/dados/conceptnet5/conceptnet5/vectors/miniaturize.py", line 51, in <listcomp>
term for term in frame.index if '
' not in term and term_freq(term) >= 1e-8
File "/dados/conceptnet5/conceptnet5/vectors/miniaturize.py", line 25, in term_freq
return wordfreq.word_frequency(term, lang)
File "/root/anaconda3/lib/python3.7/site-packages/wordfreq/init.py", line 301, in word_frequency
_wf_cache[args] = _word_frequency(
File "/root/anaconda3/lib/python3.7/site-packages/wordfreq/init.py", line 244, in _word_frequency
tokens = lossy_tokenize(word, lang)
File "/root/anaconda3/lib/python3.7/site-packages/wordfreq/tokens.py", line 313, in lossy_tokenize
tokens = tokenize(text, lang, include_punctuation, external_wordlist)

hello everyone! i apologize if this has been asked before, but is there a way to report incorrect connections on conceptnet? when i searched "robot", it listed that robots are capable of "[crying] salt water" and "[playing] the ocarina". neither of these seem accurate to me, but then again, i'm not a robot expert. any advice would be appreciated :)
Hanna L Tischer
Hey I was wondering what the best way to parse the csv file for conceptnet is? Also is there a file that only has English words?
Rejnald Lleshi
Hi everyone! Whenever I query the REST API like so:'http://api.conceptnet.io/c/en/' + word + '?limit=2000' I get a variable result even though I am setting a limit=2000. Any idea why this is happening?
@hannalt hi hanna, I wrote a simple python python script that filteres english assertions from the raw assertions.csv, you can take a look here:
hi guys, I wonder how you can get the links to wordnet from the assertions.csv ?
Hello everyone, I am trying to import the ConceptNet graph to Neo4j. I found some instances on github: https://github.com/redsk/neo_concept but they are all done long time ago and the ConceptNet structure is different now. Can anyone help me extract the Conceptnet to nodes and relationships csv files that I will later import to Neo4j database? Thank you in advance.
Hi Kiko - i wrote a reader for it that you could use to dump a csv file with minor modifications: https://github.com/curiosity-ai/catalyst-conceptnet - or just ingest directly using the neo4j driver for c#
hi I want to download all the relations with surfaceText in csv format. Is there a url to directly do that?
@rspeer I want to get results with relation=IsA and language preference is english.
So, I am using a python request:
response = requests.get('http://api.conceptnet.io/query?rel=/r/IsA&lang=en')
Still, I am getting results in Japanese.
hello everyone!!
this is the program extracting the relation using conceptnet API. I am new and have less knowledge about conceptnet . I got an error a very long time.
This is the error :
multiprocessing.pool.MaybeEncodingError: Error sending result: '<multiprocessing.pool.ExceptionWithTraceback object at 0x7f9d326b5940>'. Reason: 'TypeError("cannot pickle '_io.BufferedReader' object")
I appreciate your help. thanks in advance

anyone got an error like me
return _wf_cache[args]
KeyError: ('##', 'ja', 'best', 0.0)

During handling of the above exception, another exception occurred:

Robyn Speer
I think the above is an issue I responded to over e-mail as well, but it's a problem with the dependencies of wordfreq, which is a dependency of ConceptNet. I've been working on untangling those dependencies. pip install --upgrade wordfreq should solve it.
@92komal I'd need a lot more context in your case, such as why multiprocessing and pickle are involved. Neither of those modules are involved in ConceptNet as far as I know.
oh, now I see the code. Anyway, right, that's a problem with something you're trying to send between processes in multiprocessing, and I believe you'd encounter it no matter what API you're using.
Anmol Kalia
Can someone point me to the best way to get all concepts that have a r/partof with a given concept, eg airplane?
Hello guys, are servers down today? i keep getting error 500
Hi. How an I get the embedding vector for a sentence using ConceptNet?Thanks
Alan Hogue
I'm seeing the same server errors.
I have a question. I see in the CSV download that there are a lot of relations between (to-from?) wordnet lexemes, but I don't see wordnet sense keys or anything that I recognize as pointing uniquely to something in wordnet. We'd like to use the two together, so my question is: does a unique WN identifier exist anywhere in the data, perhaps accessible through the API that is currently down? Thanks!
Navneet Agarwal

Hello guys, are servers down today? i keep getting error 500

Hello guys, are the servers still down ? (accessed at 6:24 am, 3 June 2021)


Hello guys, are servers down today? i keep getting error 500

Hello guys, are the servers still down ? (accessed at 6:24 am, 3 June 2021)

Yes, i still get the error unfortunately...

Navneet Agarwal
Yep, I am also keep getting the error.
Jean-Louis Villecroze
Getting error 500 as well (both for the http and https server) :(
Jean-Louis Villecroze
Left an issue on github for the Server Error we have been seeing for a while now ...
Jean-Louis Villecroze
It got fixed (once they saw the issue on github) :)
Hello guys, I hope you're doing well. Can I use ConceptNet through Python ?
Same question as above. Is there a step-by-step guide to download Conceptnet dataset?