Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Dec 20 2019 12:00
    jmatsushita commented #77
  • Nov 29 2019 16:12
    jmatsushita commented #77
  • Nov 29 2019 16:07
    jmatsushita commented #77
  • Sep 19 2019 09:33
    deepakedayillam opened #89
  • Jul 06 2019 11:46
    devxpy commented #69
  • Jul 06 2019 01:18
    RangerMauve commented #69
  • Jul 05 2019 22:08
    devxpy commented #69
  • Jun 24 2019 20:57
    joehand commented #88
  • Jun 24 2019 20:56
    joehand commented #88
  • Jun 24 2019 20:55
    joehand commented #88
  • Jun 24 2019 20:55
    joehand commented #88
  • Jun 24 2019 20:54
    joehand pinned #88
  • Jun 24 2019 20:54
    joehand closed #88
  • Jun 24 2019 20:54
    joehand commented #88
  • Jun 24 2019 20:47
    RangerMauve commented #88
  • Jun 24 2019 20:39
    todrobbins commented #88
  • May 31 2019 16:11
    RangerMauve commented #88
  • May 31 2019 08:14
    decentral1se commented #88
  • May 30 2019 21:59
    todrobbins commented #88
  • May 30 2019 21:56
    todrobbins commented #88
Dat
@dat_project_twitter
nettle p2p on the long tail is tricky
okdistribute yeah
rangermauve Having a DHT to discover feeds related to pogs and then using a materialized view for the actual data seems a bit easier
okdistribute Then it's like AOL
okdistribute 😂 A/S/L?
rangermauve Hell yeah
okdistribute lots of issues with that though in real life.
nettle unless you have superpeers (like internet archive or something) with many resources seeding that long tail
rangermauve nettle: could you elaborate on what you mean by long tail?
nettle rangermauve: the huge amount of relatively unpopular stuff
nettle like the 10ks of npm modules almost nobody uses
nettle lots of resources needing to host them
nettle but little demand
Dat
@dat_project_twitter
rangermauve Yeah, I think it'd be cool to have stuff be more ephemeral by default. Like, in this world I'd like to see people that are actively advertising they're into pogs and have them remove their entires from the DHT after. It kinda works with hypercore and hyperswarm right now if you create fake hypercores by hashing topics for keys and discovering peers that way. After you close the swarm it'll remove itself from the DHT to
rangermauve prevent false positives
rangermauve And then finding data about others through links and hopefully being able to load data from the peers you got the links through
Dat
@dat_project_twitter
substack rangermauve: this problem of coming up with groups of peers while reducing noise (amount of traffic peers not interested in the topic need to relay) is well researched in the literature around pubsub
rangermauve substack: So if I understand correctly, this would be a layer sitting on top of the raw peer discovery / connection code and people would need to do some sort of query to pull data from the overlay?
Dat
@dat_project_twitter
rangermauve Or is this mostly useful for getting updates from the network?
substack connecting to peers based on topics would happen separately from queries i think
substack but also, implementing some of those papers is a large amount of work
substack the combination of techniques is going to vary a lot based on the different types of apps people make
Dat
@dat_project_twitter
rangermauve Hmmm, do you have any reading recommendations on how to pull data from overlay networks? I think I get the gist of subscribing to data that's coming in
Dat
@dat_project_twitter
substack rangermauve: if you think of a p2p version of an app like twitter, each peer is interested in receiving updates from a set of peers and perhaps also from hashtags
substack and these algorithms let you calculate a set of peers to connect to who are likely interested in the same topics
substack and with those sets you can also generate meta-topics to use with tools like hyperswarm
Dat
@dat_project_twitter
rangermauve substack: So then you'd connect to them and pull data from those peers through kappa sparse queries or something?
Dat
@dat_project_twitter
substack rangermauve: you could do that, yes. but these are separate techniques
substack you can do each without the other
substack kappa-view-query was mentioned earlier, but i think kappa-sparse-query was meant?
substack for some applications, the payload for topic-discovery may be small enough to publish bits of topic metadata to the swarm tech
substack like for example, for peermaps each peer could publish a quadtree bitmap to the swarm with a 1 set for each geographical bucket that the peer has some data for
Dat
@dat_project_twitter
substack with 64 bytes you could partition the earth into 64*8=512 buckets
substack i haven't looked at dat swarm tech in a while but i remember with signal-hub you could publish small metadata payloads
substack looking forward to doing these kinds of experiments soon, after i finish up a few database things and get the ingest working
Dat
@dat_project_twitter
rangermauve Oooo, quadtree bitmap is a great term. 😍
rangermauve 512 layers of buckets, right?
Dat
@dat_project_twitter
substack 512 buckets
substack for 64 bytes
Dat
@dat_project_twitter
rangermauve Like, quadtrees have levels of precision in the tree, right. So by 512 buckets do you mean 512 levels of precision in the quadtree or total buckets? Or are the two the same? Sorry, I'm kinda dumb today. 😅
Dat
@dat_project_twitter
substack rangermauve: ah yeah i guess i was thinking just 1-level
substack multiple levels could make sense but i'd need to think about it more
Dat
@dat_project_twitter
rangermauve This might be relevant to folks here. Found it in the dat comm comm notes. :P https://blog.mozilla.org/blog/2020/03/30/were-fixing-the-internet-join-us/
rangermauve Basically, Mozilla is offering to pay teams of up to 4 people 2500 each to work on some sort of distributed web project
rangermauve I wanna propose a local-network website dsicovery thing based on dat-webext fix-the-internet/spring-mvp-lab#36
Diego
@dpaez
cool
Dat
@dat_project_twitter
decentral1se its been tidal wave flooded by projects... 600 in one night or somet...
decentral1se just shows lack of funding opportunities once again...too little cash to spread around...
decentral1se (cash gone elsewhere no doubt)