Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Nov 29 16:12
    jmatsushita commented #77
  • Nov 29 16:07
    jmatsushita commented #77
  • Sep 19 09:33
    deepakedayillam opened #89
  • Jul 06 11:46
    devxpy commented #69
  • Jul 06 01:18
    RangerMauve commented #69
  • Jul 05 22:08
    devxpy commented #69
  • Jun 24 20:57
    joehand commented #88
  • Jun 24 20:56
    joehand commented #88
  • Jun 24 20:55
    joehand commented #88
  • Jun 24 20:55
    joehand commented #88
  • Jun 24 20:54
    joehand pinned #88
  • Jun 24 20:54
    joehand closed #88
  • Jun 24 20:54
    joehand commented #88
  • Jun 24 20:47
    RangerMauve commented #88
  • Jun 24 20:39
    todrobbins commented #88
  • May 31 16:11
    RangerMauve commented #88
  • May 31 08:14
    decentral1se commented #88
  • May 30 21:59
    todrobbins commented #88
  • May 30 21:56
    todrobbins commented #88
  • May 30 21:41
    RangerMauve commented #88
Dat
@dat_project_twitter
rangermauve @geekodour Usually I log the archive.key using archive.ready(() => console.log(archive.key.toString('hex')) and then copy-paste it to the other application somewhere. (like a CLI argument)
Hrishikesh Barman
@geekodour
@RangerMauve thanks, it helped. :)
hss
@hss_gitlab
i accidentally deleted some files in my dat-directory. how to restore the last version?
Phoenix Fox
@Bluebie
It seems like most of the work on beaker is happening in the blue or blue-hyperdrive10 branches. I am wondering, what is "blue"? is that like, an internal name for a big update? seemed like beaker was getting monthly releases up until about six months ago and now everything's just a bit stalled and very broken on the macOS platform, from the public release side of things
I'd also like to know, i'm trying to get my head around what Dat Archives look like, as a web hosting thing. For example, if I delete a "file" from a Dat Archive, and someone chooses to cohost that archive, are they downloading and mirroring old obsolete data, or just the current version? Like, in a social media app for example, it would be important that when users opt to delete some media they previously shared, that all the other nodes (at least ones that aren't dodgy modified ones) would genuinely try to erase and reduce availability of that deleted content when they sync up. as well as disk space concerns with, for example, an ephemeral blog where content expires after a year or two
Dat
@dat_project_twitter
rangermauve @hss_gitlab did you have your archive backed up anywhere else? By default the CLI doesn't keep the history to save on space IIRC.
rangermauve pfrazee: Ping! ^
pfrazee @Bluebie yeah that's correct, "blue" is the codename for a major next releas
rangermauve @Bluebie paul can probably speak to the Beaker related questions. Regarding files and deletion, you can set an archive to be in "latest" mode (default in the CLI) which doesn't download obsolete data or keep history.
rangermauve @Bluebie so if everyone is using latest mode then old data will get forgotten.
Phoenix Fox
@Bluebie
Ok! Is that true for the metadata as well? Like, I have one website I’m experimentally publishing thru dat which has a dataset that is rewritten every day, and I wonder if the metadata log will get longer and longer until one day it becomes slow to load and needs to be reinitialised as a new Dat archive?
Phoenix Fox
@Bluebie
Also question, how does one go about migrating a dat share from one server to another? Where do the private keys live? Are they in the .dat subfolder in the folder containing the shared data on the writer machine?
Dat
@dat_project_twitter
okdistribute yeah they are there.
okdistribute if you transfer i'd recommend using the dat keys command, but only update ever from one place -- once you transfer your keys, the old one should be discarded. otherwise your dat will be broken.
Phoenix Fox
@Bluebie
thanks for the tip @okdistribute
Dat
@dat_project_twitter
jboy If I want to just grab a single file from a dat archive rather than the whole thing, what's the best way to do that from the command line?
jboy none of the dat subcommands seem to fit the bill
Dat
@dat_project_twitter
jboy I guess I'm asking what's the equivalent to wget/snarf/curl in the dat world.
Dat
@dat_project_twitter
deka New comm-comm meeting tomorrow: dat-land/comm-comm#112 see you there ;-)
Phoenix Fox
@Bluebie
@jboy one way to do it would be to clone a dat with the —sparse option, then use dat sync —sparse —http, which makes a little web server on port 8080 and you can use curl or whatever to load out any files you need on demand I believe?
@jboy or if you want something nicer, there are some NPM packages for Dat to HTTP gateways, which let you load stuff off dat without caching them to your actual filesystem in a dat folder
or if you want to get real fancy, you can wait a bit and Dat’s going to be getting some nice FUSE compatibility so you can just mount a dat right in to your actual filesystem and lazy load in data as needed through regular filesystem operations
Dat
@dat_project_twitter
rangermauve @jboy You can have a .datdownload file to specify which files you wish to load: https://github.com/datproject/dat#selecting-files
Dat
@dat_project_twitter
jboy thanks, that should do the trick rangermauve
jboy thanks also Bluebie
Phoenix Fox
@Bluebie
question: internal to one dat archive, is content deduplicated at all? like if i have two files containing identical content but different filenames and paths, will their content be stored twice?
Dat
@dat_project_twitter
rangermauve @Bluebie There's no deduplication at the moment. The tradeoff is there so you can stream data into the archive faster and don't need to go through the original data twice (once to hash and check for dupes, once to upload).
Dat
@dat_project_twitter
okdistribute jhand: can we move docs.datproject.org to docs.dat.foundation & create a redirect?
okdistribute jboy: or docs.datprotocol.com..
okdistribute jhand: or just keep docs.datproject.org for now.. but the link is broken in the new site
okdistribute jhand: rangermauve also whats the difference between 'community-> get started' and 'learn->docs' ?
rangermauve okdistribute: I think "get started" is going to have some content that todrobbins is working on that relates to the MOSS grant. Like a developer onboarding thing
Dat
@dat_project_twitter
okdistribute ok
Phoenix Fox
@Bluebie
No dedup isn’t a big deal for me. I’m happy with the trade off, just trying to think more about how the data structures work so I can invent good data formats in the future, coming from IPFS world Dat is pretty different (in good ways!)
if my application really needs deduplication I can implement it myself using a folder full of content-hashed filenames in archive :)
Dat
@dat_project_twitter
rangermauve @bluebie You might also be interested in checking out this book about how Dat's internal structures work: https://datprotocol.github.io/how-dat-works/
Dat
@dat_project_twitter
okdistribute rangermauve: do you want to add an example for the sdk in the new cookbook section of docs?
rangermauve okdistribute: sure! I'm just going to take a break for like four to six hours and then I'll get on it
okdistribute no worries or rush on that
okdistribute just thought it'd be a nice add
Dat
@dat_project_twitter
okdistribute i added the kappa workshop here and created the 'cookbook' section https://docs.datproject.org/docs/kappa
Dat
@dat_project_twitter
nettle heck yeah okdistribute!!!
Dat
@dat_project_twitter
decentral1se boring licensing question about the dat modules. A lot of core is MIT licensed
decentral1se I'm working from those while porting to python and wondering if I have to maintain that license
decentral1se so far I am doing that. Curious though if anyone has a lead on what are the options. IANAL :)
Dat
@dat_project_twitter
okdistribute MIT means you can do whatever you want, even relicense
okdistribute (of course it means more things, but ... for sake of simplicity..)
decentral1se aha, right
decentral1se thanks, I better read up on this ;)
Dat
@dat_project_twitter
decentral1se (motivated by reading about the http://commonstransition.org/commons-based-reciprocity-licenses approach)