These are chat archives for anacrolix/torrent

7th
Jan 2018
Matt Joiner
@anacrolix
Jan 07 2018 00:43
you could call File.State, and sum the Bytes for each PieceState that is Completed=true
I should start a FAQ for this
any suggestions about style, and where to run it?
maybe a gist?
let people openly modify it
Matt Joiner
@anacrolix
Jan 07 2018 07:42
there's a bug in the new file reader
just reported downstream
Matt Joiner
@anacrolix
Jan 07 2018 08:52
shoudl be fixed, but i'm writing tests for it
doesn't that work?
Denis
@elgatito
Jan 07 2018 09:56
does work to change the limit, but burst can;t be changed
I was testing different situation and could not make it work properly, maybe it works great when you need to limit speeds in terms of Mb's, not Kb's
Matt Joiner
@anacrolix
Jan 07 2018 09:57
what was your burst set to?
for limits on the order of KBs
Denis
@elgatito
Jan 07 2018 09:59
I don't remember now, the problem is that with one burst - limiting to Kbs works great, but when you set Limit to rate.Inf - it does not use all the bandwidth
Matt Joiner
@anacrolix
Jan 07 2018 09:59
i think it defaults to rate.Inf, but you think changing it from something that wasn't, to rate.Inf doesn't work?
it seems like burst should be settable, try submitting an issue to them
Denis
@elgatito
Jan 07 2018 10:01
nope, burst could not be changed in one limiter, you should re-create the limiter with another burst, that is specially designed
I will try another shot with more investigation to how limiters influence on download in the library
Denis
@elgatito
Jan 07 2018 10:09
I only remember now, that this limiter rate.NewLimiter(rate.Inf, 0) was giving another speed, compared to rate.NewLimiter(rate.Inf, 1<<15), which makes no sense, when Limit is rate.Inf
Denis
@elgatito
Jan 07 2018 12:16
have just tested my implementation with reuse of file handles, and it's much more slower :)
Denis
@elgatito
Jan 07 2018 14:57
error reading torrent "Wayward Pines (S01) LostFilm" piece 628 offset 774558, 0 bytes: %!s(<nil>)
Matt Joiner
@anacrolix
Jan 07 2018 14:58
yeah, pull the next commit
should fix it
i'm doing a big rewrite of the tests to test multi-file torrents that will catch this kind of thing in the future
96261342119b979205bc7c8fd19295ee22a7746f
Denis
@elgatito
Jan 07 2018 15:00
i'm at this one Improve comments on Config.{Upload,Download}RateLimiter
Matt Joiner
@anacrolix
Jan 07 2018 15:00
is it reproducible?
Denis
@elgatito
Jan 07 2018 15:00
memory storage now broken, trying to understand why. Piece marked as completed, but when reader is attached - it cannot read first piece of that file
that is from multi-file torrent. for single file - fine
Matt Joiner
@anacrolix
Jan 07 2018 15:02
damn, reset to 9b718566ba5d19eeac4b68d4f9d5a2351ec02a7d for now
Denis
@elgatito
Jan 07 2018 15:03
that's fine:)
Matt Joiner
@anacrolix
Jan 07 2018 15:03
i reset in my prod stuff, i haven't pushed anything else yet
Denis
@elgatito
Jan 07 2018 15:06
I create readers from File, set offset and readahead and download goes fine, piece is read fine from the Client side
so probably something with offset from reader side
Matt Joiner
@anacrolix
Jan 07 2018 15:07
yes
Denis
@elgatito
Jan 07 2018 15:27
huh, i've made a fix :)
Matt Joiner
@anacrolix
Jan 07 2018 15:27
to master of torrent?
Denis
@elgatito
Jan 07 2018 15:28
func (r *reader) available(off, max int64) (ret int64) {
    off += r.offset
    for max > 0 {
added that line with off +=
coz it was requesting global Torrent with local offset values
Matt Joiner
@anacrolix
Jan 07 2018 15:31
well done!
that's right
i missed that one
Denis
@elgatito
Jan 07 2018 15:31
i'm not so familiar, maybe there can be any other place like that
Matt Joiner
@anacrolix
Jan 07 2018 15:32
if it works i'd say you found it
:P
i added Reader.torrentOffset to track the points of conversion
Denis
@elgatito
Jan 07 2018 15:36
:+1:
Matt Joiner
@anacrolix
Jan 07 2018 15:40
that Reader.available function doesn't look very efficient. but i've never seen it come up on any profile before
it looks like a holdout from when responsive was the default
Denis
@elgatito
Jan 07 2018 15:41
that is a reader, and it's coming once at each data byte, so you'd need concurrent readers to see any load
Matt Joiner
@anacrolix
Jan 07 2018 15:42
nah more than i'ts O(len of read), which is normally only ~32KB I think
so it only scans 2 chunks
no point optimizing it tho it if it works and isn't slow
if u want to commit that directly, go ahead
Denis
@elgatito
Jan 07 2018 15:45
which one? with offset?
Matt Joiner
@anacrolix
Jan 07 2018 15:45
yep
deranjer
@deranjer
Jan 07 2018 15:48
@anacrolix Thanks, I was doing it without checking if piece.complete which was always giving me 100% complete
Matt Joiner
@anacrolix
Jan 07 2018 15:49
i feel like that trick should go into the faq
Denis
@elgatito
Jan 07 2018 15:50
@anacrolix , not sure I would like to make a commit. since it can break something :D
len1 := int64(req.Length) - (off - r.t.requestOffset(req)) not sure this will work as intended
Matt Joiner
@anacrolix
Jan 07 2018 15:51
maybe run the unit tests and stuff to be sure
deranjer
@deranjer
Jan 07 2018 15:51
@anacrolix I'll look into readthedocs, not sure if they allow open editing
Matt Joiner
@anacrolix
Jan 07 2018 15:51
i usually run stuff in prod if i'm not super confident about it for a bit and there's no tests lol
@deranjer do you know if github wiki is ok? https://github.com/anacrolix/torrent/wiki
deranjer
@deranjer
Jan 07 2018 15:52
Yeah that should work fine
Matt Joiner
@anacrolix
Jan 07 2018 15:53
looks like any github user can modify the wiki which is cool
Denis
@elgatito
Jan 07 2018 15:53
PASS
ok      github.com/anacrolix/torrent    95.595s
does that mean it looks fine?
Matt Joiner
@anacrolix
Jan 07 2018 15:54
try go test -race github.com/anacrolix/... in case you have any of my other stuff that depends on it too, and will do the fs and other modules too
i think only confluence is public, so github.com/anacrolix/torrent/... would be fine
Denis
@elgatito
Jan 07 2018 16:10
my tests fail with network errors
so i've made a PR
Denis
@elgatito
Jan 07 2018 18:38
I have made a test with limiting to Kbs, this limiters are fine to change limits later if need to limit them:
        DownloadLimiter: rate.NewLimiter(rate.Inf, 2<<16),
        UploadLimiter:   rate.NewLimiter(rate.Inf, 2<<16),