These are chat archives for rust-lang/rust

1st
Nov 2017
Guillaume Fraux
@Luthaf
Nov 01 2017 09:56
Hi ! Does anyone else see weird failures on Travis with rust beta builder in the last few weeks?
Like this: https://travis-ci.org/lumol-org/lumol/jobs/294151334#L1786. The compiler get killed while compiling doc tests
It sometimes goes away when restarting the build, and I could not reproduce it locally, so I assume this is an issue with Travis somehow ...
Fra ns
@snarf95_twitter
Nov 01 2017 10:59

Unpacking an Option and returning a reference to the Some(t) variant possible ?

pub fn all_pieces<'a>(&'a self) -> impl Iterator<Item = Piece> + 'a {
    self.map.iter().flat_map(|r| r.iter()).filter_map(|&t| t)
}

This is what I'm doing right now but it it basically clones the piece upon iteration afaik.

Aleksey Kladov
@matklad
Nov 01 2017 11:01
Hm, this does not clone anything: there are no .clone() or .cloned() calls, and there are no implicit clones in Rust.
Fra ns
@snarf95_twitter
Nov 01 2017 11:05
My piece struct looks like this:
#[derive(Debug, Copy, Clone, PartialEq)]
struct Piece {
    tile: Tile,
    piece_type: PieceType,
    owner: Player,
}
i.e. it implements Copy trait so it should clone it "automagically"
Jonas Platte
@jplatte
Nov 01 2017 11:07
Copy doesn't just mean that it clones automatically
But also that the cloning / copying only requires copying the bits that make up the structure
Aleksey Kladov
@matklad
Nov 01 2017 11:08
Copying is cheap (unless Piece is really huge, but then it should probably not be Copy) and it's usually better to avoid rederences in the API
That said, I think you can do .filter_map(|opt| opt.as_ref()).)
Fra ns
@snarf95_twitter
Nov 01 2017 11:10
@matklad .as_ref() did the trick, thanks :-)
a quick ~15% speedup
Aleksey Kladov
@matklad
Nov 01 2017 11:12
wowsers, I wonder if something in your Piece is to big then...
Are Tile and Player just small ints, or are they complex structs themselves?
Fra ns
@snarf95_twitter
Nov 01 2017 11:12
well it's relative simple, piece_type and owner are just simple enums and tile is (i8,i8)
Aleksey Kladov
@matklad
Nov 01 2017 11:13
Guess I'll need to reavaluate my guidelines about references to copy types :sweat_smile:
Fra ns
@snarf95_twitter
Nov 01 2017 11:14
hmm maybe... guess only way to tell is by profiling
Aleksey Kladov
@matklad
Nov 01 2017 11:20
@snarf95_twitter out of curiosity, what is ::std::mem::size_of::<Piece>()?
Fra ns
@snarf95_twitter
Nov 01 2017 11:22
4
kinda weird why it would be faster then ...
Aleksey Kladov
@matklad
Nov 01 2017 11:23
Most curious! sieof &Piece would be 8 (you are on x64, right?), so references are actually larger then Pieces themselves. I have no idea why they turn out to be faster =)
Fra ns
@snarf95_twitter
Nov 01 2017 11:24
x64 yup
maybe something with codegen?
maybe inaccurate profiler?
think its the profiler tbh, most likely
profiling method
Aleksey Kladov
@matklad
Nov 01 2017 11:27

Hm, are you getting 15% speedup with profiler? Any profiler skews results, so benchmarks should be done without them.

I personally use /usr/bin/time as a final check for "are my optimizations actually make code faster"?

Fra ns
@snarf95_twitter
Nov 01 2017 11:28
hmm im running the old faithful windoge.. benching using cargo bench
Aleksey Kladov
@matklad
Nov 01 2017 11:31
Ah, so you get 15% improvement in cargo bench, which is not run under profiler? Then it's a legit speedup! :)
Fra ns
@snarf95_twitter
Nov 01 2017 11:34
If I run the bench in a loop then each time it scores worse? What is happening? Thermal downclocking?
My bench methods are using all cores btw
Sergei Pepyakin
@pepyakin
Nov 01 2017 14:45
Hello all! Is there any way to run tests from rust-skeptic ?
(I have #[test] fns in my README and I want to test them)
TatriX
@TatriX
Nov 01 2017 15:14
What is the best rust book to read after trpl?
Sean Perry
@shaleh
Nov 01 2017 18:07
If I want to have a macro that makes new identifier names do I have to use procedural macro? For instance I have an XFoo and a YFoo and I would like a macro that expanded Foo into XFoo and YFoo? The naive ($f:ident) and then X$f is not working for me.
Jonas Platte
@jplatte
Nov 01 2017 18:11
@shaleh It is possible outside of proc macros, but only on nightly: https://doc.rust-lang.org/nightly/std/macro.concat_idents.html
Sean Perry
@shaleh
Nov 01 2017 18:15
@jplatte thanks, let me try that. You can't really use Rust without using nightly so that is not a burden currently.
Robyn Speer
@rspeer
Nov 01 2017 18:15
That's an odd sentiment, I've never used nightly.
Steve Klabnik
@steveklabnik
Nov 01 2017 18:16
the only thing i use nightly for are developer tools, and only for the tools
well there's one project that i have that uses nightly
but it's also because it uses tools
Sean Perry
@shaleh
Nov 01 2017 18:20
It seems that nightly is the only place that has enough pieces for everything to work
Steve Klabnik
@steveklabnik
Nov 01 2017 18:20
like what?
just curious
Denis Lisov
@tanriol
Nov 01 2017 18:21
For me, stable is already a big improvement on C/C++ so I use nightly only for (a) tools, (b) embedded (microcontroller).
Antonin Carette
@k0pernicus
Nov 01 2017 18:23
I am using Nightly for tools mainly (bench), and develop my crates...
getynge
@getynge
Nov 01 2017 18:29
Hello world!
Sean Perry
@shaleh
Nov 01 2017 18:33
@steveklabnik concat_idents is our 3rd feature use. 2nd for macros.
Steve Klabnik
@steveklabnik
Nov 01 2017 18:34
i thought that was stable
Sean Perry
@shaleh
Nov 01 2017 18:34
nope
Steve Klabnik
@steveklabnik
Nov 01 2017 18:34
huh
getynge
@getynge
Nov 01 2017 18:35
I've been out of the loop on this and did want to ask. Does anybody know what the stabilization plans for procedural macros are? I remember reading at one point that a revamp was planned, but I checked the issue tracker on the procedural macro RFC and it looks like progress is still being made for the current version of procedural macros
Steve Klabnik
@steveklabnik
Nov 01 2017 18:35
@getynge implementation is still underway
getynge
@getynge
Nov 01 2017 18:36
@steveklabnik ah that's good. Do you think it'd look roughly like what we see on nightly?
Steve Klabnik
@steveklabnik
Nov 01 2017 18:36
i'm not sure what has been implemented yet honestly
so i can't say
getynge
@getynge
Nov 01 2017 18:36
Honestly I'm just praying Rocket can be moved to stable someday
Steve Klabnik
@steveklabnik
Nov 01 2017 18:36
the RFCs should have what it looks like
getynge
@getynge
Nov 01 2017 18:36
fair enough
Sean Perry
@shaleh
Nov 01 2017 18:37
yeah, I would enjoy getting off the nightly train as well
proc macros was our first hard nightly require
we also have const_size_of, global_allocator, and now concat_idents
Steve Klabnik
@steveklabnik
Nov 01 2017 18:38
yeah, that explains it then; I don't use macros much at all
getynge
@getynge
Nov 01 2017 18:39
@shaleh although the use of procedural macros is one of the things that makes rocket so nice, so I think it was a good implementation decision
I just have some trepidations using rocket in production because I just can't feel at ease using nightly rust in prod
Sean Perry
@shaleh
Nov 01 2017 18:40
same here. Implementing some of our bits without macros would be icky. We would probably run an external code gen/template.
steveklabnik @steveklabnik nods
Sean Perry
@shaleh
Nov 01 2017 18:40
Now that I know about rust-toolchain I have a little less trepidation. We can lock it down to a "known working" nightly and go forward when needed.
getynge
@getynge
Nov 01 2017 18:41
@shaleh do you work on rocket? I don't see you on the list of contributors but idk how GitHub determines that. Not to be rude or anything, just curious
Sean Perry
@shaleh
Nov 01 2017 18:41
No, I do not. I am in asking about macros for another project.
getynge
@getynge
Nov 01 2017 18:42
@shaleh ah, makes sense. What are you working on if you don't mind my asking?
Sean Perry
@shaleh
Nov 01 2017 18:42
Emacs ported to Rust :-)
Wilfred/remacs
getynge
@getynge
Nov 01 2017 18:42
holy crap that's a huge job
Sean Perry
@shaleh
Nov 01 2017 18:43
Rust makes it surprisingly easy
getynge
@getynge
Nov 01 2017 18:43
I'm pretty sure by the end of that project you'll have more LOC then redox
Sean Perry
@shaleh
Nov 01 2017 18:43
we can call back and forth between the two without fapping about with FFI
extern "C" and on we go
getynge
@getynge
Nov 01 2017 18:43
@shaleh yeah rust makes a lot of stuff like that a breeze.
Sean Perry
@shaleh
Nov 01 2017 18:43
so we can port it function by function
along the way we report bugs and inconsistencies
since most of what emacs users care about is the Lisp, it is functional quickly
We have a procedural macro which turns a Rust function into one usable for the Lisp engine (still in C)
getynge
@getynge
Nov 01 2017 18:45
@shaleh that's pretty neato
Sean Perry
@shaleh
Nov 01 2017 18:45
this means generating a C struct and exporting it. foo -> Ffoo and Sfoo
getynge
@getynge
Nov 01 2017 18:45
I just now wrapped my head around normal macros
Sean Perry
@shaleh
Nov 01 2017 18:46
yeah, the procedural one is definitely a head scratcher
getynge
@getynge
Nov 01 2017 18:46
I'm hyped for macros 2.0, properly namespaced macros will be great
Sean Perry
@shaleh
Nov 01 2017 18:46
I did not expect -> "S"$thing to be a hard macro expansion. But here I am learning about unstable concat_idents.
getynge
@getynge
Nov 01 2017 18:47
@shaleh I mean I've written compilers before so at least the ideas are not lost on me, but procedural macros basically take a stream of tokens as input and spit out a transformed stream of tokens right?
or do they take an AST?
or am I just being a silly goose and they actually do something totally different
Sean Perry
@shaleh
Nov 01 2017 18:47
in tokens -> out tokens
#[proc_macro_attribute] pub fn lisp_fn(attr_ts: TokenStream, fn_ts: TokenStream) -> TokenStream
getynge
@getynge
Nov 01 2017 18:48
is there something I could do to help? I haven't been contributing nearly enough in the time I've had a GitHub so I've been making it a goal of mine to make at least one code contribution to some project a day
Sean Perry
@shaleh
Nov 01 2017 18:49
Always
issues are marked for newbies
you need rustfmt-nightly to pass layout
We are using 2017-10-22 compiler suite currently
getynge
@getynge
Nov 01 2017 18:53
ah okay
Sean Perry
@shaleh
Nov 01 2017 18:53
rustup makes this easy
you can specify a version for a project
getynge
@getynge
Nov 01 2017 18:54
yeah I have several toolchains installed. Though I actually didn't know about the per-project thing
can you specify to rustup to change toolchains when in specific directories or something?
Sean Perry
@shaleh
Nov 01 2017 18:54
yes
Steve Klabnik
@steveklabnik
Nov 01 2017 18:54
rustup override add NAME
Sean Perry
@shaleh
Nov 01 2017 18:54
thanks @steveklabnik I was bout to look up the exact syntax
Steve Klabnik
@steveklabnik
Nov 01 2017 18:54
:)
Sean Perry
@shaleh
Nov 01 2017 18:56
so in this case rustup install nightly-2017-10-22 && rustup override add nightly-2017-10-22
rustup show in the project versus elsewhere to double check
getynge
@getynge
Nov 01 2017 18:58
Also one thing I want to build up to, rust experience wise, is to be able to contribute to rust targeting mobile devices
Antonin Carette
@k0pernicus
Nov 01 2017 18:59
@getynge If you just want to use a tool (like bench) and you don’t want to change your current toolchain, you can just execute the tool from the toolchain with rustup run nightly <my_command>
Like rustup run nightly cargo bench
getynge
@getynge
Nov 01 2017 18:59
@k0pernicus thanks, that's actually really helpful
which reminds me I need to install clippy
I think being able to use rust to develop for mobile devices would be a huge boon to the community, but idk
Android has the NDK, but there are no real safe bindings to the NDK in rust, there are generated bindings but cargo does not currently support building to APKs
Steve Klabnik
@steveklabnik
Nov 01 2017 19:01
you can also do
cargo +nightly bench
getynge
@getynge
Nov 01 2017 19:01
although there are tools to package to APKs, the situation is less than ideal
Steve Klabnik
@steveklabnik
Nov 01 2017 19:01
and it's the same as the rustup run invocation
Antonin Carette
@k0pernicus
Nov 01 2017 19:04
Thanks @steveklabnik, I didn't know :-)
Steve Klabnik
@steveklabnik
Nov 01 2017 19:10
:)
Antonin Carette
@k0pernicus
Nov 01 2017 19:48
Anobody here made some experiments for calculation at compile time?
I know the Rust compiler can automatically do optimizations in the code by itself
Evaluation of static primitives at compile time also...
Steve Klabnik
@steveklabnik
Nov 01 2017 19:49
so
rust-lang/rust#24111
is the big tracking issue
Antonin Carette
@k0pernicus
Nov 01 2017 19:51
Oh cool, thanks a lot!
Steve Klabnik
@steveklabnik
Nov 01 2017 19:53
no problem
once we get that going, tons of compile-time shenanigans are going to be possible :)
Antonin Carette
@k0pernicus
Nov 01 2017 19:53
:ok_hand:
Steve Klabnik
@steveklabnik
Nov 01 2017 19:54
it's going to take a while until all of this is stable though; there's some big PRs that are open right now with the foundations
Antonin Carette
@k0pernicus
Nov 01 2017 19:54
I understand
Steve Klabnik
@steveklabnik
Nov 01 2017 19:54
basically, rust-lang/rust#43628 imbeds a rust interpreter into the rust compiler
so you can use that to do all sorts of fun stuff
Antonin Carette
@k0pernicus
Nov 01 2017 19:57
I'll take a look at it :-)
Steve Klabnik
@steveklabnik
Nov 01 2017 19:57
oh wait that's the wrong one
rust-lang/rust#45002
Antonin Carette
@k0pernicus
Nov 01 2017 20:15
:+1:
Antonin Carette
@k0pernicus
Nov 01 2017 22:01
There is something I don’t understand very well…
I am playing with size_of and size_of_val with vectors and fixed size arrays
The doc for Vec says "If capacity is 0, the vector will not allocate."
So, it means that the vector actually exists, right? The size of this vector will be != 0
I wrote this simple code to validate my hypothesis:
fn size_of_test() {
        let v0 : Vec<u8> = Vec::with_capacity(0);
        let v128 : Vec<u8> = Vec::with_capacity(128);
        let w : Vec<u8> = (0..128).collect();
        let x = [0u8; 128];
        println!("Size of <usize> type is {}", mem::size_of::<usize>());
        println!("Size of empty vec (without capacity of 0) is {}", mem::size_of_val(&v0));
        println!("Size of empty vec (without capacity of 128) is {}", mem::size_of_val(&v128));
        println!("Size of initialized vec is {}", mem::size_of_val(&w));
        println!("Size of initialized array is {}", mem::size_of_val(&x));
 }
The result is:
Size of <usize> type is 8
Size of empty vec (capacity of 0) is 24
Size of empty vec (capacity of 128) is 24
Size of initialized vec is 24
Size of initialized array is 128
It seems my hypothesis is correct.
So, the full size of v0 is 24 (3 * size_of<usize>), the full size of v128 is 128 x 24, the full size of w is 128 x 24 and the full size of x is 128, right?
Jonas Platte
@jplatte
Nov 01 2017 22:06
@k0pernicus You are querying the size of Vec<_> here. IIRC a Vec is made up of a pointer to the actual memory, the capacity and the actual length
So one pointer field and two usize fields = 3 * 8 bytes
Antonin Carette
@k0pernicus
Nov 01 2017 22:07
@jplatte Yes, so 3 * size_of<usize>
Jonas Platte
@jplatte
Nov 01 2017 22:08
You probably want to explicitly convert them to slices: https://play.rust-lang.org/?gist=795a47196c44995dab310b4fc9f06a1b&version=stable
Antonin Carette
@k0pernicus
Nov 01 2017 22:08
@jplatte So « the vector will not allocate » is different than « the vector will not be instantiated », right?
Jonas Platte
@jplatte
Nov 01 2017 22:09
uhh.. Don't really understand the question
Antonin Carette
@k0pernicus
Nov 01 2017 22:12
Sorry, it’s ok with your example :-)
Thanks a lot!
Jonas Platte
@jplatte
Nov 01 2017 22:13
Does it...? What do you need this for, anyway?
Antonin Carette
@k0pernicus
Nov 01 2017 22:14
Just to experiment
Jonas Platte
@jplatte
Nov 01 2017 22:16
Okay. I hope I didn't cause any misconceptions ^^°
Antonin Carette
@k0pernicus
Nov 01 2017 22:17
No no, it’s okay
Actually, I was wondering what is the full size, in memory, of a vector in comparison with a fixed-size array
And also how a non-fixed capacity vector is growing: 2 > 4 > 8 > 16 > 32 > 64 > 128 > … ?
I will crawl the web/official doc for that :-)
Jonas Platte
@jplatte
Nov 01 2017 22:21
Oh, you can easily query the vectors capacity I think
Antonin Carette
@k0pernicus
Nov 01 2017 22:21
Yep
Jonas Platte
@jplatte
Nov 01 2017 22:22
What the slice is is just the initialized memory, not the capacity / reserved memory
the amount of space a Vec takes up is 24 bytes on the stack (with the current implementation) + capacity bytes on the heap
^ easiest way of checking how a Vec grows
Antonin Carette
@k0pernicus
Nov 01 2017 22:25
Thanks, I wrote this code too ^^
it’s like C++ vectors actually
Jonas Platte
@jplatte
Nov 01 2017 22:26
I don't think the C++ standard specifies it actually
But yeah, it makes sense for most implementations to do something like this
The initial capacity / capacity after adding the first element might be different
Antonin Carette
@k0pernicus
Nov 01 2017 22:28
Yep
Antonin Carette
@k0pernicus
Nov 01 2017 22:33
So, the total allocated size of a vector is equals to: (capacity * sizeof(T) + sizeof(usize) * 2)
Jonas Platte
@jplatte
Nov 01 2017 22:34
Not two times usize, three times usize
pointer to the heap data, capacity, length
Antonin Carette
@k0pernicus
Nov 01 2017 22:35
Oh ok, so (capacity * sizeof(T) + sizeof(usize) * 3)
Jonas Platte
@jplatte
Nov 01 2017 22:35
yup
Antonin Carette
@k0pernicus
Nov 01 2017 22:35
Make sense
Jonas Platte
@jplatte
Nov 01 2017 22:35
although I don't know how much sense it makes to add up heap and stack size
seen as the stack is almost always much more limited
Thinking about it, heap and stack size maybe isn't even a good distinction
since you can obviously also have a Vec<Vec<_>>, where each of the 24 bytes of the inner Vecs are on the heap too
Antonin Carette
@k0pernicus
Nov 01 2017 22:41
That’s right