Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Feb 28 2019 19:45
    @Arachnid banned @merkle_tree_twitter
  • Feb 17 2019 00:56
    @jpitts banned @Aquentson_twitter
  • Apr 09 2018 04:14
    @holiman banned @JennyJennywren_twitter
  • Oct 21 2017 19:12
    @Arachnid banned @Musk11
Péter Szilágyi
@karalabe
we can see this on rinkeby a lot when we reach 1500mgas/sec block processing :D
Peter (bitfly)
@peterbitfly
ok, I need to check how many such tx are and will try to exclude them from the calculation
Martin Holst Swende
@holiman
Was just going to say that -- when an ICO is stopped, all txs may be aborted early, wasting a lot of gas that isn't really proportional to the delay it causes the miner
5chdn
@5chdn
Is the Youtube timestamp incorrect? It says 2 more hours, but I was under the impression it should start in 15 minutes?
(The dev call)
Hudson Jameson
@Souptacular
Oh it is!
Let me fix
chriseth
@chriseth
did someone switch from winter to summer time on the weekend?
Eth-Gitter-Bridge
@Eth-Gitter-Bridge
[Hudson Jameson] Did I make a mistake re: timezone?
[Hudson Jameson] Oh got it. On the youtube yes.
Hudson Jameson
@Souptacular
@/all Meeting now
Daniel Ellison
@zigguratt
Sorry I can't be there this time, @Souptacular. I'm in an airport in Lisbon with crummy connectivity.
Hudson Jameson
@Souptacular
No problem @zigguratt !
Have safe travels
Daniel Ellison
@zigguratt
Thanks!
Hudson Jameson
@Souptacular
ethereum/pm#31
Alex Van de Sande
@alexvandesande
Hey all! I’ve created a EIP that would love your comments on
ethereum/EIPs#877
It’s proposing a simpler way to get contracts to pay their own gas, or tokens transacgtions to be paid in tokens themselves
Peter (bitfly)
@peterbitfly
I adapted now my analysis of the uncle rate vs. tx numbers & gas usage to exclude all failed transactions as suggested by @karalabe and @holiman. The results are available at https://imgur.com/a/jMYYi . Still there is a good correlation between the uncle rate and the total amount of tx processed, but even after excluding all failed transactions there is no visible correlation between the amount of processed gas and the uncle rate.
Failed tx were identfied using the parity tracing API
ledgerwatch
@AlexeyAkhunov
@ppratscher Great analysis, I like it. It might suggest that each transaction has a fixed cost that translated to the uncle rates, regardless of what it does. And as a result, the gas cost of making a transaction (21000 gas currently) might be underpriced relative to other things
Peter (bitfly)
@peterbitfly
Thabks! Yes or that block propagation time and not block processing time is in fact the main driver of the high uncle rate.
ledgerwatch
@AlexeyAkhunov
Conceivably, this could be learnt in a simulation (similar to the hive) where there is artificial latency is added to the communicating nodes
Martin Holst Swende
@holiman
Well, obviously latency will impact uncle rate, the q is how large degree block processing adds to the 'natural' latency caused by e.g block size, right?
During attacks ,when block processing reached 5s-levels, uncle rate increased.
But maybe it's negligible while in subsecond range
Peter (bitfly)
@peterbitfly
It looks like that. This would mean that a dedicated block relay network for miners would help in decreasing the uncle rate
ledgerwatch
@AlexeyAkhunov
This paper (you probably have seen it): http://hackingdistributed.com/2018/01/15/decentralization-bitcoin-ethereum/ estimates Ethereum's median peer to peer latency as 152 ms (table in the chapter 4.2 - network structure)
Eth-Gitter-Bridge
@Eth-Gitter-Bridge
[Vitalik Buterin] I'd like to see a multidimensional linear regression getting coefficients for uncle rate from (i) gas, (ii) tx count, (iii) block size
[Vitalik Buterin] it feels to me like the contribution from tx count is quite high but that seems.... so weird
Martin Holst Swende
@holiman

One thing I've been pondering... Has anyone considered doing a multidimensional benchmark? Essentially the following:

  • Record block processing times for N blocks (N might be e.g. 1000)
  • For each of those blocks, create a distribution of the op-count for each opcode.
  • Correlate/estimate the times for each opcode.

In theory, I think this could be written as a large polynom, but in practice since opcode processing varies, the method needs to be based on estimation finding the best match.

Peter (bitfly)
@peterbitfly
I have added the correlation with the block size, it looks kind of similar to the one with the tx count https://imgur.com/a/HlJdP
Eth-Gitter-Bridge
@Eth-Gitter-Bridge
[Vitalik Buterin] can someone construct the data for this?
[Vitalik Buterin] would be good to construct once
[Vitalik Buterin] so we can then use it for analysis
Martin Holst Swende
@holiman
@ppratscher I wonder how you measure 'uncle-rate' here.. Does it mean "the chance that this block will become non-canonical", or does it measure the "uncle-rate" in a more general manner?
Eth-Gitter-Bridge
@Eth-Gitter-Bridge
[Vitalik Buterin] so if it's a regression analysis the y value would be "1 if this block is an uncle and 0 if it's not"
[Vitalik Buterin] though there are two ways to do it
[Vitalik Buterin] one way is that
[Vitalik Buterin] the other way is to look at periods of, say, 100 blocks
[Vitalik Buterin] if you're going to do correlation analyses then only the latter really makes sense
[Vitalik Buterin] because the correlations with "1 if this block is an uncle and 0 if it's not" are going to be low
Martin Holst Swende
@holiman
@vbuterin when you said "can someone construct the data for this?" -- which 'this' did you mean?
Eth-Gitter-Bridge
@Eth-Gitter-Bridge
[Vitalik Buterin] for each block (including uncle), get: total gas used, gas used not including burned from reverts, tx count, block size, number of times each opcode is used, 1 if uncle else 0
[Vitalik Buterin] just make a table out of that in some nice format (eg. CSV or JSON)
[Vitalik Buterin] so we can do whatever analysis on it
Martin Holst Swende
@holiman
@ppratscher -- do you already have that data, or can extract it from whatever tooling you've used already? And @vbuterin, what can "number of times each opcode is used" be used for without block processing time too?
Eth-Gitter-Bridge
@Eth-Gitter-Bridge
[Vitalik Buterin] ah, we could use block processing time