Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Feb 27 15:09
    ordian labeled #11527
  • Feb 27 15:09
    ordian opened #11527
  • Feb 27 15:09
    ordian labeled #11527
  • Feb 27 14:57
    niklasad1 closed #11508
  • Feb 27 14:57
    niklasad1 commented #11508
  • Feb 27 14:56
    niklasad1 labeled #11508
  • Feb 27 14:38

    ordian on ao-github-actions

    initial github actions (compare)

  • Feb 27 13:07
    s3krit synchronize #11525
  • Feb 27 13:07

    s3krit on mp-de-parityify

    sed magic find . -type f -exec… (compare)

  • Feb 27 12:57
    ordian labeled #11514
  • Feb 27 12:57
    ordian unlabeled #11514
  • Feb 27 12:56

    ordian on perf

    (compare)

  • Feb 27 12:56

    ordian on master

    Faster kill_garbage (#11514) *… (compare)

  • Feb 27 12:56
    ordian closed #11514
  • Feb 27 11:22

    ordian on na-engine-signer-dont-use-msg-only-zeroes

    (compare)

  • Feb 27 11:22

    ordian on master

    [EngineSigner]: don't sign mess… (compare)

  • Feb 27 11:22
    ordian closed #11524
  • Feb 27 11:22
    ordian closed #11521
  • Feb 27 10:58
    niklasad1 synchronize #11524
  • Feb 27 10:58

    niklasad1 on na-engine-signer-dont-use-msg-only-zeroes

    forgot formatting change (compare)

exsdaemon
@exsdaemon
I found! problem was the lack of eth for transfer the token (calling the smart contract method)
xuwwww678
@xuwwww678
2020-03-10 15:05:54 Syncing #2393374 0x6e65…e844 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 9/25 peers 245 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:05:59 Syncing #2393374 0x6e65…e844 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 9/25 peers 245 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:04 Syncing #2393374 0x6e65…e844 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 9/25 peers 245 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:09 Syncing #2393383 0x5757…68df 1.80 blk/s 5.6 tx/s 0.4 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:14 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:19 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:24 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:29 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:34 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:39 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:44 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:49 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:54 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 7/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:06:59 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 7/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:07:04 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 7/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:07:09 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:07:14 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:07:19 Syncing #2393383 0x5757…68df 0.00 blk/s 0.0 tx/s 0.0 Mgas/s 0+ 5100 Qed #2398486 8/25 peers 254 KiB chain 67 MiB db 42 MiB queue 1 MiB sync RPC: 0 conn, 0 req/s, 0 µs
2020-03-10 15:07:24 Syncing #2393383 0x5757…6
Every time is card card here for half a month
Sascha Goebel
@saschagoebel
Hey folks, I know this is not the right chat for this question, but there are a lot of bright people with much insight in here, so, does anyone know if the Classic Mordor Network is dead? It stopped syncing at Block 943610 and others seem to have the same problem. I was not able to find any information on it online :-o
Kyle From Ohio
@KyleFromOhio
QUESTION: trying to sync a minimal full node. not using “warp”. It took 3 days to get to 5M blocks and now its only going at <1 block a second w/ 30+ peers. At this pace its going to take months to sync to 9M+ blocks. If i turn on “warp” now to get it to sync in a few days and make it usable, can I turn off warp after its “warp synced" to let it catch up on the rest of the 4Mil+ blocks in the background?
parity_settings2.png
^ we are shooting for level5.
Sascha Goebel
@saschagoebel
Answer to myself: Apparently it's broken right now, but in the process of being fixed.
Muhammad Taimoor
@Taimoor10
which evm version parity is configured for by default?
Joaquin Diaz
@joadr
is anybody else having problems with ethereum network?
our transactions are waiting for confirmation for hours
at 20 Gwei S:
joshua-mir
@joshua-mir
@joadr safe minimum is currently like, 100gwei
Joaquin Diaz
@joadr
wowowowowow
joshua-mir
@joshua-mir
Kyle From Ohio
@KyleFromOhio
last 2 i had go out in minutes were… gas_price: 453,000,000,000, gas: 22000
$1.30 fee. was like 10 cents yesterday.
Joaquin Diaz
@joadr
Yeah, you put 453 Gwei
Kyle From Ohio
@KyleFromOhio
correct.
QUESTION: can you switch back and forth between “warp” and “no warp”? we want to sync fast but then move back to no warp settings for minimal full node.
joshua-mir
@joshua-mir
when you leave warp naturally, you drop into a normal sync
Kyle From Ohio
@KyleFromOhio
so in the long run warp and no-warp are the same?
joshua-mir
@joshua-mir
correct. after you drop out of warp, you begin downloading "ancient blocks" in the background and after that is complete your node is indistinguishable from one that wasn't warp-synced. a common misconception is that warp sync is progressive but you are literally downloading a snapshot of chainstate at a given blockheight and then continuing normal sync from there
Kyle From Ohio
@KyleFromOhio
sweet. thanks.
petehall73
@petehall73

@joshua-mir I hope you might you might be able to shed some light on something for me. A transaction is submitted to a PoA chain but is never mined into a block. I have trace logging enabled and see the following:
2020-03-12 16:00:50 UTC http.worker00 TRACE own_tx Importing transaction: PendingTransaction { transaction: SignedTransaction { transaction: UnverifiedTransaction { unsigned: Transaction { nonce: 242853, gas_price: 0, gas: 707296, action: Call(0x09df18f41b2919a971a931271125380e19bc8470), value: 0, data: [43, 86, 89, 207, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 131, 212, 104, 90, 54, 117, 245, 163, 17, 137, 164, 235, 145, 22, 213, 213, 93, 181, 32, 98, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 11, 246, 75, 165, 105, 172, 202, 191, 159, 173, 169, 245, 183, 131, 16, 164, 59, 130, 169, 190, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 128, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 96, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 164, 247, 211, 209, 104, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 131, 212, 104, 90, 54, 117, 245, 163, 17, 137, 164, 235, 145, 22, 213, 213, 93, 181, 32, 98, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 8, 215, 198, 158, 136, 116, 179, 76, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 124, 230, 108, 80, 226, 132, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 88, 138, 80, 50, 130, 254, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 94, 106, 134, 224, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 65, 105, 236, 238, 246, 209, 71, 82, 73, 235, 32, 19, 180, 54, 19, 16, 52, 217, 204, 214, 107, 149, 171, 244, 122, 203, 36, 47, 9, 42, 239, 39, 192, 31, 47, 232, 223, 13, 67, 240, 10, 219, 29, 58, 72, 240, 0, 128, 7, 106, 176, 189, 136, 7, 199, 169, 34, 79, 33, 25, 36, 27, 205, 234, 87, 27, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] }, v: 11425, r: 65241909667955742846573997075134853704950105840118719332093726271303452572585, s: 24953106305542008821292879060670806566449831376663929157906445963509097927181, hash: 0xaf21fbbef89c0debb121dda52aa72512e653f52f4366b36684f3340b9355e24d }, sender: 0x080e98a5a1f2b9d52c64174bacd26d0583502a9c, public: Some(0xe1b93c303a33b5f0287ef0799f27d88b274dc4a58bc35b4e98d2bbd95ff15372b806bcdad604e38a68ff4798c060b0c19618d7cf5e0191f93a0073cc8fe9c282) }, condition: None }
2020-03-12 16:00:50 UTC http.worker00 DEBUG own_tx Imported to the pool (hash 0xaf21fbbef89c0debb121dda52aa72512e653f52f4366b36684f3340b9355e24d)
2020-03-12 16:00:50 UTC http.worker00 DEBUG txqueue [0xaf21fbbef89c0debb121dda52aa72512e653f52f4366b36684f3340b9355e24d] Added to the pool.
2020-03-12 16:00:50 UTC http.worker00 DEBUG txqueue [0xaf21fbbef89c0debb121dda52aa72512e653f52f4366b36684f3340b9355e24d] Sender: 0x080e…2a9c, nonce: 242853, gasPrice: 0, gas: 707296, value: 0, dataLen: 484 ))

I then never see that tx hash again in the logs. What might be the possible reasons why a tx is quietly dropped? I only see 'PendingTransaction' occasionally and some transactions are mined, others are not.

Kyle From Ohio
@KyleFromOhio
i switched from nowarp to warp and it gets to end of snapshot and just crashes then starts all over again. endless loop...
2020-03-13 15:52:00 UTC IO Worker #0 INFO import Syncing snapshot 1397/3593 #5427283 32/50 peers

back to nowarp and now it at least is syncing again but painfully slow…

2020-03-13 15:58:07 UTC IO Worker #2 INFO import Syncing #5427356 0x7ea6…4a01 0.80 blk/s 106.0 tx/s 4.9 Mgas/s 0+ 789 Qed #5428157 36/50 peers 3 MiB chain 80 MiB db 84 MiB queue 12 MiB sync RPC: 0 conn, 0 req/s, 0 µs

Yury Sobolev
@ysobolev

Hi, I recently set up parity v2.7.2-stable on a digitalocean droplet (shared instance) with 6 CPU and 16GB RAM. I am running it in a docker. After it finishes syncing, the CPU usage and memory usage skyrocket. Here is my configuration

version: "3.3"
services:
  parity:
    image: "parity/parity:stable"
    command:
      - "--jsonrpc-interface=all"
      - "--jsonrpc-apis=all"
      - "--jsonrpc-threads=2"
      - "--base-path=/data"
    ports:
      - "30303:30303"
      - "30303:30303/udp"
    volumes:
      - "./blockchain_data:/data:rw"

Eventually it gets OOM killed with
[126912.424952] Out of memory: Kill process 18671 (parity) score 970 or sacrifice child [126912.426895] Killed process 18671 (parity) total-vm:28361984kB, anon-rss:15893636kB, file-rss:0kB, shmem-rss:0kB
Any ideas?

bubbleboy14
@bubbleboy14
hello. does anyone know why a parity node would time out on incoming requests from truffle and web3?
jacekv
@jacekv

Hey guys, I am currently trying to run a dockerized Parity container, where I used volumes to put a key file into the container. But parity gives me the following error message:

2020-03-19 08:57:15 UTC Error reading key file: Os { code: 21, kind: Other, message: "Is a directory" }
2020-03-19 08:57:15 UTC Error creating key file: Os { code: 21, kind: Other, message: "Is a directory" }

I checked the key file but it is not a directory.

-rw-r--r--. 1 root root 520 Mar 19 08:32 UTC--2020-03-19T08-32-00Z--96870c50-c003-ed12-eeec-527e2d95f369

Any idea what else to look for?

aliciawatts1994
@aliciawatts1994
I have seen in the books where we ask for the current sharing resource If it is good, you don't need it, See it: https://createapage.wiki/
Miguel Cabeza
@miguel567
when I starta new node and starts sycning snapshot, it is actually storing something in the DB? I check the directory size and it doesn't change at all
Liting
@DongLiting_92_twitter
hey,guys,When using parity for data synchronization, I have been stuck in this block for more than a day, but I do n’t know what the problem is. I adjusted all possible parameters with reference to the wiki
image.png
--base-path /parity --mode=active --warp-barrier=8950000 --num-verifiers=8 --db-compaction=ssd --cache-size=28000 --scale-verifiers --no-color --no-ipc -
-no-ws --min-peers=100 --max-peers=150 --no-ancient-blocks --pruning "fast"
Miguel Cabeza
@miguel567
@DongLiting_92_twitter I'm not on that block yet. I'm syncing mainnet, but I'm behind you, I'll let you know if I hit that mark and if I have the same problem

QUESTION: I'm running 2 parity clients from their Docker image. 1 client Kovan, 1 client Mainnet.
I try to call the below endpoint on any of the clients and I get the below error:

curl --trace-ascii dump.txt --data '{"method":"web3_clientVersion","params":[],"id":1,"jsonrpc":"2.0"}' -H "Content-Type: application/json" -X POST localhost:8555 Note: Unnecessary use of -X or --request, POST is already inferred. curl: (56) Recv failure: Connection reset by peer
Any ideas? I have set different ports.
docker run -ti -v /home/miguel/kovan:/home/parity/.local/share/io.parity.ethereum/ --name=kovan -p 8555:8555 -p 8556:8556 -p 30313:30313 -p 30313:30313/udp parity/parity --config=/home/parity/.local/share/io.parity.ethereum/config.toml

I have same ports above in the config.toml
Kovan is on 8555, 30313, 8556
Mainnet is on 8565, 30323, 8566

joshua-mir
@joshua-mir
@miguel567 this is most likely due to the rpc-hosts setting, if you are making curl requests from your docker host
Miguel Cabeza
@miguel567
rcp-hosts setting: can you elaborate a bit?
@joshua-mir
joshua-mir
@joshua-mir
the setting is --rpc-hosts and only ip addresses provided can make rpc requests to your node, default is only localhost @miguel567
in config file, it is hosts in the [rpc] section
Miguel Cabeza
@miguel567
ok
I don't have any hosts defined in my config.toml only the port change on RCP section. I am calling the endpoin through CURL form the host directly and defining the specific port for each docker client. So I thought I can always call localhost:8555 (kovan port) and localhost:8565 (mainnet port).
fama
@nagarajmanjunath
Hi, In parity inside the VM, actio_ parmas have struct pub enum ParamsType which has Embedded and Separate type. But whenever I do deploy contract or call contract using web3 it will always Embedded. May i know the difference, on what circumstance it will separate or embedded ?
Liting
@DongLiting_92_twitter
@miguel567 Yesterday I tried to resynchronize from the dozens of blocks before the problem block, and now I have reached the latest block height, which means that there is no problem with the block itself. Is this a probabilistic event during the synchronization process, and there seems to be no better solution other than restarting?
joshua-mir
@joshua-mir
FYI everyone, if you weren't aware, https://discord.io/openethereum is the new home of discussion about OpenEthereum (f.k.a. parity-ethereum)
Miguel Cabeza
@miguel567
@DongLiting_92_twitter Same here, I already passed that block and no problems.
will move to discord them
Kael Shipman
@kael-shipman
Hey all. Just got an error (on Rinkeby) that I've never seen before, and apparently google hasn't either. It says "Insufficient trust level". See here: https://rinkeby.etherscan.io/tx/0x7badb350a48c9e2d332f724d6460eb5e44e02bea89812f27bde58d612a7b2e78. Any thoughts?