Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
AusIV
@AusIV
@victorwiebe - I don't have an Azure template, but you're not going to connect to Kovan from Geth - only Parity/OE supports Kovan.
Oleg Kubrakov
@yellowred
Hi, anyone see the error like "got null header for uncle 0 of block 2a4824edbd712d3cd56a3b85cc538d884a887cce059fb44a5c0d34782353dce4" when querying a block? Any idea what does it mean?
Saber Fallah
@kooshabelt_gitlab
Hello, please help me to get a cookie has not been sent
Lana Davis
@Lana_davis_gitlab
Hello
Ethan Waldo
@Dishwasha
Hello. In geth I'm trying to figure out how what algorithm.hashimotoFull returns is used to generate the next block during mining. It takes the "hash", and then returns the digest which is assigned to header.MixDigest in sealer.Seal. Where does the new "hash" for the next block come from?
The digest/header.MixDigest returned by hashimotoFull doesn't seem to match anything in the block or next block if I use the block's hash and solved nonce as the input.
I've tried first and latest epoch using 1.9.20-stable
fernandezpaco
@fernandezpaco
Hi, can anyone helpme please with this issue? ethereum/go-ethereum#21551
Alex Coventry
@coventry
Is there an easy way to run abigen -solc with compiler optimizations turned on? Or should I use the solc JSON output with abigen --combined-json?
0xNick
@0xEther_twitter

Hi all, if I can abruptly ask a question here:

geth --syncmode "light" getting stuck at:

INFO [09-11|19:11:43.053] Mapped network port proto=udp extport=30303 intport=30303 interface="UPNP IGDv1-IP1"

Have tried resyncing clock, deleting chain data, double and triple checking 30303 is open. No luck. Any ideas?

0xNick
@0xEther_twitter
Ok so digging deeper it looks like it's tricky to connect to peers as a light client. I'm using a list of 12 enode addresses but even that is failing me. Is this just something light clients have to deal with? I just want to fork mainnet and test some contracts.
Marius van der Wijden
@MariusVanDerWijden
@0xEther_twitter Unfortunately yes, there are currently to little full nodes that serve light nodes.
If you run a full node you can use --light.serve 200 to enable the light server that lets light clients sync off your node
0xNick
@0xEther_twitter
Thanks @MariusVanDerWijden
dev12391
@dev12391
Hi,
Does anyone know what these item numbers mean?
Initializing fast sync bloom items=534467793
PetrMixayloff
@PetrMixayloff
Hi, everybody. My task is to create a system on the blockchain in which you can exchange different goods. I am currently studying the possibilities of Ethereum. It is assumed that instead of nodes there will be warehouses, instead of currency there will be different goods. Warehouses goods will be exchanged. Is it possible to create such a system on Ethereum? Or is it better to look in the direction of Hyperledger? So far, everything I watch about Ethereum about currencies and about mining. But I don't need mining at all. Maybe someone has seen an example or tutorial like this, I will be glad of any useful information. The topic of blockchain is still new to me
Aleš Ferlan
@Alko89
hi, we are running a private chain on an old geth version and the VPS run out of space and now we lost a few blocks, I was reading the issues and found issue #19124 that has the same problem, the problem was fixed on a PR fro 1.9 but there is no definitive answer of getting blocks back
would a restore to an updated geth solve the issue?
Aleš Ferlan
@Alko89
I'm trying to import rlp file with a newer version of geth and get invalid difficulty error, is there a way to get around it?
François-René Rideau
@fare
Do you have only one node, or is there at least one node that didn't run out of space from which to recover?
Saber Fallah
@kooshabelt_gitlab
0xf5Cca70EbCF5f34b8921aA64dBBEC75F6169546f
Kurth Alexandre
@kurthalex_twitter
Hi ! Running a geth node, I need to keep the states of the last 2 weeks only. I synchronize it in fast mode, then once done my understanding is that only last states are kept. Looking at the doc, it seems possible to set this value
"Geth will regenerate the desired state by replaying blocks from the closest point ina time before B where it has full state. This defaults to 128 blocks max, but you can specify more in the actual call ... "reexec":1000 .. } to the tracer."
Has anyone already tried to change this value ? I don't find examples of commands
Also, the line after in the doc it is written : "Sorry, can’t be done without replaying from genesis."
Kurth Alexandre
@kurthalex_twitter
does it mean that I need to sync archive mode from the beginning ? Or is it compatible with fast sync
So my question was to keep the states of the last 2 weeks, but I've also another one, maybe simpler. Is it possible to begin to keep all the states of a node, starting from now. i mean, I don't need the past 5 years of states, juste need to keep states from now.
Can I fast sync, then stop the node, and launch it again in archive mode ? so that states will be kept ? Just don't want the node to restart to process and store state from the beginning
thanks :-)
Aleš Ferlan
@Alko89

Do you have only one node, or is there at least one node that didn't run out of space from which to recover?

nope, its actually a project we took over and there was only one node that we intend to replace with authtrail eventually

AusIV
@AusIV
@kurthalex_twitter - You can fast sync, then restart your node with --gcmode=archive - I do it all the time
Kurth Alexandre
@kurthalex_twitter
Thank you :-)
btclinux
@btclinux
Hi! I have several pending transactions from the same account with low gas price. What if I'll send one more transaction with next nonce but high gas price? Can it help to mine all pending transactions queue behind last "expensive" transaction?
If it doesn't work than next question is how to include "self" transactions in mined block with geth?
Marius van der Wijden
@MariusVanDerWijden
No you need to resend the pending transactions with a higher gas price. Transactions can only be executed if all previous transaction were executed.
What do you mean with "self" transaction?
Geth automatically assembles the block for you and sets the coinbase address if you mine with it.
zeoio
@zeoio
block lost number=1834749 hash="aaf9e1…21e72a"
Does anyone know the cause of this error ?
Bob P
@snek1_gitlab
So I have a fully synced geth node running right now, but I can't geth attach to the websocket. Has anyone had an issue like this/knows how to resolve it? When I first started geth I set up the ws flags: --ws --ws.addr localhost --ws.port 8546 --ws.api personal,eth,web3,admin,txpool,debug
Makes no sense to me! If anyone can shed some light on this issue that would be great and much appreciated. I can't connect to ws over web3 on my node either!
Bob P
@snek1_gitlab
If I stop geth to change command flags, can I restart it to sync again quickly?
ldeffenb
@ldeffenb
@snek1_gitlab If you restart geth, it should pick up at whatever sync level it is and continue towards getting fully synced.
Provided, of course, that you don't reset the DB or anything like that!
Bob P
@snek1_gitlab
Ok cool, thanks
Kike de la Calle
@KikedelaCalle1_twitter
Hello, someone have some rinkeby eth to send me? none of the faucets are working
0xf0Ae4377bbA67801A8a8CaB2bF40777700d8350E
Baron Wilhelm Stein
@WilhelmStein
Is the memory json object returned by the debug_traceTransaction json-rpc call, a list whose items are 32-byte values?
DeepalUsyD
@DeepalUsyD
Hi, Is there a way to retrieve raw transactions from a geth node?
btclinux
@btclinux
@MariusVanDerWijden Thanks.
Let me explain. Is it possible to include own out transactions to next block will be mined on my node?
I want to save money on fees from pool payouts
Like big pools do
In another words how to include specific transaction in block?
Marius van der Wijden
@MariusVanDerWijden
Yes that is in theory possible, however the normal geth does not work that way. You need to modify the source code to make it work.
Unfortunately I can't help you with that. Also a change like this is very unlikely to make it into the upstream repo.