Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Anna
    @annafralberts
    upload ahum packager
    JD Bothma
    @jdbothma_twitter
    @annafralberts it might mean you have duplicate classifying values
    Here's how i check for them
    JD Bothma
    @jbothma
    I'm strggling to get a new dataset online: adjusted-estimates-of-national-expenditure-2015-16 and the retry adjusted-estimates-of-national-expenditure-2015-16-v2 - I think it was the same error as openspending/openspending#1413 under account b9d2af843f3a7ca223eea07fb608e62a but I didn't check in detail
    it's not showing up after a couple of hours like before
    Brook Elgie
    @brew
    How did this go @jbothma? Did it upload in the end?
    JD Bothma
    @jbothma
    yeah - I tried again an hour ago and just checked and looks like it was successful this time
    only thing I changed was the ID :)
    Oscar Montiel
    @tlacoyodefrijol
    Hello everyone, the OpenSpending team will be spending some time with their loved ones from December 21st to January 2nd. This means we will take some more time to respond to inquiries. Happy holidays everyone!
    Jaron Rademeyer
    @otterbotter
    Hi. I've been uploading datasets quite frequently the past few weeks and I keep running into the same issue (openspending/openspending#1442). I've uploaded another dataset almost 3 hours ago but I have no idea if it's still in progress or if it's failed. It's somewhat critical that I have access to this dataset on OS soon, so if anyone's able to help that would be great
    The dataset should be under b9d2af843f3a7ca223eea07fb608e62a:budgeted-and-actual-national-expenditure-v6
    Jaron Rademeyer
    @otterbotter
    I've managed to upload a second one, the first one most likely failed but I have no idea why
    JD Bothma
    @jbothma
    @brew I finally got a chance to use the schema type-setting in OS Packager - it's really handy!
    thanks for that addition
    since we're using goodtables with a custom schema for the pre-upload data checks, I can just add types to that so the same schema file used for data checks can be used for setting types in packager
    Brook Elgie
    @brew
    Nice! Hope it saves you some time.
    Cristhian Parra
    @cdparra
    Hi everyone! I started to test openspending this week and I am looking forward to get a local instance running for a gov. office to test in Paraguay. Do you have any recommendation in terms of specs for a cloud server that would be good enough to host the platform for testing purposes?
    JD Bothma
    @jbothma
    Any babbage experts around?
    I just realised my facts table is absolutely massive because I'm using text columns for the joins with label tables!!
    Has anyone set babbage up so there's a text code and label in the label table, and the join column is different from the code column?
    JD Bothma
    @jbothma
    ah, I think I got it - for anyone curious, I think babbage joins the fact and label tables by automatically figuring out the primary key of the label table, so I just need to add an integer-based primary key and update the foreign key on the facts table to use those IDs
    https://github.com/OpenUpSA/municipal-data/blob/master/models/incexp.json#L12
    lolziac
    @lolziac
    hello to all, is API working for you guys ???
    or the page https://openspending.org/admin does it load OK with you ??!
    Adam Kariv
    @akariv
    Hey @lolziac - thanks for reporting, I’ll let the team know.
    lolziac
    @lolziac
    no problem...as always :)
    Adam Kariv
    @akariv
    cc @tlacoyodefrijol
    JD Bothma
    @jbothma
    What do you guys think is the ETA on getting it up and running again?
    JD Bothma
    @jbothma
    We have a hackathon/data viz event this weekend so it would be amazing to get it up and running
    I also need to load some more data for the event
    in the meantime I'm spinning up our own openspending instance but packager tries to request http://localhost:8080/packager/proxy?url=undefined which fails
    haven't dug into why url=undefined yet
    JD Bothma
    @jbothma
    ah, the url undefined thing seems to be openspending/openspending#1378
    let's try os-data-importers - shouldn't rely on goodtables
    JD Bothma
    @jbothma
    I've managed to process enough of the columbian datapackage in the simple branch using os-data-importers to be able to see the default bar chart but when I try to filter the API doesn't respond - any ideas why that might be?
    Oscar Montiel
    @tlacoyodefrijol
    @jbothma Elastic Search is down. This is giving some problems with all of the OS components. We’re working on bringing the services back up
    JD Bothma
    @jbothma
    awesome - thanks - wasn't sure what's happening - I'll let you focus on that
    JD Bothma
    @jbothma
    :clap: looks like it's alive again!
    JD Bothma
    @jbothma
    Hey guys, it looks like OS Packager is still hitting failing to do the initial dataset check which was previously a problem with goodtables
    Any chance you can look into that? Or is it faster to just add https://github.com/vulekamali/os-source-specs#vulekamali to os-data-importers?
    I'm pretty sure the pipelines in that branch don't work properly but if it's cool to add so long I can get the new pipeline I need working in there later today
    lolziac
    @lolziac
    It looks live...but OS Packager is not working :(
    when trying to upload the file I get this Error: Can't get a report from API. Reason: "Error: Reached maximum attemtps"
    JD Bothma
    @jbothma
    Same
    lolziac
    @lolziac
    hope somebody from the team can check this
    JD Bothma
    @jbothma
    in the network log I then see a request to /packager/proxy?url=undefined which has come up before openspending/openspending#1378
    lolziac
    @lolziac
    same here
    Oscar Montiel
    @tlacoyodefrijol
    Hi @jbothma @lolziac. I'll open a ticket to look into the Packager issue. For now it's a good idea to add the data to the pipeline
    lolziac
    @lolziac
    HI @tlacoyodefrijol thanks for treating this Packager issue, regarding adding data to the pipeline that you are mentioning it, where can I read more or try it out ?!?!
    lolziac
    @lolziac
    everything seems OK now