Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jan 31 19:27
    rufuspollock commented #241
  • Dec 14 2018 15:27
    StephenAbbott opened #246
  • Dec 03 2018 09:12
    rufuspollock commented #245
  • Nov 26 2018 14:51
    StephenAbbott opened #245
  • Nov 08 2018 08:31
    zelima commented #243
  • Nov 08 2018 08:05
    zelima closed #244
  • Nov 08 2018 08:05
    zelima commented #244
  • Nov 08 2018 07:57
    zaneselvans commented #244
  • Nov 07 2018 07:22
    zelima commented #244
  • Nov 07 2018 07:16
    akariv commented #244
  • Nov 07 2018 07:10
    akariv commented #234
  • Nov 06 2018 16:56
    parrottsquawk commented #234
  • Nov 01 2018 13:25
    zelima commented #244
  • Nov 01 2018 13:25
    zelima commented #244
  • Nov 01 2018 13:23
    zelima commented #244
  • Nov 01 2018 08:29
    anuveyatsu commented #244
  • Nov 01 2018 08:29
    anuveyatsu commented #244
  • Oct 24 2018 19:03
    zaneselvans opened #244
  • Oct 23 2018 09:40
    geraldb starred datahq/datahub-qa
  • Oct 19 2018 08:22

    Branko-Dj on master

    [travis][s]: Added update comma… (compare)

Pi Delport
@PiDelport
@rufuspollock: Hey, I just wanted to remind you about the organisation creation request. :)
Rufus Pollock
@rufuspollock
@PiDelport we will get to it we hope next week by @anuveyatsu but we are a bit limited in time on datahub.io atm :wink:
Pi Delport
@PiDelport
Cool, thanks for the update. :)
Tito Jankowski
@titojankowski

@rufuspollock Interesed?

Hi, this is Lizzie Sinnott with API World 2019, the world’s largest API event.

We’ve read CarbonDoomsDay’s coverage in ProgrammableWeb about your technology: CarbonDoomsDay Web API, and we want to feature it in our Top 250 API Industry Updates in 2019.

Who would be the right person to speak with about interviewing CarbonDoomsDay regarding your new API initiatives this year? We’ll be publishing these updates in our upcoming Top 250 API Industry Updates Report @ API World 2019 (October 8-10, San Jose Convention Center), as well as our newsletter that reaches 60,000+ subscribers.

would love to get all the work of the datahub team some coverage!
Tito Jankowski
@titojankowski
@rufuspollock bump
Ted
@tedchou12
is there a limit on the api request for spy composition?
seenuvasan
@seenuva97541036_twitter
hi
Rufus Pollock
@rufuspollock

is there a limit on the api request for spy composition?

What do you mean?

@titojankowski can you send an email to me at rufus [dot] pollock [at] datopian.com with the intro re api world ... (sorry for the delay!)
biarm
@biarm
Hi, is there a way to to use Datahub on a ChromeOS operating system?
Tito Jankowski
@titojankowski
@rufuspollock sent!
suressh
@suresh_trader_twitter
hello
any one knows where to get the historical company name changes ( new and old ticker name details)
D01110788
@D01110788
Hi there,
I am trying to publish a data set as a zip file but I am getting the following error > Error! "size" argument must not be larger than 2147483647. Does this relate to the size of the file itself where it is 2.31 GB.
If it does is there any option to publish files of this size.
Thanks
Anuar Ustayev
@anuveyatsu
Hi @D01110788 let us check this and get back to you.
cc/ @svetozarstojkovic
D01110788
@D01110788
If the service is not available for very large datasets could you recommend some where I would add large research datasets completed as part of a MSc relating to twitter streamed data and wikidata revision xml file datasets and its related parsed datasets.
Rufus Pollock
@rufuspollock
@D01110788 it should work
Rizvan Multani
@golden_hearts92_twitter
hello team data-hub
i want to know about premium data
Rizvan Multani
@golden_hearts92_twitter
i am software pprogrammer and i want to use your data(country (with full name, currency name and code, calling code, iso code 2 and 3 digit, flag and tld), state, city)
iman jazan
@iman_j_twitter
hey guys does anyone know where I can get the column header definitions for the EPL Season-1919_csv?
Anuar Ustayev
@anuveyatsu
@golden_hearts92_twitter Hey, yes, feel free to use it. If you have any additional request please email us regarding premium data.
@iman_j_twitter Hi, could you please link to the page you’re looking at?
iman jazan
@iman_j_twitter
Anuar Ustayev
@anuveyatsu
ok @iman_j_twitter I got it now. Would you mind to create an issue here https://github.com/datopian/datahub-qa/issues/new so that we can track it there?
Frazer McLean
@RazerM
Anuar Ustayev
@anuveyatsu
@RazerM yes
Khalid Almallahi
@AbuKotsh_twitter
season-1920 ? where can get it
Frazer McLean
@RazerM
@anuveyatsu it seems to be out of date (e.g. North Macedonia was added to https://en.wikipedia.org/wiki/ISO_3166-1 in March)
Cristiano
@Criscito88_gitlab
Hey! Do you know where can I get data about the most running teams in Premier Laegue, last seasons?
Anuar Ustayev
@anuveyatsu
@AbuKotsh_twitter hey, can you please open a ticket here and describe your question with links etc so we can have a look? https://github.com/datopian/datahub-qa/issues
@RazerM hi, thank you for flagging it! Can you please open an issue here https://github.com/datopian/datahub-qa/issues ?
Nick Green
@ngreen28
im looking for worldwide ISP (internet service provider) dataset, anyone know where I can find?
danielvianna
@danielvianna
hey guys. I'm trying to test fedex shipping (worldwide) API. I'm looking for a dataset that includes country code and postal code. I essentially need to cover major cities in the world because my product would not sell in a remote island of the pacific. any ideas how I can find this?
I'm going to write a script to send postal code + country code and see if there is an error. I'm not using fedex public API so I wonder if there is a glitch on their side or my custom ecommerce's code
praveenkumar
@praveenkumar2295_gitlab
Hi Guys, I need to integrate the western union API in my laravel website. Can anyone explain how to integrate the western union money transfer?

Hi Guys, I need to integrate the western union API in my laravel website. Can anyone explain how to integrate the western union money transfer?

I will also create a business account.

DeveloperDanny
@danny_developer_twitter
Hey, I'm trying to use the airport codes resource but the CSV is coming down with malformed UTF-8 content? No matter how I handle the data it is malformed, is anyone able to confirm this is an issue? I've tried opening in Excel ( after reading the specific instructions ) and also; Notepad++, EmEditor, importing directly into a UTF-8 MySQL DB, all come up with malformed data.
mikkelangaju
@mikkelangaju
It looks like the core UN/LOCODE dataset has not been updated to the newest released version. Any chance this dataset will be updated or even potentially added to the list of automatically updating datasets?
Zane Selvans
@zaneselvans
Hey there. Catalyst is finally getting close to releasing data packages for Public Utility Data Liberation Project, and we're wondering what the current state of datahub is. It seems like it's been pretty quiet since last fall as far as blog posts and updates to say, the data-cli repository on github. The command line interface and direct API access from within Python/R etc. would be very useful for getting our users the easiest possible access to the data. Is it all working well? Are there things going on in the background?
Zane Selvans
@zaneselvans
When I install the data CLI tool from npm (or yarn) I get a slew of deprecation messages, and then when I try and download my test datapackage I get:
$ data get https://datahub.io/zaneselvans/pudl-msha 
(node:20955) [DEP0066] DeprecationWarning: OutgoingMessage.prototype._headers is deprecated
raven@2.6.4 alert: failed to send exception to sentry: Connection timed out on request to sentry.io
> Error! primordials is not defined
I'm attempting to pull one of the tabular data resources inside that package into a pandas dataframe, as per the example on the page, but it's been half an hour and it still hasn't returned...
Pattamapong Rattana
@MickeyyGot_twitter
Anyone has data dictionaries for free dataset?
or description for columns.