Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Dec 05 18:38
    midudev closed #160
  • Sep 03 22:17
    lostfictions edited #292
  • Sep 03 22:15
    lostfictions opened #292
  • Jun 04 12:12
    outbreak opened #291
  • May 08 20:39
    danmactough closed #290
  • May 08 20:39
    danmactough commented #290
  • May 08 02:45
    chuanqisun opened #290
  • May 06 23:46
    danmactough commented #91
  • May 06 22:52
    yPhil-gh commented #91
  • May 06 22:37
    yPhil-gh commented #289
  • May 06 22:36
    yPhil-gh closed #289
  • May 02 14:55
    yPhil-gh edited #289
  • May 02 14:55
    yPhil-gh edited #289
  • May 02 14:54
    yPhil-gh edited #289
  • May 02 14:53
    yPhil-gh opened #289
  • Apr 02 16:20
    Pomax opened #288
  • Mar 21 01:09
    danmactough closed #287
  • Mar 21 01:09
    danmactough commented #287
  • Mar 20 16:41
    yPhil-gh edited #287
  • Mar 20 16:38
    yPhil-gh edited #287
FZFalzar
@FZFalzar
Anyway @danmactough are there any plans to support the <lastBuildDate> tag for feedparser?
Dan MacTough
@danmactough
@FZFalzar Not sure what you mean by “support”, but we do parse lastBuildDate as one of the date fields https://github.com/danmactough/node-feedparser/blob/master/main.js#L430
Doug Clark
@douglas-b-clark
@danmactough Is there any way to get feedparser to give me a JSON object that includes the meta data and an array of articles? I really don't need any of the events - I'm parsing a static string of xml previously pulled from an RSS feed. Is there any way to get at that data without going through n number of on.readable callbacks?
Doug Clark
@douglas-b-clark
alternatively, is there a way to know that the last article of a feed has been processed? I'm not getting an 'end' event.
Dan MacTough
@danmactough
@douglas-b-clark I’ve heard other reports of the end event not firing. Can you open an issue with some reproducable code?
regarding the built-up JSON, one of the key features of feedparser is that it is a streaming parser, eliminating the need to build up an arbitrarily large result object. If you need that object, you can build it up yourself, however.
Doug Clark
@douglas-b-clark
@danmactough Thanks for the response. I will try to log a bug on Monday. I have found, however, that the "drain" event is being fired after all of the articles have been run through the "readable" event, so I am able to build my own result object as you suggest.
Doug Clark
@douglas-b-clark
Bug logged. Thanks for your help @danmactough .
Dan MacTough
@danmactough
thanks, @douglas-b-clark I will take a look.
Josh Finnie
@joshfinnie
@danmactough qq
does node-feedparser accept etag and last-modified args
Dan MacTough
@danmactough
@joshfinnie no. feedparser doesn’t do anything but feed parsing. you’re responsible for fetching the feed etc. For example: https://github.com/danmactough/node-feedparser/blob/master/examples/compressed.js
Josh Finnie
@joshfinnie
k ,thx
Dan MacTough
@danmactough
If you’re going to perform conditional GETs (and you should!), then instead of treating anything non-200 as an error https://github.com/danmactough/node-feedparser/blob/master/examples/compressed.js#L20 you would just also check for 304 and skip.
Josh Finnie
@joshfinnie
Yeah, looking at something like this: https://pythonhosted.org/feedparser/http-etag.html
Dan MacTough
@danmactough
yup. python feedparser library is canonical as it was basically co-developed with rss/atom (and conditional GET, for that matter)
Dimitris Karittevlis
@dka09
Hi, is there a way to use the library without streams? Wait for the full response before handling it?
Dan MacTough
@danmactough
@dka09 you do need to handle the stream, but you can just buffer it (push each item onto an array) and then handle the buffered results when the stream is done. I recommend you use streams, though, to avoid the unknown memory consumption caused by buffering the entire result
Thierry Schellenbach
@tschellenbach
Any tips for debugging the library when parsing fails? danmactough/node-feedparser#183
Kevin Minehart
@kminehart
Hey guys, anyone here?
Kevin Minehart
@kminehart
having trouble with my readstream, the readable event is never getting fired. Any ideas? I picked this app back up from September, I can't imagine much has changed right?
Dan MacTough
@danmactough
@kminehart can you link to a gist with the code?
Kevin Minehart
@kminehart
It's possible that the feeds only contain metadata which would point to a different problem then :joy:
cat feed_34320_2016-11-15T14-* | grep body 0 results. So that looks like what's happening.
Dan MacTough
@danmactough
Ah. Only metadata. That’s not the first time I’ve heard that. I gotta say, stream parsing is crucial, but the mapping the stream API to the structure of RSS feeds is difficult.
Kevin Minehart
@kminehart
Yeah. It works pretty well when the inputs are fine! There's so many weird options in the Associated Press' WebFeedsAgent application.
hard to get it all correct
Good thing there's a way to only download feeds with entries :)
royalcrown28
@royalcrown28
I'm having the hardest time parsing mmo-champions rss feed desciptions
EMEHINOLA Idowu
@hydeenoble
Good day everyone
am just trying to use node-feedparser for the first time....pls is there anything like a documentation of something??
William
@Wonko7
Hi all
I basing my code on the compressed example
when I try to read http://rss.slashdot.org/Slashdot/slashdotMainatom I only get the first 3 entries
not having this problem on https://xkcd.com/atom.xml
any idea what I could be messing up?
Dan MacTough
@danmactough
@Wonko7 Can you point to a gist with your code?
William
@Wonko7
are you up for some clojurescript?
I won't annoy with that, but there's no reason it should stop right? there's no date limit on the entries or anything?
Dan MacTough
@danmactough
feedparser is just a streaming parser. as long as you keep feeding it a feed, it will keep parsing it and emitting items. no limits or sorting or anything like that.
William
@Wonko7
ooh got it, new to node too, readable can be fired multiple times
everything works, thanks
Dan MacTough
@danmactough
:+1:
William
@Wonko7
danmactough: hey, thanks for this, in like 3 days I managed to fill the rss reader shaped hole in my life
Dan MacTough
@danmactough
That's awesome to hear @Wonko7
Cameron Panagrosso
@Cagrosso
Hi Dan, I'm having some trouble understanding what's going on with your Usage example for feedparser.
Why is it that in req.on('response') you are piping the 'stream' to feedparser instead of the response? When I do that, I get an error saying that 'pipe' is not a function. On a whim, I decided to try to pipe the response in and it worked!
Dan MacTough
@danmactough
@Cagrosso if you link to a gist I can take a look, but it's likely a copy paste error on your end. The example code is correct.
betaredex
@betaredex
hey