Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
LilSpazJoekp
@LilSpazJoekp

Okay so I need some help with writing tests for my pull request. When I run python3.7 setup.py test it fails with a bunch of errors similar to:

prawcore.exceptions.RequestException: error with request A request was made that could not be handled.
E           
E           A request was made to https://oauth.reddit.com/r/all/new?limit=100&raw_json=1 that could not be found in TestSubredditStreams.submissions.
E           
E           The settings on the cassette are:
E           
E               - record_mode: once
E               - match_options {'method', 'uri'}.

/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/prawcore/requestor.py:49: RequestException

I have no experience with betamax or writing unit tests. Any help would be greatly appreciated. Thanks!

Joe RH
@jarhill0
Do you understand how Betamax and its cassettes work?
LilSpazJoekp
@LilSpazJoekp
I have a basic understanding
cassettes playback requests
Joe RH
@jarhill0

This error happens when you modify the test that corresponds to a particular cassette. The fix is to delete the relevant cassette.

However, the error you've posted seems to be from submissions streams (which I don't believe you were modifying in my PR, but correct me if I'm wrong). My guess is that you copied and pasted a with line that invokes a cassette without changing the cassette name.

It's difficult to diagnose something like this without seeing the code though.
LilSpazJoekp
@LilSpazJoekp
this is the commit my code is at: LilSpazJoekp/praw@cce4180
I haven't changed anything in the tests folder
Joe RH
@jarhill0
Huh. I'm going to pull your branch and try running the tests on my end.
LilSpazJoekp
@LilSpazJoekp
That commit are modifications that were made by @bboe and it modified steam_generator
Joe RH
@jarhill0
I have reproduced the failure on my end.
One second, let me poke around. My guess is that bboe's commit required a cassette to be re-recorded but that didn't get pushed.
LilSpazJoekp
@LilSpazJoekp
I think you're right.
I deleted the cassettes that have stream in the name and it is failing less
Joe RH
@jarhill0
My guess is incorrect, because those cassettes haven't been touched in 3 years in master, yet tests pass there.
LilSpazJoekp
@LilSpazJoekp
hm
Joe RH
@jarhill0
Oh, wait a second.
This change ("Remove use of before parameter in streams") isn't in master at all. Where did that come from? Why did bboe push it to your branch?
LilSpazJoekp
@LilSpazJoekp
that's a good question
I don’t know
Joe RH
@jarhill0
Why is that being included (via bboe's commit) in your PR, which is ostensibly about moderation streams only?
I guess I'm a little out of the loop since I haven't been following this PR. I guess these are questions for @bboe.
Bottom line is: since that commit changes how streaming works, the cassettes will need to be deleted and re-recorded. That should solve your testing issue. But why are those changes being made in the first place?
LilSpazJoekp
@LilSpazJoekp
I think I know why
There was talk of removing the before aspect from the stream generator praw-dev/praw#1050
LilSpazJoekp
@LilSpazJoekp
LilSpazJoekp
@LilSpazJoekp
In my PR, modmail_conversation() is one of the streams that it will add and that relies on stream_generator not passing params to the func because modmail.conversations() will not accept params from stream_generator.
Joe RH
@jarhill0
I see that discussion, but it's confusing to me why bboe included those changes in your PR. I see that part of your PR relies on them, but I still think they should be a separate PR.
LilSpazJoekp
@LilSpazJoekp
Yeah I agree
Bryce Boe
@bboe
I'm not sure how that commit got back on your branch @LilSpazJoekp. Perhaps when you pulled it didn't reset the branch?
image.png
That shows that commit being added after I force pushed to update. Either way you can fix it via git reset 8e53498d4fe75048d35a6ff654bb0c9c318f5137 and then force push the branch.
Let's take modmail_conversation out of the PR for now.
Since the before param will need to stay for now in the stream_generator.
LilSpazJoekp
@LilSpazJoekp
can do
Joshua Ashkinaze
@josh-ashkinaze
Hey, I posted a question on reddit and I might as well ask here: is there any way to get the comment level depth for comments on a post? Strikes me as odd comment objects don't have a depth attribute
Joe RH
@jarhill0
I don't think there is. You could find it manually by counting parents, but that's fairly time-intensive.
Joshua Ashkinaze
@josh-ashkinaze

Well how would I even count parents?

There is a replies attribute. Though as I understand it, that lists the comments below a comment.

Joe RH
@jarhill0
Use .parent with an incrementor inside a loop
Joshua Ashkinaze
@josh-ashkinaze
Like this? This throws an error though
def get_post_comments(post, comment_limit):
  comments = []
  post.comments.replace_more(limit=comment_limit)
  for comment in post.comments.list():
    depth = 1 # assume a parent comment 
    for comment in comment.parent:
      depth+=1
    return comments.append([comment.body, depth])
Joe RH
@jarhill0
Try something like this (I'm on my phone, so sorry for mistakes):
tracker = comment
while hasattr(tracker, 'parent'):
    depth += 1
    tracker = tracker.parent
Joe RH
@jarhill0
@josh-ashkinaze idk if this is you or not but I just wrote an answer to the same question on StackOverflow https://stackoverflow.com/questions/57243140/how-to-store-a-comments-depth-in-praw-python-reddit-api-wrapper/57279550#57279550
Joshua Ashkinaze
@josh-ashkinaze
yes yes that was me
def parse_depth(comment):
  """
  Return the depth of a comment.

  Keep checking for a parent of a comment, 
  until a comment has no parent, 
  at which point the root is reached. 
  """
  depth = 0
  tracker = comment
  while hasattr(tracker, "parent"):
    parent = tracker.parent()
    tracker = parent
    depth+=1
  return depth
worked for me
I think your code did not work because it did not update parent correctly. To be fair, you did warn me you were on your phone so my bad
Joe RH
@jarhill0
@josh-ashkinaze Did you read my answer on StackOverflow? Based on the context here, I didn't realize that you were iterating over all the comments in a single thread. The approach I shared in this chat, which you another version of above, is going to be very inefficient for your use case
Every call to parse_depth makes depth network requests, which is highly inefficient
Joshua Ashkinaze
@josh-ashkinaze
ah got it
reseph (Zeno)
@Zenoxio_twitter