Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    got it.. this was bad on my part.. I was hoping to see results as I was typing..
    some sort of typeahead.. but it seems like I have to enter..
    Kelly Norton
    @rkothari007 ah yeah, been thinking about adding that in some form ... but haven't because some searches can still take a bit.
    Thanks for the debugging !
    Kelly Norton
    np, sorry for the confusion
    Sam Jordan
    Hi! What versions of jsx, node, and other frontend stuffs are required for the web GUI?
    (I'm running into some warnings and errors on the web GUI, so I assume my setup must be weird/wrong)
    i.e. Warning: React.renderComponent will be deprecated in a future version. Use React.render instead. react-0.12.2.min.js:18108 Warning: Unknown DOM property autocomplete. Did you mean autoComplete? react-0.12.2.min.js:9673 Each child in an array should have a unique "key" prop. Check the render method of SearchBar. See http://fb.me/react-warning-keys for more information.
    (and after a couple of searches: Uncaught Error: Invariant Violation: ReactMount: Two valid but unequal nodes with the same `data-reactid`: .0.1.0)
    Kelly Norton
    @sfjordan the versions should be bundled in https://github.com/etsy/Hound/tree/master/ui/assets/js
    @sfjordan I can't reproduce the warning, can you tell me more about the steps you took
    Sam Jordan
    Thanks! I found them eventually after poking around more. I traced the issue back to a chrome extension I had, which was manipulating the DOM somehow in a way react didn't like. Sorry to bother you!
    Hello, we have set up Hound to index ~50Gbyte of Source files. Some searches however run 30sec+ some others return within miliseconds. Basically searches with more results take a lot longer than searches with few results. A search with ~200 files result takes arround 30 sec for example. While the search is performing we donĀ“t see a lot of I/O nor I/O wait but 100% CPU.. Is this normal?

    To add to what @Spacefish is saying: I just build a plain codesearch csearchindex and used csearch for the same string (on the same machine) and hound for a lookup (around 240 hits - lookup in ~50GB source files):

    • hound: ~ 33s
    • codesearch: ~1s
    • csearch -n <SEARCHSTRING> | awk -F':' '{ system("< "$1" tail -n $(("$2"-5)) | head -n 10") }' > resultfile.txt: ~2.5-5s

    The last part was just to look if IO is the problem (since hound has to open all files and grep the resulting lines too), but while this takes a few seconds it's much faster than hound. I'm looking at the code, but have to learn go on the way, so this may take a while. But it looks like the part reading/grepping the files is causing this.

    Just noticed there's a "+" missing in the tail-call.. just for sake of completeness :)
    Rohan Kapadia
    is there any method for me to add auth on top of my deplyment?
    Seshumadhav Chaturvedula
    I am unable to get the search results updating as I type.. Anyone facing same problem?
    Andrew Grossnickle
    Hello. I am running into an issue where hound is dying and I am seeing Message: can't create pivot_root dir , error mkdir /var/lib/docker/devicemapper/mnt/1bcaf52a1d9fdf2102f148b1f9e79d7adf28b0552e8f095afce8e9cdb662cf47/rootfs/.pivot_root109700853: in the logs
    Kelly Norton
    @TeraInferno does hound have write access?
    I assume you are running via docker.
    Andrew Grossnickle
    @kellegous I am running hound via docker. Hound does indeed have write access. Some details on the setup:
    I am running hound on a CentOS VM. It is using the devicemapper docker storage option (currently the loopback setting). I have 16 Gb allocated for docker data and 2 Gb allocated for docker metadata.
    Breahna David
    Hi, guys,
    i tried to configure hound to catch my eslint from jsx files,
    but had some problems :worried:
    Can anyone help me?
    Seth Reno
    Is it possible to include the username and password for a git repo in the config.json? I've done this for svn, but can't seem to get it working for a bitbucket repo.
    Alex Ortiz-Rosado
    @sethreno: It doesn't look like the GitDriver supports username and password (unless you include them in the URL itself, which is a BAD idea). You should instead setup SSH access to your BitBucket repo and have a local SSH key configured - https://confluence.atlassian.com/bitbucket/set-up-ssh-for-git-728138079.html
    @sethreno: For reference, here's the SVNDriver code https://gowalker.org/github.com/etsy/hound/vcs#SVNDriver and here's the GitDriver code: https://gowalker.org/github.com/etsy/hound/vcs#GitDriver
    Alex Ortiz-Rosado
    Does anyone know what the "# files" in the upper-right corner (under the search box) represents? I assumed it was total number of files that match the search query, but I'm getting "25 files", while the results only displays 1 file. #whassupwitdat?
    Julian Brendl
    hey, so I was wondering, where is the code / repositories specified that hound searches? Would it for instance be possible to search among all github repositories?

    I'm trying to set up hound to search our internal repositories. These repositories are accessible via ssh urls that look like the ones below:

    In the hound quick start manual I read:

    Use SSH style URLs in the config: "url" : "git@github.com:foo/bar.git". As long as you have your SSH keys set up on the box where Hound is running this will work.

    So I added the following to my config.json file:

    "MyProjectA" : {
    "url" : "my_user@git.my_company.com/pkg/MyProjectA"
    But I don't feel it is quite right as in the quick start example the url points to a bar.git file, but I don't have a parallel MyProjectA.git file, so in fact I'm pointing to the root dir of my repository, under which one can find a .git directory.

    And, as expected, it errors:

    2017/01/25 15:05:00 Failed to clone my_user@git.my_company.com/pkg/MyProjectA, see output below fatal: repository 'my_user@git.my_company.com/pkg/MyProjectA' does not exist

    Anyone has a clue if it is possible to get this working and how? I can clone everything locally, but that means that the search engine doesn't pick up changes, which is a pity.

    I am trying to add SSL/TLS support to this application. How to add that?
    how does hound compare to livegrep.com?
    is it still developed?
    @jklein ping