Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    MUBI06
    @MUBI06
    do someone know what is the problem ? it seems not possible to specify a path to the file
    blotus
    @blotus
    and by default, data files go into /var/lib/crowdsec/data/
    MUBI06
    @MUBI06
    ive tried with the data section and got the same issue
    data:
    blotus
    @blotus
    how do you install the scenario ?
    do you manually copy the file in the container or do you use cscli ?
    if you manually copy the scenario, you will also need to manually copy the sqli_probe_patterns.txt file in your container (it's actually cscli that will download the data files, not crowdsec itself)
    MUBI06
    @MUBI06
    Okay thanks blotus
    MUBI06
    @MUBI06
    the params login?id=2%20and%201=2&information_schema.tables should activate the scenario right blotus ?
    blotus
    @blotus
    yes
    but you need at least 10 requests with different parameters values
    so you can try to just change your 2%20 to 3%20 and so on
    and the scenario will trigger
    MUBI06
    @MUBI06
    Ah so refreshing the page 11 times with this params id=2%20and%201=2&information_schema.tables
    will not block the ip ?
    blotus
    @blotus
    yes
    LtSich
    @LtSich
    Hi, quick question, can I use external program / api in my scenario, like (for example) send a request to ipqualityscore to check the reputation of the IP and eventually block them ?
    atm I know how to do a check on a custom ip list in a file, but no idea if we can do that with an external API
    AlteredCoder
    @AlteredCoder
    Hello @LtSich
    Currently it is not possible to do it with crowdsec :/
    but it is a good idea
    LtSich
    @LtSich
    thx @AlteredCoder for the answer, yes that could be usefull, to avoid the need to reload CS each time we want to update the whitelist / blacklist for example
    but, for later, I'm sure the team will find a solution ;)
    AlteredCoder
    @AlteredCoder
    Yes :)
    gsonetic
    @gsonetic
    Is there some documentation on how a Kafka topic can be a data source?
    Klaus Agnoletti
    @klausagnoletti_twitter
    Do you have a specific need since you ask? And what is that need? Just trying to understand your use case s bit
    blotus
    @blotus
    Hello @gsonetic
    unfortunately, kafka is not yet supported
    gsonetic
    @gsonetic

    Okay, I just find it a bit misleading on https://doc.crowdsec.net/docs/concepts

    A stream of information can be a file, a journald event log, a cloudwatch stream, and more or less any kind of stream, such as a kafka topic.

    @klausagnoletti_twitter I'm trying to find a way to stream data into Crowdsec, I'm using vector.dev as log parser.
    3 replies
    blotus
    @blotus
    looking at the vector.dev available sinks
    we do support AWS cloudwatch logs, so depending on whether you use AWS or not, it might be a solution
    the biggest drawback is the lag between the log ingestion and the actual availability + with our current implementation it's pretty easy to run into cloudwatch API quotas if you try to monitor a lot of streams
    gsonetic
    @gsonetic
    I do not use AWS.
    Klaus Agnoletti
    @klausagnoletti_twitter
    @blotus Would it be a lot of work to do a Kafka datasource + parser?
    blotus
    @blotus
    we would not need a parser in this case, as kafka would "only" be the way we read logs
    as for the datasource itself, I had a very very quick look some time ago, it did not look that hard, the official go library seems pretty easy to use
    Klaus Agnoletti
    @klausagnoletti_twitter
    Allright. We should put it on the (long) list of features we'd like :-)
    gsonetic
    @gsonetic
    I'm actually surprised Crowdsec doesn't have a better data source solution for streaming. Right now my hack is that I have Vector service exposed, which write to a file, which Crowdsec reads. This is not a good solution
    2 replies
    glebkhil
    @glebkhil
    # Don't change this
    type: http
    
    name: http_default # this must match with the registered plugin in the profile
    log_level: info # Options include: trace, debug, info, warn, error, off
    
    format: |
      {"chat_id": "YOUR_CHAT_ID", "text": "{{- range . -}}
      {{- range .Decisions -}} 🏴☠️ {{.Value}} wird für {{.Duration}} geblockt aufgrund von {{.Scenario}}.
      {{end -}}
      {{end -}}"}'
    
    url: "https://api.telegram.org/YOUR_BOT_TOKEN/sendMessage" # plugin will make requests to this url. Eg value https://www.example.com/
    
    method: POST # eg either of "POST", "GET", "PUT" and other http verbs is valid value.
    
    headers:
        Content-Type: application/json
    thks
    MUBI06
    @MUBI06
    Hello all
    we are trying to parse a such apache log
    2021-10-18T09:09:53.415Z {name=service-target-1} test.entreprise.io 172.16.17.0 - - [18/Oct/2021:09:09:53 +0000] "GET /login/360Spider HTTP/1.1" 404 6609 "-" "user_agent" 11508
    with this grok: %{IPORHOST:httpd_request_host} %{IPORHOST:httpd_request_source_ip} %{NOTSPACE:httpd_request_owner} %{NOTSPACE:httpd_request_username} [%{DATA:httpd_request_received}] "%{WORD:httpd_request_action} %{URIPATH:httpd_request_uri}(?:%{URIPARAM:httpd_request_params})? HTTP/%{DATA:httpd_request_version}" %{NUMBER:httpd_response_code} (%{NUMBER:httpd_response_size}|-) "%{DATA:httpd_request_referer}" "%{DATA:httpd_user_agent}" %{NUMBER:httpd_timing}
    it's working well but we have an empty user_agent using crowdsec
    but working when we test here https://grokdebug.herokuapp.com/
    we have tried to test passing QS type to the user_agent
    but not working
    Thibault "bui" Koechlin
    @buixor
    can you try to enable debug in the apache2 parser ?
    it will output you what it captured
    MUBI06
    @MUBI06
    yes in the debug the user_agent field has the value of the all url : https://test.entreprise.io/
    which is not correct
    all others fields work well
    Shivam Sandbhor
    @sbs2001

    @MUBI06 I see couple of problems in your grok expression:

    1. The square bracket ("[" and "]") in your [%{DATA:httpd_request_received}] is not escaped.
    2. The initial part "2021-10-18T09:09:53.415Z {name=service-target-1} " of the log is not parsed at all. Don't know whether that's useful in your context.
    3. Minor issue, but in the same expression from (1), instead of using DATA grok expression, use something more specific. HTTPDATE works here.

    Summing these up, the following expression would work for you

    %{IPORHOST:httpd_request_host} %{IPORHOST:httpd_request_source_ip} %{NOTSPACE:httpd_request_owner} %{NOTSPACE:httpd_request_username} \[%{HTTPDATE:httpd_request_received}\] "%{WORD:httpd_request_action} %{URIPATH:httpd_request_uri}(?:%{URIPARAM:httpd_request_params})? HTTP/%{DATA:httpd_request_version}" %{NUMBER:httpd_response_code} (%{NUMBER:httpd_response_size}|-) "%{DATA:httpd_request_referer}" "%{DATA:httpd_user_agent}" %{NUMBER:httpd_timing}
    1 reply