Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Felipe Martins
    @felipsmartins
    Hi, there! How's it going!? So I have a "simple" question: Can I get Elastic IP from Kibana interface? is that even possible? i'm struggling on that for hours and I got nothing :(
    I mean, the Elastic host which kibana is connected to.
    amilcarwong
    @amilcar34706591_twitter
    Hi , i am new in kibana, can I put some control and take action from kibana ui? thanks in advance
    saradaguruge
    @saradaguruge
    hi
    I'm getting this error when I try to start kibana
    Error: Setup lifecycle of "apm" plugin wasn't completed in 30sec. Consider disabling the plugin and re-start.
    at Timeout.setTimeout (/home/elastic/kibana-7.6.0-linux-x86_64/src/core/utils/promise.js:31:90)
    at ontimeout (timers.js:436:11)
    at tryOnTimeout (timers.js:300:5)
    at listOnTimeout (timers.js:263:5)
    at Timer.processTimers (timers.js:223:10)
    amilcarwong
    @amilcar34706591_twitter

    Hi All,

    I need to create a custom dashboard and use elastic search.
    question: is possible to use elastic search only with angular and create good visualization with query filter etc?

    or I have to use a server-side technology (Elasticsearch Clients)

    thanks

    Anish Aggarwal
    @anishagg17
    Hello everyone , I'm Anish , a second year engineering student from Nit Hamirpur . I work with react and redux and have a strong grasp over web development. I would like to work on "Playground / Toggle System for Live Documentation"
    . It would be great if someone could walk me through the right channel to join !
    Ghost
    @ghost~5d8d1747d73408ce4fcc29fc
    Hello everyone! Can anyone help me how to setup the eui repo locally on the computer?
    tomicarsk6
    @tomicarsk6
    @amilcar34706591_twitter Yes you can,there is a elaticsearch.js library that you can use directly from Angular project, more about that you can read it here
    Daksh Verma
    @Dakkshverma_twitter
    Hello people,
    How to do multiselect faceting in elasticsearch.
    Example from solr http://yonik.com/multi-select-faceting/
    Aneesh12369
    @Aneesh12369

    Hi All,
    logstash is thorwing error when there is message field in json.Its working fine when there is no message field.Below is my conf file.How to handle this even if message filed is present in the json

    input {
      kafka {
        bootstrap_servers => "xxxx.xxx.xxx.xxx:9092"
        topics => "graylog-gelf-input"
        codec => "json"
        tags => "logstash-message"
        }
    }
    
    
    filter {
        json {
          source => "message"
        }
    
        mutate {
          add_field => {
            "message" => "%{[full_message]}"
    
          }
    
          remove_field => [ "version" ]
    
        }
    }
    
    
    output {
         gelf {
             host => "xxx.xxx.xx.xx"
             port => xxxx
             protocol => "TCP"
         }
         file {
             path => "/var/log/logstash/outlog.log"
         }
    
    }

    test message

    {"version":"1.1","host":"kafka-input","short_message":"test short message 123","full_message":"test full message 123","message":"test message 123","timestamp":1582870694,"level":1,"website_count":"0","website":"https://bellasouljewelry.com/","account_id":"207159823","last_active":"20200228T06:18:13Z","fail_reason":"Widget not found","website_domain":"bellasouljewelry.com","start_date":"20200228T06:18:13Z","status":"failed"}
    Jibu Chacko
    @jibsonline
    if [message] {filter {
    mutate { remove_field => [ "field1", "field2", "field3", ... "fieldN" ] }
    }
    if [message] { filter {
    mutate { remove_field => [ "message" ] }
    } }
    Ignore the first
    Daksh Verma
    @Dakkshverma_twitter
    Hello people,
    How to do multiselect faceting in elasticsearch.
    Example from solr http://yonik.com/multi-select-faceting/
    mayurmehta
    @mayurmehta
    Hello People. I want to match phrase while in boolean queries. How can I do that?
    For example: Manager "Corporate Bank Technology"
    Results should Match 'Manager' OR 'Corporate Bank Technology'
    mayurmehta
    @mayurmehta
    I am using
    "bool" : {
          "should" : [
            { 
                  "query_string" : [
                           "fields":  [
                                  'position_title^4',
                                  'parent_organization_name^2',
                                  'organization_name^2',  
                            ],
                            "query": "Corporate Bank Technology",
                            "boost": 4
                    ],
                    "query_string" : [
                           "fields":  [
                                  'position_title^4',
                                  'parent_organization_name^2',
                                  'organization_name^2',  
                            ],
                            "query": "Manager",
                            "boost": 2
                    ]
          ],
    faguilera85
    @faguilera85
    hello, can anyone please tell me how can I search documents where one of its fields match with an array of stringss?
    sayeedc
    @sayeedc
    Hi, does filebeat support collecting logs from a selection of k8s/openshift namespace? E.g. I have 30 namespaces, but I only need logs for pods in just a few of them.
    rishi360
    @rishi360
    Hi can anyone help to filter tomcat access logs and cataline. also the source IP address from which location the tomcat was access. and also help me what i have to add in filebeat.yml also.
    rishi360
    @rishi360
    can any one please help me
    Aneesh12369
    @Aneesh12369
    Hi Team
    This is my cnfiguration file.
    I'm facing issues in production.Some logs are missing. I have captured tcpdump. Seems logstash didnt made any request to the graylog.No errors in the logs.
    Logstash simply dropped the message. The same event when I pushed manually,It was processed successfully.Please help
    input {
      kafka {
        bootstrap_servers => "xxxx:9092,xxxx:9092"
        group_id => "logstash-graylog"
        topics => "gelf-kafka-log-events"
        codec => "json"
        tags => "logstash-message"
        poll_timeout_ms => 5000
        }
    }
    
    
    filter {
        json {
          source => "message"
        }
    
        mutate {
          add_field => {
            "message" => "%{[full_message]}"
    
          }
          remove_field => [ "protocol","version" ]
    
        }
    }
    
    
    output {
    
         gelf {
            host => "xx.xx.xx.xx"
            port => xxxx
            protocol => "TCP"
         }
    
         gelf {
             host => "xxxxx"
             port => xxxx
             protocol => "TCP"
         }
    
    }
    JoaquinHervasFarru
    @JoaquinHervasFarru
    Hi everyone! I’m a newbie with logstash and I’m trying to parse a radware log with grok, I was wondering how do I specify the delimiter between my key=value pairs, thanks in advance for any help!!
    CliveJL
    @CliveJL
    @JoaquinHervasFarru If the logs are structured already with keys and values, then have a look into the Logstash "kv" filter. I don't know what Radware logs look like, but the "kv" filter may be better suited than trying to Grok them.
    Marc Siebeneicher
    @msiebeneicher
    hey folks - one question about the elastic index: if I already have a node in the index like foo.bar witch is "used/filed" by an ingest pipeline with strings, can I simply change it to foo.bar.string?
    I tried something like that and got no new messages after the change, but my grok pattern worked fine in the debugger(s).
    JoaquinHervasFarru
    @JoaquinHervasFarru
    @CliveJL thanks for your help, I tried using kv but in the end the solution was to change the output since the recipient was not able to understand the logs that logstash was sending, anyhow thanks for your support 😄👍🏻
    JoaquinHervasFarru
    @JoaquinHervasFarru
    Hi everyone! I’m trying to find a beginner to intermediate course or tutorial besides just the documentation, since the one I took seems to be a bit light and I would like something a little more comprehensive
    sayeedc
    @sayeedc
    Hi, I'm using apm-server and wondered if apm-server can have multiple transaction indices, i.e. one for each application environment. We have different app environments running on the same openshift cluster. The idea is to have one apm server for all them with separate indices in Elasticsearch.
    tomicarsk6
    @tomicarsk6
    @sayeedc You can use one apm-server, but more than one apm-agent, and you can configure environment name, I think flag is called ELASTIC_APM_ENVIRONMENT. You can also use flag ELASTIC_APM_SERVICE_NAME to name a service.
    Bishal Jaiswal
    @Bishalj

    How to trigger case sensitive wildcard search in elastic java:
    boolQueryBuilder.should(
    QueryBuilders
    .queryStringQuery(""+formattedSearchString+"")
    .defaultField(searchCriteria)
    .analyzeWildcard(false)
    );

    the above query is tokenizing the word if space character is present

    can anyone help me on this ?
    shailers
    @shailers
    hi , i am using pipeline to pipeline communication with the pipeline output/input plugins and the pipeline keeps pushing the same message to elasticsearch again and again

    input {
    beats {
    port => "5050"
    }
    }

    output {

    if  [message] =~ /value/ {
    
        pipeline { send_to => [reciving_pipeline]
         }

    }

    }
    and this pipeline is writing to elasticsearch:

    input {

    pipeline { address => reciving_pipeline }

    }

    filter {

    json {
        source => "message"
    }

    }

    output {

        elasticsearch {
            hosts => ["elasticsearch:9200"]
            index => "reciving_pipeline_index-%{+YYYY.MM}"
        }

    }

    carrow
    @carrowheap
    Hi everyone, I am using elasticseach 5.6 on docker, but I cannot access it from php I have this error "missing authentication token for REST request [/ app_dev / product / _search]". when I test with curl curl -uelastic 172.20.0.2:9200
    It works well.
    Alina
    @alina_alamzeb_twitter
    Hey!
    Does anyone here have experience with Elastic Search Curator?
    suryaZivame
    @skant09
    did anyone face problem with elastic search apm?
    tomicarsk6
    @tomicarsk6
    @skant09 What kind of problems?
    suryaZivame
    @skant09
    @tomicarsk6 We are trying to integrate apm-agent into a java webapp. Though the service got registered on kibana, it doesn't show any data or logs. On the same apm server we have another nodejs service registered which is working fine.
    @tomicarsk6 we integrated according to the provided document.
    teahyuk
    @teahyuk
    @alina_alamzeb_twitter I have
    Ashwin
    @ashwinsoni

    Hello Everyone..... I need help in a elasticsearch query ignoring the _score value.....
    Like if i pass the below query for example....

    GET /products/_search
    {
      "query": {
        "terms": {
          "_id": [
            "5e5e3330bc3e53271a3a0d3c", 
            "5e5e3332bc3e53271a3a0d70", 
            "5e5e3336f65d1b191c35294f", 
            "5e5e3336f65d1b191c35293b", 
            "5e5e3330f65d1b191c3528c5", 
            "5e5e3336bc3e53271a3a0db9", 
          ]
        }
      }
    }

    then i am receiving the response in different sequence order for the given ids based on _score value.....
    Please let me know if we could avoid the _score default behaviour and get the response in the sequence order in which it is requested.....

    Karshil sheth
    @karshil2309
    Hello everyone, Can anyone suggest channel for Solr? I am not finding a good community.
    rbavandl
    @rbavandl
    Hello Everyone!
    I keep getting shard_not_found_exception 404 exception on index (bulk), any thoughts?
    David Gomez
    @nan140114
    Hello "
    I'm trying to set up a ELK cluster hosted by elastic ( 14 days trial ) to receive my metrics from AWS Cloudwatch. After some research, I noticed that the right tool is metricbeat. metricbeat has an AWS module to specify the CW metrics and send it to the cluster. It works on my workstation, I mean I downloaded the metricbeat binary and I configured the AWS module and started the metricbeat process. My doubt is: does metricbeat need an agent running separately from the ELK cluster?
    Does the ELK cluster provide a plugin/agent/approach to run metricbeat on the cluster itself?
    MicroMicroZhang
    @MicroMicroZhang
    Hello I am trying to use elastialert but I could not build the docker image successfully is there any one have the same problem?
    khushboo2908
    @khushboo2908

    Reporting via Elastic:
    I have multiple indices in elastic and I need to generate reports using all of them. Description of indices:
    Ticket:
    Fields: Ticket_id, campaign_id, customer_id, <other_ticket_attributes>
    Campaign
    Fields: campaign_id, campaign_name
    Customer:
    Fields: customer_id, customer_name, customer_email, etc

    A campaign can have billions of tickets in it, also millions of tickets can be associated with a single customer.

    The report ( say ticket detail report) that needs to be generated should have the following information:
    Ticket_id, campaign_id, campaign_name, customer_name, customer_email, etc

    While going through the documentation, I realized that this was not feasible either through elastic DQL, Kibana reporting module, or ElasticSearch SQL. One of the ways to achieve this was by maintaining a denormalized index containing all the above fields.
    In my application’s domain, campaign name and customer details like name, etc are editable. This means editing any of these fields will result in the updating of millions of docs. Is this the correct way of implementing such reporting use-cases or should I switch to SQL DBs ( for joins and nested queries)? Any suggestions are welcome.