Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    SenalChen
    @SenalChen
    why I can't get the connection?
    @vikpandey Hi,my application.yml configuration is:
    image.png
    SenalChen
    @SenalChen
    Maybe something wrong with your hosts file
    And I used Class org.springframework.data.elasticsearch.core.ElasticsearchTemplate; to connect the elasticsearch
    I don't need to build class Config at all;
    Ramo Karahasan-Riechardt
    @ramo-karahasan-riechardt
    Hello
    白玉堂
    @baiyutang
    hi
    LemonAndroid
    @LemonAndroid
    v
     When you want to show someone something cool And they ask "What's cool about it?" Don"t try to e x p l a i n Try to ask them what other things they could find cool that you could show them It"s annoying to explain why things are cool over and over again And why is that annoying? Because it"s annoying to understand what"s cool about something Hence: How will it be cool for the person you are showing it to when they are trying to understand it and the understanding part is already uncool for you?
    baselai
    @baselai

    Hello guys,
    I have a service that outputs two files into a file system, csv and manifest, both have same file name but different extension.

    I need to build a logstash config file that does the following:

    1. Once the files are written, it reads both files (csv and manifest) that are either located in the main or subdirectories (nested folders)
    2. Don't read a previously added files if any new pairs are being added, I mean it only reads the newly added ones in any location under the main root.

    Note: both files, csv and manifest, should be read together because the manifest has metadata that helps me to index the csv file when I'll push it to elasticsearch.

    Question: sometimes the csv file will take 30 seconds to be written, it is a huge file, so I'm wondering if logstash will start read the file once it's created OR once it's closed and the service finished filling it.

    Here is the code I'm using, I managed to read a csv file only, but not sure how to do that for both files as I mentioned above.

    input {
      file {
        path => "/usr/share/input/**/*.*"
        start_position => beginning
        sincedb_path => "/dev/null"
        discover_interval => 2
        stat_interval => "1 s"
      }
    }
    
    filter {
        ...
           .... Code goes here ....
    }
    
    output {
        stdout { codec => rubydebug }
        elasticsearch {
            index => "%{blockId}"
            hosts => ["${HOSTS}"]
        }
    }
    Aneesh Katla
    @bluefishtweets_twitter
    Hi, I 've setup elastic + kibana 7.2.0 new installation .. .
    elastic in cluster mode ,, 2 nodes and kibana single node connected to FQDN provided by load balancer on elastic cluster .. I see indices on 1 node and other node shows no indices .. I've setup the right properties in yml files in both nodes .. not sure what's causing nodes to be out of sync .. .
    v1rt
    @v1rt
    Is it true what I was told that as much as possible, don't use logsstash and instead, use ES ingest node?
    Rajesh Doddi
    @rajeshdoddi
    hello anybody can show how to setup kiban.yml in local machi e?
    Alina
    @alina_alamzeb_twitter
    Hi
    Can anyone direct me to Elastic Search group? Or can I use this group to chat?
    udayKumar Bommala
    @udaykumar1995
    GET localhost:9200/test/_search
    {
    "query": {
    "query_string" : {
    "query" : "uday",
    "fields" : ["field1^2", "*"]
    }
    }
    }
    works well directly on elastic search
    Can some one giude me how I can achieve it using elastic search API?
    udayKumar Bommala
    @udaykumar1995
    Elastic Search Engine Version : 6.7
    NomNomSu
    @NomNomSu

    Hello, can someone tell me what I did wrong here?

    grok {
        match => { "message" => "(?<os_version>((?<=OS_VERSION=)(.*)(?=&SOFTWARE)))"}
        match => { "message" => "(?<software>((?<=SOFTWARE=)(.*)(?=&BLA)))"}
        match => { "message" => "(?<bla>((?<=BLA=)(.*)(?=&FOO)))"}
    }

    Is it really impossible to make all matches in 1 grok?

    gary
    @garyglitter_gitlab
    hello guys i am new to ELK , i am trying to read the txt file in my desktop(Windows) and pass it to the elasticsearch through Logstash. when i use stdin{} in pipeline its working but i can't able to read file from local machine.please help
    khushboo2908
    @khushboo2908

    As I am new to elasticsearch using [elasticsearch version 7.4] and with a lot of studies it is not clear till now how much shards / Nodes are preferred in particular index. As of now, I have configured 3 shards and 2 replicas with 3 Nodes(each having 8GB RAM, 500GB HDD). and having 55GB of Data in One Index. So I need your views/suggestions in the following points.

    Is above given no of shards, Nodes, replicas is sufficient.
    For CAP theorem I will prefer CP i.e: Consistency and Partition-tolerance for this in 3 Node cluster
    For Consistency configured write_consistency=all
    For Partition-tolerance set master-eligible node to (N/2) + 1 in my case it is 3.

    Mostafa Magdy
    @mosafa697

    hay there, want to know how to fix that error:

    #<LogStash::Event:0xc4e5a0>], :response=>{"index"=>
        {"_index"=>"node_test", "_type"=>"redis_logs", "_id"=>"nMbqvW8BVbQcP1IUoW0V", "status"=>400, "error"=>
            {   
                "type"=>"illegal_argument_exception", 
                "reason"=>"Rejecting mapping update to [node_test] as the final mapping would have more than 1 type: [_doc, redis_logs]"}
            }
        }

    the input of logstash is redis:

    input {
      redis {
        batch_count => 1
        data_type => ["list"]
        host => "localhost"
        key => "node_test"
        codec => "plain"
      }
    }
    gwasky
    @gwasky

    Hello there, been at this for forever ... needs some help. Using ES 7.5
    Created a my mapping as below

    {
        "mappings": {
                "properties": {
                    "transactionDate": {
                        "type": "date",
                        "format": "yyyy-MM-dd HH:mm:ss"
                    },
                    "transactionNumber": {
                        "type": "text"
                    },
                    "trnType": {
                        "type": "text"
                    },
                    "memo": {
                        "type": "text"
                    },
                    "srcDeviceNumber": {
                        "type": "text"
                    },
                    "trgDeviceNumber": {
                        "type": "text"
                    },
                    "status": {
                        "type": "text"
                    },
                    "grossAmount": {
                        "type": "long"
                    },
                    "feeAmount": {
                        "type": "long"
                    },
                    "srcEndBalance": {
                        "type": "long"
                    },
                    "tgtEndBalance": {
                        "type": "long"
                    }
                }
        }
    }

    When adding a document below.

    PUT {{url}}/am/txn/102241647
    {
                "TRANSACTION_DATE": "2015-01-06 13:15:34",
                "TRANSNUMBER": "102241647",
                "TRNTYPE": "CASHIN",
                "MEMO": "",
                "SRC_DEVICENUMBER": "256xxx",
                "TGT_DEVICENUMBER": "256xxxx",
                "STATUS": "MREVERSED",
                "GROSS_AMOUNT": "5000",
                "FEE_AMOUNT": "0",
                "SRC_ENDBALANCE": "11029",
                "TGT_ENDBALANCE": "5000"
    }

    I get the error below

    Rejecting mapping update to [am] as the final mapping would have more than 1 type: [_doc, txn]"

    I cant seem to find where thiss issue is

    gwasky
    @gwasky
    seen my issue...
    wasnt including the TYPE when MAPPING
    PUT {{url}}/am
    instead of
    PUT {{url}}/am/txn
    Felipe Martins
    @felipsmartins
    Hi, there! How's it going!? So I have a "simple" question: Can I get Elastic IP from Kibana interface? is that even possible? i'm struggling on that for hours and I got nothing :(
    I mean, the Elastic host which kibana is connected to.
    amilcarwong
    @amilcar34706591_twitter
    Hi , i am new in kibana, can I put some control and take action from kibana ui? thanks in advance
    saradaguruge
    @saradaguruge
    hi
    I'm getting this error when I try to start kibana
    Error: Setup lifecycle of "apm" plugin wasn't completed in 30sec. Consider disabling the plugin and re-start.
    at Timeout.setTimeout (/home/elastic/kibana-7.6.0-linux-x86_64/src/core/utils/promise.js:31:90)
    at ontimeout (timers.js:436:11)
    at tryOnTimeout (timers.js:300:5)
    at listOnTimeout (timers.js:263:5)
    at Timer.processTimers (timers.js:223:10)
    amilcarwong
    @amilcar34706591_twitter

    Hi All,

    I need to create a custom dashboard and use elastic search.
    question: is possible to use elastic search only with angular and create good visualization with query filter etc?

    or I have to use a server-side technology (Elasticsearch Clients)

    thanks

    Anish Aggarwal
    @anishagg17
    Hello everyone , I'm Anish , a second year engineering student from Nit Hamirpur . I work with react and redux and have a strong grasp over web development. I would like to work on "Playground / Toggle System for Live Documentation"
    . It would be great if someone could walk me through the right channel to join !
    walter-ind
    @walter-ind
    Hello everyone! Can anyone help me how to setup the eui repo locally on the computer?
    tomicarsk6
    @tomicarsk6
    @amilcar34706591_twitter Yes you can,there is a elaticsearch.js library that you can use directly from Angular project, more about that you can read it here
    Daksh Verma
    @Dakkshverma_twitter
    Hello people,
    How to do multiselect faceting in elasticsearch.
    Example from solr http://yonik.com/multi-select-faceting/
    Aneesh12369
    @Aneesh12369

    Hi All,
    logstash is thorwing error when there is message field in json.Its working fine when there is no message field.Below is my conf file.How to handle this even if message filed is present in the json

    input {
      kafka {
        bootstrap_servers => "xxxx.xxx.xxx.xxx:9092"
        topics => "graylog-gelf-input"
        codec => "json"
        tags => "logstash-message"
        }
    }
    
    
    filter {
        json {
          source => "message"
        }
    
        mutate {
          add_field => {
            "message" => "%{[full_message]}"
    
          }
    
          remove_field => [ "version" ]
    
        }
    }
    
    
    output {
         gelf {
             host => "xxx.xxx.xx.xx"
             port => xxxx
             protocol => "TCP"
         }
         file {
             path => "/var/log/logstash/outlog.log"
         }
    
    }

    test message

    {"version":"1.1","host":"kafka-input","short_message":"test short message 123","full_message":"test full message 123","message":"test message 123","timestamp":1582870694,"level":1,"website_count":"0","website":"https://bellasouljewelry.com/","account_id":"207159823","last_active":"20200228T06:18:13Z","fail_reason":"Widget not found","website_domain":"bellasouljewelry.com","start_date":"20200228T06:18:13Z","status":"failed"}
    Jibu Chacko
    @jibsonline
    if [message] {filter {
    mutate { remove_field => [ "field1", "field2", "field3", ... "fieldN" ] }
    }
    if [message] { filter {
    mutate { remove_field => [ "message" ] }
    } }
    Ignore the first
    Daksh Verma
    @Dakkshverma_twitter
    Hello people,
    How to do multiselect faceting in elasticsearch.
    Example from solr http://yonik.com/multi-select-faceting/
    mayurmehta
    @mayurmehta
    Hello People. I want to match phrase while in boolean queries. How can I do that?
    For example: Manager "Corporate Bank Technology"
    Results should Match 'Manager' OR 'Corporate Bank Technology'
    mayurmehta
    @mayurmehta
    I am using
    "bool" : {
          "should" : [
            { 
                  "query_string" : [
                           "fields":  [
                                  'position_title^4',
                                  'parent_organization_name^2',
                                  'organization_name^2',  
                            ],
                            "query": "Corporate Bank Technology",
                            "boost": 4
                    ],
                    "query_string" : [
                           "fields":  [
                                  'position_title^4',
                                  'parent_organization_name^2',
                                  'organization_name^2',  
                            ],
                            "query": "Manager",
                            "boost": 2
                    ]
          ],
    faguilera85
    @faguilera85
    hello, can anyone please tell me how can I search documents where one of its fields match with an array of stringss?