Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jan 31 2019 21:52
    zpriddy edited #860
  • Jan 31 2019 21:52
    zpriddy opened #860
  • Jan 31 2019 20:47
  • Jan 31 2019 14:03
    adl1995 opened #165
  • Jan 31 2019 13:56
    nadouani closed #769
  • Jan 31 2019 13:55

    nadouani on develop

    #769 Add a case template select… Merge branch 'feature/template-… #769 Add case template selector (compare)

  • Jan 31 2019 13:55
    nadouani commented #769
  • Jan 31 2019 13:54
    nadouani milestoned #769
  • Jan 30 2019 18:41
    amr-cossi opened #164
  • Jan 30 2019 16:21
    nadouani edited #271
  • Jan 30 2019 16:20

    nadouani on develop

    #271 Allow merging multiple ale… (compare)

  • Jan 30 2019 16:18

    To-om on develop

    #271 Update alert status when m… (compare)

  • Jan 30 2019 15:53

    To-om on develop

    #271 Add API to merge alert in … (compare)

  • Jan 30 2019 10:44
    nadouani closed #857
  • Jan 30 2019 10:44
    nadouani labeled #857
  • Jan 30 2019 10:44
    Xumeiquer commented #857
  • Jan 30 2019 10:30
    nadouani edited #271
  • Jan 30 2019 10:30
    nadouani edited #271
  • Jan 30 2019 10:30
    nadouani edited #271
  • Jan 30 2019 10:30
    nadouani edited #271
AndreevaP
@AndreevaP
Hi, everyone! I'm new to TheHive project, I'm doing my university project and I try to understand how TheHive works. I downloaded thehive-training-3.3.1.ova (latest version VM aren't available 404 error). I want to connect elk to the hive, so the hive can receive logs, but I don't know how to do that. Can you help me with this? Maybe I can connect a different source to the hive, which will send logs to the hive? Now I just have Thehive and Cortex interfaces and don't know what should I do next. I will be very grateful for your help
Jac Diamond
@angrynipples_twitter
Hey all, is there any documentation around how to run your responders in containers? I've seen responders in the Cortex-Analyzers repo with Dockerfiles but I can't seem to find any documentation on how to containerise your responders and have them work with Cortex
Sohan BASSAVA
@Sbassava
Hello, does somebody know how to add a self signed CA certificate on TheHive4 for MISP, it's not available in the documentation, it's only there for TheHive3
I already have the jks file, but i'm unable to find the syntax for adding the truststore.jks in the application.conf file for TheHive4
Marc Hörsken
@mback2k
Is there a way to debug Cortex performance issues (being unresponsive every now and then), I am on the latest version 3.1.1-1?
kashyap412
@kashyap412
Hi, everyone! I'm new to the TheHive project, i had created 2 organization with the same email, and I'm using the API key to send alerts from the postman i was going to 1st organization but i need to send based on the organization is there any parameter to send an alert to different org
Sheltant
@Sheltant
@TEAM I am new to HIVE and Cortex and finding difficulty in connecting the Elasticseatch running with HTTPS with local certificate and play.ws.ssl.loose.acceptAnyCertificate=true didn;t help either
Venkat Ragavan
@venkat330
what is recommended HW spec fo HIve & cortex without ES in docker
kashyap412
@kashyap412
hi all can we able to send a watcher alert from elasticsearch to thehive with observables included
if yes can any one plz share the sample watcher with observables
Hamza-Smidi
@Hamza-Smidi
Hi, can anyone suggest me how to install this dependencies on redhat 7
sudo apt-get install -y --no-install-recommends python-pip python2.7-dev python3-pip python3-dev ssdeep libfuzzy-dev libfuzzy2 libimage-exiftool-perl libmagic1 build-essential git libssl-dev
Sankar
@knsankar
Hi,
I am trying to upgrade my old instance which is running with hive-3.3.0 to hive-3.4.0. Where can I find the pre-compiled binaries for that?
samsowa
@samsowa
@Hamza-Smidi, it seems you are installing cortex-analyzers on the redhat 7. Here are the dependencies: subscription-manager -enable rehel-server-rhscl-7-rpms
sudo yum install devtoolset-7
sudo yum groupinstall "Development Tools"
sudo yum install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
sudo yum install epel-release yum-utils
sudo yum install python-pip python-devel python3-pip python3-devel libffi-devel ssdeep-devel ssdeep-libs perl-Image-Exiftool per-File-LibMagic git openssl.devel
pip install ssdeep
pip3 install ssdeep
Antonio ruiz bustamante
@antoniorb_gitlab
I have a question, I hope you can help me. I have a use case for the creation of a responder, in which if a case is generated in TheHive and this case contains an observavle of the IP type and also this has within the Metadata the parameter "Is IOC" enabled then it sends the IP address to a blacklist in QRadar. The problem is that I can't get the "Is IOC" parameter. In the case, at the bottom where the observable is, "Flags" appears next to "Type", but I can't get it. Can you help me please? In my code there is the line "data.flags" but I don't know if it is correct, because that line returns an empty list.
image.png
image.png
Uma Reddy Yata
@uma.hunk_gitlab
import of alert into Hive case is taking too long time(morethan 5 min.s approx) incase of too many observables(ex: morethan 300), any idea why case import loading takes huge time? and any solution I can try ?
Darío Menten
@dariommr
Hello Guys, I am new in gitter, sorry if I miss something.
I need help, the first time run of the ThiHive4 is not asking me to create the credentials
Peter David
@PeterJDavid
out of curiosity, does anyone have a recommended workflow for alerts that don't result in any major investigation (like the analyst reviews the alert, goes "nah, that's not actionable", and then does... ???) We're looking at creating a case to immediately close it as N/A or something similar, but are unsure what a good pattern would be. For example, if we get an AWS Console Login without MFA alert, but the user logged in via SAML, we want to just say "no-risk" and move on with little effort.
La3ZmC4A
@La3ZmC4A

I need to create / manage custom fields for use in our templates.

It was easy in thehive3.

In thehive4, I can see how i can add/remove existing custom fields to a template, but not how to create, change, or manage the custom fields themselves.

The documention (link below) is not helpful.
http://docs.thehive-project.org/thehive/user-guides/administration/custom-fields/

Any assistance with this would be greatly appreciated.

second question / help request

i am having issues authenticating with the test user created after initializing the hive4 database.

the documentation on thehive4 is lacking (e.g. http://docs.thehive-project.org/thehive/user-guides/administration/users/)

how can i reset this user's password, create a new user (command line), etc.?

some help would be greatly appreciated.

Peter David
@PeterJDavid
@La3ZmC4A you can modify custom fields through the admin user, so that'll be how - log in as that user and then you can do it there in the settings
La3ZmC4A
@La3ZmC4A
@PeterJDavid i know i can modify existing custom fields, but i need to be able to create new non-prior-existing fields
thank you for the response
Peter David
@PeterJDavid
ah - better clarification @La3ZmC4A - The actual admin@thehive.local user, not any admin user... That takes you to a separate org admin area that allows you to create custom fields. Normal users cannot do that as they're under an Org in TheHive
La3ZmC4A
@La3ZmC4A
@PeterJDavid thank you for the response, that worked, thanks again
vvadymv
@vvadymv
Hi all, I'm frustrating with the upgrade from 3.4 to 4.0. The whole setup is in docker but I'm unable to locate migrate tool. I've pulled thehiveproject/thehive4:4.0.5 and see some shell script in the /opt/thehive/bin/migrate
Kārlis
@TKsvilans
Hello guys! Im trying to set up Synapse with my TheHive instance to demo it out, but Im getting this error, and after hours can't fix it, has anyone come across this problem also?
2021-06-16 12:55:16,882 :: INFO :: workflows.common.common.getConf starts
2021-06-16 12:55:22,577 :: INFO :: workflows.Ews2Case.connectEws starts
2021-06-16 12:55:22,578 :: INFO :: common.common.getConf starts
2021-06-16 12:55:22,578 :: INFO :: objects.EwsConnector. getAccount starts
2021-06-16 12:55:23,078 :: INFO :: objects.EwsConnector.scan starts
2021-06-16 12:55:25,410 :: INFO :: objects.TheHiveConnector.connect starts
2021-06-16 12:55:25,645 :: INFO :: objects.TheHiveConnector.searchCaseByDescription starts
2021-06-16 12:55:25,799 :: INFO :: objects.TheHiveConnector.getTaskIdByName starts
2021-06-16 12:55:25,925 :: INFO :: objects.TheHiveConnector.craftCommTask starts
2021-06-16 12:55:25,925 :: INFO :: objects.TheHiveConnector.createTask starts
2021-06-16 12:55:26,003 :: ERROR :: Task creation failed
2021-06-16 12:55:26,004 :: ERROR :: Failed to create case from email
Traceback (most recent call last):
File "/home/thehive/Synapse-master/workflows/Ews2Case.py", line 73, in connectEws
commTaskId = theHiveConnector.createTask(esCaseId, commTask)
File "/home/thehive/Synapse-master/workflows/objects/TheHiveConnector.py", line 113, in createTask
raise ValueError(json.dumps(response.json(), indent=4, sort_keys=True))
ValueError: {
"message": "User not found",
"type": "NotFoundError"
}
In this instance, a case is created, with the correct assignee, but without the observables..
vvadymv
@vvadymv

Hi all, we migrating from 3.5 to 4.1 and migrate script fails almost just after start. Any ideas what to check?

[error] Migration failed
com.google.inject.ConfigurationException: Guice configuration errors:
1) No implementation for akka.actor.typed.ActorRef<org.thp.thehive.services.CaseNumberActor$Request> annotated with @com.google.inject.name.Named(value=case-number-actor) was bound.
while locating akka.actor.typed.ActorRef<org.thp.thehive.services.CaseNumberActor$Request> annotated with @com.google.inject.name.Named(value=case-number-actor)
for the 15th parameter of org.thp.thehive.services.CaseSrv.<init>(CaseSrv.scala:42)
while locating org.thp.thehive.services.CaseSrv
for the 4th parameter of org.thp.thehive.migration.th4.Output.<init>(Output.scala:90)
while locating org.thp.thehive.migration.th4.Output
2) No implementation for akka.actor.typed.ActorRef<org.thp.thehive.services.CaseNumberActor$Request> annotated with @com.google.inject.name.Named(value=case-number-actor) was bound.
while locating akka.actor.typed.ActorRef<org.thp.thehive.services.CaseNumberActor$Request> annotated with @com.google.inject.name.Named(value=case-number-actor)
for the 15th parameter of org.thp.thehive.services.CaseSrv.<init>(CaseSrv.scala:42)
while locating org.thp.thehive.services.CaseSrv
for the 1st parameter of org.thp.thehive.connector.cortex.services.ActionOperationSrv.<init>(ActionOperationSrv.scala:20)
while locating org.thp.thehive.connector.cortex.services.ActionOperationSrv
for the 2nd parameter of org.thp.thehive.connector.cortex.services.ActionSrv.<init>(ActionSrv.scala:34)
while locating org.thp.thehive.connector.cortex.services.ActionSrv
for the 23rd parameter of org.thp.thehive.migration.th4.Output.<init>(Output.scala:90)
while locating org.thp.thehive.migration.th4.Output

3) No implementation for akka.actor.typed.ActorRef<org.thp.thehive.services.CaseNumberActor$Request> annotated with @com.google.inject.name.Named(value=case-number-actor) was bound.
while locating akka.actor.typed.ActorRef<org.thp.thehive.services.CaseNumberActor$Request> annotated with @com.google.inject.name.Named(value=case-number-actor)
for the 15th parameter of org.thp.thehive.services.CaseSrv.<init>(CaseSrv.scala:42)
while locating com.google.inject.Provider<org.thp.thehive.services.CaseSrv>
for the 1st parameter of org.thp.thehive.services.TaskSrv.<init>(TaskSrv.scala:28)
while locating org.thp.thehive.services.TaskSrv
for the 1st parameter of org.thp.thehive.connector.cortex.services.EntityHelper.<init>(EntityHelper.scala:22)
while locating org.thp.thehive.connector.cortex.services.EntityHelper
for the 3rd parameter of org.thp.thehive.connector.cortex.services.ActionSrv.<init>(ActionSrv.scala:34)
while locating org.thp.thehive.connector.cortex.services.ActionSrv
for the 23rd parameter of org.thp.thehive.migration.th4.Output.<init>(Output.scala:90)
while locating org.thp.thehive.migration.th4.Output
3 errors
at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1120)
at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1078)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1131)
at org.thp.thehive.migration.th4.Output$.apply(Output.scala:83)
at org.thp.thehive.migration.Migrate$.$anonfun$new$1(Migrate.scala:198)
at org.thp.thehive.migration.Migrate$.$anonfun$new$1$adapted(Migrate.scala:182)
at scala.Option.foreach(Option.scala:407)
at org.thp.thehive.migration.Migrate$.delayedEndpoint$org$thp$thehive$migration$Migrate$1(Migrate.scala:182)
at org.thp.thehive.migration.Migrate$delayedInit$body.apply(Migrate.scala:16)
at scala.Function0.apply$mcV$sp(Function0.scala:39)

tl-Bruno-Braga
@tl-Bruno-Braga
Hey guys. Duplicated alerts seem to be dropped. Is there a way to have a counter for each alert? i am thinking about implementing it through artifacts and everytime i get a new hit i just update it. Any thoughts? It is helpful to keep track of these.
La3ZmC4A
@La3ZmC4A
i'm having the same issues with 3.5 -> 4.1 migrate as @vvadymv
Peter David
@PeterJDavid
@tl-Bruno-Braga the duplicate alerts seem to be based on the "sourceRef" and "source" matching. If these are actually unique alerts from the source system, and your using the "sourceRef" field right, these should not auto-reject. Otherwise, are you accidentally importing the same alert multiple times?
You could also abuse this by catching the "duplicate alert" response and just updating a custom field on the alert, but not sure how valuable that would be as the alerts should be coming in at different times and such, so you'd lose trending by just implementing as a counter.
@tl-Bruno-Braga we just import them as individual alerts and have a webhook parser that collates and merges them into specific cases if those cases are still open
vvadymv
@vvadymv
@La3ZmC4A welcome to the club ) Did you try 3.4 to 4.0 migration?
Tj
@2koobshaaxboyz_twitter
Hi All, I do not know if this is the right chatroom for inquiries. But I am trying to get access to TheHive and Cortex demo VM. Unfortunately, all links I've tried are not working, including the one StrangeBee sent me.
tl-Bruno-Braga
@tl-Bruno-Braga
@PeterJDavid i implement an external whitelist so the sourceRef is actually a behavior hash. it will match several alerts but i wanted to have it there anyway because as the number of alerts with same behavior increases i can whitelist, otherwise it is just a one time occurence and we can let it slide and alert again
i will use a side analyser to pull the number of hits and add the context later
but i would definetely consider keeping a counter of duplicates.
also, thanks for response :)
La3ZmC4A
@La3ZmC4A
@vvadymv i haven't because i currently have hive3.5, but i guess i could try the multi-step process of downgrading to upgrade
Rodrigo
@rdrg-ar
Hello everyone! I'm having trouble to refresh or update an analyzer template. I changed long.html inside the analyzer directory in /opt/cortex/Cortex-Analyzers/thehive-templates/ , and also changed it with the Cortex admin within the web UI. Restarted thehive and cortex, but the changes doesn't show up in the observables detail view. It is still the same report. Also, the file long.html and the template which can be changed from the UI seems unrelated, both are diferents. What I am missing here? Any help will be appreciated. Thanks!
samsowa
@samsowa

Hello guys, I am trying to migrate TheHive3.4 to TheHive4.1 using this command /opt/thehive/bin/migrate --output /etc/thehive/application.conf --main-organisation TESTME --es-uri http://192.168.0.34:9200 --es-index the_hive
I am getting below Error. I tried to search the error online but no success.
3 errors
at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1120)
at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1078)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1131)
at org.thp.thehive.migration.th4.Output$.apply(Output.scala:83)
at org.thp.thehive.migration.Migrate$.$anonfun$new$1(Migrate.scala:198)
at org.thp.thehive.migration.Migrate$.$anonfun$new$1$adapted(Migrate.scala:182)
at scala.Option.foreach(Option.scala:407)
at org.thp.thehive.migration.Migrate$.delayedEndpoint$org$thp$thehive$migration$Migrate$1(Migrate.scala:182)
at org.thp.thehive.migration.Migrate$delayedInit$body.apply(Migrate.scala:16)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1$adapted(App.scala:80)
at scala.collection.immutable.List.foreach(List.scala:431)
at scala.App.main(App.scala:80)
at scala.App.main$(App.scala:78)
at org.thp.thehive.migration.Migrate$.main(Migrate.scala:16)
at org.thp.thehive.migration.Migrate.main(Migrate.scala)
2021-06-20 23:19:11,439 [INFO] from org.thp.thehive.ClusterListener in TheHiveMigration-akka.actor.default-dispatcher-11 [|] Member is Removed: akka://TheHiveMigration@127.0.0.1:38227 after Exiting
2021-06-20 23:19:11,766 [INFO] from org.thp.thehive.migration.Migrate in main [|] Stage: initialisation

Here is the entire application.log
2021-06-20 23:15:56,978 [INFO] from org.thp.scalligraph.ScalligraphModule in main [|] Loading scalligraph module
2021-06-20 23:16:09,013 [INFO] from akka.event.slf4j.Slf4jLogger in application-akka.actor.default-dispatcher-6 [|] Slf4jLogger started
2021-06-20 23:16:12,630 [INFO] from akka.remote.artery.tcp.ArteryTcpTransport in application-akka.actor.default-dispatcher-6 [|] Remoting started with transport [Artery tcp]; listening on address [akka://application@127.0.0.1:35055] with UID [-4347204014927584478]
2021-06-20 23:16:12,764 [INFO] from akka.cluster.Cluster in application-akka.actor.default-dispatcher-6 [|] Cluster Node [akka://application@127.0.0.1:35055] - Starting up, Akka version [2.6.10] ...
2021-06-20 23:16:13,297 [INFO] from akka.cluster.Cluster in application-akka.actor.default-dispatcher-6 [|] Cluster Node [akka://application@127.0.0.1:35055] - Registered cluster JMX MBean [akka:type=Cluster]
2021-06-20 23:16:13,297 [INFO] from akka.cluster.Cluster in application-akka.actor.default-dispatcher-6 [|] Cluster Node [akka://application@127.0.0.1:35055] - Started up successfully
2021-06-20 23:16:13,722 [INFO] from akka.cluster.Cluster in application-akka.actor.default-dispatcher-10 [|] Cluster Node [akka://application@127.0.0.1:35055] - No seed-nodes configured, manual cluster join required, see https://doc.akka.io/docs/akka/current/typed/cluster.html#joining
2021-06-20 23:16:13,863 [INFO] from akka.cluster.sbr.SplitBrainResolver in application-akka.actor.default-dispatcher-6 [|] SBR started. Config: strategy [KeepMajority], stable-after [20 seconds], down-all-when-unstable [15 seconds], selfUniqueAddress [akka://application@127.0.0.1:35055#-4347204014927584478], selfDc [default].
2021-06-20 23:16:22,109 [INFO] from org.reflections.Reflections in main [|] Reflections took 1746 ms to scan 1 urls, producing 165 keys and 2480 values
2021-06-20 23:16:22,853 [INFO] from org.thp.thehive.ClusterSetup in main [|] Initialising cluster
2021-06-20 23:16:23,012 [INFO] from akka.cluster.Cluster in application-akka.actor.default-dispatcher-6 [|] Cluster Node [akka://application@127.0.0.1:35055] - Node [akka://application@127.0.0.1:35055] is JOINING itself (with roles [dc-default], version [0.0.0]) and forming new cluster
2021-0

samsowa
@samsowa
Hello guys, I am also trying to migrate TheHive3.4 to TheHive4.0 in the lab environment. I am getting below error even though thehive user has read/write access to the hdfs directory
[error] Case/Observable creation failure: org.apache.hadoop.security.AccessControlException: Permission denied: user=thehive, access=WRITE, inode="/":hadoop:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:496)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:336)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:241)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1909)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1893)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1852)
at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2635)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2577)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:807)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:494)
vvadymv
@vvadymv
@La3ZmC4A we were able to upgrade from 3.5 to 4.1 with migrate from 4.1.5 (it failed with 4.1.6)
CryptoNation
@CoinClues_twitter
Hi all
but getting this error

root@cortex1:/home/ubuntu# service thehive status
● thehive.service - Scalable, Open Source and Free Security Incident Response Solutions
Loaded: loaded (/usr/lib/systemd/system/thehive.service; disabled; vendor preset: enabled)
Active: failed (Result: exit-code) since Wed 2021-06-23 11:15:57 UTC; 6min ago
Docs: https://thehive-project.org
Process: 26118 ExecStart=/opt/thehive/bin/thehive -Dconfig.file=/etc/thehive/application.conf -Dlogger.file=/etc/thehiv
Main PID: 26118 (code=exited, status=255)

Jun 23 11:15:57 cortex1 systemd[1]: Started Scalable, Open Source and Free Security Incident Response Solutions.
Jun 23 11:15:57 cortex1 systemd[1]: thehive.service: Main process exited, code=exited, status=255/n/a
Jun 23 11:15:57 cortex1 systemd[1]: thehive.service: Failed with result 'exit-code'.

any pointers where I could have screwed up with the setup