top of page
Writer's pictureNitin Yadav

Tool for performing OSINT for dark web investigations

Hello everyone,

I am Nitin yadav(KD) back again with another write-up



Introduction to OSINT

OSINT can be summarized as open-source information and tools to gain intelligence about a target or target system. OSINT can be used to collect information from online sources, such as public websites, blogs, forums, and social media sites, as well as electronic communications in transit and at rest. OSINT can be used to collect information about a target’s political, economic, social, and military affiliations, as well as any other imaginable information.


OSINT is valuable for both the intelligence community and the general public. The intelligence community uses OSINT to collect information about foreign governments and their activities


Importance of OSINT in dark web investigations


As the world becomes increasingly connected, cybercrime and espionage have become increasingly common. These crimes can take many different forms, from theft of intellectual property to sabotage of critical infrastructure.

One of the most important tools law enforcement has for investigating crimes and prosecuting those responsible is known as “open-source intelligence” or “OSINT”. OSINT is the collection, analysis, interpretation, and dissemination of information that can be found in public sources.

One of the most important sources of OSINT information is the “dark web”. The dark web is a collection of websites and networks that are not indexed by search engines or publicly available.


When it comes to conducting OSINT on the dark web, there are a variety of methods that can be used. Some of the more popular methods include:


1. Searching for known dark web addresses and content

2. Conducting brute force attacks against Dark Web servers

3. Scanning public Dark Web search engines for mentions of specific keywords or subjects

4. Scouring dark web forums and communities for information

5. Mining dark weblogs for information

6. Analyzing dark web traffic for suspicious activity

7. Conducting vulnerability scans against Dark Web applications

8. Collecting intelligence on dark web users and groups

9. Tracing dark


But wait here is a bot that can help you to minimize your workload


The Tool: Tor Bot


The tool is written in python. The main objective of this project is to collect open data from the deep web/dark web and with the help of data mining algorithms, collect as much information as possible and produce an interactive tree graph. The interactive tree graph module will be able to display the relations of the collected intelligence data.


Features of the bot:


  1. Onion Crawler (.onion). (Completed)

  2. Returns Page title and address with a short description of the site. (Partially Completed)

  3. Save links to the database. (PR to be reviewed)

  4. Get emails from the site. (Completed)

  5. Save crawl info to JSON file.(Completed)

  6. Crawl custom domains.(Completed)

  7. Check if the link is live.(Completed)

  8. Built-in Updater.(Completed)

  9. TorBot GUI (In progress)

  10. Social Media integration. (Not Started) ...(will be updated)

OS Dependencies:

  • Tor

  • Python 3.7

  • Golang 1.16

Python Dependencies:

  • beautifulsoup4

  • pyinstaller

  • PySocks

  • termcolor

  • requests

  • requests_mock

  • yattag

  • numpy


Golang Dependencies:

  • https://github.com/KingAkeem/gotor (This service needs to be run in tandem with TorBot)


Setup:


Before you run the torBot make sure the following things are done properly:

  • Run tor service

  • Make sure that your torrc is configured to SOCKS_PORT localhost:9050

  • Install Poetry

  • Disable Poetry virtualenvs (not required)

  • Install TorBot Python requirements


On Linux platforms, you can make an executable for TorBot by using the install.sh script. You will need to give the script the correct permissions using


Now you can run

to create the torBot binary. Run

to execute the program.


Using Docker

  • Ensure that you have a tor container running on port 9050.

  • Build the image using the following command (in the root directory):

  • Run the container (make sure to link the tor container as tor):



TO-DO:

  • Visualization Module

  • Implement BFS Search for webcrawler

  • Use Golang service for concurrent webcrawling

  • Improve stability (Handle errors gracefully, expand test coverage and etc.)

  • Create a user-friendly GUI

  • Randomize Tor Connection (Random Header and Identity)

  • Keyword/Phrase search

  • Social Media Integration

  • Increase anonymity

  • Increase efficiency


Repo: Click Here



I hope you enjoyed this one and I see you next time :)


Take care and stay safe!



988 views0 comments

Komentáře


bottom of page