Ransomware leak site monitoring

Related tags

Loggingransomwatch
Overview

RansomWatch

Build Image Docker Hub Publish Docker Hub Image

RansomWatch is a ransomware leak site monitoring tool. It will scrape all of the entries on various ransomware leak sites, store the data in a SQLite database, and send notifications via Slack or Discord when a new victim shows up, or when a victim is removed.

Configuration

In config_vol/, please copy config.sample.yaml to config.yaml, and add the following:

  • Leak site URLs. I decided not to make this list public in order to prevent them from gaining even more noteriety, so if you have them, add them in. If not, this tool isn't for you.
  • Notification destinations. RansomWatch currently supports notifying via.the following:
    • Slack: Follow these instructions to add a new app to your Slack workspace and add the webhook URL to the config.
    • Discord: Follow these instructions to add a new app to your Discord server and add the webhook URL to the config.

Additionally, there are a few environment variables you may need to set:

  • RW_DB_PATH: Path for the SQLite database to use
  • RW_CONFIG_PATH: Path to the config.yaml file

These are both set in the provided docker-compose.yml.

Usage

This is intended to be run in Docker via a cronjob on whatever increment you decide to use.

First, build the container: docker-compose build app

Then, add it to your crontab. Example crontab entry (running every 8 hours):

0 */8 * * * cd /path/to/ransomwatch && docker-compose up --abort-on-container-exit

If you'd prefer, you can use the image published on Docker Hub (captaingeech/ransomwatch) instead, with a docker-compose.yml that looks something like this:

version: "3"

services:
  app:
    image: captaingeech/ransomwatch:latest
    depends_on:
      - proxy
    volumes:
      - ./db_vol:/db
      - ./config_vol:/config
    environment:
      PYTHONUNBUFFERED: 1
      RW_DB_PATH: /db/ransomwatch.db
      RW_CONFIG_PATH: /config/config.yaml

  proxy:
    image: captaingeech/tor-proxy:latest

This can also be run via the command line, but that requires you to have your own Tor proxy (with the control service) running. Example execution:

$ RW_DB_PATH=./db_vol/ransomwatch.db RW_CONFIG_PATH=./config_vol/config.yaml python3 src/ransomwatch.py

Example Slack Messages

Slack notification for new victim

Slack notification for removed victim

Slack notification for site down

Slack notification for an error

The messages sent to Discord are very similar in style, identical in content.

Leak Site Implementations

The following leak sites are (planned to be) supported:

  • Conti
  • MAZE
  • Egregor
  • Sodinokibi/REvil
  • DoppelPaymer (captcha, prob won't be supported for a while)
  • NetWalker
  • Pysa
  • Avaddon
  • DarkSide
  • CL0P
  • Nefilim
  • Mount Locker
  • Suncrypt
  • Everest
  • Ragnarok
  • Ragnar_Locker
  • BABUK LOCKER
  • Pay2Key
  • Cuba
  • RansomEXX
  • Pay2Key
  • Ranzy Locker
  • Astro Team
  • LV

If there are other leak sites you want implemented, feel free to open a PR or DM me on Twitter, @captainGeech42

Comments
  • Pysa timestamp format change

    Pysa timestamp format change

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/pysa.py", line 38, in scrape_victims
        published_dt = datetime.strptime(
      File "/usr/local/lib/python3.9/_strptime.py", line 568, in _strptime_datetime
        tt, fraction, gmtoff_fraction = _strptime(data_string, format)
      File "/usr/local/lib/python3.9/_strptime.py", line 349, in _strptime
        raise ValueError("time data %r does not match format %r" %
    ValueError: time data '22/03/21' does not match format '%m/%d/%y'
    
    opened by captainGeech42 4
  • Something broken with REvil

    Something broken with REvil

    app_1    | 2021/04/20 18:36:25 [ERROR] Got an error while scraping REvil, notifying
    app_1    | 2021/04/20 18:36:25 [ERROR] Error sending Discord notification (400): {"embeds": ["0"]}
    app_1    | 2021/04/20 18:36:25 [ERROR] Failed to send error notification to Discord guild "test-discord"
    app_1    | 2021/04/20 18:36:25 [ERROR] Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | http.client.RemoteDisconnected: Remote end closed connection without response
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
    app_1    |     resp = conn.urlopen(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 755, in urlopen
    app_1    |     retries = retries.increment(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py", line 532, in increment
    app_1    |     raise six.reraise(type(error), error, _stacktrace)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py", line 734, in reraise
    app_1    |     raise value.with_traceback(tb)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/app/ransomwatch.py", line 52, in main
    app_1    |     s.scrape_victims()
    app_1    |   File "/app/sites/revil.py", line 62, in scrape_victims
    app_1    |     r = p.get(f"{self.url}?page={i}", headers=self.headers)
    app_1    |   File "/app/net/proxy.py", line 101, in get
    app_1    |     return self.session.get(*args, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 555, in get
    app_1    |     return self.request('GET', url, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
    app_1    |     resp = self.send(prep, **send_kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
    app_1    |     r = adapter.send(request, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 498, in send
    app_1    |     raise ConnectionError(err, request=request)
    app_1    | requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    | 2021/04/20 18:36:25 [INFO] Finished all sites, exiting
    

    not sure what's going on. similar error w/ slack

    bug 
    opened by captainGeech42 3
  • Conti - Scraping Error

    Conti - Scraping Error

    Describe the bug

    Error Message Below:

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    To Reproduce Steps to reproduce the behavior: This error has happened several times over the last 24 hours while ransomwatch has been run on a cron job.

    Expected behavior Parse the contents of the Conti site with no errors or have additional error handling built in to handle this error.

    Screenshots If applicable, add screenshots to help explain your problem.

    Logs

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    Environment

    • OS: Ubuntu 20.04
    • How you are running it: Docker via cron job (read me best practices implementation)

    Additional context Add any other context about the problem here.

    opened by GRIT-5ynax 2
  • Dockerhub image out of date

    Dockerhub image out of date

    Running the Dockerhub image results in

    app_1 | Traceback (most recent call last): app_1 | File "/app/ransomwatch.py", line 98, in app_1 | NotificationManager.send_error_notification(f"Non-scraping failure", tb, fatal=True) app_1 | File "/app/notifications/manager.py", line 30, in send_error_notification app_1 | for workspace, params in Config["slack"].items(): app_1 | KeyError: 'slack'

    Works if the image is built

    bug 
    opened by nhova 2
  • New sites

    New sites

    • [x] Ranzy
    • [x] Astro
    • [x] Pay2Key
    • [x] Cuba
    • [x] RansomEXX
    • [x] Mount Locker
    • [x] Ragnarok
    • [ ] Ragnar Locker
    • [x] Suncrypt
    • [x] Everest
    • [x] Nefilim
    • [x] CL0P
    • [x] Pysa
    opened by captainGeech42 2
  • New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    This pull request adds support for BLACKMATTER, ARVIN, EL COMETA, LORENZ, XING, LOCKBIT.

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc.
    • [x] The data going into the DB is properly parsed and is accurate
    enhancement 
    opened by x-originating-ip 1
  • cl0p scraper broken

    cl0p scraper broken

    Describe the bug Cl0p scraper out of date

    Logs

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/cl0p.py", line 21, in scrape_victims
        victim_list = soup.find("div", class_="collapse-section").find_all("li")
    AttributeError: 'NoneType' object has no attribute 'find_all'
    

    should probably just update this to the v3 site as well

    bug 
    opened by captainGeech42 1
  • Enhance pysa datetimes processing (#50)

    Enhance pysa datetimes processing (#50)

    Describe the changes

    Adding some logics into pysa.py to try to process the datetime better. Also, exception handling has been added to avoid crash of the script.

    Related issue(s)

    #50

    How was it tested?

    Before: scrapping failed at some point if pysa was defined in the yaml config file (see related issue).

    Now:

    • [x] scrapping works
    • [x] dates look good (although as we don't know what is the true value, we can only admit it's relevant)
    • [x] the script does not crash any longer because of the try/catch instructions.
    opened by biligonzales 1
  • Handle missing notifications element in the yaml config file (#52)

    Handle missing notifications element in the yaml config file (#52)

    Describe the changes

    Added minor changes into manager.py so that it does not scream out loud if we do not want to configure notifications. Basically the presence of the notifications element in the Config yaml is tested.

    Related issue(s)

    #52

    How was it tested?

    • [x] Docker started with an empty notifications element
    • [x] Docker started withtout any notifications element
    opened by biligonzales 1
  • Unable to run without configured notifications

    Unable to run without configured notifications

    The notifications part in the config.yaml file needs to be present and configured to avoid any error at runtime. Would be great to be able to leave the notifications part empty (or even not to set it in the yaml config).

    opened by biligonzales 1
  • Conti: scraper fixed (#73)

    Conti: scraper fixed (#73)

    Describe the changes

    Fixed the Conti scraper to use the newsList javascript item because no html elements were available any longer.

    Related issue(s)

    This fixes issue #73

    How was it tested?

    1. Add Conti url to config.yaml
    2. Run docker-compose build app
    3. Run docker-compose up --abort-on-container-exit
    4. Conti results are pushed again in the database

    Checklist for a new scraper (delete if N/A)

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc. (there was one logging.debug there, I left it as it was)
    • [x] The data going into the DB is properly parsed and is accurate
    opened by biligonzales 0
  • Lockbit scraper fixed (now uses playwright) #74

    Lockbit scraper fixed (now uses playwright) #74

    Describe the changes

    Lockbit 2.0 now uses a ddos protection mechanism hence the regular http get method is no longer working.

    As a workaround I have implemented the playwright Microsoft library which behaves as if a proper browser did the request.

    Summary of the changes:

    1. lockbit.py: replaced the use of requests by playwright
    2. requirements.txt: added playwright
    3. Dockerfile: added playwright chromium support as well as required libraries.

    I have also upgraded at the top of the Dockerfile from python3.9-buster to python3.10-bullseye.

    Related issue(s)

    It fixes Issue #74

    Note that the scraping engine for lockbit has been left untouched as it is still perfectly working. Only the web page retrieval method has been altered.

    How was it tested?

    • [x] docker-compose build app
    • [x] docker-compose up --abort-on-container-exit
    • [x] Checked that Lockbit entries have been inserted into the database
    opened by biligonzales 3
  • new victims monitoring is broken, alert only when sites are down

    new victims monitoring is broken, alert only when sites are down

    Describe the bug The app doesn't alert when new victims added to the ransom sites (we noticed that new victim are being added on some of the sites) We get alerts only when the sites are down.

    Expected behavior The app alert when new victim are added to the ransom sits being monitored.

    Logs Starting ransomwatch_proxy_1 ... done Starting ransomwatch_app_1 ... done Attaching to ransomwatch_proxy_1, ransomwatch_app_1 proxy_1 | Feb 07 14:50:31.819 [notice] Tor 0.4.5.7 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1i, Zlib 1.2.11, Liblzma 5.2.5, Libzstd 1.4.5 and Unknown N/A as libc. proxy_1 | Feb 07 14:50:31.822 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://www.torproject.org/download/download#warning proxy_1 | Feb 07 14:50:31.822 [notice] Read configuration file "/etc/tor/torrc". proxy_1 | Feb 07 14:50:31.825 [notice] Opening Socks listener on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Socks listener connection (ready) on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opening Control listener on 0.0.0.0:9051 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Control listener connection (ready) on 0.0.0.0:9051 app_1 | 2022/02/07 14:50:33 [INFO] Initializing app_1 | 2022/02/07 14:50:33 [INFO] Found 30 sites app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Avaddon app_1 | 2022/02/07 14:50:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Conti app_1 | 2022/02/07 14:50:38 [INFO] Scraping victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:48 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:48 [INFO] Finished Conti app_1 | 2022/02/07 14:51:48 [INFO] Starting process for DarkSide app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for REvil app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for Babuk app_1 | 2022/02/07 14:51:50 [INFO] Scraping victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:51 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:51 [INFO] Finished Babuk app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Ranzy app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Astro app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Pay2Key app_1 | 2022/02/07 14:51:53 [INFO] Scraping victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:54 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:54 [INFO] Finished Pay2Key app_1 | 2022/02/07 14:51:54 [INFO] Starting process for Cuba app_1 | 2022/02/07 14:51:57 [INFO] This is the first scrape for Cuba, no victim notifications will be sent app_1 | 2022/02/07 14:51:57 [INFO] Scraping victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:08 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:08 [INFO] Finished Cuba app_1 | 2022/02/07 14:52:08 [INFO] Starting process for RansomEXX app_1 | 2022/02/07 14:52:10 [INFO] Scraping victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:13 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:13 [INFO] Finished RansomEXX app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Mount app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnarok app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnar app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Suncrypt app_1 | 2022/02/07 14:52:15 [INFO] This is the first scrape for Suncrypt, no victim notifications will be sent app_1 | 2022/02/07 14:52:15 [INFO] Scraping victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:17 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:17 [INFO] Finished Suncrypt app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Everest app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Nefilim app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Cl0p app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Pysa app_1 | 2022/02/07 14:52:19 [INFO] Scraping victims app_1 | 2022/02/07 14:52:23 [WARNING] couldn't parse timestamp: 00/00/00 app_1 | /usr/local/lib/python3.9/site-packages/dateparser/date_parser.py:35: PytzUsageWarning: The localize method is no longer necessary, as this time zone supports the fold attribute (PEP 495). For more details on migrating to a PEP 495-compliant implementation, see https://pytz-deprecation-shim.readthedocs.io/en/latest/migration.html app_1 | date_obj = stz.localize(date_obj) app_1 | 2022/02/07 14:52:24 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:24 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:24 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:24 [INFO] Finished Pysa app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Hive app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lockbit app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Xing app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lorenz app_1 | 2022/02/07 14:52:26 [INFO] This is the first scrape for Lorenz, no victim notifications will be sent app_1 | 2022/02/07 14:52:26 [INFO] Scraping victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:27 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:27 [INFO] Finished Lorenz app_1 | 2022/02/07 14:52:27 [INFO] Starting process for ElCometa app_1 | 2022/02/07 14:52:27 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:27 [INFO] Starting process for Arvin app_1 | 2022/02/07 14:52:30 [INFO] This is the first scrape for Arvin, no victim notifications will be sent app_1 | 2022/02/07 14:52:30 [INFO] Scraping victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:33 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:33 [INFO] Finished Arvin app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Blackmatter app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Avoslocker app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for LV app_1 | 2022/02/07 14:52:35 [INFO] Scraping victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:37 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:37 [INFO] Finished LV app_1 | 2022/02/07 14:52:37 [INFO] Starting process for Marketo app_1 | 2022/02/07 14:52:37 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:37 [INFO] Starting process for LockData app_1 | 2022/02/07 14:52:40 [INFO] Scraping victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:42 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:42 [INFO] Finished LockData app_1 | 2022/02/07 14:52:42 [INFO] Starting process for Rook app_1 | 2022/02/07 14:52:42 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:42 [INFO] Finished all sites, exiting

    Environment

    • OS: [Ubuntu 20.04.3]
    • How you are running it: [Docker with cronjob]
    opened by Deventual 1
  • Victim removal detection doesn't work properly when onion changes

    Victim removal detection doesn't work properly when onion changes

    Victim removal detection currently uses the full URL usually, which includes the onion domain. One side effect of this is that whenever the onion addr for a site changes, all of the victims are considered removed and new on the next scrape, which is problematic.

    Change this to just use the URI + site ID.

    bug 
    opened by captainGeech42 0
  • LOCKBIT 2.0 Support

    LOCKBIT 2.0 Support

    Site Info (no URL) LOCKBIT 2.0 was released some time ago. It should be confirmed either the scraper works with the new site or a module should be rewritten.

    Is the site currently online? Yes

    opened by wersas1 5
Releases(v1.2)
  • v1.2(Dec 4, 2021)

    This release fixes a few different bugs on the following scrapers:

    • Ragnar
    • Lorenz
    • Pysa
    • Arvin

    What's Changed

    • fixed #79 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/80
    • fixed #76 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/81
    • fixed #77, changed dateparsing to use lib by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/82
    • changed arvin date parsing to use lib (fixes #75) by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/83

    Full Changelog: https://github.com/captainGeech42/ransomwatch/compare/v1.1...v1.2

    Source code(tar.gz)
    Source code(zip)
  • v1.1(Dec 2, 2021)

    Ransomwatch v1.1

    This release adds support for many new sites, and has a critical security update. For details on the security update, see here.

    Supported Sites

    This release supports the following shame sites:

    • Conti
    • Sodinokibi/REvil
    • Pysa
    • Avaddon
    • DarkSide
    • CL0P
    • Nefilim
    • Mount Locker
    • Suncrypt
    • Everest
    • Ragnarok
    • Ragnar_Locker
    • BABUK LOCKER
    • Pay2Key
    • Cuba
    • RansomEXX
    • Pay2Key
    • Ranzy Locker
    • Astro Team
    • BlackMatter
    • Arvin
    • El_Cometa
    • Lorenz
    • Xing
    • Lockbit
    • AvosLocker
    • LV
    • Marketo
    • Lockdata
    Source code(tar.gz)
    Source code(zip)
  • v1.0(Apr 18, 2021)

    v1.0 Ransomwatch Release

    This initial version of Ransomwatch supports the following sites:

    • Conti
    • REvil/Sodinokibi
    • Avaddon
    • DarkSide

    This release supports notifying via:

    • Slack Webhooks

    More sites/notification capabilities will be added over time. However, this release has been tested in a production capacity and should be suitable to start collections.

    If you find any bugs or run across an problems, please open an issue to help improve Ransomwatch

    Source code(tar.gz)
    Source code(zip)
Owner
Zander Work
@osusec / @OSU-SOC
Zander Work
Simple and versatile logging library for python 3.6 above

Simple and versatile logging library for python 3.6 above

Miguel 1 Nov 23, 2022
🐑 Syslog Simulator hazır veya kullanıcıların eklediği logları belirtilen adreslere ve port'a seçilen döngüde syslog ile gönderilmesini sağlayan araçtır. | 🇹🇷

syslogsimulator hazır ürün loglarını SIEM veya log toplayıcısına istediğiniz portta belirli sürelerde göndermeyi sağlayan küçük bir araçtır.

Enes Aydın 3 Sep 28, 2021
Summarize LSF job properties by parsing log files.

Summarize LSF job properties by parsing log files of workflows executed by Snakemake.

Kim 4 Jan 09, 2022
Python logging package for easy reproducible experimenting in research

smilelogging Python logging package for easy reproducible experimenting in research. Why you may need this package This project is meant to provide an

Huan Wang 20 Dec 23, 2022
A Fast, Extensible Progress Bar for Python and CLI

tqdm tqdm derives from the Arabic word taqaddum (تقدّم) which can mean "progress," and is an abbreviation for "I love you so much" in Spanish (te quie

tqdm developers 23.7k Jan 01, 2023
Robust and effective logging for Python 2 and 3.

Robust and effective logging for Python 2 and 3.

Chris Hager 1k Jan 04, 2023
A colored formatter for the python logging module

Log formatting with colors! colorlog.ColoredFormatter is a formatter for use with Python's logging module that outputs records using terminal colors.

Sam Clements 778 Dec 26, 2022
Ultimate Logger - A Discord bot that logs lots of events in a channel written in python

Ultimate Logger - A Discord bot that logs lots of events in a channel written in python

Luca 2 Mar 27, 2022
Python bindings for g3log

g3logPython Python bindings for g3log This library provides python3 bindings for g3log + g3sinks (currently logrotate, syslog, and a color-terminal ou

4 May 21, 2021
A basic logging library for Python.

log.py 📖 About: A basic logging library for Python with the capability to: save to files. have custom formats. have custom levels. be used instantiat

Sebastiaan Bij 1 Jan 19, 2022
Key Logger - Key Logger using Python

Key_Logger Key Logger using Python This is the basic Keylogger that i have made

Mudit Sinha 2 Jan 15, 2022
Monitor and log Network and Disks statistics in MegaBytes per second.

iometrics Monitor and log Network and Disks statistics in MegaBytes per second. Install pip install iometrics Usage Pytorch-lightning integration from

Leo Gallucci 17 May 03, 2022
Log4j alternative for Python

Log4p Log4p is the most secure logging library ever created in this and all other universes. Usage: import log4p log4p.log('"Wow, this library is sec

Isaak Uchakaev 15 Dec 16, 2022
A simple package that allows you to save inputs & outputs as .log files

wolf_dot_log A simple package that allows you to save inputs & outputs as .log files pip install wolf_dot_log pip3 install wolf_dot_log |Instructions|

Alpwuf 1 Nov 16, 2021
A lightweight logging library for python applications

cakelog a lightweight logging library for python applications This is a very small logging library to make logging in python easy and simple. config o

2 Jan 05, 2022
Structured Logging for Python

structlog makes logging in Python faster, less painful, and more powerful by adding structure to your log entries. It's up to you whether you want str

Hynek Schlawack 2.3k Jan 05, 2023
Python script to scan log files/system for unauthorized access around system

checkLogs Python script to scan log files/system for unauthorized access around Linux systems Table of contents General info Getting started Usage Gen

James Kelly 1 Feb 25, 2022
ClusterMonitor - a very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs

ClusterMonitor A very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs. Usage To start recording

23 Oct 04, 2021
A demo of Prometheus+Grafana for monitoring an ML model served with FastAPI.

ml-monitoring Jeremy Jordan This repository provides an example setup for monitoring an ML system deployed on Kubernetes.

Jeremy Jordan 176 Jan 01, 2023
Logging system for the TPC software.

tpc_logger Logging system for the TPC software. The TPC Logger class provides a singleton for logging information within C++ code or in the python API

UC Davis Machine Learning 1 Jan 10, 2022