Python Testing Crawler 🐍 🩺 🕷️ A crawler for automated functional testing of a web application

Overview

Python Testing Crawler 🐍 🩺 🕷️

PyPI version PyPI Supported Python Versions GitHub license GitHub Actions (Tests)

A crawler for automated functional testing of a web application

Crawling a server-side-rendered web application is a low cost way to get low quality test coverage of your JavaScript-light web application.

If you have only partial test coverage of your routes, but still want to protect against silly mistakes, then this is for you.

Features:

  • Selectively spider pages and resources, or just request them
  • Submit forms, and control what values to send
  • Ignore links by source using CSS selectors
  • Fail fast or collect many errors
  • Configurable using straightforward rules

Works with the test clients for Flask (inc Flask-WebTest), Django and WebTest.

Why should I use this?

Here's an example: Flaskr, the Flask tutorial application has 166 lines of test code to achieve 100% test coverage.

Using Python Testing Crawler in a similar way to the Usage example below, we can hit 73% with very little effort. Disclaimer: Of course! It's not the same quality or utility of testing! But it is better than no tests, a complement to hand-written unit or functional tests and a useful stopgap.

Installation

$ pip install python-testing-crawler

Usage

Create a crawler using your framework's existing test client, tell it where to start and what rules to obey, then set it off:

from python_testing_crawler import Crawler
from python_testing_crawler import Rule, Request, Ignore, Allow

def test_crawl_all():
    client = ## ... existing testing client
    ## ... any setup ...
    crawler = Crawler(
        client=my_testing_client,
        initial_paths=['/'],
        rules=[
            Rule("a", '/.*', "GET", Request()),
        ]
    )
    crawler.crawl()

This will crawl all anchor links to relative addresses beginning "/". Any exceptions encountered will be collected and presented at the end of the crawl. For more power see the Rules section below.

If you need to authorise the client's session, e.g. login, then you should that before creating the Crawler.

It is also a good idea to create enough data, via fixtures or otherwise, to expose enough endpoints.

How do I setup a test client?

It depends on your framework:

Crawler Options

Param Description
initial_paths list of paths/URLs to start from
rules list of Rules to control the crawler; see below
path_attrs list of attribute names to extract paths/URLs from; defaults to "href" -- include "src" if you want to check e.g. ,
A Proof of concept of a modern python CLI with click, pydantic, rich and anyio

httpcli This project is a proof of concept of a modern python networking cli which can be simple and easy to maintain using some of the best packages

Kevin Tewouda 17 Nov 15, 2022
Plugin for generating HTML reports for pytest results

pytest-html pytest-html is a plugin for pytest that generates a HTML report for test results. Resources Documentation Release Notes Issue Tracker Code

pytest-dev 548 Dec 28, 2022
Test django schema and data migrations, including migrations' order and best practices.

django-test-migrations Features Allows to test django schema and data migrations Allows to test both forward and rollback migrations Allows to test th

wemake.services 382 Dec 27, 2022
It's a simple script to generate a mush on code forces, the script will accept the public problem urls only or polygon problems.

Codeforces-Sheet-Generator It's a simple script to generate a mushup on code forces, the script will accept the public problem urls only or polygon pr

Ahmed Hossam 10 Aug 02, 2022
This repository contains a testing script for nmigen-boards that tries to build blinky for all the platforms provided by nmigen-boards.

Introduction This repository contains a testing script for nmigen-boards that tries to build blinky for all the platforms provided by nmigen-boards.

S.J.R. van Schaik 4 Jul 23, 2022
This project demonstrates selenium's ability to extract files from a website.

This project demonstrates selenium's ability to extract files from a website. I've added the challenge of connecting over TOR. This package also includes a personal archive site built in NodeJS and A

2 Jan 16, 2022
WEB PENETRATION TESTING TOOL 💥

N-WEB ADVANCE WEB PENETRATION TESTING TOOL Features 🎭 Admin Panel Finder Admin Scanner Dork Generator Advance Dork Finder Extract Links No Redirect H

56 Dec 23, 2022
0hh1 solver for the web (selenium) and also for mobile (adb)

0hh1 - Solver Aims to solve the '0hh1 puzzle' for all the sizes (4x4, 6x6, 8x8, 10x10 12x12). for both the web version (using selenium) and on android

Adwaith Rajesh 1 Nov 05, 2021
✅ Python web automation and testing. 🚀 Fast, easy, reliable. 💠

Build fast, reliable, end-to-end tests. SeleniumBase is a Python framework for web automation, end-to-end testing, and more. Tests are run with "pytes

SeleniumBase 3k Jan 04, 2023
Browser reload with uvicorn

uvicorn-browser This project is inspired by autoreload. Installation pip install uvicorn-browser Usage Run uvicorn-browser --help to see all options.

Marcelo Trylesinski 64 Dec 17, 2022
This file will contain a series of Python functions that use the Selenium library to search for elements in a web page while logging everything into a file

element_search with Selenium (Now With docstrings 😎 ) Just to mention, I'm a beginner to all this, so it it's very possible to make some mistakes The

2 Aug 12, 2021
Ab testing - basically a statistical test in which two or more variants

Ab testing - basically a statistical test in which two or more variants

Buse Yıldırım 5 Mar 13, 2022
Codeforces Test Parser for C/C++ & Python on Windows

Codeforces Test Parser for C/C++ & Python on Windows Installation Run pip instal

Minh Vu 2 Jan 05, 2022
A friendly wrapper for modern SQLAlchemy and Alembic

A friendly wrapper for modern SQLAlchemy (v1.4 or later) and Alembic. Documentation: https://jpsca.github.io/sqla-wrapper/ Includes: A SQLAlchemy wrap

Juan-Pablo Scaletti 129 Nov 28, 2022
FauxFactory generates random data for your automated tests easily!

FauxFactory FauxFactory generates random data for your automated tests easily! There are times when you're writing tests for your application when you

Og Maciel 37 Sep 23, 2022
A web scraping using Selenium Webdriver

Savee - Images Downloader Project using Selenium Webdriver to download images from someone's profile on https:www.savee.it website. Usage The project

Caio Eduardo Lobo 1 Dec 17, 2021
A feature flipper for Django

README Django Waffle is (yet another) feature flipper for Django. You can define the conditions for which a flag should be active, and use it in a num

952 Jan 06, 2023
Run ISP speed tests and save results

SpeedMon Automatically run periodic internet speed tests and save results to a variety of storage backends. Supported Backends InfluxDB v1 InfluxDB v2

Matthew Carey 9 May 08, 2022
A simple serverless create api test repository. Please Ignore.

serverless-create-api-test A simple serverless create api test repository. Please Ignore. Things to remember: Setup workflow Change Name in workflow e

Sarvesh Bhatnagar 1 Jan 18, 2022