update contributing

pull/1808/head
alpharush 2 years ago
parent 08f865766c
commit 0dd62b76cb
  1. 71
      CONTRIBUTING.md

@ -7,7 +7,7 @@ If you're unsure where to start, we recommend our [`good first issue`](https://g
Bug reports and feature suggestions can be submitted to our issue tracker. For bug reports, attaching the contract that caused the bug will help us in debugging and resolving the issue quickly. If you find a security vulnerability, do not open an issue; email opensource@trailofbits.com instead.
## Questions
Questions can be submitted to the issue tracker, but you may get a faster response if you ask in our [chat room](https://empireslacking.herokuapp.com/) (in the #ethereum channel).
Questions can be submitted to the "Discussions" page, and you may also join our [chat room](https://empireslacking.herokuapp.com/) (in the #ethereum channel).
## Code
Slither uses the pull request contribution model. Please make an account on Github, fork this repo, and submit code contributions via pull request. For more documentation, look [here](https://guides.github.com/activities/forking/).
@ -41,7 +41,7 @@ A code walkthrough is available [here](https://www.youtube.com/watch?v=EUl3UlYSl
## Development Environment
Instructions for installing a development version of Slither can be found in our [wiki](https://github.com/crytic/slither/wiki/Developer-installation).
To run the unit tests, you need to clone this repository and run `pip install ".[dev]"`.
To run the unit tests, you need to clone this repository and run `make test`. Run a specific test with `make test TESTS=$test_name`.
### Linters
@ -49,37 +49,54 @@ Several linters and security checkers are run on the PRs.
To run them locally in the root dir of the repository:
- `pylint slither tests --rcfile pyproject.toml`
- `black . --config pyproject.toml`
- `make lint`
> Note, this only validates but does not modify the code.
To automatically reformat the code:
- `make reformat`
We use pylint `2.13.4`, black `22.3.0`.
### Detectors tests
### Testing
Slither's test suite is divided into three categories end-to-end (`tests/e2e`), unit (`tests/unit`), and tools (`tests/tools/`).
How do I know what kind of test(s) to write?
- End-to-end: functionality that requires invoking `Slither` and inspecting some output such as printers and detectors.
- Unit: additions and modifications to objects should be accompanied by a unit test that defines the expected behavior. Aim to write functions in as pure a way as possible such that they are easier to test.
- Tools: tools built on top of Slither (`slither/tools) but not apart of its core functionality
#### Adding detector tests
For each new detector, at least one regression tests must be present.
- Create a test in `tests`
- Update `ALL_TEST` in `tests/test_detectors.py`
- Run `python ./tests/test_detectors.py --generate`. This will generate the json artifacts in `tests/expected_json`. Add the generated files to git.
- If updating an existing detector, identify the respective json artifacts and then delete them, or run `python ./tests/test_detectors.py --overwrite` instead.
- Run `pytest ./tests/test_detectors.py` and check that everything worked.
To see the tests coverage, run `pytest tests/test_detectors.py --cov=slither/detectors --cov-branch --cov-report html`.
To run tests for a specific detector, run `pytest tests/test_detectors.py -k ReentrancyReadBeforeWritten` (the detector's class name is the argument).
To run tests for a specific version, run `pytest tests/test_detectors.py -k 0.7.6`.
The IDs of tests can be inspected using `pytest tests/test_detectors.py --collect-only`.
### Parser tests
- Create a test in `tests/ast-parsing`
- Run `python ./tests/test_ast_parsing.py --compile`. This will compile the artifact in `tests/ast-parsing/compile`. Add the compiled artifact to git.
- Run `python ./tests/test_ast_parsing.py --generate`. This will generate the json artifacts in `tests/ast-parsing/expected_json`. Add the generated files to git.
- Run `pytest ./tests/test_ast_parsing.py` and check that everything worked.
To see the tests coverage, run `pytest tests/test_ast_parsing.py --cov=slither/solc_parsing --cov-branch --cov-report html`
To run tests for a specific test case, run `pytest tests/test_ast_parsing.py -k user_defined_value_type` (the filename is the argument).
To run tests for a specific version, run `pytest tests/test_ast_parsing.py -k 0.8.12`.
To run tests for a specific compiler json format, run `pytest tests/test_ast_parsing.py -k legacy` (can be legacy or compact).
The IDs of tests can be inspected using ``pytest tests/test_ast_parsing.py --collect-only`.
1. Create a test in `tests/e2e/detectors`
2. Update `ALL_TEST` in `tests/e2e/detectors/test_detectors.py`
3. Run `python tests/e2e/detectors/test_detectors.py --generate`. This will generate the json artifacts in `tests/expected_json`. Add the generated files to git. If updating an existing detector, identify the respective json artifacts and then delete them, or run `python ./tests/test_detectors.py --overwrite` instead.
4. Run `pytest tests/e2e/detectors/test_detectors.py` and check that everything worked.
> ##### Helpful commands
> - To see the tests coverage, run `pytest tests/e2e/detectors/test_detectors.py --cov=slither/detectors --cov-branch --cov-report html`.
> - To run tests for a specific detector, run `pytest tests/e2e/detectors/test_detectors.py -k ReentrancyReadBeforeWritten`(the detector's class name is the argument).
> - To run tests for a specific version, run `pytest tests/e2e/detectors/test_detectors.py -k 0.7.6`.
> - The IDs of tests can be inspected using `pytest tests/e2e/detectors/test_detectors.py --collect-only`.
#### Adding parsing tests
1. Create a test in `tests/e2e/solc_parsing/`
2. Run `python tests/e2e/solc_parsing/test_ast_parsing.py --compile`. This will compile the artifact in `tests/e2e/solc_parsing/compile`. Add the compiled artifact to git.
3. Run `python tests/e2e/solc_parsing/test_ast_parsing.py --generate`. This will generate the json artifacts in `tests/e2e/solc_parsing/expected_json`. Add the generated files to git.
4. Run `pytest tests/e2e/solc_parsing/test_ast_parsing.py` and check that everything worked.
> ##### Helpful commands
> - To see the tests coverage, run `pytest tests/e2e/solc_parsing/test_ast_parsing.py --cov=slither/solc_parsing --cov-branch --cov-report html`
> - To run tests for a specific test case, run `pytest tests/e2e/solc_parsing/test_ast_parsing.py -k user_defined_value_type` (the filename is the argument).
> - To run tests for a specific version, run `pytest tests/e2e/solc_parsing/test_ast_parsing.py -k 0.8.12`.
> - To run tests for a specific compiler json format, run `pytest tests/e2e/solc_parsing/test_ast_parsing.py -k legacy` (can be legacy or compact).
> - The IDs of tests can be inspected using ``pytest tests/e2e/solc_parsing/test_ast_parsing.py --collect-only`.
### Synchronization with crytic-compile
By default, `slither` follows either the latest version of crytic-compile in pip, or `crytic-compile@master` (look for dependencies in [`setup.py`](./setup.py). If crytic-compile development comes with breaking changes, the process to update `slither` is:

Loading…
Cancel
Save