Atom and all repositories under Atom will be archived on December 15, 2022. Learn more in our official announcement
A GitHub App built with Probot to deliver notifications of toxic comments
Listens for new or edited issues, pull requests, or comments. It sends the content of those events to a semantic analysis API that can rate the content on multiple sentiment axes. If the content is rated above a threshold on any axis, a notification email is sent to humans to investigate and decide whether to take action.
This Probot app reads its configuration from two files:
- Global settings:
.githubrepository under the user or organization it is installed in from the.github/biohazard-alert.ymlfile - Repo-specific settings:
.github/biohazard-alert.ymlfile
Configuration settings are:
notifyOnError:truemeans that notifications are generated when errors are encountered (defaulttrue)skipPrivateRepos:truemeans that events from private repositories will be ignored (defaulttrue)threshold: Analysis ratings higher than this number will generate notifications (default0.8)
This app uses Google's Perspective API to analyze the content using the following models:
TOXICITYSEVERE_TOXICITYIDENTITY_ATTACKINSULTPROFANITYTHREATSEXUALLY_EXPLICITFLIRTATIONUNSUBSTANTIAL
# Install dependencies
npm install
# Build the app
npm run build
# Run the bot locally
npm run dev