arrow-left

Only this pageAll pages
gitbookPowered by GitBook
1 of 42

Prophecies

Loading...

Loading...

Getting Started

Loading...

Loading...

Loading...

User Guide

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Developers

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Ask for help

To report a bug, please open an issue on our Githubarrow-up-right. To maximize the chance of getting help from our team and our community, we advise you to be as precise as possible, while detailing:

  • your Operating System (Mac, Windows or Linux)

  • the version of your Operating System

  • the version of Prophecies

  • screenshots of your issue

If for confidentiality reasons you don't want to open an issue on Github, please write to .

circle-exclamation

Prophecies is an open-source project, done by a non-profit organization. Our team will try its best to help you in a timely manner but the best way to get help will always be to rely on the community (and ask your question on Github).

prophecies@icij.orgenvelope

Check data as a checker

Upload data as an admin

Manage a task

Read notifications

Sign in or log in

Introduction

Prophecies is a platform to fact check data collaboratively. It is a free open-source software developed by the International Consortium of Investigative Journalists (ICIJ).

hashtag
Why did we create Prophecies?

ICIJ is a non-profit leading outstanding investigations involving hundreds of reporters around the world. Each of our project leverage the power of data and technology to investigate issues of global importance. To be able to simultaneously publish a story in so many media, every single line of data we use must be fact checked. This is, as we say internally, "our secret sauce". Some of the datasets we produce rely on a very heavy effort to be verified, sometimes with up to 4 different rounds of verification.

In order to make sure this workflow is well organized and efficient, we created Prophecies.

hashtag
How can I use Prophecies?

Prophecies is a collaborative tool that must be available online to its users. Therefore, to use Prophecies you must install it on your server, as web application.

circle-info

Wondering how you can install Prophecies on your server? !

hashtag
What skills are needed to install Prophecies?

Prophecies is a Python application built at the top of the Django framework. To ease installation, we publish a Docker image of the application . The image is built both for ARM64 and AMD64 which ensure a wider level of compatibility.

To install Prophecies, you will need to have a good understanding of how Docker and Docker Compose work and how they can be used. You will also need to have basic knowledge of how to setup a PostgreSQL database. In this guide, we will use a database directly installed in a Docker.

Key concepts

This page introduce the key concepts used by Prophecies.

Configure Prophecies

The application can be configured using the following environment variables:

Environment Variable
Description
Default Value

SECRET_KEY

Secret key for Django security

'django-insecure-...'

DEBUG

Debug mode toggle

False

Installation with Docker

hashtag
Prerequisites

Before installing Prophecies, ensure you have the following tools installed:

  • Docker: A platform for developing, shipping, and running applications inside isolated environments known as containers. Learn more about Dockerarrow-up-right.

  • Docker Compose: A tool for defining and running multi-container Docker applications. It uses a YAML file to configure your application's services, making it easier to manage complex setups. .

These tools are essential for running Prophecies efficiently and are widely used in the development and deployment of modern applications.

hashtag
Setup steps

hashtag
1. Prepare the Working Directory

Place this docker-compose.yml file in your chosen directory. This will be the working directory for your Docker Compose setup:

hashtag
2. Configure NGINX

You need to create an NGINX configuration file as referenced in your Docker Compose file (./nginx.conf). Ensure this file is correctly configured to serve your Prophecies. It should be set up to listen on port 9999 and serve static files from the /static/ directory:

hashtag
3. Starting Services

In your working directory, run the following command:

This command will download the necessary images (if not already downloaded), create the defined volumes, and start the services as background processes.

hashtag
3. Database Migrations and Static Files

The docker-compose.yml file specifies services for running database migrations (migration) and collecting static files so they can be served by NGINX (collectstatic). These services should automatically perform these tasks when you start Docker Compose.

hashtag
4. Create Admin User

The database is currently empty, without user. To provision your first admin user, you must use the command line:

hashtag
5. Accessing the Application

Once all services are up and running, the application should be accessible via :

6. Monitoring and Logs

To monitor the status of your Docker containers and check for any issues, you can use the Docker Compose logs command:

The -f flag will follow the log output, allowing you to see real-time logs as they are generated.

7. Stopping and Removing Services

To stop the services without removing them, you can run:

If you need to stop and remove all the containers, networks, and volumes associated with your Docker Compose setup, use the down command:

triangle-exclamation

This will clean up everything you have created and is particularly useful for starting from scratch. Please make sure you know what you're doing before running this command.

hashtag
Additional Notes

  • This configuration doesn't include SSL setup for NGINX, which is recommended for production.

  • Regularly check for updates to the icij/prophecies image to keep your application up to date.

  • To use custom domain, don't forget to update the ALLOWED_HOSTS

TEMPLATE_DEBUG

Debug mode for templates

Value of DEBUG

ALLOWED_HOSTS

Host/domain names that Django can serve

'localhost,127.0.0.1'

CSRF_TRUSTED_ORIGINS

Trusted origins for CSRF protection

'http://localhost:8080'

STATIC_URL

URL for static files

'/static/'

STATIC_ROOT

Directory for collectstatic

Project root + 'run/static'

MEDIA_STORAGE

Type of media storage

'FS'

MEDIA_ROOT

Directory for user-uploaded files

'/media/' for S3, Project root + 'run/media' for FS

MEDIA_URL

URL for media files

'https://{AWS_S3_CUSTOM_DOMAIN}/' for S3, '/media/' for FS

AWS_ACCESS_KEY_ID

AWS access key ID for S3

None

AWS_SECRET_ACCESS_KEY

AWS secret access key for S3

None

AWS_STORAGE_BUCKET_NAME

AWS storage bucket name for S3

None

AWS_S3_REGION_NAME

AWS S3 region name

None

AWS_S3_SIGNATURE_VERSION

AWS S3 signature version

's3v4'

AWS_QUERYSTRING_EXPIRE

AWS query string expiration time

'3600'

DATABASE_URL

Database URL for Django-environ to parse

'sqlite:///{root.path("db.sqlite3")()}'

USE_X_FORWARDED_HOST

Use X-Forwarded-Host header

Value of DEBUG

SOCIAL_AUTH_URL_NAMESPACE

Namespace for social auth URLs

None

SOCIAL_AUTH_LOGIN_URL

Login URL for social auth

'/login/provider/'

SOCIAL_AUTH_PROVIDER_HOSTNAME

Hostname for OAuth provider

None

SOCIAL_AUTH_PROVIDER_KEY

Key for OAuth provider

None

SOCIAL_AUTH_PROVIDER_SECRET

Secret for OAuth provider

None

SOCIAL_AUTH_PROVIDER_PROFILE_URL

Profile URL for OAuth provider

None

SOCIAL_AUTH_PROVIDER_AUTHORIZATION_URL

Authorization URL for OAuth provider

None

SOCIAL_AUTH_PROVIDER_ACCESS_TOKEN_URL

Access token URL for OAuth provider

None

SOCIAL_AUTH_PROVIDER_ACCESS_TOKEN_METHOD

Access token method for OAuth provider

'POST'

SOCIAL_AUTH_PROVIDER_USERNAME_FIELD

Username field for OAuth provider

'username'

SOCIAL_AUTH_PROVIDER_GROUPS_FIELD

Groups field for OAuth provider

'groups'

SOCIAL_AUTH_PROVIDER_STAFF_GROUP

Staff group for OAuth provider

'staff'

CONSTANCE_PUBLIC_KEYS

Comma separated public keys for dynamic settings

'loginUrl'

DJANGO_ADMIN_LOGIN

Toggle for Django Admin native login

Value of DEBUG

CACHE_URL

URL for cache backend

'filecache:///code/prophecies/run/cache/'

DJANGO_LOG_LEVEL

Logging level for Django

'INFO'

Project

A group of tasks to verify. For instance "Pandora Papers" is a project.

Task

Each task, or list of record, that must be verified under a specific set of rules. For instance, a list of "Paintings locations" to check. A task can have many options, including the type of form to use, the type of options present, the number of rounds of checks, etc.

Task Record

The actual records to check. Those record are always composed of an "original value" and a "predicted value". Checker will have to verify if the predicted value is correct. Each record can be identified uniquely with an optional uid.

Task Record Review

The assignation of a record to a checker. This can contain the result of the check ("correct", "incorrect", etc) as well as the alternative value proposed by the checker.

Checker

The user that will be in charge of checking a task record. The list of checkers is defined when creating the task.

Choice Group

The list of options that will be presented to the checker when reviewing a record. For instance "Correct", "Incorrect" and "I don't know". An option can be mark as "requiring an alternative value". For instance if you pick incorrect when reviewing an address, you might have to select the correct country.

Alternative Value

A list of alternative values that can be associated to a choice group. For instance, a list of countries and their ISO3 code.

Follow the guide
on Docker Hubarrow-up-right

Find a task

Enter a choice for multiple records

Leave a note

Read tutorial

Check records

Add a task

Enter a choice

variable.
  • The application can be configured with .

  • Get started with Docker Composearrow-up-right
    http://localhost:9999arrow-up-right
    version: '3'
    
    volumes:
      shared_static: {}
      postgresql_data: {}
    
    services:
      nginx:
          image: nginx:latest
          ports:
            - "9999:9999"
          volumes:
            - ./nginx.conf:/etc/nginx/conf.d/default.conf
            - shared_static:/static/:ro
          depends_on:
            - web
    
      web:
        image: icij/prophecies:0.5.9
        environment:
          - ALLOWED_HOSTS=0.0.0.0,localhost,web
          - CSRF_TRUSTED_ORIGINS=http://localhost:9999
          - DATABASE_URL=postgres://postgres:postgres@db/prophecies
          - DEBUG=false
          - DJANGO_SETTINGS_MODULE=prophecies.settings.production
          - DJANGO_ADMIN_LOGIN=true
          - PROPHECIES_APP_LOGIN_URL=/admin/login/?next=/
          - PORT=8008
          - WAIT_HOSTS_TIMEOUT=60
          - WAIT_HOSTS=db:5432
        volumes:
          - shared_static:/code/prophecies/run/static
        depends_on:
          - db
          - migration
          - collectstatic
        expose:
          - "8008"
    
      migration:
        image: icij/prophecies:0.5.9
        command: sh -c '/usr/bin/wait && poetry run python manage.py migrate --noinput'
        environment:
          - WAIT_HOSTS=db:5432
          - WAIT_HOSTS_TIMEOUT=60
          - DATABASE_URL=postgres://postgres:postgres@db/prophecies
        depends_on:
          - db
    
      collectstatic:
        image: icij/prophecies:0.5.9
        command: poetry run python manage.py collectstatic --noinput
        volumes:
          - shared_static:/code/prophecies/run/static
    
      db:
        image: postgres:16
        restart: always
        environment:
          POSTGRES_PASSWORD: postgres
          POSTGRES_USER: postgres
          POSTGRES_DB: prophecies
        command: postgres -c shared_preload_libraries=pg_stat_statements -c 'pg_stat_statements.track=all'
        volumes:
          - postgresql_data:/var/lib/postgresql/data
    upstream web {
      ip_hash;
      server web:8008;
    }
    
    server {
    
        location /static/ {
            autoindex on;
            alias /static/;
        }
    
        location / {
            proxy_pass http://web/;
        }
        
        listen 9999;
        server_name localhost;
    }
    
    docker compose up -d
    docker compose exec -it web make createsuperuser
    docker compose logs -f
    docker compose stop
    docker compose down --volumes
    many environment variables

    Filter records

    Change or cancel a choice

    Duplicate record

    Lock record

    Sort records

    Read tips

    See record's history

    Add users

    Read shortcuts

    Open record in new tab

    Add a project

    Search records

    Read history

    FAQ

    Read stats

    Log out

    Testing the backend

    Once you setup Prophecies from the sources, simply launch Pytest with this shortcut:

    make test-back

    This command triggers the testing process, where make is a build automation tool used to manage tasks like compiling the code or running tests, and test-back is a target defined within the Makefile.

    Upon execution, Pytest will begin to run all the specified tests in your test suite. You can monitor the output in your terminal to check the progress of the tests. If all goes well and your code passes all the tests, you should see a success message indicating that everything is working as intended. Any failed tests will be reported accordingly, allowing you to make the necessary adjustments.

    Architecture overview

    hashtag
    API

    The API is built with Django Rest Framework and implements the key concepts of Prophecies.

    hashtag
    Stack

    Prophecies is based on the following open source tools:

    • Python 3.9, 3.10

    • Node 16, 18

    • Django 4.2

    hashtag
    Interaction workflow

    This sequence diagram illustrates the communication flow within the Prophecies platform infrastructure, focusing on how a request travels from the user interface to the database and back.

    hashtag
    Database schema

    Here is a simplified version of the Database schema.

    Publishing a new Docker image

    hashtag
    Docker image tag

    The Docker image is tagged with the version extracted from the git tag (e.g., icij/prophecies:1.0.0). Additionally, the image is tagged as latest.

    • Image Repository:

    • Platform: linux/arm64, linux/amd64

    This documentation provides a comprehensive guide on how to release a new Docker image, build, and test the application locally using the provided Makefile commands.

    hashtag
    Releasing a new Docker Image

    hashtag
    1. Update Version

    The version can be updated as major, minor, or patch using the Makefile. To do this, run one of the following commands depending on the type of version bump you need:

    These commands automatically update the version in pyproject.toml for the Python backend and package.json in the frontend directory, then commit these changes and tag the version in Git.

    hashtag
    2. Push the Changes and Tag to Git

    After updating the version and creating a git tag, push the changes and the tag to your git repository:

    Pushing the tag is crucial because the GitHub Actions workflow for Docker image publishing is triggered by a push to tags that follow the semantic versioning format, prefixed with v (e.g., v1.0.0).

    hashtag
    Build and test locally

    A convenience docker-compose.yml file is located as the root of the repository. To build and test Prophecies locally with Docker, you can run from the app's root directory:

    Installation from the sources

    This guide will explain how to setup Prophecies using the sources.

    hashtag
    Prerequisites

    triangle-exclamation

    This section describes how to install Prophecies for development. This environment disables many security settings provided by Django and isn't meant to be used for internet-facing instances. For that purpose please follow the installation guide with Docker.

    • Python 3.9

    • Node 16.x

    • Poetry >= 1.2

    hashtag
    Setup steps

    Checkout the repository with git:

    After entering the directory, setup a virtualenv with poetry and to install required packages:

    To setup the database (using SQLite3 by default):

    To create a superuser:

    For more customization, this app utilizes inspired environment variables to configure your Django application. You can create .env file using the custom settings variables:

    The application can be configured with .

    Testing the frontend

    Once you setup Prophecies from the sources, simply launch Jest with this shortcut:

    make test-front

    This command triggers the testing process, where make is a build automation tool used to manage tasks like compiling the code or running tests, and test-front is a target defined within the Makefile.

    Upon execution, Jest will begin to run all the specified tests in your test suite. You can monitor the output in your terminal to check the progress of the tests. If all goes well and your code passes all the tests, you should see a success message indicating that everything is working as intended. Any failed tests will be reported accordingly, allowing you to make the necessary adjustments.

    hashtag
    Mock Service Worker

    This application utilizes (MSW) to mock API results from the backend. All server request are defined within handers that you can find :

    icij/propheciesarrow-up-right
    Yarn 1.x
  • Git

  • 12factorarrow-up-right
    many environment variables
    $ tree prophecies/apps/frontend/tests/unit/mocks/handlers
    │
    ├── action-aggregate.js
    ├── actions.js
    ├── choice-groups.js
    ├── settings.js
    ├── task-record-media.js
    ├── task-record-reviews.js
    ├── task-records.js
    ├── tasks.js
    ├── task-user-choice-statistics .js
    ├── task-user-statistics.js
    ├── tips.js
    ├── user-notification.js
    └── users.js
    Mock Service Workerarrow-up-right
    in this directoryarrow-up-right
    make major
    make minor
    make patch
    git push && git push --tags
    docker compose up
    git clone git@github.com:ICIJ/prophecies.git
    cd prophecies/
    make install
    make migrate
    make createsuperuser
    DEBUG=on
    DATABASE_URL=
    CACHE_URL=dummycache://
    STATIC_URL=/static/
    SOCIAL_AUTH_PROVIDER_KEY=
    SOCIAL_AUTH_PROVIDER_SECRET=
    SOCIAL_AUTH_PROVIDER_HOSTNAME=http://localhost:3001
    SOCIAL_AUTH_PROVIDER_USERNAME_FIELD=uid
    SOCIAL_AUTH_PROVIDER_GROUPS_FIELD=groups_by_applications.prophecies
    SOCIAL_AUTH_PROVIDER_STAFF_GROUP=icijstaff
    Django Rest Framework 3.14
  • Django Rest Framework JSON:API 6.1

  • Pytest

  • Vue 2.7

  • Vue CLI 5

  • Vuex ORM 0.36

  • Jest 28

  • Poetry

  • Yarn

  • Explore the APIarrow-up-right

    Use this page to explore the latest specifications of the API.

    Download API schemaarrow-up-right

    This schema follows the OpenAPI 3arrow-up-right. You can load this file in many OpenAPI clients, including Swagger or Redoc.

    JSON:API specificationarrow-up-right

    The JSON:API spec is a standard for structuring JSON responses in APIs to optimize data exchange and efficiency.

    Continuous integration

    Prophecies' GitHub Actions workflowarrow-up-right is designed to automate the process of linting, testing, and publishing Docker images. This workflow triggers on every push to the repository and is structured into several jobs to ensure code quality and reliability before a new Docker image is published.

    hashtag
    Jobs Overview

    The workflow consists of six main jobs.

    hashtag
    lint-backend

    • Purpose: Lints the backend code to ensure it adheres to Python coding standards.

    • Python Version: Runs with Python 3.9.

    • Command: Uses pylint to lint the prophecies directory.

    hashtag
    test-backend

    • Purpose: Executes backend tests to verify the functionality.

    • Python Version: Tests are run on both Python 3.9 and 3.10.

    • Command: Runs backend tests using the make test-back command.

    hashtag
    test-frontend

    • Purpose: Conducts frontend tests to ensure UI/UX integrity.

    • Node Version: Tests are conducted on Node.js versions 16.x and 18.x.

    • Command: Frontend tests are executed using the make test-front command.

    hashtag
    build

    • Purpose: Build artifacts to be publish with the next release.

    • Conditions: This job runs only if the push event is a tag push starting with 'v', and it depends on the successful completion of the lint-backend, test-backend, and test-frontend jobs.

    hashtag
    package-publish

    • Purpose: Restore artifacts from the build job to publish them as Github Release.

    • Conditions: This job runs only if the push event is a tag push starting with 'v', and it depends on the successful completion of the build job.

    hashtag
    docker-publish

    • Conditions: This job runs only if the push event is a tag push starting with 'v', and it depends on the successful completion of the build job.

    • Step: This job includes checking out the code, setting up QEMU and Docker Buildx, logging into Docker Hub, preparing the tag name based on the git tag, and finally building and pushing the Docker image.

    circle-info

    Learn more about how to trigger the .

    hashtag
    Secrets

    To be able to push on the Docker Hub, this workflow must use the following secrets:

    • DOCKERHUB_USERNAME: The username for Docker Hub authentication.

    • DOCKERHUB_PASSWORD: The password or access token for Docker Hub authentication.

    Command: Run a python command to extract the OpenAPI schema file and store it.
    publication of a new Docker image
    spinner
    spinner