This is a web application for managing and building stories based on tips solicited from the public. This project is meant to be easy to setup for non-programmer, intuitive to use and highly extendable.
Here are a few use cases:
The project is broken up into several components:
We have a GitBook with a full user guide that covers running Collaborate, importing and refining data, and setting up Google services. You can read the documentation here.
Collaborate has builtin support for one-click installs in both Google Cloud and Heroku. During the setup process for both deployments, make sure to fill in the email, username and password fields so you can log in.
The Heroku deploy button will create a small, "free-tier" Collaborate system. This consists of a small web server, a database which supports between 10k-10M records (depending on data size) and automatically configures scheduled data re-importing.
The Google Cloud Run button launches Collaborate into the Google Cloud environment. This deploy requires you to setup a Google Project, enable Google Cloud billing and enable the Cloud Run API. Full set up instructions are here.
This deploy does not automatically configure scheduled re-importing, but you can add it via Cloud Scheduler by following these instructions.
Once you've deployed your Cloud Run instance, you can manage your running instance from the Google Developer's Console.
Getting the system set up and running locally begins with cloning this repository and installing the Python dependencies. Python 3.6 or 3.7 and Django 2.2 are assumed here.
# virtual environment is recommended mkvirtualenv -p /path/to/python3.7 collaborative # install python dependencies pip install -r requirements.txt
Assuming everything worked, let's bootstrap and then start the local server:
# get the database ready python manage.py migrate # create a default admin account python manage.py createsuperuser # gather up django and collaborate assets python manage.py collectstatic --noinput # start the local application python manage.py runserver
You can then access the application
http://localhost:8000 and log
in with the credentials you selected in the
(above). Logging in will bring you to a configuration wizard where
you will import your first Google Sheet and import its contents.
Dockerfile (the same one used by the Google Cloud Run
deploy) can be found here:
This creates a basic production environment with nginx and gunicorn. By
default, it uses SQLite3, but you can configure the database by adding a
DATABASE_URL environment variable. You can read more about the format
for this variable here.
We also included a configuration script for plain Nginx deploys here:
This can be copied to your main Nginx sites configuration directory (e.g.,
In order to get auto-updating data sources, make sure to add a cron job
that runs the following
There's an example cron file that, when added to your
update data every 15 minutes:
Note that if you use the above example, you probably want to add logrotate
for the logfile the above cron config adds. You can find the logrotate
script here (add it to