From: Bryan Date: Sat, 28 Dec 2013 06:40:56 +0000 (-0500) Subject: converting rST to markdown, closing #179 X-Git-Tag: release-20150131~357 X-Git-Url: https://git.librecmc.org/?a=commitdiff_plain;h=4ace93c76353fae43e1c7d6e49f5dfc9367b9823;p=oweals%2Fkarmaworld.git converting rST to markdown, closing #179 --- diff --git a/docs/dbimport.md b/docs/dbimport.md new file mode 100644 index 0000000..e23fc5c --- /dev/null +++ b/docs/dbimport.md @@ -0,0 +1,32 @@ +# Bootstrapping a DB + +A fresh database may be bootstrapped by running `manage.py syncdb --migrate`. +Initial schools must be populated, which can be done by running: + +1. `manage.py fetch_usde_csv ./school.csv` +1. `manage.py import_usde_csv ./school.csv` +1. `manage.py sanitize_usde_schools` + +For testing purposes, it might be desirable to populate a database with +additional data. In this case, see +[the next section](#importing-a-preliminary-db). + +# Importing a Preliminary DB + +A preliminary set of Notes, Courses, and Schools is available as json from +the [this repository](https://github.com/FinalsClub/notesjson) + +## import to a fresh database + +To import this db: + +1. download or clone the notesjson repo +2. move the contents of the repo to the root of karmanotes +3. run the following management command: `./manage.py import_json all` + +## import to a database which has data + +Alternatively, you can remove all current Notes, Courses, and Schools from +the database before importing with the following management command: + + ./manage.py import_json all clean diff --git a/docs/dbimport.rst b/docs/dbimport.rst deleted file mode 100644 index 677a543..0000000 --- a/docs/dbimport.rst +++ /dev/null @@ -1,19 +0,0 @@ -Importing a Preliminary DB -========================== - -A preliminary set of Notes, Courses, and Schools is available as json from -the following repository: - -https://github.com/FinalsClub/notesjson - -To import this db: -1. download or clone the notesjson repo -2. move the contents of the repo to the root of karmanotes -3. run the following management command: -./manage.py import_json all - - -Alternatively, you can remove all current Notes, Courses, and Schools from -the database before importing with the following management command - -./manage.py import_json all clean diff --git a/docs/secrets.md b/docs/secrets.md new file mode 100644 index 0000000..dbe9713 --- /dev/null +++ b/docs/secrets.md @@ -0,0 +1,65 @@ +# Storing Secrets + +Secrets represent authentication information that we have to specify to our app, +so we do not want to check it into source control. These are stored as files +in `{project_root}/karmaworld/karmaworld/secrets`. + + +## drive.py + +This file points at a json file and a p12 file. These two files are described in +subsections below. + +The `GOOGLE_USER` variable should be set to the email address of the user whose +Google Drive is to be accessed. The Google Drive Service account (defined by +the json file and p12 file) will need permission. See the README for more +information on that subject. + +### client_secrets.json + +`client_secrets.json` contains metadata about the Google Drive service account. +This file is provided by Google. See here for more information: +https://developers.google.com/console/help/new/#serviceaccounts + +### drive.p12 + +`drive.p12` (downloaded from Google as `crazypantslonghexvalue-privatekey.p12`) +contains a private key which replaces a password. This file is very sensitive. +Ensure it is read-only by the proper user(s) through file system controls. + +## db_settings.py + +`db_settings.py` sets up variables in `settings/prod.py` for connecting to the +a database. + +* `PROD_DB_NAME` should be set to the database name +* `PROD_DB_USERNAME` should be set to the role/user which accesses the database +* `PROD_DB_PASSWORD` should be the password of the above role/user + +## filepicker.py + +`filepicker.py` contains the Filepicker API key which identifies the server +to the Filepicker service. + +## static_s3.py + +`static_s3.py` sets up variables in `settings/prod.py` for AWS S3 static file +storage. + +* `DEFAULT_FILE_STORAGE` refers to the Django storage backend to use. Generally + it should be 'storages.backends.s3boto.S3BotoStorage' +* `AWS_ACCESS_KEY_ID` is an alphanumeric identifier given by AWS. +* `AWS_SECRET_ACCESS_KEY` is an ASCII passkey given by AWS. +* `AWS_STORAGE_BUCKET_NAME` is some bucket. +* `S3_URL` is the URL to the s3 bucket (`http://BUCKET.s3.amazonaws.com/`) +* `STATIC_URL` should be the same as the `S3_URL` + +## twitter.py + +`twitter.py` is used by celery note tasks to send Twitter messages with note +updates. + +* `CONSUMER_KEY` is provided by Twitter +* `CONSUMER_SECRET` is provided by Twitter +* `ACCESS_TOKEN_KEY` is provided by Twitter +* `ACCESS_TOKEN_SECRET` is provided by Twitter diff --git a/docs/secrets.rst b/docs/secrets.rst deleted file mode 100644 index 16ff47d..0000000 --- a/docs/secrets.rst +++ /dev/null @@ -1,42 +0,0 @@ -Storing Secrets -=============== - -Secrets are things that we have to specify to our app, -but do not want to check into source control. -These are stored as files in `karmaworld/karmaworld/secrets` - - -client_secrets.json -------------------- - -`client_secrets.json` is the api drive authentication with google api services for authenticating the server-side of `gdrive.py` requests. It is generated by google and used in `gdrive.py`. - - -db_settings.py --------------- - -`db_settings.py` stores data about the production database. - -+ PROD_DB_NAME = "" # is the database name in postgres -+ PROD_DB_USERNAME = "" # is the postgres login username -+ PROD_DB_PASSWORD = "" # is the postgres login user password - - -drive.py --------- - -`drive.py` specifies the google drive user that we are uploading documents too. - -+ GOOGLE_USER = "" # a google drive email address that we authenticate with - -static_s3.py ------------- - -`static_s3.py` sets up variables in `settings/prod.py` for AWS S3 static file storage. - -+ DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage' -+ AWS_ACCESS_KEY_ID = '' -+ AWS_SECRET_ACCESS_KEY = '' -+ AWS_STORAGE_BUCKET_NAME = '' # this will differ between production and beta -+ S3_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME -+ STATIC_URL = S3_URL