Skip to content

Set up redis and celery on local and test the tasks

Priya Kasireddy edited this page May 9, 2022 · 6 revisions

To enhance performance, we use redis, celery beat and worker to process scheduled tasks that are queued such as nightly MV refresh, legal documents reload, elasticsearch backup etc.

The download tasks are queued in redis. The celery worker will find the job in the queue, process the request, extract and execute the query in database and upload the output from the query to s3 bucket.

App will send message to Slack at end of task.

The task code is in the tasks folder.

To test the tasks on local, modify the webservices/tasks/__init__.py file temporarily.

  • Disable SSL on local. Change line 81 from
"url": redis_url() + "?ssl=true",

to

"url": redis_url(),
  • Comment out the broker_use_ssl and redis_backend_use_ssl variables. line 61-66
    # broker_use_ssl={
    #     'ssl_cert_reqs': ssl.CERT_NONE,
    # },
    # redis_backend_use_ssl={
    #     'ssl_cert_reqs': ssl.CERT_NONE,
    # },

1. Set up redis, celery-beat, celery-worker, api, cms on local

Open Terminal #1---Redis server

  • start redis server. No need to activate python virtualenv.
redis-server

Open Terminal #2---Cloud.gov foundry

  • Login Cloud.gov foundry and get AWS s3 bucket credentials and user-provided credentials. Prepare below variables and export them in other terminals as needed.
cf login --sso
cf target -s <space>
cf env api
  • Look for the s3 service named 'fec-s3-api'
export AWS_ACCESS_KEY_ID="xxxxx"
export AWS_SECRET_ACCESS_KEY="xxxxx"
export AWS_PUBLIC_BUCKET="xxxxx"
export AWS_DEFAULT_REGION="xxxxx"
  • Test download (don't need FEC_DOWNLOAD_API_KEY if test on dev space):
export FEC_DOWNLOAD_API_KEY="xxxxx" # don't need this if test on 'dev' space
  • Test cron jobs to send Slack message:
export SLACK_HOOK="xxxxx"

Open Terminal #3---celery-beat

  • Only for testing cron jobs
  • Go to openFEC repo and active python virtualenv
pip install -r requirements.txt # if there are some package version changes)
pyenv activate <virtual_env>
celery --app webservices.tasks beat --loglevel INFO
  • Make sure message shows on terminal: INFO/MainProcess] beat: Starting...

Open Terminal #4---celery-worker

  • Go to openFEC repo and active python virtualenv
  • Export the necessary AWS variables from Terminal #2
pyenv activate <virtual_env>
export AWS_ACCESS_KEY_ID="xxxxx"
export AWS_SECRET_ACCESS_KEY="xxxxx"
export AWS_PUBLIC_BUCKET="xxxxx"
export AWS_DEFAULT_REGION="xxxxx"
export FEC_DOWNLOAD_API_KEY="xxxxx"  # don't need this if test on 'dev' space
export SLACK_HOOK="xxxxx"
  • For testing download, set SQLA_CONN
export SQLA_CONN=<space db connection> 
  • For testing cron job, set SQLA_CONN to your local db and make sure your cfdm_test has sample data and all MVs
export SQLA_CONN=<your local db connection>
  • Start celery-worker
celery --app webservices.tasks worker --loglevel INFO
  • Make sure message shows on terminal: Connected to redis://localhost:6379/0 if you don't see this message which means Redis connection fail.

Open Terminal #5---api (openFEC repo)

  • Go to openFEC repo and active Python virtual env
  • Export the necessary AWS variables that get from Terminal #2
pyenv activate <virtual_env>
export AWS_ACCESS_KEY_ID="xxxxx"
export AWS_SECRET_ACCESS_KEY="xxxxx "
export AWS_PUBLIC_BUCKET="xxxxx"
export AWS_DEFAULT_REGION="xxxxx"
export SLACK_HOOK="xxxxx"
export SQLA_CONN=<space db connection>
./manage.py runserver

Open Terminal #6---cms

  • Go to fec-cms repo and active Python virtual env
  • Export the necessary env variables and points to local api.
export DATABASE_URL=postgresql://:@/cfdm_cms_test
export FEC_API_URL=http://localhost:5000
export FEC_WEB_API_KEY_PRIVATE=xxxxx # don't need this if point to local or 'dev' api
export FEC_WEB_API_KEY_PUBLIC=xxxxx  # don't need this if point to local or 'dev' api
export FEC_CMS_ENVIRONMENT=LOCAL

./manage.py runserver

2. Test tasks

Test download function

Test cron jobs

  • modify webservices/task/init.py to different schedule.

  • Start celery-beat

celery --app webservices.tasks beat --loglevel INFO

Test Slack message

  • Modify webservices/tasks/refresh.py or webservices/tasks/legal_docs.py #bots —> #test-bot

(Keep in mind you will delete the download bucket if you give it the AWS info, so if you are not testing that aspect of the code you might just want to comment it out.)