Skip to content

gq1/pipelines-website

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains code for the sanger-tol pipelines website: https://pipelines.tol.sanger.ac.uk. It has been forked and adapted from https://nf-co.re/.

Packages used

Here's how the website is built:

Development

Getting the code

To make edits to the website, fork the repository to your own user on GitHub and then clone to your local system.

git clone [email protected]:[USERNAME]/pipelines-website.git
cd pipelines-website/

Running a local server

Ok, you're ready! To run the website locally, just start the apache-php server with:

docker compose up

NB: If you are using a Mac with Apple silicon, you need to run:

docker compose -f m1-docker-compose.yml up

You should then be able to access the website in your browser at http://localhost:8888/.

If you prefer, you can also use a tool such as MAMP - if so, set the base directory to /path/to/pipelines-website/public_html in Preferences > Web-Server > Document Root and then hit Start Servers.

Most of the hand-written text is in /markdown, to make it easier to write. The PHP files in /public_html then parse this into HTML dynamically, if supplied with a filename.

Note that the .htaccess file is set up to remove the .php file extensions in URLs.

First-run

Much of the site is powered by a pipelines.json file. The webserver does this automatically when GitHub events trigger an update, but you'll need to run the script manually.

Access tokens

First you'll need a config.ini text file with values for github_username and github_access_token. See instructions on how to get a GitHub OAuth token (the token only needs the public_repo permission). This file is ignored in .gitignore for security reasons.

For the MySQL database you should also add the following values:

host = 'db'
port = '3306';
dbname = 'nfcore';
username = 'nfcore_admin';
password = 'PEBBLY8exhibit_mead1cilium6despise'

Running PHP scripts

It's easiest to run these first manual update scripts on the command line. If you have PHP available then you may be able to do this directly. Alternatively, if you are using Docker as above then you can open a shell inside the running container. The container is typically named nf-core-web (you can check this with the docker ps command), so you can open an interactive shell using the following command:

docker exec -it nf-core-web /bin/bash
cd /var/www/

Update scripts

The following command will create public_html/pipelines.json, which is used by the website.

php update_pipeline_details.php

To update the modules database (from within the docker container) run:

php update_module_details.php

Optionally, once you've done that, you can grab the pipeline traffic, issue statistics and font awesome icons:

php update_issue_stats.php
php update_stats.php
php update_fontawesome_icons.php

This creates nfcore_stats.json, nfcore_issue_stats.json and public_html/assets/js/fa-icons.json, all also ignored in .gitignore.

Production Server Setup

Deployment

# login to the production VM

# clone the repo if not done yet
git clone https://github.com/sanger-tol/pipelines-website.git

# pull the latest from Git and the main branch being used.
cd pipelines-website
git status
git pull

# start the docker containers
docker compose -f docker-compose-prod.yml up >> logs/docker-compose.log 2>&1 &

Stats cronjob

The web server needs the following cronjobs running to scrape statistics and updates:

00 */2 * * * ( git -C /home/ubuntu/pipelines-website pull && docker run --rm -v /home/ubuntu/pipelines-website:/var/www/ -w /var/www composer:2.1 bash -c 'composer install' && docker run --rm --user 1000:1000 -v /home/ubuntu/pipelines-website:/var/www/ -w /var/www node:16-alpine3.12 sh -c 'npm install & npm run build-prod' ) >> /home/ubuntu/pipelines-website/logs/git-pull.log 2>&1
00 05 * * * docker exec nf-core-web /usr/local/bin/php /var/www/update_pipeline_details.php >> /home/ubuntu/pipelines-website/logs/update_pipelines.log 2>&1
15 05 * * * docker exec nf-core-web /usr/local/bin/php /var/www/update_module_details.php >> /home/ubuntu/pipelines-website/logs/update_modules.log 2>&1
30 05 * * * docker exec nf-core-web /usr/local/bin/php /var/www/update_stats.php >> /home/ubuntu/pipelines-website/logs/update_stats.log 2>&1
45 05 * * * docker exec nf-core-web /usr/local/bin/php /var/www/update_issue_stats.php >> /home/ubuntu/pipelines-website/logs/update_issue_stats.log 2>&1

Remember to replace the path with your actual deployment directory if any different.

The update_issue_stats.php script can use a lot of GitHub API calls, so should run at least one hour after the update_stats.php script last finished. This is not because the script takes an hour to run, but because the GitHub API rate-limiting counts the number of calls within an hour.

Contribution guidelines

If you are looking forward to contribute to the website or add your institution to the official list of contributors, please have a look at the CONTRIBUTING.md.

Community

If you have any questions or issues, please let us know.

Credits

Matthieu Muffato (@muffato) manages the content and Guoying Qi (@gq1) manages the website. Many individuals, especially Priyanka Surana (@priyanka-surana), have made various contributions.

Phil Ewels (@ewels) built the original nf-core website. More recently, @mashehu has done a great deal of work with the code. See the repo contributors for more.

Kudos to the excellent npm website, which provided inspiration for the design of the pipeline pages.

About

Code and files for the sanger-tol pipelines website.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • PHP 78.9%
  • JavaScript 14.7%
  • SCSS 6.4%