Skip to content

Commit

Permalink
Version 092020 (#3034)
Browse files Browse the repository at this point in the history
* inject study_type in EBI and improvements to current automatic processing pipeline (#3023)

* inject study_type in ebi and improvements to current automatic proecssing pipeline

* addressing @ElDeveloper comments

* some general fixes/additions for next release (#3026)

* some general fixes/additions for next release

* adding test for not None job.release_validator_job

* fix #2839

* fix #2868 (#3028)

* fix #2868

* 2nd round

* fix errors

* more changes

* fix errors

* fix ProcessingJobTest

* fix PY_PATCH

* add missing TRN.add

* encapsulated_query -> perform_as_transaction

* fix #3022 (#3030)

* fix #3022

* adding tests

* fix #2320 (#3031)

* fix #2320

* adding prints to debug

* children -> 1

* APIArtifactHandlerTest -> APIArtifactHandlerTests

* configure_biom

* qdb.util.activate_or_update_plugins

* improving code

* almost there

* add values.template

* fix filepaths

* filepaths -> files

* fixing errors

* add prep.artifact insertion

* addressing @ElDeveloper comments

* fix artifact_definition active command

* != -> ==

* Added three tutorial sections to the Qiita documentation (#3032)

* Added three tutorial sections to the Qiita documentation: 'Retrieving Public Data for Own Analysis' and 'Processing public data retrieved with redbiom' to the redbiom tab, and 'Statistical Analysis to Justify Clinical Trial Sample Size Tutorial' to the analyzing samples tab.

* Update redbiom.rst

* Update redbiom.rst

* Update redbiom.rst

* Further updates to redbiom.rst and the Stats tutorial.

* update redbiom.rst

* Finished proof-reading

* Placed all three tutorials/sections together under Introduction to the download and analysis of public Qiita data

* added a new introduction, with links to the three sections

* Added figures to stats tutorial and contexts explanation

* Added figures to stats tutorial and contexts explanation

* Apply suggestions from code review [skip ci]

Co-authored-by: Yoshiki Vázquez Baeza <[email protected]>

Co-authored-by: Antonio Gonzalez <[email protected]>
Co-authored-by: Yoshiki Vázquez Baeza <[email protected]>

* 092020 (#3033)

* 092020

* connect artifact with job

* rm INSERT qiita.artifact_processing_job

* Apply suggestions from code review [skip ci]

Co-authored-by: Yoshiki Vázquez Baeza <[email protected]>

Co-authored-by: Yoshiki Vázquez Baeza <[email protected]>

Co-authored-by: Daniel McDonald <[email protected]>
Co-authored-by: Mirte Kuijpers <[email protected]>
Co-authored-by: Yoshiki Vázquez Baeza <[email protected]>
  • Loading branch information
4 people authored Sep 16, 2020
1 parent d9275b7 commit 8e7abc5
Show file tree
Hide file tree
Showing 53 changed files with 1,953 additions and 431 deletions.
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ install:
- pip install https://github.com/qiita-spots/qiita_client/archive/master.zip
- pip install https://github.com/qiita-spots/qtp-biom/archive/master.zip
- export QIITA_SERVER_CERT=`pwd`/qiita_core/support_files/server.crt
- configure_biom --env-script "source ~/virtualenv/python2.7/bin/activate; export PATH=$HOME/miniconda3/bin/:$PATH; . activate qtp-biom" --server-cert $QIITA_SERVER_CERT
- configure_biom --env-script "export PATH=$HOME/miniconda3/bin/:$PATH; source activate qtp-biom" --server-cert $QIITA_SERVER_CERT
- source deactivate
- source activate qiita
before_script:
Expand Down
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
# Qiita changelog

Version 092020
--------------

* Added a new endpoint to inject artifacts to existing preparations or jobs: `/qiita_db/artifact/`
* Outdated commands with the exact same name than newer commands will be marked as not outdated. This is helpful for cases where the commands haven't changed between version
* Added the `add_ebi_accessions` to the `to_dataframe()` method of the information files so it can be used to `redbiom`. This will allow searching via sample or experiment accessions
* Added the `release_validator_job` method to `ProcessingJob` method to easily retrieve the `release_validator` job of a `processing_job`
* Re-added `STUDY_TYPE` to the EBI-ENA submission as they are required but deprecated so just adding as Other
* Added qiime2.2020.08 to the system; which updated these plugins: qp-qiime2, qtp-biom, qtp-diversity, qtp-visualization
* Shogun processing using Woltka will now produce 2 extra artifacts: a per genome and per gene artifacts

Version 072020
--------------

Expand Down
6 changes: 2 additions & 4 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ compute resources to the global community, alleviating the technical burdens,
such as familiarity with the command line or access to compute power, that are
typically limiting for researchers studying microbial ecology.

Qiita is currently in alpha status. We are very open to community
Qiita is currently in beta status. We are very open to community
contributions and feedback. If you're interested in contributing to Qiita,
see `CONTRIBUTING.md <https://github.com/biocore/qiita/blob/master/CONTRIBUTING.md>`__.
If you'd like to report bugs or request features, you can do that in the
Expand All @@ -43,9 +43,7 @@ Current features

* Target gene data: we support deblur against GreenGenes (13_8) and close
reference picking against GreenGenes (13_8) and Silva.
* Metagenoic/Shotgun data: we support Shogun processing. Note that this data
is suitable for download and further down stream analyses but we don't recommend
meta-analysis within Qiita (only single study).
* Metagenomic and Metatranscriptomic data: we support Shogun processing.
* biom files can be added as new preparation templates for downstream
analyses; however, this cannot be made public.

Expand Down
2 changes: 1 addition & 1 deletion qiita_core/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@
# The full license is in the file LICENSE, distributed with this software.
# -----------------------------------------------------------------------------

__version__ = "052020"
__version__ = "092020"
2 changes: 1 addition & 1 deletion qiita_db/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
from . import user
from . import processing_job

__version__ = "052020"
__version__ = "092020"

__all__ = ["analysis", "artifact", "archive", "base", "commands",
"environment_manager", "exceptions", "investigation", "logger",
Expand Down
44 changes: 17 additions & 27 deletions qiita_db/analysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -355,11 +355,9 @@ def description(self, description):
QiitaDBStatusError
Analysis is public
"""
with qdb.sql_connection.TRN:
sql = """UPDATE qiita.{0} SET description = %s
WHERE analysis_id = %s""".format(self._table)
qdb.sql_connection.TRN.add(sql, [description, self._id])
qdb.sql_connection.TRN.execute()
sql = """UPDATE qiita.{0} SET description = %s
WHERE analysis_id = %s""".format(self._table)
qdb.sql_connection.perform_as_transaction(sql, [description, self._id])

@property
def samples(self):
Expand Down Expand Up @@ -513,11 +511,9 @@ def pmid(self, pmid):
-----
An analysis should only ever have one PMID attached to it.
"""
with qdb.sql_connection.TRN:
sql = """UPDATE qiita.{0} SET pmid = %s
WHERE analysis_id = %s""".format(self._table)
qdb.sql_connection.TRN.add(sql, [pmid, self._id])
qdb.sql_connection.TRN.execute()
sql = """UPDATE qiita.{0} SET pmid = %s
WHERE analysis_id = %s""".format(self._table)
qdb.sql_connection.perform_as_transaction(sql, [pmid, self._id])

@property
def can_be_publicized(self):
Expand Down Expand Up @@ -618,13 +614,11 @@ def set_error(self, error_msg):
error_msg : str
The error message
"""
with qdb.sql_connection.TRN:
le = qdb.logger.LogEntry.create('Runtime', error_msg)
sql = """UPDATE qiita.analysis
SET logging_id = %s
WHERE analysis_id = %s"""
qdb.sql_connection.TRN.add(sql, [le.id, self.id])
qdb.sql_connection.TRN.execute()
le = qdb.logger.LogEntry.create('Runtime', error_msg)
sql = """UPDATE qiita.analysis
SET logging_id = %s
WHERE analysis_id = %s"""
qdb.sql_connection.perform_as_transaction(sql, [le.id, self.id])

def has_access(self, user):
"""Returns whether the given user has access to the analysis
Expand Down Expand Up @@ -696,11 +690,9 @@ def share(self, user):
if user.id == self.owner or user.id in self.shared_with:
return

with qdb.sql_connection.TRN:
sql = """INSERT INTO qiita.analysis_users (analysis_id, email)
VALUES (%s, %s)"""
qdb.sql_connection.TRN.add(sql, [self._id, user.id])
qdb.sql_connection.TRN.execute()
sql = """INSERT INTO qiita.analysis_users (analysis_id, email)
VALUES (%s, %s)"""
qdb.sql_connection.perform_as_transaction(sql, [self._id, user.id])

def unshare(self, user):
"""Unshare the analysis with another user
Expand All @@ -710,11 +702,9 @@ def unshare(self, user):
user: User object
The user to unshare the analysis with
"""
with qdb.sql_connection.TRN:
sql = """DELETE FROM qiita.analysis_users
WHERE analysis_id = %s AND email = %s"""
qdb.sql_connection.TRN.add(sql, [self._id, user.id])
qdb.sql_connection.TRN.execute()
sql = """DELETE FROM qiita.analysis_users
WHERE analysis_id = %s AND email = %s"""
qdb.sql_connection.perform_as_transaction(sql, [self._id, user.id])

def _lock_samples(self):
"""Only dflt analyses can have samples added/removed
Expand Down
32 changes: 13 additions & 19 deletions qiita_db/artifact.py
Original file line number Diff line number Diff line change
Expand Up @@ -388,8 +388,7 @@ def _associate_with_analysis(instance, analysis_id):
(analysis_id, artifact_id)
VALUES (%s, %s)"""
sql_args = [analysis_id, instance.id]
qdb.sql_connection.TRN.add(sql, sql_args)
qdb.sql_connection.TRN.execute()
qdb.sql_connection.perform_as_transaction(sql, sql_args)

with qdb.sql_connection.TRN:
if parents:
Expand Down Expand Up @@ -673,12 +672,10 @@ def name(self, value):
ValueError
If `value` contains more than 35 chars
"""
with qdb.sql_connection.TRN:
sql = """UPDATE qiita.artifact
SET name = %s
WHERE artifact_id = %s"""
qdb.sql_connection.TRN.add(sql, [value, self.id])
qdb.sql_connection.TRN.execute()
sql = """UPDATE qiita.artifact
SET name = %s
WHERE artifact_id = %s"""
qdb.sql_connection.perform_as_transaction(sql, [value, self.id])

@property
def timestamp(self):
Expand Down Expand Up @@ -751,8 +748,7 @@ def _set_visibility(self, value):
sql = """UPDATE qiita.artifact
SET visibility_id = %s
WHERE artifact_id IN %s"""
qdb.sql_connection.TRN.add(sql, [vis_id, tuple(ids)])
qdb.sql_connection.TRN.execute()
qdb.sql_connection.perform_as_transaction(sql, [vis_id, tuple(ids)])

@visibility.setter
def visibility(self, value):
Expand Down Expand Up @@ -989,15 +985,13 @@ def is_submitted_to_vamps(self, value):
QiitaDBOperationNotPermittedError
If the artifact cannot be submitted to VAMPS
"""
with qdb.sql_connection.TRN:
if not self.can_be_submitted_to_vamps:
raise qdb.exceptions.QiitaDBOperationNotPermittedError(
"Artifact %s cannot be submitted to VAMPS" % self.id)
sql = """UPDATE qiita.artifact
SET submitted_to_vamps = %s
WHERE artifact_id = %s"""
qdb.sql_connection.TRN.add(sql, [value, self.id])
qdb.sql_connection.TRN.execute()
if not self.can_be_submitted_to_vamps:
raise qdb.exceptions.QiitaDBOperationNotPermittedError(
"Artifact %s cannot be submitted to VAMPS" % self.id)
sql = """UPDATE qiita.artifact
SET submitted_to_vamps = %s
WHERE artifact_id = %s"""
qdb.sql_connection.perform_as_transaction(sql, [value, self.id])

@property
def filepaths(self):
Expand Down
12 changes: 4 additions & 8 deletions qiita_db/download_link.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,10 +72,8 @@ def delete(cls, jti):
jti : object
The jwt token identifier
"""
with qdb.sql_connection.TRN:
sql = """DELETE FROM qiita.{0} WHERE jti=%s""".format(cls._table)
qdb.sql_connection.TRN.add(sql, [jti])
qdb.sql_connection.TRN.execute()
sql = """DELETE FROM qiita.{0} WHERE jti=%s""".format(cls._table)
qdb.sql_connection.perform_as_transaction(sql, [jti])

@classmethod
def exists(cls, jti):
Expand All @@ -98,10 +96,8 @@ def delete_expired(cls):
r"""Deletes all expired download links"""
now = datetime.now(timezone.utc)

with qdb.sql_connection.TRN:
sql = """DELETE FROM qiita.{0} WHERE exp<%s""".format(cls._table)
qdb.sql_connection.TRN.add(sql, [now])
qdb.sql_connection.TRN.execute()
sql = """DELETE FROM qiita.{0} WHERE exp<%s""".format(cls._table)
qdb.sql_connection.perform_as_transaction(sql, [now])

@classmethod
def get(cls, jti):
Expand Down
69 changes: 68 additions & 1 deletion qiita_db/handlers/artifact.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,9 @@

from tornado.web import HTTPError
from collections import defaultdict
from json import loads
from json import loads, dumps

from qiita_core.qiita_settings import r_client
import qiita_db as qdb
from .oauth2 import OauthBaseHandler, authenticate_oauth

Expand Down Expand Up @@ -234,3 +235,69 @@ def post(self):
self.set_status(200, reason="Artifact type already exists")

self.finish()


class APIArtifactHandler(OauthBaseHandler):
@authenticate_oauth
def post(self):
user_email = self.get_argument('user_email')
job_id = self.get_argument('job_id', None)
prep_id = self.get_argument('prep_id', None)
atype = self.get_argument('artifact_type')
aname = self.get_argument('command_artifact_name', 'Name')
files = self.get_argument('files')

if job_id is None and prep_id is None:
raise HTTPError(
400, reason='You need to specify a job_id or a prep_id')
if job_id is not None and prep_id is not None:
raise HTTPError(
400, reason='You need to specify only a job_id or a prep_id')

user = qdb.user.User(user_email)
values = {
'files': files, 'artifact_type': atype, 'name': aname,
# leaving here in case we need to add a way to add an artifact
# directly to an analysis, for more information see
# ProcessingJob._complete_artifact_transformation
'analysis': None}
PJ = qdb.processing_job.ProcessingJob
if job_id is not None:
TN = qdb.sql_connection.TRN
job = PJ(job_id)
with TN:
sql = """SELECT command_output_id
FROM qiita.command_output
WHERE name = %s AND command_id = %s"""
TN.add(sql, [aname, job.command.id])
results = TN.execute_fetchflatten()
if len(results) < 1:
raise HTTPError(400, 'The command_artifact_name does not '
'exist in the command')
cmd_out_id = results[0]
provenance = {'job': job_id,
'cmd_out_id': cmd_out_id,
# direct_creation is a flag to avoid having to wait
# for the complete job to create the new artifact,
# which is normally ran during regular processing.
# Skipping is fine because we are adding an artifact
# to an existing job outside of regular processing
'direct_creation': True,
'name': aname}
values['provenance'] = dumps(provenance)
# inherint the first prep info file from the first input artifact
prep_id = job.input_artifacts[0].prep_templates[0].id
else:
prep_id = int(prep_id)

values['template'] = prep_id
cmd = qdb.software.Command.get_validator(atype)
params = qdb.software.Parameters.load(cmd, values_dict=values)
new_job = PJ.create(user, params, True)
new_job.submit()

r_client.set('prep_template_%d' % prep_id,
dumps({'job_id': new_job.id, 'is_qiita_job': True}))

self.write(new_job.id)
self.finish()
Loading

0 comments on commit 8e7abc5

Please sign in to comment.