Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
We do not have the option to modify the request and then redirect with
the modification in place. To go around that, this commit creates an
intermediate step that informs the user of their progress, while
allowing us to store the filename for future steps.
|
|
|
|
|
|
|
|
Initialise the upload path for R/qtl2 bundles. This commit adds UI
that allows the user to select from existing species, before
proceeding to the next stage.
|
|
Add a favicon to reduce noise in the logs due to failed requests.
|
|
|
|
|
|
|
|
Implements the code enabling the upload of the samples/cases to the database.
|
|
|
|
Notify the user when they try to create a new dataset that has the
same name as an existing dataset and give them the chance to fix it
before continuing.
|
|
Handle any and all unforeseen error conditions gracefully by capturing
the exceptions, logging out for debug purposes and providing the user
with a generic error page.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
* To avoid confusion, only display the "alert-success" green on
completion of the parsing process. While parsing, if there are no
errors, then display the "No errors found so far" message without
the green colour.
|
|
|
|
|
|
* Display the status of the job, as it is running
* Display STDERR output if an error occurs
* Display STDOUT output as job is running and on successful completion
of the job
|
|
- Hook up external data insertion script to webserver code
- Provide rudimentary status indication
- Generalise some job creation details
|
|
Ease the selection of a radio button by allowing the user to click on
any of the table cells that are in the same row as the radio button of
concern.
|
|
Provide user with a confirmation stage where they can verify all the
data before inserting into the database.
|
|
Enable the user to create a new dataset should the need arise.
A few extra fixes were done, such as:
- Provide list of average methods to choose from
- Provide input elements for some expected fields
- Add a new confirmation step before doing the actual data update
|
|
Rather than using the redirect, that led to exposing the study id as a
get parameter, this commit adds an auxilliary step that allows the
user to choose whether to continue with the new study or go back and
select an existing study.
|
|
- Implement UI enabling selection from existing datasets
- Start implementation of UI that enables creation of new dataset
|
|
|
|
- Build code to populate the "Group" and "Tissue" dropdown lists
- Enable redirect with POST data (code 307) in case there is input
error to enable the user fix their errors
- Move hidden fields to macro to reduce repetition
|
|
Implement the select study UI
|
|
|
|
|
|
The GeneChipId value is required for the data being inserted, so this
commit provides the UI to enable selection of the chip.
|
|
As part of updating the database with the new data, there is a need to
select the appropriate dataset that the data belongs to, and this
commit provides the UI to assist the user do that.
|
|
The number columns in each contents line should be equal to the nember
of columns in the header line.
|
|
|
|
|
|
Enable the user to abort the background parsing of the file.
|
|
Enable the progress status page to show all the errors found at any
point during the processing of the file.
|
|
The CLI scripts use "standard-error" so update the web version to fit
in with that.
|
|
Implement code to handle errors in the processing of files.
|
|
- README.org: document how to run scripts manually
- manifest.scm: remove python-rq as a dependency
- qc_app/jobs.py: rework job launching and processing
- qc_app/parse.py: use reworked job processing
- qc_app/templates/job_progress.html: display progress correctly
- qc_app/templates/parse_results.html: display final results
- scripts/worker.py: new worker script
|
|
* Create and push the application context for the worker functions
* Fix the update of meta fields
|
|
Enable the queuing of file parsing jobs, since the files could be
really large and take a long time to parse and present results.
* etc/default_config.py: Add default config for redis server
* manifest.scm: Add redis, and rq as dependencies
* qc_app/__init__.py
* qc_app/jobs.py: module to hold utilities for management of the jobs
* qc_app/parse.py: Enqueue the job - extract file-parsing code to
callable function
* qc_app/templates/base.html: Enable addition of extra meta tags
* qc_app/templates/job_progress.html: template to display job progress
* qc_app/templates/no_such_job.html: template to indicate when a job
id is invalid
* quality_control/parsing.py: Add the total size parsed so far
|