Explore Workflows
View already parsed workflows here or click here to add your own
Graph | Name | Retrieved From | View |
---|---|---|---|
|
workflow.cwl
|
![]() Path: flow_dispatch/2other_species/workflow.cwl Branch/Commit ID: master |
|
|
EMG pipeline's QIIME workflow
Step 1: Set environment PYTHONPATH, QIIME_ROOT, PATH Step 2: Run QIIME script pick_closed_reference_otus.py ${python} ${qiimeDir}/bin/pick_closed_reference_otus.py -i $1 -o $2 -r ${qiimeDir}/gg_13_8_otus/rep_set/97_otus.fasta -t ${qiimeDir}/gg_13_8_otus/taxonomy/97_otu_taxonomy.txt -p ${qiimeDir}/cr_otus_parameters.txt Step 3: Convert new biom format to old biom format (json) ${qiimeDir}/bin/biom convert -i ${resultDir}/cr_otus/otu_table.biom -o ${resultDir}/cr_otus/${infileBase}_otu_table_json.biom --table-type=\"OTU table\" --to-json Step 4: Convert new biom format to a classic OTU table. ${qiimeDir}/bin/biom convert -i ${resultDir}/cr_otus/otu_table.biom -o ${resultDir}/cr_otus/${infileBase}_otu_table.txt --to-tsv --header-key taxonomy --table-type \"OTU table\" Step 5: Create otu summary ${qiimeDir}/bin/biom summarize-table -i ${resultDir}/cr_otus/otu_table.biom -o ${resultDir}/cr_otus/${infileBase}_otu_table_summary.txt Step 6: Move one of the result files mv ${resultDir}/cr_otus/otu_table.biom ${resultDir}/cr_otus/${infileBase}_otu_table_hdf5.biom Step 7: Create a list of observations awk '{print $1}' ${resultDir}/cr_otus/${infileBase}_otu_table.txt | sed '/#/d' > ${resultDir}/cr_otus/${infileBase}_otu_observations.txt Step 8: Create a phylogenetic tree by pruning GreenGenes and keeping observed otus ${python} ${qiimeDir}/bin/filter_tree.py -i ${qiimeDir}/gg_13_8_otus/trees/97_otus.tree -t ${resultDir}/cr_otus/${infileBase}_otu_observations.txt -o ${resultDir}/cr_otus/${infileBase}_pruned.tree |
![]() Path: workflows/qiime-workflow.cwl Branch/Commit ID: 708fd97 |
|
|
zip_and_index_vcf.cwl
This is a very simple workflow of two steps. It will zip an input VCF file and then index it. The zipped file and the index file will be in the workflow output. |
![]() Path: zip_and_index_vcf.cwl Branch/Commit ID: master |
|
|
epos single download
EPOS-IT Curl Workflow: downloads data based on curl input. |
![]() Path: epos_accept_single_url_curl.cwl Branch/Commit ID: master |
|
|
pipeline-pe.cwl
ATAC-seq pipeline - reads: PE |
![]() Path: v1.0/ATAC-seq_pipeline/pipeline-pe.cwl Branch/Commit ID: master |
|
|
wf_clipseqcore_pe_1barcode.cwl
Workflow for handling reads containing one barcode. Returns the bam file containing read2 only. Notes: runs the following steps: - demultiplex - trimfirst_file2string - trimagain_file2string - b1_trim_and_map - view_r2 - index_r2_bam - make_bigwigs |
![]() Path: cwl/wf_clipseqcore_pe_1barcode.cwl Branch/Commit ID: master |
|
|
zip_and_index_vcf.cwl
This is a very simple workflow of two steps. It will zip an input VCF file and then index it. The zipped file and the index file will be in the workflow output. |
![]() Path: zip_and_index_vcf.cwl Branch/Commit ID: master |
|
|
count-lines1-wf.cwl
|
![]() Path: tests/count-lines1-wf.cwl Branch/Commit ID: main |
|
|
RNASelector as a CWL workflow
https://doi.org/10.1007/s12275-011-1213-z |
![]() Path: workflows/rna-selector.cwl Branch/Commit ID: 5dc7c5c |
|
|
functional analysis prediction with InterProScan
|
![]() Path: workflows/functional_analysis.cwl Branch/Commit ID: 3f85843 |