Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Maker annotation workflow #47

Closed

Conversation

gallardoalba
Copy link
Contributor

No description provided.

@bgruening
Copy link
Member

ping @abretaud

Copy link
Collaborator

@abretaud abretaud left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super cool, thanks for working on this!

@@ -0,0 +1,7 @@
version: 1.2
workflows:
- name: "ANNOTATION-OF-EUKARYOTIC-GENOMES-WITH-MAKER"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any particular reason to have this name in upper case?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really, except that we named all the covid-19 workflows in uppercase, but there's no technical reason to do it. If there's only a single workflow in the folder we could also just call it main ? The only concern is that the name should be reasonably short (if you need to type it manually, or include it as a reference in a paper), and if we add multiple workflows we should know which workflow does what.

@mvdbeek
Copy link
Member

mvdbeek commented Jul 12, 2021

I will have a look at the test error, that looks like a Galaxy bug

@gallardoalba
Copy link
Contributor Author

I will have a look at the test error, that looks like a Galaxy bug

Thanks!

@mvdbeek
Copy link
Member

mvdbeek commented Jul 12, 2021

OK, so the busco task is failing, and the test framework then fails trying to download the outputs.
This is the command that was built

12092 2021-07-11T23:35:48.1828454Z galaxy.jobs.command_factory INFO 2021-07-11 23:35:47,280 [pN:main,p:5726,tN:L      ocalRunner.work_thread-2] Built script [/tmp/tmp3esa7pwv/job_working_directory/000/11/tool_script.sh] for       tool command [if [ -z "$AUGUSTUS_CONFIG_PATH" ] ; then BUSCO_PATH=$(dirname $(which busco)) ; export AUGUS      TUS_CONFIG_PATH=$(realpath ${BUSCO_PATH}/../config) ; fi && cp -r "$AUGUSTUS_CONFIG_PATH/" augustus_dir/ &      & export AUGUSTUS_CONFIG_PATH=`pwd`/augustus_dir/ &&   busco --in '/tmp/tmp3esa7pwv/files/000/dataset_14.d      at' --lineage_dataset 'fungi_odb10' --update-data --mode 'tran' -o busco_galaxy --cpu ${GALAXY_SLOTS:-4} -      -evalue 0.01  --limit 3]

This runs for about 4 hours and then fails with

75067 2021-07-12T03:30:37.3438266Z galaxy.tool_util.output_checker INFO 2021-07-12 03:30:36,412 [pN:main,p:5726,      tN:LocalRunner.work_thread-0] Job error detected, failing job. Reasons are [{'type': 'exit_code', 'desc':       'Fatal error: Exit code 1 ()', 'exit_code': 1, 'code_desc': '', 'error_level': 3}]

Unfortunately we don't have the stderr, but I wouldn't be surprised if it fails for CPU or memory limits. The job itself ran for almost 4 hours ... would reducing the input size accelerate this job ? Can we maybe use the tool test data for busco ?

@gallardoalba
Copy link
Contributor Author

gallardoalba commented Jul 12, 2021

OK, so the busco task is failing, and the test framework then fails trying to download the outputs.
This is the command that was built

12092 2021-07-11T23:35:48.1828454Z galaxy.jobs.command_factory INFO 2021-07-11 23:35:47,280 [pN:main,p:5726,tN:L      ocalRunner.work_thread-2] Built script [/tmp/tmp3esa7pwv/job_working_directory/000/11/tool_script.sh] for       tool command [if [ -z "$AUGUSTUS_CONFIG_PATH" ] ; then BUSCO_PATH=$(dirname $(which busco)) ; export AUGUS      TUS_CONFIG_PATH=$(realpath ${BUSCO_PATH}/../config) ; fi && cp -r "$AUGUSTUS_CONFIG_PATH/" augustus_dir/ &      & export AUGUSTUS_CONFIG_PATH=`pwd`/augustus_dir/ &&   busco --in '/tmp/tmp3esa7pwv/files/000/dataset_14.d      at' --lineage_dataset 'fungi_odb10' --update-data --mode 'tran' -o busco_galaxy --cpu ${GALAXY_SLOTS:-4} -      -evalue 0.01  --limit 3]

This runs for about 4 hours and then fails with

75067 2021-07-12T03:30:37.3438266Z galaxy.tool_util.output_checker INFO 2021-07-12 03:30:36,412 [pN:main,p:5726,      tN:LocalRunner.work_thread-0] Job error detected, failing job. Reasons are [{'type': 'exit_code', 'desc':       'Fatal error: Exit code 1 ()', 'exit_code': 1, 'code_desc': '', 'error_level': 3}]

Unfortunately we don't have the stderr, but I wouldn't be surprised if it fails for CPU or memory limits. The job itself ran for almost 4 hours ... would reducing the input size accelerate this job ? Can we maybe use the tool test data for busco ?

Yes, I guess that it should be a memory or cpu issue, because I tested it in Galaxy and worked fine. I'll reduce the size of the input files.

@gallardoalba
Copy link
Contributor Author

I will have a look at the test error, that looks like a Galaxy bug

HI @mvdbeek, now it triggers this error:
500 Server Error: Internal Server Error for url: http://localhost:41967/api/histories/2891970512fa2d5a/contents/3e4e8de146a035f5/display

@mvdbeek
Copy link
Member

mvdbeek commented Jul 13, 2021

It's maker that is failing now:

2021-07-13T14:26:06.8356201Z galaxy.jobs.command_factory INFO 2021-07-13 14:26:06,464 [pN:main,p:5733,tN:LocalRunner.work_thread-3] Built script [/tmp/tmpx002mwjt/job_working_directory/000/10/tool_script.sh] for tool command [cp '/tmp/tmpx002mwjt/files/000/dataset_4.dat' input.gff3 && echo "##FASTA" >> input.gff3 && cat '/tmp/tmpx002mwjt/files/000/dataset_2.dat' >> input.gff3 && maker2zff input.gff3 && fathom -categorize 1000 genome.ann genome.dna && fathom -export 1000 -plus uni.ann uni.dna && forge export.ann export.dna && hmm-assembler.pl snap_training . > '/tmp/tmpx002mwjt/files/000/dataset_15.dat']
...
...
2021-07-13T15:51:11.7103800Z galaxy.tool_util.output_checker INFO 2021-07-13 15:51:11,230 [pN:main,p:5733,tN:LocalRunner.work_thread-2] Job error detected, failing job. Reasons are [{'type': 'exit_code', 'desc': 'Fatal error: Exit code 1 ()', 'exit_code': 1, 'code_desc': '', 'error_level': 3}]
2021-07-13T15:51:11.7105594Z 127.0.0.1 - - [13/Jul/2021:15:51:11 +0000] "GET /api/histories/2891970512fa2d5a HTTP/1.1" 200 - "-" "python-requests/2.25.1"

I have updated planemo to produce more meaningful output, let's see what's going on there

@mvdbeek
Copy link
Member

mvdbeek commented Jul 13, 2021

Also maker is on the biocontainer skip list: https://github.com/galaxyproject/tools-iuc/blob/master/.tt_biocontainer_skip#L1, that might be a problem too

@gallardoalba
Copy link
Contributor Author

Now it launched this error:
File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/planemo/reports/macros.tmpl", line 5, in template - **Step {{step_data.order_index + 1}}: {{step_data.workflow_step_label or (step_data.jobs[0].tool_id if step_data.jobs[0] else 'Unlabelled step')|replace("_", "\_")}}**: jinja2.exceptions.UndefinedError: 'dict object' has no attribute 'order_index

@mvdbeek
Copy link
Member

mvdbeek commented Jul 14, 2021

You can see the actual error in the artifact uploaded in the test job:

Results (powered by Planemo)

Summary

State Count
Total 1
Passed 0
Error 1
Failure 0
Skipped 0
Errored s
  • ❌ maker-annotation-eukaryote.ga_0

    Workflow invocation details

    • Steps
      • Step 23: toolshed.g2.bx.psu.edu/repos/iuc/busco/busco/4.1.4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              adv {"aug_prediction": {"__current_case__": 0, "augustus_mode": "no"}, "evalue": "0.01", "limit": "3", "long": "false"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              lineage_dataset "fungi_odb10"
              mode "tran"
      • Step 22: toolshed.g2.bx.psu.edu/repos/iuc/jcvi_gff_stats/jcvi_gff_stats/0.8.4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              ref_genome {"__current_case__": 1, "genome": {"values": [{"id": 2, "src": "hda"}]}, "genome_type_select": "history"}
      • Step 21: toolshed.g2.bx.psu.edu/repos/devteam/gffread/gffread/2.2.1.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "gff3"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chr_replace None
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              decode_url "false"
              expose "false"
              filtering None
              full_gff_attribute_preservation "false"
              gffs {"__current_case__": 0, "gff_fmt": "none"}
              maxintron None
              merging {"__current_case__": 0, "merge_sel": "none"}
              reference_genome {"__current_case__": 2, "fa_outputs": ["-w exons.fa", "-x cds.fa", "-y pep.fa"], "genome_fasta": {"values": [{"id": 2, "src": "hda"}]}, "ref_filtering": null, "source": "history"}
              region {"__current_case__": 0, "region_filter": "none"}
      • Step 20: toolshed.g2.bx.psu.edu/repos/iuc/jbrowse/jbrowse/1.16.10+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "fasta"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              action {"__current_case__": 0, "action_select": "create"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              gencode "1"
              jbgen {"aboutDescription": "", "defaultLocation": "", "hideGenomeOptions": "false", "shareLink": "true", "show_menu": "true", "show_nav": "true", "show_overview": "true", "show_tracklist": "true", "trackPadding": "20"}
              plugins {"BlastView": "true", "ComboTrackSelector": "false", "GCContent": "false"}
              reference_genome {"__current_case__": 1, "genome": {"values": [{"id": 2, "src": "hda"}]}, "genome_type_select": "history"}
              standalone "minimal"
              track_groups [{"__index__": 0, "category": "Maker annotation", "data_tracks": [{"__index__": 0, "data_format": {"__current_case__": 2, "annotation": {"values": [{"id": 4, "src": "hda"}, {"id": 33, "src": "hda"}, {"id": 19, "src": "hda"}]}, "data_format_select": "gene_calls", "index": "false", "jb_custom_config": {"option": []}, "jbcolor_scale": {"color_score": {"__current_case__": 0, "color": {"__current_case__": 0, "color_select": "automatic"}, "color_score_select": "none"}}, "jbmenu": {"track_menu": []}, "jbstyle": {"max_height": "600", "style_classname": "feature", "style_description": "note,description", "style_height": "10px", "style_label": "product,name,id"}, "match_part": {"__current_case__": 1, "match_part_select": "false"}, "override_apollo_drag": "False", "override_apollo_plugins": "False", "track_config": {"__current_case__": 3, "html_options": {"topLevelFeatures": ""}, "track_class": "NeatHTMLFeatures/View/Track/NeatFeatures"}, "track_visibility": "default_off"}}]}, {"__index__": 1, "category": "Maker evidences", "data_tracks": [{"__index__": 0, "data_format": {"__current_case__": 2, "annotation": {"values": [{"id": 5, "src": "hda"}, {"id": 20, "src": "hda"}, {"id": 31, "src": "hda"}]}, "data_format_select": "gene_calls", "index": "false", "jb_custom_config": {"option": []}, "jbcolor_scale": {"color_score": {"__current_case__": 0, "color": {"__current_case__": 0, "color_select": "automatic"}, "color_score_select": "none"}}, "jbmenu": {"track_menu": []}, "jbstyle": {"max_height": "600", "style_classname": "feature", "style_description": "note,description", "style_height": "10px", "style_label": "product,name,id"}, "match_part": {"__current_case__": 0, "match_part_select": "true", "name": {"__class__": "RuntimeValue"}}, "override_apollo_drag": "False", "override_apollo_plugins": "False", "track_config": {"__current_case__": 3, "html_options": {"topLevelFeatures": ""}, "track_class": "NeatHTMLFeatures/View/Track/NeatFeatures"}, "track_visibility": "default_off"}}]}]
              uglyTestingHack ""
      • Step 19: toolshed.g2.bx.psu.edu/repos/iuc/maker_map_ids/maker_map_ids/2.31.11:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              justify "6"
              prefix "TEST"
      • Step 18: toolshed.g2.bx.psu.edu/repos/iuc/maker/maker/2.31.11:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __job_resource {"__current_case__": 0, "__job_resource__select": "no"}
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              abinitio_gene_prediction {"aug_prediction": {"__current_case__": 1, "augustus_mode": "history", "augustus_model": {"values": [{"id": 23, "src": "hda"}]}}, "snaphmm": {"values": [{"id": 26, "src": "hda"}]}, "unmask": "false"}
              advanced {"AED_threshold": "1.0", "alt_peptide": "C", "alt_splice": "false", "always_complete": "false", "correct_est_fusion": "false", "keep_preds": "0.0", "map_forward": "false", "max_dna_len": "100000", "min_contig": "1", "min_protein": "0", "other_gff": null, "pred_flank": "200", "pred_stats": "false", "single_exon": {"__current_case__": 0, "single_exon": "0"}, "split_hit": "10000"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              est_evidences {"altest": null, "altest_gff": null, "est": null, "est2genome": "false", "est_gff": null}
              gene_prediction {"model_gff": null, "pred_gff": null, "snoscan_rrna": null, "trna": "false"}
              organism_type "eukaryotic"
              protein_evidences {"protein": null, "protein2genome": "false", "protein_gff": null}
              reannotation {"__current_case__": 1, "altest_pass": "true", "est_pass": "true", "maker_gff": {"values": [{"id": 21, "src": "hda"}]}, "model_pass": "false", "other_pass": "false", "pred_pass": "false", "protein_pass": "true", "reannotate": "yes", "rm_pass": "true"}
              repeat_masking {"repeat_source": {"__current_case__": 3, "source_type": "no"}}
      • Step 17: toolshed.g2.bx.psu.edu/repos/iuc/busco/busco/4.1.4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              adv {"aug_prediction": {"__current_case__": 0, "augustus_mode": "no"}, "evalue": "0.01", "limit": "3", "long": "false"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              lineage_dataset "fungi_odb10"
              mode "tran"
      • Step 16: toolshed.g2.bx.psu.edu/repos/iuc/snap_training/snap_training/2013_11_29+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              gene_num "1000"
      • Step 15: toolshed.g2.bx.psu.edu/repos/iuc/jcvi_gff_stats/jcvi_gff_stats/0.8.4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              ref_genome {"__current_case__": 1, "genome": {"values": [{"id": 2, "src": "hda"}]}, "genome_type_select": "history"}
      • Step 14: toolshed.g2.bx.psu.edu/repos/bgruening/augustus_training/augustus_training/3.3.3:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • Step 13: toolshed.g2.bx.psu.edu/repos/devteam/gffread/gffread/2.2.1.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "gff3"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chr_replace None
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              decode_url "false"
              expose "false"
              filtering None
              full_gff_attribute_preservation "false"
              gffs {"__current_case__": 0, "gff_fmt": "none"}
              maxintron None
              merging {"__current_case__": 0, "merge_sel": "none"}
              reference_genome {"__current_case__": 2, "fa_outputs": ["-w exons.fa"], "genome_fasta": {"values": [{"id": 2, "src": "hda"}]}, "ref_filtering": null, "source": "history"}
              region {"__current_case__": 0, "region_filter": "none"}
      • Step 12: toolshed.g2.bx.psu.edu/repos/iuc/maker/maker/2.31.11:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __job_resource {"__current_case__": 0, "__job_resource__select": "no"}
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              abinitio_gene_prediction {"aug_prediction": {"__current_case__": 1, "augustus_mode": "history", "augustus_model": {"values": [{"id": 11, "src": "hda"}]}}, "snaphmm": {"values": [{"id": 15, "src": "hda"}]}, "unmask": "false"}
              advanced {"AED_threshold": "1.0", "alt_peptide": "C", "alt_splice": "false", "always_complete": "false", "correct_est_fusion": "false", "keep_preds": "0.0", "map_forward": "false", "max_dna_len": "100000", "min_contig": "1", "min_protein": "0", "other_gff": null, "pred_flank": "200", "pred_stats": "false", "single_exon": {"__current_case__": 0, "single_exon": "0"}, "split_hit": "10000"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              est_evidences {"altest": null, "altest_gff": null, "est": null, "est2genome": "false", "est_gff": null}
              gene_prediction {"model_gff": null, "pred_gff": null, "snoscan_rrna": null, "trna": "false"}
              organism_type "eukaryotic"
              protein_evidences {"protein": null, "protein2genome": "false", "protein_gff": null}
              reannotation {"__current_case__": 1, "altest_pass": "true", "est_pass": "true", "maker_gff": {"values": [{"id": 6, "src": "hda"}]}, "model_pass": "false", "other_pass": "false", "pred_pass": "false", "protein_pass": "true", "reannotate": "yes", "rm_pass": "true"}
              repeat_masking {"repeat_source": {"__current_case__": 3, "source_type": "no"}}
      • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/busco/busco/4.1.4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • if [ -z "$AUGUSTUS_CONFIG_PATH" ] ; then BUSCO_PATH=$(dirname $(which busco)) ; export AUGUSTUS_CONFIG_PATH=$(realpath ${BUSCO_PATH}/../config) ; fi && cp -r "$AUGUSTUS_CONFIG_PATH/" augustus_dir/ && export AUGUSTUS_CONFIG_PATH=`pwd`/augustus_dir/ &&   busco --in '/tmp/tmpntphx7id/files/000/dataset_14.dat' --lineage_dataset 'fungi_odb10' --update-data --mode 'tran' -o busco_galaxy --cpu ${GALAXY_SLOTS:-4} --evalue 0.01  --limit 3

            Exit Code:

            • 0

            Standard Output:

            • INFO:	***** Start a BUSCO v4.1.4 analysis, current time: 07/13/2021 17:13:48 *****
              INFO:	Configuring BUSCO with /usr/local/share/busco/config.ini
              INFO:	Mode is transcriptome
              WARNING:	You are using a custom e-value cutoff
              INFO:	Input file is /tmp/tmpntphx7id/files/000/dataset_14.dat
              INFO:	Downloading information on latest versions of BUSCO data...
              INFO:	Downloading file 'https://busco-data.ezlab.org/v4/data/lineages/fungi_odb10.2020-09-10.tar.gz'
              INFO:	Decompressing file '/tmp/tmpntphx7id/job_working_directory/000/11/working/busco_downloads/lineages/fungi_odb10.tar.gz'
              INFO:	Running BUSCO using lineage dataset fungi_odb10 (eukaryota, 2020-09-10)
              INFO:	Running 1 job(s) on makeblastdb, starting at 07/13/2021 17:14:05
              INFO:	Creating BLAST database with input file
              INFO:	[makeblastdb]	1 of 1 task(s) completed
              INFO:	Running a BLAST search for BUSCOs against created database
              INFO:	Running 1 job(s) on tblastn, starting at 07/13/2021 17:14:05
              INFO:	[tblastn]	1 of 1 task(s) completed
              INFO:	Translating candidate transcripts
              INFO:	***** Run HMMER on gene sequences *****
              INFO:	Running 158 job(s) on hmmsearch, starting at 07/13/2021 17:14:50
              INFO:	[hmmsearch]	16 of 158 task(s) completed
              INFO:	[hmmsearch]	32 of 158 task(s) completed
              INFO:	[hmmsearch]	48 of 158 task(s) completed
              INFO:	[hmmsearch]	64 of 158 task(s) completed
              INFO:	[hmmsearch]	80 of 158 task(s) completed
              INFO:	[hmmsearch]	95 of 158 task(s) completed
              INFO:	[hmmsearch]	111 of 158 task(s) completed
              INFO:	[hmmsearch]	127 of 158 task(s) completed
              INFO:	[hmmsearch]	143 of 158 task(s) completed
              INFO:	[hmmsearch]	158 of 158 task(s) completed
              INFO:	
              
              	--------------------------------------------------
              	|Results from dataset fungi_odb10                 |
              	--------------------------------------------------
              	|C:4.8%[S:4.5%,D:0.3%],F:0.1%,M:95.1%,n:758       |
              	|36	Complete BUSCOs (C)                       |
              	|34	Complete and single-copy BUSCOs (S)       |
              	|2	Complete and duplicated BUSCOs (D)        |
              	|1	Fragmented BUSCOs (F)                     |
              	|721	Missing BUSCOs (M)                        |
              	|758	Total BUSCO groups searched               |
              	--------------------------------------------------
              INFO:	BUSCO analysis done with WARNING(s). Total running time: 74 seconds
              
              ***** Summary of warnings: *****
              WARNING:busco.BuscoConfig	You are using a custom e-value cutoff
              
              INFO:	Results written in /tmp/tmpntphx7id/job_working_directory/000/11/working/busco_galaxy
              
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              adv {"aug_prediction": {"__current_case__": 0, "augustus_mode": "no"}, "evalue": "0.01", "limit": "3", "long": "false"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              lineage_dataset "fungi_odb10"
              mode "tran"
      • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/snap_training/snap_training/2013_11_29+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpntphx7id/files/000/dataset_4.dat' input.gff3 && echo "##FASTA" >> input.gff3 && cat '/tmp/tmpntphx7id/files/000/dataset_2.dat' >> input.gff3 && maker2zff input.gff3 && fathom -categorize 1000 genome.ann genome.dna && fathom -export 1000 -plus uni.ann uni.dna && forge export.ann export.dna && hmm-assembler.pl snap_training . > '/tmp/tmpntphx7id/files/000/dataset_15.dat'

            Exit Code:

            • 0

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              gene_num "1000"
      • Step 9: toolshed.g2.bx.psu.edu/repos/devteam/gffread/gffread/2.2.1.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpntphx7id/files/000/dataset_2.dat' genomeref.fa && gffread '/tmp/tmpntphx7id/files/000/dataset_4.dat' -g genomeref.fa      -w exons.fa

            Exit Code:

            • 0

            Standard Error:

            • No fasta index found for genomeref.fa. Rebuilding, please wait..
              Fasta index rebuilt.
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "gff3"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chr_replace None
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              decode_url "false"
              expose "false"
              filtering None
              full_gff_attribute_preservation "false"
              gffs {"__current_case__": 0, "gff_fmt": "none"}
              maxintron None
              merging {"__current_case__": 0, "merge_sel": "none"}
              reference_genome {"__current_case__": 2, "fa_outputs": ["-w exons.fa"], "genome_fasta": {"values": [{"id": 2, "src": "hda"}]}, "ref_filtering": null, "source": "history"}
              region {"__current_case__": 0, "region_filter": "none"}
      • Step 8: toolshed.g2.bx.psu.edu/repos/iuc/jcvi_gff_stats/jcvi_gff_stats/0.8.4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpntphx7id/files/000/dataset_4.dat' 'input.gff'  &&  python -m jcvi.annotation.stats genestats 'input.gff' > '/tmp/tmpntphx7id/files/000/dataset_12.dat'  &&  python -m jcvi.annotation.stats summary 'input.gff' '/tmp/tmpntphx7id/files/000/dataset_2.dat' 2>&1 | tail -n +3 >> '/tmp/tmpntphx7id/files/000/dataset_12.dat'  &&  python -m jcvi.annotation.stats stats 'input.gff' 2>&1 | grep Mean >> '/tmp/tmpntphx7id/files/000/dataset_12.dat'  &&  python -m jcvi.annotation.stats histogram 'input.gff'  &&  pdfunite *.input.pdf '/tmp/tmpntphx7id/files/000/dataset_13.dat'

            Exit Code:

            • 0

            Standard Error:

            • �[0;33m17:12:59 [gff]�[0m�[0;35m Indexing `input.gff`�[0m
              �[0;33m17:12:59 [base]�[0m�[0;35m Load file `transcript.sizes`�[0m
              �[0;33m17:12:59 [base]�[0m�[0;35m Imported 187 records from `transcript.sizes`.�[0m
              �[0;33m17:12:59 [base]�[0m�[0;35m Load file `transcript.sizes`�[0m
              �[0;33m17:12:59 [base]�[0m�[0;35m Imported 187 records from `transcript.sizes`.�[0m
              �[0;33m17:12:59 [stats]�[0m�[0;35m A total of 187 transcripts populated.�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m $HOME=None�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m matplotlib data path /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m loaded rc file /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/matplotlibrc�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m matplotlib version 2.2.2�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m interactive is False�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m platform is linux2�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m loaded modules: ['numpy.core.info', 'Bio.SeqIO.UniprotIO', 'jcvi.apps.base', 'ctypes.os', 'Bio.os', 'runpy', 'gc', 'distutils.sysconfig', 'matplotlib.cbook._backports', 'Bio._py3k.urllib2', 'logging.weakref', 'base64', 'jcvi.utils', 'jcvi.apps.signal', 'unittest.sys', 'numpy.core.umath', 'string', 'Bio.SeqIO.InsdcIO', 'numpy.lib.arraysetops', '_elementtree', 'xml.etree.warnings', 'xml.etree.sys', 'Bio.SeqUtils.binascii', 'json.encoder', 'Bio.SeqIO.QualityIO', 'numpy.core.machar', 'jcvi.formats.collections', 'unittest.StringIO', 'unittest.types', 'jcvi.apps.threading', 'numpy.ma.extras', 'numpy.fft.fftpack_lite', 'matplotlib.cbook', 'dis', 'zlib', 'logging.threading', 'jcvi.annotation', 'Bio.SeqIO.Bio', '_json', 'Bio.GenBank.warnings', 'matplotlib.cbook.warnings', 'pyparsing', 'matplotlib.cbook.textwrap', 'abc', 'numpy._globals', 'numpy.lib.npyio', 'xml.sax.io', 'Bio._py3k.sys', 'Bio.SeqIO.sys', 'numpy.fft.helper', 'matplotlib.rcsetup', 'optparse', 'unittest.suite', '_ctypes', 'xml.etree', 'json.scanner', 'codecs', 'Bio.collections', 'numpy.os', 'jcvi.annotation.jcvi', 'mkl_fft._numpy_fft', 'StringIO', 'Bio.Alphabet.IUPAC', 'weakref', 'numpy.core._internal', 'distutils.sys', 'numpy.lib.arraypad', 'pprint', 'Bio.SeqIO.math', 'Bio.warnings', 'jcvi.apps.httplib', 'sqlite3.dbapi2', 'select', 'ctypes._ctypes', '_heapq', 'six.moves.urllib', 'numpy.lib.financial', 'Bio.Align.Generic', 'binascii', 'jcvi.formats.sizes', 'unittest.loader', 'unittest.fnmatch', 'numpy.polynomial.chebyshev', '_functools', 'cPickle', 'numpy.polynomial.hermite_e', 'jcvi.apps.urllib', 'jcvi', 'numpy.testing.utils', 'jcvi.graphics.histogram', 'jcvi.utils.UserDict', 'numpy.core.fromnumeric', 'unicodedata', 'numpy.ctypeslib', 'matplotlib._version', '_ast', 'jcvi.utils.range', 'encodings.aliases', 'Bio.SeqIO.SeqXmlIO', 'fnmatch', 'sre_parse', 'pickle', 'Bio.SeqIO.AceIO', 'numpy.random.warnings', 'xml.sax.types', 'jcvi.utils.itertools', 'logging.cStringIO', 'Bio.SeqIO.os', 'numpy.lib.polynomial', 'numpy.compat', 'numbers', 'numpy.core.records', 'strop', 'xml.etree.ElementPath', 'numpy.core.numeric', 'six', 'Bio.array', 'jcvi.apps', 'matplotlib.testing', 'Bio.Alphabet.Bio', 'numpy.lib.utils', 'numpy.lib.arrayterator', 'os.path', 'Bio.GenBank.__future__', '_weakrefset', 'Bio.SeqIO.warnings', 'unittest.traceback', 'unittest.os', 'jcvi.utils.table', 'functools', 'sysconfig', 'sqlite3.collections', 'numpy.polynomial.legendre', 'numpy.matrixlib.defmatrix', 'tempfile', 'imp', 'jcvi.apps.subprocess', 'jcvi.formats.urllib', 'Bio._py3k.commands', 'Bio.SeqIO.Interfaces', 'numpy.linalg.info', 'jcvi.utils.re', 'jcvi.formats.Bio', 'xml.etree._elementtree', 'unittest.util', 'Bio.SeqIO.struct', 'jcvi.apps.time', 'httplib', 'decimal', 'numpy.lib._datasource', 'Bio._py3k.__builtin__', 'jcvi.formats.os', 'jcvi.apps.unicodedata', 'numpy.linalg._umath_linalg', 'cStringIO', 'numpy.polynomial', 'jcvi.annotation.reformat', 'jcvi.graphics.sys', 'numpy.add_newdocs', 'Bio.SeqIO.__future__', 'encodings', 'Bio.sqlite3', 'Bio.Data', 'json.struct', 'numpy.lib.numpy', 'numpy.random.threading', 're', 'jcvi.formats.fasta', 'sqlite3.time', 'math', 'ast', 'numpy.lib.ufunclike', 'ctypes.struct', 'matplotlib.json', '_locale', 'logging', 'thread', 'traceback', 'jcvi.apps.logging', 'jcvi.graphics.os', 'jcvi.utils.logging', 'jcvi.annotation.re', 'Bio.SeqUtils.math', 'jcvi.formats.bed', 'Bio.Sequencing', '_collections', 'Bio._py3k.urlparse', 'numpy.random', 'numpy._mklinit', 'numpy.lib.twodim_base', 'array', 'ctypes.sys', 'Bio.SeqIO.AbiIO', 'posixpath', 'Bio.GenBank.Bio', 'numpy.core.arrayprint', 'types', 'numpy.lib.stride_tricks', 'numpy.lib.scimath', 'matplotlib.cbook.functools', 'json._json', 'xml.sax.os', 'Bio.codecs', '_codecs', 'numpy.__config__', 'Bio._py3k.collections', 'copy', 'hashlib', 'keyword', 'jcvi.annotation.logging', 'numpy.lib.nanfunctions', 'unittest.weakref', 'jcvi.utils.cbook', 'posix', 'matplotlib.fontconfig_pattern', 'Bio.GenBank.utils', 'jcvi.utils.os', 'jcvi.annotation.collections', 'sre_compile', 'Bio.string', '_hashlib', 'numpy.lib.shape_base', 'numpy._import_tools', 'logging.collections', 'backports_abc', 'Bio.SeqIO.PhdIO', 'distutils.errors', '__main__', 'numpy.fft.info', 'numpy.sys', 'jcvi.utils.natsort', 'numpy.random.info', 'xml.etree.cElementTree', 'dateutil._version', 'matplotlib._color_data', 'unittest.result', 'bz2', 'encodings.codecs', 'xml.dom.minicompat', 'unittest.difflib', '_ssl', 'numpy.lib.index_tricks', 'warnings', 'encodings.ascii', 'Bio.SeqUtils.Bio', '_sqlite3', 'json.sys', 'Bio.Bio', 'numpy.testing.nose_tools', 'Bio.SeqIO.SwissIO', 'future_builtins', 'jcvi.formats', 'jcvi.formats.string', '_io', 'linecache', 'numpy.linalg.linalg', 'numpy.lib._iotools', 'random', '_bisect', 'datetime', 'logging.os', 'ctypes._endian', 'encodings.encodings', 'unittest.pprint', 'matplotlib.sys', 'numpy.random.mtrand', 'xml', 'Bio', '_cython_0_28_4', 'six.moves.urllib.request', 'numpy.linalg', 'pyexpat.errors', 'Bio._py3k', 'Bio.File', 'logging.thread', 'xml.etree.ElementTree', '_struct', 'jcvi.formats.base', 'numpy.lib._version', 'jcvi.graphics.logging', 'jcvi.formats.logging', 'ssl', 'numpy.version', 'distutils.re', 'Bio.SeqIO.TabIO', 'Bio.__future__', 'Bio.Sequencing.Bio', 'numpy.lib.type_check', 'jcvi.apps.jcvi', 'jcvi.formats.shutil', 'jcvi.utils.functools', 'bisect', 'unittest.runner', 'unittest.re', 'threading', 'pyexpat.model', 'cycler', 'Bio.SeqRecord', 'locale', 'Bio.SeqIO.re', 'atexit', 'xml.sax.saxutils', 'Bio.GenBank.sys', 'jcvi.utils.collections', 'jcvi.utils.grouper', 'dateutil', 'numpy.testing.decorators', 'urllib', 'numpy.core.numerictypes', 'fcntl', 'unittest.case', 'mkl_fft._version', 'Bio._py3k.urllib', 'mkl_fft._pydfti', 'distutils.os', 'numpy.lib.info', 'ctypes', 'matplotlib', 'Bio.contextlib', 'xml.dom.domreg', 'struct', 'json.re', 'commands', 'jcvi.apps.difflib', 'unittest.signal', 'jcvi.graphics.numpy', 'itertools', 'numpy.ctypes', 'opcode', 'numpy.testing.nose_tools.nosetester', 'six.moves', 'jcvi.utils.iter', 'unittest', 'Bio.Data.__future__', 'Bio.UserDict', 'jcvi.apps.console', 'unittest.collections', 'pkgutil', 'numpy.polynomial.laguerre', 'unittest.time', 'sre_constants', 'numpy.core._methods', 'numpy.core.function_base', '_random', 'numpy', 'Bio.SeqIO.PirIO', 'subprocess32', 'jcvi.formats.numpy', 'numpy.ma', 'logging.atexit', 'Bio.SeqIO.IgIO', 'Bio.SeqIO.FastaIO', 'jcvi.formats.re', 'jcvi.graphics.jcvi', 'xml.etree.re', 'numpy.lib', 'Bio.SeqUtils', 'Bio.Data.IUPACData', 'numpy.core.multiarray_tests', 'Bio.SwissProt.Bio', 'json.decoder', 'distutils.distutils', 'copy_reg', 'xml.dom.types', 'subprocess', 'site', 'io', 'pyexpat', 'shutil', 'Bio.Data.SCOPData', 'jcvi.graphics', 'jcvi.formats.itertools', 'jcvi.graphics.collections', 'Bio.SeqUtils.__future__', 'numpy.fft.fftpack', 'numpy.core', 'unittest.functools', 'sqlite3', 'jcvi.utils.orderedcollections', 'rfc822', 'numpy.polynomial.polyutils', 'json.json', 'sys', 'numpy.compat._inspect', 'xml.sax.sys', 'Bio.SwissProt.__future__', 'Bio.sys', 'xml.dom.xml', 'Bio.SeqUtils.re', 'jcvi.graphics.math', 'jcvi.graphics.functools', '_weakref', 'difflib', 'jcvi.graphics.base', 'urlparse', 'unittest.warnings', 'Bio.GenBank', 'gzip', 'heapq', 'xml.sax.urlparse', 'distutils', 'numpy.core.einsumfunc', 'matplotlib.cbook.deprecation', 'matplotlib.colors', 'jcvi.annotation.sys', 'exceptions', 'jcvi.formats.jcvi', 'Bio._py3k.future_builtins', 'xml.sax.urllib', 'mkl_fft', 'jcvi.apps.sys', 'sqlite3._sqlite3', 'numpy.testing', 'collections', '_sre', 'unittest.main', 'distutils.types', 'Bio.Align', 'zipimport', 'jcvi.apps.os', 'sqlite3.datetime', 'textwrap', 'jcvi.annotation.itertools', 'Bio.SeqIO.datetime', 'Bio.Align.Bio', 'signal', 'numpy.random.operator', 'jcvi.apps.fnmatch', 'jcvi.formats.sys', 'numpy.core.multiarray', 'Bio.SwissProt', 'distutils.version', 'jcvi.formats.math', 'numpy.ma.core', 'Bio.Sequencing.Phd', 'Bio.Data.Bio', 'matplotlib.compat.subprocess', 'logging.traceback', 'numpy.matrixlib', 'Bio.SeqIO.xml', 'numpy.testing.nose_tools.decorators', 'token', 'backports', 'numpy.lib.mixins', '_posixsubprocess32', 'Bio.SeqIO', 'glob', 'Bio.GenBank.Scanner', 'mpl_toolkits', 'jcvi.utils.sys', 'UserDict', 'inspect', 'jcvi.utils.bisect', 'logging.sys', 'Bio.SeqIO.collections', 'Bio.itertools', 'Bio.Sequencing.__future__', 'Bio.SeqUtils.CheckSum', 'socket', 'numpy.core.memmap', 'cython_runtime', 'Bio.Align.__future__', 'numpy.linalg.lapack_lite', 'Bio.Seq', 'os', 'marshal', '__future__', 'Bio.Alphabet', 'numpy.core.shape_base', 'jcvi.apps.errno', 'matplotlib.compat', '__builtin__', 'xml.sax.xmlreader', 'operator', 'xml.dom.pulldom', 'distutils.string', 'Bio.GenBank.re', 'jcvi.apps.optparse', 'errno', '_socket', 'json', 'jcvi.apps.shutil', 'Bio.SeqIO.PdbIO', 'xml.sax.handler', 'numpy.testing.nose_tools.utils', 'jcvi.formats.hashlib', 'jcvi.utils.urlparse', '_warnings', 'encodings.__builtin__', 'tokenize', 'numpy._distributor_init', 'pwd', 'numpy.core.getlimits', '_sysconfigdata', 'Bio.SeqIO.SffIO', 'numpy.fft', 'numpy.random.numpy', 'logging.time', 'Bio.GenBank.collections', 'jcvi.formats.gff', 'numpy.lib.function_base', 'logging.warnings', 'mimetools', 'xml.sax', 'Bio._py3k.cStringIO', 'logging.codecs', 'numpy.compat.py3k', 'Bio.Data.CodonTable', 'numpy.polynomial._polybase', 'numpy.polynomial.hermite', 'contextlib', 'numpy.polynomial.polynomial', 'grp', 'numpy.core.defchararray', 'gettext', '_abcoll', 'Bio.SeqFeature', 'xml.sax._exceptions', 'genericpath', 'stat', 'urllib2', 'unittest.signals', 'backports.functools_lru_cache', 'ctypes.ctypes', 'numpy.lib.format', 'numpy.testing.nosetester', 'xml.dom', 'time', 'jcvi.apps.socket', 'jcvi.annotation.os', 'Bio.Sequencing.Ace']�[0m
              �[0;33m17:13:01 [__init__]�[0m�[0;35m CACHEDIR=/tmp/matplotlib-gykF_4�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m font search path ['/usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf', '/usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm', '/usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts']�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono-Oblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans-Oblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizOneSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmr10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizFourSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizTwoSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizThreeSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneralBolIta.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizTwoSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneral.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmmi10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizOneSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmb10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizThreeSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUniIta.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneralItalic.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmss10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif-BoldItalic.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono-Bold.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUniBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUniBolIta.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif-Bold.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansDisplay.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneralBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUni.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmex10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif-Italic.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans-BoldOblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono-BoldOblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmsy10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerifDisplay.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans-Bold.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizFiveSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matpl
              ..
              13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizFourSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m trying fontname /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmtt10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono-Oblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans-Oblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizOneSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmr10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizFourSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizTwoSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizThreeSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneralBolIta.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizTwoSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneral.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmmi10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizOneSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmb10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizThreeSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUniIta.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneralItalic.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmss10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif-BoldItalic.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono-Bold.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUniBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUniBolIta.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif-Bold.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansDisplay.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXGeneralBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXNonUni.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmex10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerif-Italic.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans-BoldOblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSansMono-BoldOblique.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmsy10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSerifDisplay.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans-Bold.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizFiveSymReg.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/DejaVuSans.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/STIXSizFourSymBol.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/ttf/cmtt10.ttf�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Helvetica-Bold.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Courier-Oblique.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvb8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Times-Bold.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pbkli8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Courier-BoldOblique.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvr8an.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Courier.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/ptmri8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/cmex10.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pzdr.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvr8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pncbi8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvbo8an.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pagd8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/psyr.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/cmtt10.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/putri8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pplr8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/cmsy10.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/putb8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Helvetica.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pplb8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvb8an.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/putr8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvro8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Helvetica-BoldOblique.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Symbol.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pplri8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Times-Italic.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/ptmb8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/ptmr8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvl8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pbkl8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pncri8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/cmr10.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/putbi8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Helvetica-Oblique.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Times-BoldItalic.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvlo8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pcrbo8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Courier-Bold.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pplbi8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvro8an.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/ptmbi8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pcrb8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/phvbo8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pncr8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pcrro8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/ZapfDingbats.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/cmmi10.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pcrr8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pncb8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pbkd8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pzcmi8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pagko8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/pdfcorefonts/Times-Roman.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pbkdi8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pagdo8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m createFontDict: /usr/local/lib/python2.7/site-packages/matplotlib/mpl-data/fonts/afm/pagk8a.afm�[0m
              �[0;33m17:13:01 [font_manager]�[0m�[0;35m generated new fontManager�[0m
              �[0;33m17:13:02 [__init__]�[0m�[0;35m backend agg version v2.2�[0m
              �[0;33m17:13:02 [stats]�[0m�[0;35m Parsing files in `Exon_Length`..�[0m
              Exon_Length/input.txt: Min=2 Max=4044 N=542 Mean=412.461254613 SD=539.110040671 Median=203.0 Sum=223554
              �[0;33m17:13:02 [base]�[0m�[0;35m Rscript /tmp/tmp1HnJ93�[0m
              Warning message:
              Removed 68 rows containing non-finite values (stat_bin). 
              Saving 7 x 7 in image
              Warning message:
              Removed 68 rows containing non-finite values (stat_bin). 
              �[0;33m17:13:08 [stats]�[0m�[0;35m Parsing files in `Intron_Length`..�[0m
              Intron_Length/input.txt: Min=27 Max=7568 N=355 Mean=121.385915493 SD=513.318795955 Median=56.0 Sum=43092
              �[0;33m17:13:08 [base]�[0m�[0;35m Rscript /tmp/tmpmET9dv�[0m
              Warning message:
              Removed 2 rows containing non-finite values (stat_bin). 
              Saving 7 x 7 in image
              Warning message:
              Removed 2 rows containing non-finite values (stat_bin). 
              �[0;33m17:13:14 [stats]�[0m�[0;35m Parsing files in `Gene_Length`..�[0m
              Gene_Length/input.txt: Min=111 Max=4044 N=187 Mean=1195.47593583 SD=787.281543052 Median=987.0 Sum=223554
              �[0;33m17:13:14 [base]�[0m�[0;35m Rscript /tmp/tmp3Uk5yY�[0m
              Warning message:
              Removed 1 rows containing non-finite values (stat_bin). 
              Saving 7 x 7 in image
              Warning message:
              Removed 1 rows containing non-finite values (stat_bin). 
              �[0;33m17:13:20 [stats]�[0m�[0;35m Parsing files in `Exon_Count`..�[0m
              Exon_Count/input.txt: Min=1 Max=13 N=187 Mean=2.89839572193 SD=1.82828273173 Median=2.0 Sum=542
              �[0;33m17:13:20 [base]�[0m�[0;35m Rscript /tmp/tmpVXIYvC�[0m
              Saving 7 x 7 in image
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              ref_genome {"__current_case__": 1, "genome": {"values": [{"id": 2, "src": "hda"}]}, "genome_type_select": "history"}
      • Step 7: toolshed.g2.bx.psu.edu/repos/bgruening/augustus_training/augustus_training/3.3.3:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is error

            Command Line:

            • cp -r `which augustus | xargs dirname`/../config/ augustus_dir/ &&  export AUGUSTUS_CONFIG_PATH=`pwd`/augustus_dir/ &&  maker2zff '/tmp/tmpntphx7id/files/000/dataset_4.dat' &&  zff2gff3.pl genome.ann | perl -plne 's/\t(\S+)$/\t\.\t$1/' > genome.gff3 &&  autoAugTrain.pl --genome=/tmp/tmpntphx7id/files/000/dataset_2.dat --species=local --trainingset=genome.gff3 -v &&  cd 'augustus_dir/species/' && tar cvfz '/tmp/tmpntphx7id/files/000/dataset_11.dat' 'local'

            Exit Code:

            • 1

            Standard Error:

            • Use of uninitialized value in print at /usr/local/bin/maker2zff line 171, <GFF> line 1800.
              Sequence NC_003421.2 Schizosaccharomyces pombe 972h- chromosome III has no annotation but NC_003421.2 has. Assuming that space truncates name.
              tar: invalid option -- 'z'
              BusyBox v1.22.1 (2014-05-23 01:24:27 UTC) multi-call binary.
              
              Usage: tar -[cxthvO] [-X FILE] [-T FILE] [-f TARFILE] [-C DIR] [FILE]...
              
              Create, extract, or list files from a tar file
              
              Operation:
              	c	Create
              	x	Extract
              	t	List
              	f	Name of TARFILE ('-' for stdin/out)
              	C	Change to DIR before operation
              	v	Verbose
              	O	Extract to stdout
              	h	Follow symlinks
              	exclude	File to exclude
              	X	File with names to exclude
              	T	File with names to include
              
              

            Standard Output:

            • 1 Checking fasta headers in file /tmp/tmpntphx7id/files/000/dataset_2.dat...
              1 - WARNING: Detected whitespace in fasta header of file /tmp/tmpntphx7id/files/000/dataset_2.dat. This may later on cause problems! If the pipeline turns out to crash, please clean up the fasta headers, e.g. by using simplifyFastaHeaders.pl. This message will be suppressed from now on!
              Warning: I assumed 1 times that sequence names end at first space.
              2 Will create parameters for a EUKARYOTIC species!
              2 The necessary files for training local have been created.
              2 Now training AUGUSTUS parameters for local.
              1 training.gb contains 148 sequences and 148 genes, each sequence contains 1 gene(s) on average.
              1 test/evaluation set training.gb.test contains 14 sequences and 14 genes.
              1 training set training.gb.train contains 134 sequences and 134 genes.
              1 randomly selecting 133 genes from the training set training.gb.train...
              1 training.gb.train.test contains 133 sequences and 133 genes.
              1 training.gb.onlytrain contains 1 sequences and 1 genes.
              1 Ab initio prediction accuracy of AUGUSTUS without optimizing, without UTRs, is: 0.5254
              1 Optimizing meta parameters of AUGUSTUS
              1 The accuracy after optimizing without CRF-etraining is 0.609066666666667
              
              
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/busco/busco/4.1.4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • if [ -z "$AUGUSTUS_CONFIG_PATH" ] ; then BUSCO_PATH=$(dirname $(which busco)) ; export AUGUSTUS_CONFIG_PATH=$(realpath ${BUSCO_PATH}/../config) ; fi && cp -r "$AUGUSTUS_CONFIG_PATH/" augustus_dir/ && export AUGUSTUS_CONFIG_PATH=`pwd`/augustus_dir/ &&   busco --in '/tmp/tmpntphx7id/files/000/dataset_2.dat' --lineage_dataset 'fungi_odb10' --update-data --mode 'geno' -o busco_galaxy --cpu ${GALAXY_SLOTS:-4} --evalue 0.01  --limit 3

            Exit Code:

            • 0

            Standard Output:

            • INFO:	***** Start a BUSCO v4.1.4 analysis, current time: 07/13/2021 17:04:35 *****
              INFO:	Configuring BUSCO with /usr/local/share/busco/config.ini
              INFO:	Mode is genome
              WARNING:	You are using a custom e-value cutoff
              INFO:	Input file is /tmp/tmpntphx7id/files/000/dataset_2.dat
              INFO:	Downloading information on latest versions of BUSCO data...
              INFO:	Downloading file 'https://busco-data.ezlab.org/v4/data/lineages/fungi_odb10.2020-09-10.tar.gz'
              INFO:	Decompressing file '/tmp/tmpntphx7id/job_working_directory/000/6/working/busco_downloads/lineages/fungi_odb10.tar.gz'
              INFO:	Running BUSCO using lineage dataset fungi_odb10 (eukaryota, 2020-09-10)
              INFO:	Running 1 job(s) on makeblastdb, starting at 07/13/2021 17:04:49
              INFO:	Creating BLAST database with input file
              INFO:	[makeblastdb]	1 of 1 task(s) completed
              INFO:	Running a BLAST search for BUSCOs against created database
              INFO:	Running 1 job(s) on tblastn, starting at 07/13/2021 17:04:49
              INFO:	[tblastn]	1 of 1 task(s) completed
              INFO:	Running Augustus gene predictor on BLAST search results.
              INFO:	Running Augustus prediction using aspergillus_nidulans as species:
              INFO:	Running 144 job(s) on augustus, starting at 07/13/2021 17:04:55
              INFO:	[augustus]	15 of 144 task(s) completed
              INFO:	[augustus]	29 of 144 task(s) completed
              INFO:	[augustus]	44 of 144 task(s) completed
              INFO:	[augustus]	58 of 144 task(s) completed
              INFO:	[augustus]	73 of 144 task(s) completed
              INFO:	[augustus]	87 of 144 task(s) completed
              INFO:	[augustus]	101 of 144 task(s) completed
              INFO:	[augustus]	116 of 144 task(s) completed
              INFO:	[augustus]	130 of 144 task(s) completed
              INFO:	[augustus]	144 of 144 task(s) completed
              INFO:	Extracting predicted proteins...
              INFO:	***** Run HMMER on gene sequences *****
              INFO:	Running 96 job(s) on hmmsearch, starting at 07/13/2021 17:10:51
              INFO:	[hmmsearch]	10 of 96 task(s) completed
              INFO:	[hmmsearch]	20 of 96 task(s) completed
              INFO:	[hmmsearch]	29 of 96 task(s) completed
              INFO:	[hmmsearch]	39 of 96 task(s) completed
              INFO:	[hmmsearch]	48 of 96 task(s) completed
              INFO:	[hmmsearch]	58 of 96 task(s) completed
              INFO:	[hmmsearch]	68 of 96 task(s) completed
              INFO:	[hmmsearch]	77 of 96 task(s) completed
              INFO:	[hmmsearch]	87 of 96 task(s) completed
              INFO:	[hmmsearch]	96 of 96 task(s) completed
              INFO:	Starting second step of analysis. The gene predictor Augustus is retrained using the results from the initial run to yield more accurate results.
              INFO:	Extracting missing and fragmented buscos from the file ancestral_variants...
              INFO:	Running a BLAST search for BUSCOs against created database
              INFO:	Running 1 job(s) on tblastn, starting at 07/13/2021 17:10:56
              INFO:	[tblastn]	1 of 1 task(s) completed
              INFO:	Converting predicted genes to short genbank files
              INFO:	Running 36 job(s) on gff2gbSmallDNA.pl, starting at 07/13/2021 17:11:48
              INFO:	[gff2gbSmallDNA.pl]	4 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	8 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	11 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	15 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	19 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	22 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	26 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	29 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	33 of 36 task(s) completed
              INFO:	[gff2gbSmallDNA.pl]	36 of 36 task(s) completed
              INFO:	All files converted to short genbank files, now training Augustus using Single-Copy Complete BUSCOs
              INFO:	Running 1 job(s) on new_species.pl, starting at 07/13/2021 17:11:50
              INFO:	[new_species.pl]	1 of 1 task(s) completed
              INFO:	Running 1 job(s) on etraining, starting at 07/13/2021 17:11:50
              INFO:	[etraining]	1 of 1 task(s) completed
              INFO:	Re-running Augustus with the new metaparameters, number of target BUSCOs: 722
              INFO:	Running Augustus gene predictor on BLAST search results.
              INFO:	Running Augustus prediction using BUSCO_busco_galaxy as species:
              INFO:	Running 172 job(s) on augustus, starting at 07/13/2021 17:11:50
              INFO:	[augustus]	18 of 172 task(s) completed
              INFO:	[augustus]	35 of 172 task(s) completed
              INFO:	[augustus]	52 of 172 task(s) completed
              INFO:	[augustus]	69 of 172 task(s) completed
              INFO:	[augustus]	86 of 172 task(s) completed
              INFO:	[augustus]	104 of 172 task(s) completed
              INFO:	[augustus]	121 of 172 task(s) completed
              INFO:	[augustus]	138 of 172 task(s) completed
              INFO:	[augustus]	155 of 172 task(s) completed
              INFO:	[augustus]	172 of 172 task(s) completed
              INFO:	Extracting predicted proteins...
              INFO:	***** Run HMMER on gene sequences *****
              INFO:	Running 169 job(s) on hmmsearch, starting at 07/13/2021 17:23:57
              INFO:	[hmmsearch]	17 of 169 task(s) completed
              INFO:	[hmmsearch]	34 of 169 task(s) completed
              INFO:	[hmmsearch]	51 of 169 task(s) completed
              INFO:	[hmmsearch]	68 of 169 task(s) completed
              INFO:	[hmmsearch]	85 of 169 task(s) completed
              INFO:	[hmmsearch]	102 of 169 task(s) completed
              INFO:	[hmmsearch]	119 of 169 task(s) completed
              INFO:	[hmmsearch]	136 of 169 task(s) completed
              INFO:	[hmmsearch]	153 of 169 task(s) completed
              INFO:	[hmmsearch]	169 of 169 task(s) completed
              INFO:	Results:	C:6.1%[S:6.1%,D:0.0%],F:0.9%,M:93.0%,n:758	   
              
              INFO:	
              
              	--------------------------------------------------
              	|Results from dataset fungi_odb10                 |
              	--------------------------------------------------
              	|C:6.1%[S:6.1%,D:0.0%],F:0.9%,M:93.0%,n:758       |
              	|46	Complete BUSCOs (C)                       |
              	|46	Complete and single-copy BUSCOs (S)       |
              	|0	Complete and duplicated BUSCOs (D)        |
              	|7	Fragmented BUSCOs (F)                     |
              	|705	Missing BUSCOs (M)                        |
              	|758	Total BUSCO groups searched               |
              	--------------------------------------------------
              INFO:	BUSCO analysis done with WARNING(s). Total running time: 1170 seconds
              
              ***** Summary of warnings: *****
              WARNING:busco.BuscoConfig	You are using a custom e-value cutoff
              
              INFO:	Results written in /tmp/tmpntphx7id/job_working_directory/000/6/working/busco_galaxy
              
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              adv {"aug_prediction": {"__current_case__": 0, "augustus_mode": "no"}, "evalue": "0.01", "limit": "3", "long": "false"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              lineage_dataset "fungi_odb10"
              mode "geno"
      • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/fasta_stats/fasta-stats/1.0.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/cvmfs/main.galaxyproject.org/shed_tools/toolshed.g2.bx.psu.edu/repos/iuc/fasta_stats/9c620a950d3a/fasta_stats/fasta-stats.pl' '/tmp/tmpntphx7id/files/000/dataset_2.dat' > '/tmp/tmpntphx7id/files/000/dataset_7.dat'

            Exit Code:

            • 0

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "fasta"
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • Step 4: toolshed.g2.bx.psu.edu/repos/iuc/maker/maker/2.31.11:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • RM_PATH=$(which RepeatMasker) && if [ -z "$RM_PATH" ] ; then echo "Failed to find RepeatMasker in PATH ($PATH)" >&2 ; exit 1 ; fi &&  LIBDIR=$(dirname "$RM_PATH")/../share/RepeatMasker/Libraries &&  export LIBDIR &&  maker -CTL  &&  cp '/tmp/tmpntphx7id/job_working_directory/000/4/configs/tmpnoue6w_k' maker_opts.ctl  &&   MPI_CMD="" && if [ "$MAKER_MPI" == "1" ]; then MPI_CMD="mpiexec -n ${GALAXY_SLOTS:-4}"; fi &&  ${MPI_CMD} maker --ignore_nfs_tmp maker_opts.ctl maker_bopts.ctl maker_exe.ctl < /dev/null  &&  gff3_merge -d *.maker.output/*_master_datastore_index.log -o '/tmp/tmpntphx7id/files/000/dataset_6.dat'  &&  awk '{if ($2 == "maker" || $1 ~ /^#/) {print}}' '/tmp/tmpntphx7id/files/000/dataset_6.dat' | sed -n '/^##FASTA$/q;p' > '/tmp/tmpntphx7id/files/000/dataset_4.dat'  &&  awk '{if ($2 != "maker") {print}}' '/tmp/tmpntphx7id/files/000/dataset_6.dat' | sed -n '/^##FASTA$/q;p' > '/tmp/tmpntphx7id/files/000/dataset_5.dat'

            Exit Code:

            • 0

            Standard Error:

            • Possible precedence issue with control flow operator at /usr/local/lib/site_perl/5.26.2/Bio/DB/IndexedBase.pm line 805.
              Possible precedence issue with control flow operator at /usr/local/lib/site_perl/5.26.2/Bio/DB/IndexedBase.pm line 805.
              STATUS: Parsing control files...
              STATUS: Processing and indexing input FASTA files...
              STATUS: Setting up database for any GFF3 input...
              A data structure will be created for you at:
              /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore
              
              To access files for individual sequences use the datastore index:
              /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_master_datastore_index.log
              
              STATUS: Now running MAKER...
              examining contents of the fasta file and run log
              
              
              
              --Next Contig--
              
              #---------------------------------------------------------------------
              Now starting the contig!!
              SeqID: NC_003421.2
              Length: 1031309
              #---------------------------------------------------------------------
              
              
              setting up GFF3 output and fasta chunks
              preparing ab-inits
              gathering ab-init output files
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.0
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.0 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.0.blastn
              #-------------------------------#
              deleted:2 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.1
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.1 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.1.blastn
              #-------------------------------#
              deleted:-2 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.2
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.2 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.2.blastn
              #-------------------------------#
              deleted:1 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.3
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.3 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.3.blastn
              #-------------------------------#
              deleted:0 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.4
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.4 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.4.blastn
              #-------------------------------#
              deleted:-2 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.5
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.5 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.5.blastn
              #-------------------------------#
              deleted:-2 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.6
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.6 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.6.blastn
              #-------------------------------#
              deleted:-2 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.7
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.7 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.7.blastn
              #-------------------------------#
              deleted:-2 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.8
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.8 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.8.blastn
              #-------------------------------#
              deleted:-2 hits
              doing blastn of ESTs
              formating database...
              #--------- command -------------#
              Widget::formater:
              /usr/local/bin/makeblastdb -dbtype nucl -in /tmp/maker_gHkZjC/0/blastprep/dataset_1%2Edat.mpi.10.9
              #-------------------------------#
              running  blast search.
              #--------- command -------------#
              Widget::blastn:
              /usr/local/bin/blastn -db /tmp/maker_gHkZjC/dataset_1%2Edat.mpi.10.9 -query /tmp/maker_gHkZjC/0/NC_003421%2E2.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -word_size 28 -reward 1 -penalty -5 -gapopen 5 -gapextend 5 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -dust yes -soft_masking false -show_gis -out /tmp/tmpntphx7id/job_working_directory/000/4/working/dataset_2.maker.output/dataset_2_datastore/0B/63/NC_003421.2//theVoid.NC_003421.2/0/NC_003421%2E2.0.dataset_1%2Edat.blastn.temp_dir/dataset_1%2Edat.mpi.10.9.blastn
              #-------------------------------#
              deleted:-1 hits
              collecting blastn reports
              in cluster::shadow_cluster...
              ...finished clustering.
              polishig ESTs
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g1_i1.for.1209-2601.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.1209-2601.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.1209-2601.TRINITY_DN2776_c0_g1_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g1_i2.for.1417-2601.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.1417-2601.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.1417-2601.TRINITY_DN2776_c0_g1_i2.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g3_i1.for.1209-2201.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.1209-2201.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.1209-2201.TRINITY_DN2776_c0_g3_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g1_i3.for.1451-2601.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.1451-2601.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.1451-2601.TRINITY_DN2776_c0_g1_i3.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g2_i1.for.1962-3582.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.1962-3582.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.1962-3582.TRINITY_DN2776_c0_g2_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN4302_c0_g1_i1.for.3336-4684.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.3336-4684.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.3336-4684.TRINITY_DN4302_c0_g1_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN5954_c0_g1_i1.for.4011-5096.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.4011-5096.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.4011-5096.TRINITY_DN5954_c0_g1_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN3904_c0_g1_i1.for.4826-5803.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.4826-5803.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.4826-5803.TRINITY_DN3904_c0_g1_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g1_i1.for.12079-13471.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.12079-13471.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.12079-13471.TRINITY_DN2776_c0_g1_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g1_i2.for.12287-13471.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.12287-13471.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.12287-13471.TRINITY_DN2776_c0_g1_i2.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g3_i1.for.12079-13071.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.12079-13071.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.12079-13071.TRINITY_DN2776_c0_g3_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g1_i3.for.12321-13471.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.12321-13471.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.12321-13471.TRINITY_DN2776_c0_g1_i3.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN2776_c0_g2_i1.for.12832-14452.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.12832-14452.0.fasta -Q dna -T dna --model est2genome  --minintron 20 --maxintron 10000 --showcigar --percent 20 > /tmp/maker_gHkZjC/0/NC_003421%2E2.12832-14452.TRINITY_DN2776_c0_g2_i1.e.exonerate
              #-------------------------------#
              running  est2genome search.
              #--------- command -------------#
              Widget::exonerate::est2genome:
              /usr/local/bin/exonerate  -q /tmp/maker_gHkZjC/0/TRINITY_DN4302_c0_g1_i1.for.14206-15554.0.fasta -t /tmp/maker_gHkZjC/0/NC_003421%2E2.
              ..
              ters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:45 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:45 now processing 0
              total clusters:45 now processing 0
              total clusters:45 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              in cluster::shadow_cluster...
              ...finished clustering.
              annotating transcripts
              Making transcripts
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
               ...processing 0 of 1
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 2
               ...processing 1 of 2
              clustering transcripts into genes for annotations
              Processing transcripts into genes
              in cluster::shadow_cluster...
               sorting hits in shadow cluster...
               j_size:7   current j:0
               j_size:7   current j:1
               j_size:7   current j:2
               j_size:7   current j:3
               j_size:7   current j:4
               j_size:7   current j:5
               j_size:7   current j:6
              ...finished clustering.
              in cluster::shadow_cluster...
               sorting hits in shadow cluster...
               j_size:6   current j:0
               j_size:6   current j:1
               j_size:6   current j:2
               j_size:6   current j:3
               j_size:6   current j:4
               j_size:6   current j:5
              ...finished clustering.
              adding statistics to annotations
              Calculating annotation quality statistics
              choosing best annotation set
              Choosing best annotations
              processing chunk output
              preparing evidence clusters for annotations
              Preparing evidence for hint based annotation
              in cluster::shadow_cluster...
              ...finished clustering.
              cleaning clusters....
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
               ...processing 0 of 6
               ...processing 1 of 6
               ...processing 2 of 6
               ...processing 3 of 6
               ...processing 4 of 6
               ...processing 5 of 6
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 6
               ...processing 1 of 6
               ...processing 2 of 6
               ...processing 3 of 6
               ...processing 4 of 6
               ...processing 5 of 6
              total clusters:42 now processing 0
               ...processing 0 of 5
               ...processing 1 of 5
               ...processing 2 of 5
               ...processing 3 of 5
               ...processing 4 of 5
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
              total clusters:42 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:42 now processing 0
              in cluster::shadow_cluster...
              ...finished clustering.
              in cluster::shadow_cluster...
              ...finished clustering.
              cleaning clusters....
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:40 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:40 now processing 0
               ...processing 0 of 12
               ...processing 1 of 12
               ...processing 2 of 12
               ...processing 3 of 12
               ...processing 4 of 12
               ...processing 5 of 12
               ...processing 6 of 12
               ...processing 7 of 12
               ...processing 8 of 12
               ...processing 9 of 12
               ...processing 10 of 12
               ...processing 11 of 12
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 20
               ...processing 1 of 20
               ...processing 2 of 20
               ...processing 3 of 20
               ...processing 4 of 20
               ...processing 5 of 20
               ...processing 6 of 20
               ...processing 7 of 20
               ...processing 8 of 20
               ...processing 9 of 20
               ...processing 10 of 20
               ...processing 11 of 20
               ...processing 12 of 20
               ...processing 13 of 20
               ...processing 14 of 20
               ...processing 15 of 20
               ...processing 16 of 20
               ...processing 17 of 20
               ...processing 18 of 20
               ...processing 19 of 20
               ...trimming the rest
               ...processing 0 of 20
               ...processing 1 of 20
               ...processing 2 of 20
               ...processing 3 of 20
               ...processing 4 of 20
               ...processing 5 of 20
               ...processing 6 of 20
               ...processing 7 of 20
               ...processing 8 of 20
               ...processing 9 of 20
               ...processing 10 of 20
               ...processing 11 of 20
               ...processing 12 of 20
               ...processing 13 of 20
               ...processing 14 of 20
               ...processing 15 of 20
               ...processing 16 of 20
               ...processing 17 of 20
               ...processing 18 of 20
               ...processing 19 of 20
               ...trimming the rest
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:40 now processing 0
              total clusters:40 now processing 0
              total clusters:40 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:40 now processing 0
              in cluster::shadow_cluster...
              ...finished clustering.
              in cluster::shadow_cluster...
              ...finished clustering.
              in cluster::shadow_cluster...
              ...finished clustering.
              cleaning clusters....
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 6
               ...processing 1 of 6
               ...processing 2 of 6
               ...processing 3 of 6
               ...processing 4 of 6
               ...processing 5 of 6
              total clusters:84 now processing 0
               ...processing 0 of 5
               ...processing 1 of 5
               ...processing 2 of 5
               ...processing 3 of 5
               ...processing 4 of 5
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:84 now processing 0
               ...processing 0 of 12
               ...processing 1 of 12
               ...processing 2 of 12
               ...processing 3 of 12
               ...processing 4 of 12
               ...processing 5 of 12
               ...processing 6 of 12
               ...processing 7 of 12
               ...processing 8 of 12
               ...processing 9 of 12
               ...processing 10 of 12
               ...processing 11 of 12
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
              total clusters:84 now processing 0
              total clusters:84 now processing 0
              total clusters:84 now processing 0
               ...processing 0 of 2
               ...processing 1 of 2
              total clusters:84 now processing 0
              in cluster::shadow_cluster...
              ...finished clustering.
              annotating transcripts
              Making transcripts
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 1
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
               ...processing 0 of 1
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 3
               ...processing 1 of 3
               ...processing 2 of 3
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 20
               ...processing 1 of 20
               ...processing 2 of 20
               ...processing 3 of 20
               ...processing 4 of 20
               ...processing 5 of 20
               ...processing 6 of 20
               ...processing 7 of 20
               ...processing 8 of 20
               ...processing 9 of 20
               ...processing 10 of 20
               ...processing 11 of 20
               ...processing 12 of 20
               ...processing 13 of 20
               ...processing 14 of 20
               ...processing 15 of 20
               ...processing 16 of 20
               ...processing 17 of 20
               ...processing 18 of 20
               ...processing 19 of 20
               ...processing 0 of 1
               ...processing 0 of 1
               ...processing 0 of 1
              clustering transcripts into genes for annotations
              Processing transcripts into genes
              in cluster::shadow_cluster...
               sorting hits in shadow cluster...
               j_size:12   current j:0
               j_size:12   current j:1
               j_size:12   current j:2
               j_size:12   current j:3
               j_size:12   current j:4
               j_size:12   current j:5
               j_size:12   current j:6
               j_size:12   current j:7
               j_size:12   current j:8
               j_size:12   current j:9
               j_size:12   current j:10
               j_size:12   current j:11
              ...finished clustering.
              in cluster::shadow_cluster...
               sorting hits in shadow cluster...
               j_size:11   current j:0
               j_size:11   current j:1
               j_size:11   current j:2
               j_size:11   current j:3
               j_size:11   current j:4
               j_size:11   current j:5
               j_size:11   current j:6
               j_size:11   current j:7
               j_size:11   current j:8
               j_size:11   current j:9
               j_size:11   current j:10
              ...finished clustering.
               ...processing 0 of 2
               ...processing 1 of 2
               ...processing 0 of 2
               ...processing 1 of 2
              adding statistics to annotations
              Calculating annotation quality statistics
              choosing best annotation set
              Choosing best annotations
              processing chunk output
              processing contig output
              
              
              Maker is now finished!!!
              
              

            Standard Output:

            • Start_time: 1626195931
              End_time:   1626196253
              Elapsed:    322
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __job_resource {"__current_case__": 0, "__job_resource__select": "no"}
              __workflow_invocation_uuid__ "40b9cfdde3fc11ebb9eee7a391d4aa3b"
              abinitio_gene_prediction {"aug_prediction": {"__current_case__": 0, "augustus_mode": "no"}, "snaphmm": null, "unmask": "false"}
              advanced {"AED_threshold": "1.0", "alt_peptide": "C", "alt_splice": "false", "always_complete": "false", "correct_est_fusion": "false", "keep_preds": "0.0", "map_forward": "false", "max_dna_len": "100000", "min_contig": "1", "min_protein": "0", "other_gff": null, "pred_flank": "200", "pred_stats": "false", "single_exon": {"__current_case__": 0, "single_exon": "0"}, "split_hit": "10000"}
              chromInfo "/tmp/tmpntphx7id/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              est_evidences {"altest": null, "altest_gff": null, "est": {"values": [{"id": 1, "src": "hda"}]}, "est2genome": "true", "est_gff": null}
              gene_prediction {"model_gff": null, "pred_gff": null, "snoscan_rrna": null, "trna": "false"}
              organism_type "eukaryotic"
              protein_evidences {"protein": {"values": [{"id": 3, "src": "hda"}]}, "protein2genome": "true", "protein_gff": null}
              reannotation {"__current_case__": 0, "reannotate": "no"}
              repeat_masking {"repeat_source": {"__current_case__": 3, "source_type": "no"}}
      • Step 3: Genome sequence:

        • step_state: scheduled
      • Step 2: EST and/or cDNA:

        • step_state: scheduled
      • Step 1: Protein sequences:

        • step_state: scheduled
    • Other invocation details
      • invocation_id

        • 2891970512fa2d5a
      • history_id

        • 2891970512fa2d5a
      • workflow_id

        • 2891970512fa2d5a
      • invocation_state

        • scheduled
      • history_state

        • error
      • error_message

        • Failed to run workflow final history state is [error].

@mvdbeek
Copy link
Member

mvdbeek commented Jul 14, 2021

@gallardoalba
This is in step 7: toolshed.g2.bx.psu.edu/repos/bgruening/augustus_training/augustus_training/3.3.3:

Use of uninitialized value in print at /usr/local/bin/maker2zff line 171, <GFF> line 1800.
Sequence NC_003421.2 Schizosaccharomyces pombe 972h- chromosome III has no annotation but NC_003421.2 has. Assuming that space truncates name.
tar: invalid option -- 'z'
BusyBox v1.22.1 (2014-05-23 01:24:27 UTC) multi-call binary.

Usage: tar -[cxthvO] [-X FILE] [-T FILE] [-f TARFILE] [-C DIR] [FILE]...

Create, extract, or list files from a tar file

Operation:
	c	Create
	x	Extract
	t	List
	f	Name of TARFILE ('-' for stdin/out)
	C	Change to DIR before operation
	v	Verbose
	O	Extract to stdout
	h	Follow symlinks
	exclude	File to exclude
	X	File with names to exclude
	T	File with names to include

I would say augustus needs tar as a dependency, ideally in the Conda recipe, unless the usage of tar is within the Galaxy wrapper.

@gallardoalba
Copy link
Contributor Author

Thanks!

@mvdbeek
Copy link
Member

mvdbeek commented Jul 15, 2021

I think you also need to update the version of the conda dependency in the tool wrapper, which still uses 3.3.3 -> https://github.com/galaxyproject/tools-iuc/blob/master/tools/augustus/macros.xml#L10

@@ -138,6 +138,7 @@ jobs:
galaxy-branch: ${{ env.GALAXY_BRANCH }}
chunk: ${{ matrix.chunk }}
chunk-count: ${{ needs.setup.outputs.chunk-count }}
planemo-version: 'https://github.com/galaxyproject/planemo/archive/refs/heads/master.zip'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

master is a moving target. Ideally this should be changed to pin a specific release before the PR is merged.

@gallardoalba
Copy link
Contributor Author

I think you also need to update the version of the conda dependency in the tool wrapper, which still uses 3.3.3 -> https://github.com/galaxyproject/tools-iuc/blob/master/tools/augustus/macros.xml#L10

Thanks, I'm working on it.

@mvdbeek mvdbeek mentioned this pull request Jul 19, 2021
7 tasks
@mvdbeek
Copy link
Member

mvdbeek commented Jul 20, 2021

I think it ran until completion, but it seems that the test framework thinks that the output is a path and not a URL, I will take a look at this.

@abretaud
Copy link
Collaborator

As alternatives, I'm working on funannotate and braker, maybe finder at some point. But I don't think they can be really interchanged: for some organisms, and depending on evidence data, one tool will be better than others...

@mvdbeek
Copy link
Member

mvdbeek commented Sep 21, 2021

So what are we going to do now ? Since maker is available in bioconda and homebrew-science and there was some communication with the authors about this I tend to think we could merge this and deprecate the workflow should there be any requests that concern licensing ? What do you think about this @bwlang ?

@bwlang
Copy link
Contributor

bwlang commented Sep 21, 2021

I guess we need to decide about 2 things:
1 ) is maker really an academic only license? I think yes based on their web page.
2) is it ok to have an academic only licensed tool in IWC workflows

1 is easily dealt with I think: maybe someone could contact the author and check about the seeming inconsistency between the LICENSE.txt and the web page. Last time I said something about a license problem someone got pretty mad at me on twitter - I hope that doesn't happen again...

It's a bummer we didn't think about 2 before. I'm sorry to bring this up at the last minute, after people did work to build something cool and useful ;(

If we think it's ok, it might be a good idea to put up some kind of warning to avoid setting a trap for a commercial galaxy user who might not think to check all the licenses of tools used in a workflow during installation. For.a second there I was like "oh cool - maker is open source now! Maybe I can use it to finally "finish" my deer genome project!" ... alas, no.
Maybe we could walk all the tool licenses during workflow import??

Bioconda does allow academic only tools - they just require that redistribution be allowed. They jump through some hoops for older GATK to make the user does the download on installation. I think that's a good - less likely that someone would put their organization at risk of lawsuits because they did not notice an unusual license.

If we think it's not ok, I hope we can meld the README and CONTRIBUTING.md (#61 ) to make a checklist to avoid disappointing contributors in the future.

@mvdbeek
Copy link
Member

mvdbeek commented Oct 7, 2021

@gallardoalba could you add a note both in the readme and the description of the workflow itself that maker has an academic only license ? I think that's the best we can do for now. Also ping @galaxyproject/iwc, are y'all OK with this ? I suppose maker is in the IUC so in a sense we've already implicitly agreed that academic only is OK ?

That said, we may want to also display licenses of the underlying tool in the wrapper (the license tag is now meant for the wrapper itself). If we do this we can have something semi-automatic for warning non-academic users.

@bwlang
Copy link
Contributor

bwlang commented Oct 7, 2021

Hmm - i'm not sure there was much thought about this unusual license question when this went into IUC. I see from the history that it's been there a while, but it was switched from repeatmasker's database to dfam - probably for a similar reason.
@bgruening or @nsoranzo : i think you both worked on maker's IUC wrapper in addition to @abretaud.
How did you resolve the license question - or did it just sneak by unnoticed?

@abretaud
Copy link
Collaborator

abretaud commented Oct 7, 2021

There's surely a contradiction with the GPL license, and the "non commercial" thing... I'd say as long as it's gpl, the rest makes no sense..
But other people discussed that on google I think
https://opensource.stackexchange.com/questions/7378/is-gpl-for-research-purposes-only-self-contradictory

@bwlang
Copy link
Contributor

bwlang commented Oct 7, 2021

Yeah - that does seem contradictory to me - but I'm no expert.
I guess I prefer to err on the side of author intent. That's more important to me than legality.
I they don't want me to use it, out of respect, I'll avoid it.
I can't predict how maker authors would feel about galaxy wrappers masking their intent. Maybe it's best to ask them?

@nsoranzo
Copy link
Member

nsoranzo commented Oct 7, 2021

@bwlang Thanks for bringing this up!

IANAL, but in my opinion:

  • The license is weird but its intent is clear, and it appears not only on the website but also in the LICENSE file distributed in the maker source package
  • IUC has a policy for OSI-licensed tools only, so I think maker should be removed from the tools-iuc repository and deprecated on the ToolShed unless we receive an exception in writing from the license holder(s)
  • I agree IWC should have a policy on this, xref. Add a contributing document like IUC #60

@nsoranzo
Copy link
Member

nsoranzo commented Oct 7, 2021

Ah, I forgot:

@abretaud
Copy link
Collaborator

abretaud commented Oct 7, 2021

Yeah, I agree the intent is also clearly "not for commercial use"
Maybe we could add a checkbox in the tool form to confirm it's not for commercial use?
I'm sending an email to the author to get his feeling, just in case...

@gallardoalba
Copy link
Contributor Author

gallardoalba commented Oct 7, 2021

@gallardoalba could you add a note both in the readme and the description of the workflow itself that maker has an academic only license ? I think that's the best we can do for now. Also ping @galaxyproject/iwc, are y'all OK with this ? I suppose maker is in the IUC so in a sense we've already implicitly agreed that academic only is OK ?

That said, we may want to also display licenses of the underlying tool in the wrapper (the license tag is now meant for the wrapper itself). If we do this we can have something semi-automatic for warning non-academic users.

I included some information in the README.md, but not sure how to include it in the workflow @mvdbeek

@carsonhh
Copy link

carsonhh commented Nov 5, 2021

As requested. This is Mark Yandell's official position on using MAKER for bioconda and galaxy -->

We give permission for Bioconda and Galaxy Project to distribute MAKER2/3 through their tool/package management system. MAKER2/3 is free for academic use, but commercial Bioconda and Galaxy users of MAKER2/3 still need a license, which can be obtained here: http://weatherby.genetics.utah.edu/cgi-bin/registration/maker_license.cgi. Bioconda and Galaxy project are not responsible for users who have not properly licensed MAKER.

bernt-matthias pushed a commit to galaxyproject/tools-iuc that referenced this pull request Feb 8, 2022
* Maker: add an option to warn users about license (as asked by authors in galaxyproject/iwc#47 (comment))

* lint

* simpler check
@gallardoalba
Copy link
Contributor Author

Has been solved the problem with the license @abretaud?

@abretaud
Copy link
Collaborator

@gallardoalba I think so: we now have a clear statement from the authors, and a checkbox in the maker tool to force the user to acknowledge it. If everyone's ok with that I think we can move on. We only need to update the tools in the workflow before merging


As a possible approach to assess whether changes in the workflow contribute to its improvement, one possibility is to use the [ParseVal](https://usegalaxy.eu/root?tool_id=toolshed.g2.bx.psu.edu/repos/iuc/aegean_parseval/aegean_parseval/0.16.0) tool, in order to compare the obtained result with a standard annotation.

If you only want to know is an annotation looks reasonable based on the current test data, you can just count the genes in the output GFF, and/or compare the total length of genes.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
If you only want to know is an annotation looks reasonable based on the current test data, you can just count the genes in the output GFF, and/or compare the total length of genes.
If you only want to know if an annotation looks reasonable based on the current test data, you can just count the genes in the output GFF, and/or compare the total length of genes.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@simonbray
Copy link
Member

Maybe it would be a good idea to expose the new checkbox as a workflow parameter as well?

@bwlang
Copy link
Contributor

bwlang commented Apr 11, 2022

I’m not in favor… I think this is a trap just waiting to be sprung on an an unsuspecting junior employee to put their employer (and career) in jeopardy. It sucks that scientists have to be thinking about such things, but here we are. I can see that the checkbox is present now and galaxy could show logs in a trial saying that summer intern X checked the box and that galaxy and bioconda did not contribute to that behavior. I don’t know if that would hold up. In any case, this still seems like a bad situation to be in for galaxy folks, for the intern, for the company; for everybody but the university’s lawyers I guess. Just my opinion… take it for what it’s worth.

@mvdbeek
Copy link
Member

mvdbeek commented Apr 12, 2022

I do agree with @bwlang, I think it was a mistake to add maker to the IUC in the first place. IANAL, but the statement in #47 (comment) does not seem bulletproof. Maybe it would free us from legal action against us, but who knows which third party could try to cash in without the original author's permission or intent.

@gallardoalba
Copy link
Contributor Author

So I'll close it. I'll work in the Funnotate workflow if you agree @abretaud. Thanks all!

@abretaud
Copy link
Collaborator

Ok, no problem to close it. Licenses are really a problem for genome annotation :(
Is there a plan to allow having IWC-like repos using the same model? Just like we can push tools to alternate repos when they are not suitable to IUC.
@gallardoalba thanks for your effort on this workflow! Sure we can work on the funannotate workflow, it should be less problematic. I'll send you some stuff on matrix

@gallardoalba gallardoalba deleted the add_maker_annotation branch April 13, 2022 08:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants