You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
DeepConsensus v1.2 introduces a new model that improves runtime by approximately 12% via changes to the neural network architecture (replacement of compute-intensive normalization layers).
The new model includes base quality scores from CCS as input for improved yield at empirical Q30 and Q40 over CCS.
Updated training to include data from maize (Z.mays B73) in addition to the data from CHM13. The inclusion of maize data slightly improves accuracy of DeepConsensus on human data and results in a small improvement on maize (189% Q30 yield improvement to 193% Q30 yield improvement).
Raised the cap on base quality scores to match PBCCS, increasing the dynamic range that DeepConsensus can use to express base confidence.
Added a docs page on model calibration, please see this to better understand predicted and actual confidence for reads and bases.
Thanks to Daniel Liu (@Daniel-Liu-c0deb0t) for his work on replacement of normalization layers, which resulted in significant model speed improvements.