Click any tag below to further narrow down your results
Links
OpenMed developed a pipeline that transforms protein concepts into codon-optimized DNA sequences. They compared various transformer architectures for codon-level language modeling, finding CodonRoBERTa-large-v2 to outperform others in biological relevance. The project includes detailed results and runnable code for each stage.
This article presents a method for long-context language modeling using a standard Transformer architecture that learns continuously at test time. It emphasizes next-token prediction and uses meta-learning for better initialization during training. The codebase, implemented in JAX, provides tools for running experiments and downloading datasets.
AlphaGeometry2 has been developed as an enhanced version of the original AlphaGeometry, surpassing an average gold medalist in solving geometry Olympiad problems. Its improvements include a higher coverage rate of International Math Olympiad problems and a boosted solving rate of 84%, achieved through advanced language modeling and a novel knowledge-sharing mechanism. The system aims to automate the solution of geometry problems from natural language inputs effectively.