LLMs are GPT-4 and such. They know natural languages. In future, they should also know everything about music theory, be able to make Roman numeral analysis, analysis of forms. They should convert between ABC notation, kern, MusicXML. And they should be able to compose coherent long pieces in a prompted style.
Articles that excite me:
- https://arxiv.org/abs/2205.03983
- https://chemrxiv.org/engage/api-gateway/chemrxiv/assets/orp/resource/item/62c5c622244ce03b8e3c4f21/original/do-large-language-models-know-chemistry.pdf
- https://evanthebouncy.github.io/program-synthesis-minimal/generation-with-llm/
- https://github.com/zharry29/drums-with-llm
- https://arxiv.org/search/cs?searchtype=author&query=Donahue%2C+C
- https://www.semanticscholar.org/paper/LakhNES%3A-Improving-Multi-instrumental-Music-with-Donahue-Mao/24a70db0bbb5f486126477e32a6a44ab917a4b11#citing-papers
- MidiBERT, MusicBERT, Museformer
- https://github.com/tripathiarpan20/midiformers
- https://github.com/slSeanWU/jazz_transformer
- https://github.com/asigalov61/Los-Angeles-Music-Composer
- https://github.com/gudgud96/magenta-in-pytorch and his ISMIR reviews in his blog on DDSP
- PianoTree VAE
- https://epub.jku.at/obvulihs/download/pdf/8503579?originalFilename=true
- https://liuhaumin.github.io/LeadsheetArrangement/