Beams #311
Replies: 5 comments 7 replies
-
The JSON version of MNX is specifically designed not to take ease of writing by hand into consideration. MNX will eventually have beam angles and other graphical parts of the notation represented, but first it needs an object to connect that information to, hence why getting the beam information set early in the development process is so important. There are very few notation formats that do not encode beam information (only primarily audio-related, such as MIDI, or the simplest of micro-formats like music21's TinyNotation) -- I don't think MNX would stand a chance as a format if beams were not first-class objects. |
Beta Was this translation helpful? Give feedback.
-
When software supports fine-tuning the layout (adjusting beam slope, note spacing, etc.), then all of this must be stored somewhere, and somehow, if you want it to be preserved across sessions. If you're relying purely on algorithmic layout, users will see different layouts when opening in different applications, or even the same application if the spacing has been modified from the default behaviour. |
Beta Was this translation helpful? Give feedback.
-
Please redirect me if I'm wrong for contributing to this thread about beams, as it is not directly related to the display of them, but rather how the "first-class" representation works. As I see in the spec, to know if notes are connected via a beam, one would have the following json: {
"beams": [
{
"events": ["ev1", "ev2"]
}
]
} I find this to be a cumbersome approach as an app developer because I need to:
On a logical level, a beam is a property of a note (as I believe MusicXML properly models), so to not know a note is beamed on the note level is going to at minimum lead to confusion and worse have performance issues due to needing to traverse back through the data to determine if the current note has a beam or not. This is also inconsistent with how ties are modeled in MNX, where these are properties of a note as I would expect. I think ties and beams can be thought of the same way, abstractly as "note connectors". I also would like to acknowledge that I believe there is natural tension between "representing a note's properties" and "rendering a note" and in the MNX spec it specifically calls out: I know there is the idea of having a beam as a "first-class object" and I think this can still be achieved for the metadata that is needed for a beam, but I would propose the opposite relationship for these circumstances: A note holds the reference to the beam id and not a beam holding references to note ids. |
Beta Was this translation helpful? Give feedback.
-
I put ids in a hashmap so they and the event they refer to are associated, and in my software, ids have global scope so they can be in the hashmap for the duration of the file (instead of being limited to nearby measures). Id names are like lisp gensyms. I don’t think mnx (or any interchange format) is expected to perfectly replicate the internal data structure of the music. In my software, beams (or anything that hangs off an event) have a list of their clients (as in the example in your message), and also, events have a list of the things that connect to them (beams, ties etc). This makes a cyclic object graph that is easy to render to and from JSON. The recursion knows to stop when an id is found in the hashmap, so there is no need for a mark bit. I’m guessing that other notation software does something similar. On 28 Apr 2024, at 18:17, Justus ***@***.***> wrote:
Please redirect me if I'm wrong for contributing to this thread about beams, as it is not directly related to the display of them, but rather how the "first-class" representation works. As I see in the spec, to know if notes are connected via a beam, one would have the following json:
{
"beams": [
{
"events": ["ev1", "ev2"]
}
]
}
I find this to be a cumbersome approach as an app developer because I need to:
Track ids for each note
For any given note, I don't know whether it is beamed without jumping back out of the current note, and traversing back through the current measure's beams. This is further complicated in that, quote, The beam's "events" list is free to reference IDs of events in a subsequent measure., so not only would one need to traverse the current measure for this understanding, they would need to traverse any prior measures as well.
On a logical level, a beam is a property of a note (as I believe MusicXML properly models), so to not know a note is beamed on the note level is going to at minimum lead to confusion and worse have performance issues due to needing to traverse back through the data to determine if the current note has a beam or not.
This is also inconsistent with how ties are modeled in MNX, where these are properties of a note as I would expect. I think ties and beams can be thought of the same way, abstractly as "note connectors".
I also would like to acknowledge that I believe there is natural tension between "representing a note's properties" and "rendering a note" and in the MNX spec it specifically calls out: It’s semantically rich, meaning: it’s biased toward encoding concepts rather than presentation when prudent.. I believe the way beams are currently represented, it's incorrectly favoring presentation over encoding concepts here, which is counter to a goal of the spec.
I know there is the idea of having a beam as a "first-class object" and I think this can still be achieved for the metadata that is needed for a beam, but I would propose the opposite relationship for these circumstances: A note holds the reference to the beam id and not a beam holding references to note ids.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
@williamclocksin Thanks for your response. I understand that a hashmap can be used and there is no perfect way to model things, but I still feel that the way it is currently defined, it is against the intentions of the spec iteself. The fact that I would need to track via hashmaps and know which hashmap(s) to look in to gather metadata around a specific note leads be to believe that it is not |
Beta Was this translation helpful? Give feedback.
-
Being new here I don't want to make a nuisance of myself, but I wonder if there is a different way to think about beams. I like using the move to json as an opportunity to rethink some of the decisions that were made in the xml world. Currently, in the examples, mnx builds beams using inner and hook direction, and yet there is much about beam graphics (such as angle) that is left to the formatting program. Instead, why not leave all the beam building out of mnx entirely. All mnx needs to know is the list of events beamed, and their time values (already known). There is an algorithm to provide the correct beam graphics given any sequence of beam values. Boolean and numeric parameters can specify variants (such as fan beaming). There may be some idiosyncratic beam designs out there that the general algorithm doesn't cater for, but currently mnx won't handle those either just using inner and hook direction.
In general, this is about the tricky grey area between mnx as specifying graphic appearance or not. I work on the assumption that specifying 'some' appearance is worse than specifying none, but of course others may have a different aesthetic. I can also understand the use case in which mnx is written by hand to specify a reasonably legible musical example, but I wonder if this use case is really needed later on.
Beta Was this translation helpful? Give feedback.
All reactions