music editor and MIDI/audio sequencer
Rosegarden global design
is split into 3 main parts:
The base library holds all of the fundamental "music handling" structures, of which the primary ones are Event, Segment, Track, Instrument and Composition. It also contains a selection of utility and helper classes of a kind that is not specific to any particular GUI.
This design came about at a time when Rosegarden had been through several toolkit experiments, and did not want to chain itself to any one GUI toolkit. We wanted to be able to take the core of Rosegarden and build a new application around it simply and easily, and so the base library is built around the STL, and Qt and KDE classes were not allowed here. In practice, we and Qt share the same fate now, and we have been allowing Qt classes in the base library whenever that represented the most pragmatic and expedient solution to a problem.
The keyword for the basic structures in use is "flexibility". Our Event objects can be extended arbitrarily for the convenience of GUI or performance code without having to change their declaration or modify anything in the base library. And most of our assumptions about the use of the container classes can be violated without disastrous side-effects.
- Event is the basic musical element. It's more or less a generalization of the MIDI event. Each note or rest, each key change or tempo change, is an event: there's no "note class" or "rest class" as such, they are simply represented by events whose type happens to be "note" or "rest". Each Event has a type code, absolute time (the moment at which the Event starts, relative only to the start of the Composition) and duration (usually non-zero only for notes and rests), together with an arbitrary set of named and typed properties that can be assigned and queried dynamically by other parts of the application. So, for example, a note event is likely to have an integer property called "pitch", and probably a "velocity", as well as potentially many others -- but this is not fixed anywhere, and there's no definition of what exactly a note is: client code is simply expected to ignore any unrecognised events or properties and to cope if properties that should be there are not.
- Segment is a series of consecutive Events found on the same Track, automatically ordered by their absolute time. It's the usual container for Events. A Segment has a starting time that can be changed, and a duration that is based solely on the end time of the last Event it contains. Note that in order to facilitate musical notation editing, we explicitly store silences as series of rest Events; thus a Segment really should contain no gaps between its Events. (This isn't checked anywhere and nothing will break very badly if there are gaps, but notation won't quite work correctly.)
- Track is much the same thing as on a mixing table, usually assigned to an instrument, a voice, etc. Although a Track is not a container of Events and is not strictly a container of Segments either, it is referred to by a set of Segments that are therefore mutually associated with the same instruments and parameters. In GUI terms, the Track is a horizontal row on the main Rosegarden window, whereas a Segment is a single blue box within that row, of which there may be any number.
- Instrument corresponds broadly to a MIDI or Audio channel, and is the destination for a performed Event. Each Track is mapped to a single Instrument (although many Tracks may have the same Instrument), and the Instrument is indicated in the header at the left of the Track's row in the GUI.
- Composition is the container for the entire piece of music. It consists of a set of Segments, together with a set of Tracks that the Segments may or may not be associated with, a set of Instruments, and some information about time signature and tempo changes. (The latter are not stored in Segments; they are only stored in the top-level Composition. You can't have differing time signatures or tempos in different Segments.) Any code that wants to know about the locations of bar lines, or request real-time calculations based on tempo changes, talks to the Composition.
See also http://rosegardenmusic.com/wiki/dev:units.txt for an explanation of the units we use for time and pitch values. See http://rosegardenmusic.com/wiki/dev:creating_events.txt for an explanation of how to create new Events and add properties to them.
The base directory also contains various music-related helper classes:
- The NotationTypes.[c|h] files contain classes that help with creating and manipulating events. It's very important to realise that these classes are not the events themselves: although there is a Note class in this file, and a TimeSignature class, and Clef and Key classes, instances of these are rarely stored anywhere. Instead they're created on-the-fly in order to do calculation related to note durations or time signatures or whatever, and they contain getAsEvent() methods that may be used when an event for storage is required. But the class of a stored event is always simply Event.
The NotationTypes classes also define important constants for the names of common properties in Events. For example, the Note class contains Note::EventType, which is the type of a note Event, and Note::EventRestType, the type of a rest Event; and Key contains Key::EventType, the type of a key change Event, KeyPropertyName, the name of the property that defines the key change, and a set of the valid strings for key changes.
- BaseProperties.[c|h] contains a set of "standard"-ish Event property names that are not basic enough to go in NotationTypes.
- SegmentNotationHelper and SegmentPerformanceHelper do tasks that may be useful to notation-type code and performer code respectively. For example, SegmentNotationHelper is used to manage rests when inserting and deleting notes in a score editor, and to create beamed groups and suchlike; SegmentPerformanceHelper generally does calculations involving real performance time of notes (taking into account tied notes, tuplets and tempo changes). These two lightweight helper classes are also usually constructed on-the-fly for use on the events in a given Segment and then discarded after use.
- Quantizer is used to quantize event timings and set quantized timing properties on those events. Note that quantization is non-destructive, as it takes advantage of the ability to set new Event properties to simply assign the quantized values as separate properties from the original absolute time and duration.
The GUI directory builds into a Qt application that follows a document/view model. The document (class RosegardenDocument, which wraps a Composition (along with several other related classes)) can have several views (class RosegardenMainViewWidget), although at the moment only a single one is used.
This view is the TrackEditor, which shows all the Composition's Segments organized in Tracks. Each Segment can be edited in several ways, as notation, on a piano roll matrix, or via the raw event list.
All editor views are derived from EditViewBase. EditViewBase is the class dealing with the edition per se of the events. It uses several components:
- Layout classes, horizontal and vertical: these are the classes which determine the x and y coordinates of the graphic items representing the events (notes or piano-roll rectangles). They are derived from the LayoutEngine base-class in the base library.
- Tools, which implement each editing function at the GUI (such as insert, erase, cut and paste). These are the tools which appear on the EditView's toolbar.
- Toolbox, which is a simple string => tool map.
- Commands, which are the fundamental implementations of editing operations (both menu functions and tool operations). Originally a KDE subclass, these are our own implementation now, likely borrowed from Sonic Visualiser.
- a QGraphicsScene and QGraphicsView, no longer actually from a shared base class, I don't think
- LinedStaff, a staff with lines. Like the canvas view, this isn't part of the EditView definition, but both views use one. (Probably different implementations now, and no longer shared. Author not sure.)
There are currently two editor views:
- NotationView, with accompanying classes NotationHLayout, NotationVLayout, NotationStaff, and all the classes in the notationtool and notationcommands files. These are also closely associated with the NotePixmapFactory and NoteFont classes, which are used to generate notes from component pixmap files.
- MatrixView, with accompanying classes MatrixHLayout, MatrixVLayout, and other classes in the matrixview files.
The editing process works as follows:
[NOTE : in the following, we're talking both about events as UI events or user events (mouse button clicks, mouse move, keystrokes, etc...) and Events (our basic music element). To help lift the ambiguity, "events" is for UI events, Events is for Event.]
- The canvas view gets the user events (see NotationCanvasView::contentsMousePressEvent(QMouseEvent*) for an example). It locates where the event occured in terms of musical element: which note or staff line the user clicked on, which pitch and time this corresponds to, that kind of stuff. (In the Notation and Matrix views, the LinedStaff calculates mappings between coordinates and staff lines: the former is especially complicated because of its support for page layout.)
- The canvas view transmits this kind of info as a signal, which is connected to a slot in the parent EditView.
- The EditView delegates action to the current tool.
- The tool performs the actual job (inserting or deleting a note, etc...).
Since this action is usually complex (merely inserting a note requires dealing with the surrounding Events, rests or notes), it does it through a SegmentHelper (for instance, base/SegmentNotationHelper) which "wraps" the complexity into simple calls and performs all the hidden tasks.
The EditView also maintains (obviously) its visual appearance with the layout classes, applying them when appropriate.
The Sequencer interfaces directly with ALSA
and provides MIDI "play" and "record" ports which can be connected to other MIDI clients (MIDI IN and OUT hardware ports or ALSA synth devices) using any ALSA MIDI Connection Manager. The Sequencer also supports playing and recording of Audio sample files using Jack
The GUI and Sequencer were originally implemented as separate processes communicating using the KDE DCOP communication framework, but they have now been restructured into separate threads of a single process. The original design still explains some of the structure of these classes, however. Generally, the DCOP functions that the GUI used to call in the sequencer are now simple public functions of RosegardenSequencer that are described in the RosegardenSequencerIface parent class (this class is retained purely for descriptive purposes); calls that the sequencer used to make back to the GUI have mostly been replaced by polling from the GUI to sequencer.
The main operations invoked from the GUI involve starting and stopping the Sequencer, playing and recording, fast forwarding and rewinding. Once a play or record cycle is enabled it's the Sequencer that does most of the hard work. Events are read from (or written to, when recording) a set of mmapped files shared between the threads.
The Sequencer makes use of two libraries libRosegardenSequencer and libRosegardenSound:
- libRosegardenSequencer holds everything pertinent to sequencing for Rosegarden including the Sequencer class itself.
- libRosegardenSound holds the MidiFile class (writing and reading MIDI files) and the MappedEvent and MappedEventList classes (the communication class for transferring events back and forth between sequencer and GUI). This library is needed by the GUI as well as the Sequencer.
The main Sequencer state machine is a good starting point and clearly visible at the bottom of rosegarden/sequencer/main.cpp.