summaryrefslogtreecommitdiffstats
path: root/doc/artsbuilder/future.docbook
diff options
context:
space:
mode:
Diffstat (limited to 'doc/artsbuilder/future.docbook')
-rw-r--r--doc/artsbuilder/future.docbook414
1 files changed, 414 insertions, 0 deletions
diff --git a/doc/artsbuilder/future.docbook b/doc/artsbuilder/future.docbook
new file mode 100644
index 00000000..529cc103
--- /dev/null
+++ b/doc/artsbuilder/future.docbook
@@ -0,0 +1,414 @@
+<!-- <?xml version="1.0" ?>
+<!DOCTYPE chapter PUBLIC "-//KDE//DTD DocBook XML V4.1.2-Based Variant
+V1.1//EN" "dtd/kdex.dtd">
+To validate or process this file as a standalone document, uncomment
+this prolog. Be sure to comment it out again when you are done -->
+
+<chapter id="future-work">
+<title>Future Work</title>
+
+<para>
+This section describes some of the &arts; work that is in progress.
+Development progresses quickly, so this information may be out of date.
+You should check the TODO list file and the <link
+linkend="mailing-lists">mailing list</link> archives to see what new
+functionality is planned. Feel free to get involved in new design and
+implementation.
+</para>
+
+<para>
+This is a draft document which tries to give you an overview how new
+technologies will be integrated in &arts;. Namely, it does cover the
+following:
+</para>
+
+<itemizedlist>
+<listitem><para>How interfaces work.</para></listitem>
+<listitem><para>Codecs - decoding of mp3 or wav streams in a form that
+they can be used as data.</para></listitem>
+<listitem><para>Video.</para></listitem>
+<listitem><para>Threading.</para></listitem>
+<listitem><para>Synchronization.</para></listitem>
+<listitem><para>Dynamic expansion/masquerading.</para></listitem>
+<listitem><para>Dynamic composition.</para></listitem>
+<listitem><para>&GUI;</para></listitem>
+<listitem><para>&MIDI;</para></listitem>
+</itemizedlist>
+
+<para>
+This is work in progress. However, it should be the base if you want to
+see new technology in &arts;. It should give you a general idea how
+these problems will be addressed. However, feel free to correct anything
+you see here.
+</para>
+
+<para>
+Things that will be use &arts; technology (so please, coordinate your
+efforts):
+</para>
+
+<itemizedlist>
+<listitem>
+<para>
+<application>KPhone</application> (voice over <acronym>IP</acronym>)
+</para>
+</listitem>
+
+<listitem>
+<para>
+&noatun; (video / audio player)
+</para>
+</listitem>
+
+<listitem>
+<para>
+&artscontrol; (sound server control program, for scopes)
+</para>
+</listitem>
+
+<listitem>
+<para>
+<application>Brahms</application> (music sequencer)
+</para>
+</listitem>
+
+<listitem>
+<para><application>Kaiman</application> (&kde;2 media player - kmedia2 compliant)
+</para>
+</listitem>
+
+<listitem>
+<para>
+<application>mpglib</application>/<application>kmpg</application>
+(<acronym>mpg</acronym> audio and video playing technology)
+</para>
+</listitem>
+
+<listitem>
+<para>
+<application>SDL</application> (direct media layer for games not
+yet started but maybe nice)
+</para>
+</listitem>
+
+<listitem>
+<para>
+<application>electric ears</application> (author contacted me - status
+unknown)
+</para>
+</listitem>
+</itemizedlist>
+
+<sect1 id="interfaces-how">
+<title>How Interfaces Work</title>
+
+<!-- I think this is now obsolete and documented elsewhere ? -->
+
+<para>
+&MCOP; interfaces are the base of the &arts; concept. They are the
+network transparent equivalent to C++ classes. Whenever possible you
+should orient your design towards interfaces. Interfaces consist of four
+parts:
+</para>
+
+<itemizedlist>
+<listitem><para>Synchronous streams</para></listitem>
+<listitem><para>Asynchronous streams</para></listitem>
+<listitem><para>Methods</para></listitem>
+<listitem><para>Attributes</para></listitem>
+</itemizedlist>
+
+<para>
+These can be mixed in any way you like. New technologies should be
+defined in terms of interfaces. Read the sections about asynchronous
+streams and synchronous streams, as well as the KMedia2 interfaces,
+which are a good example how such things work
+</para>
+
+<para>
+Interfaces are specified in <literal role="extension">.idl</literal>
+code and run through the <command>mcopidl</command> compiler. You
+derive the
+<classname><replaceable>Interfacename</replaceable>_impl</classname>
+class to implement them, and use
+<function>REGISTER_IMPLEMENTATION(Interfacename_impl)</function> to
+insert your object implementations into the &MCOP; object system.
+</para>
+
+</sect1>
+
+<sect1 id="codecs">
+<title>Codecs - Data Decoding</title>
+
+<para>
+The kmedia2 interfaces allow you to ignore that wav files, mp3s and
+whatever consist of data streams. Instead, you only implement methods to
+play them.
+</para>
+
+<para>
+Thus, you can write a wave loading routine in a way that you can play
+wave files (as PlayObject), but nobody else can use your code.
+</para>
+
+<para>
+Asynchronous streams would be the alternative. You define an interface
+which allows you to pass data blocks in, and get data blocks out. This
+looks like that in &MCOP;:
+</para>
+
+<programlisting>
+interface Codec {
+ in async byte stream indata;
+ out async byte stream outdata;
+};
+</programlisting>
+
+
+<para>
+Of course codecs could also provide attributes to emit additional data,
+such as format information.
+</para>
+
+<programlisting>
+interface ByteAudioCodec {
+ in async byte stream indata;
+ out async byte stream outdata;
+ readonly attribute samplingRate, bits, channels;
+};
+</programlisting>
+
+<para>
+This <interfacename>ByteAudioCodec</interfacename> for instance could be
+connected to a <interfacename>ByteStreamToAudio</interfacename> object,
+to make real float audio.
+</para>
+
+<para>
+Of course, other Codec types could involve directly emitting video data,
+such as
+</para>
+
+<programlisting>
+interface VideoCodec {
+ in async byte stream indata;
+ out video stream outdata; /* note: video streams do not exist yet */
+};
+</programlisting>
+
+<para>
+Most likely, a codec concept should be employed rather than the
+<quote>you know how to play and I don't</quote> way for instance
+<interfacename>WavPlayObject</interfacename> currently uses. However,
+somebody needs to sit down and do some experiments before an
+<acronym>API</acronym> can be finalized.
+</para>
+
+</sect1>
+
+<sect1 id="video">
+<title>Video</title>
+
+<para>
+My idea is to provide video as asynchronous streams of some native
+&MCOP; data type which contains images. This data type is to be created
+yet. Doing so, plugins which deal with video images could be connected
+the same way audio plugins can be connected.
+</para>
+
+<para>
+There are a few things that are important not to leave out, namely:
+</para>
+
+<itemizedlist>
+<listitem>
+<para>
+There are <acronym>RGB</acronym> and <acronym>YUV</acronym> colorspaces.
+</para>
+</listitem>
+<listitem>
+<para>
+The format should be somehow tagged to the stream.
+</para>
+</listitem>
+<listitem>
+<para>
+Synchronization is important.
+</para>
+</listitem>
+</itemizedlist>
+
+<para>
+My idea is to leave it possible to reimplement the
+<classname>VideoFrame</classname> class so that it can store stuff in a
+shared memory segment. Doing so, even video streaming between different
+processes would be possible without too much pain.
+</para>
+
+<para>
+However, the standard situation for video is that things are in the same
+process, from the decoding to the rendering.
+</para>
+
+<para>
+I have done a prototypic video streaming implementation, which you can
+download <ulink
+url="http://space.twc.de/~stefan/kde/download/video-quickdraw.tar.gz">here
+</ulink>. This would need to be integrated into &MCOP; after some
+experiments.
+</para>
+
+<para>
+A rendering component should be provided that supports XMITSHM (with
+<acronym>RGB</acronym> and <acronym>YUV</acronym>), Martin Vogt told me
+he is working on such a thing.
+</para>
+
+</sect1>
+
+<sect1 id="threading">
+<title>Threading</title>
+
+<para>
+Currently, &MCOP; is all single threaded. Maybe for video we will no
+longer be able to get around threading. Ok. There are a few things that
+should be treated carefully:
+</para>
+
+
+<itemizedlist>
+<listitem><para>
+SmartWrappers - they are not threadsafe due to non-safe reference
+counting and similar.
+</para>
+</listitem>
+<listitem>
+<para>
+Dispatcher / I/O - also not threadsafe.
+</para>
+</listitem>
+</itemizedlist>
+
+<para>
+However, what I could imagine is to make selected modules threadsafe,
+for both, synchronous and asynchronous streaming. That way - with a
+thread aware flow system, you could schedule the signal flow over two or
+more processors. This would also help audio a lot on multiprocessor
+things.
+</para>
+
+<para>
+How it would work:
+</para>
+
+
+<itemizedlist>
+<listitem>
+<para>The Flow System decides which modules should calculate what - that
+is:
+</para>
+ <itemizedlist>
+ <listitem><para>video frames (with process_indata method)</para></listitem>
+ <listitem><para>synchronous audio streams
+ (calculateBlock)</para></listitem>
+ <listitem><para>other asynchronous streams, mainly byte
+ streams</para></listitem>
+ </itemizedlist>
+</listitem>
+<listitem>
+<para>
+Modules can calculate these things in own threads. For audio, it makes
+sense to reuse threads (&eg; render on four threads for four processors,
+no matter if 100 modules are running). For video and byte decompression,
+it may be more confortable to have a blocking implementation in an own
+thread, which is synchronized against the rest of &MCOP; by the flow
+system.
+</para>
+</listitem>
+
+<listitem>
+<para>
+Modules may not use &MCOP; functionality (such as remote invocations)
+during threaded operation.
+</para>
+</listitem>
+</itemizedlist>
+
+</sect1>
+
+<sect1 id="synchronization">
+<title>Synchronization</title>
+
+<para>
+Video and &MIDI; (and audio) may require synchronization. Basically, that
+is timestamping. The idea I have is to attach timestamps to asynchronous
+streams, by adding one timestamp to each packet. If you send two video
+frames, simply make it two packets (they are large anyway), so that you
+can have two different time stamps.
+</para>
+
+<para>
+Audio should implicitely have time stamps, as it is synchronous.
+</para>
+
+</sect1>
+
+<sect1 id="dynamic-composition">
+<title>Dynamic Composition</title>
+
+<para>
+It should be possible to say: An effect FX is composed out of these
+simpler modules. FX should look like a normal &MCOP; module (see
+masquerading), but in fact consist of other modules.
+</para>
+
+<para>
+This is required for &arts-builder;.
+</para>
+
+</sect1>
+
+<sect1 id="gui">
+<title>&GUI;</title>
+
+<para>
+All &GUI; components will be &MCOP; modules. They should have attributes
+like size, label, color, ... . A <acronym>RAD</acronym> builder
+(&arts-builder;) should be able to compose them visually.
+</para>
+
+<para>
+The &GUI; should be saveable by saving the attributes.
+</para>
+
+</sect1>
+
+<sect1 id="midi-stuff">
+<title>&MIDI;</title>
+
+<para>
+The &MIDI; stuff will be implemented as asynchronous streams. There are
+two options, one is using normal &MCOP; structures to define the types
+and the other is to introduce yet another custom types.
+</para>
+
+<para>
+I think normal structures may be enough, that is something like:
+</para>
+
+<programlisting>
+struct MidiEvent {
+ byte b1,b2,b3;
+ sequence&lt;byte&gt; sysex;
+}
+</programlisting>
+
+<para>
+Asynchronous streams should support custom stream types.
+</para>
+
+</sect1>
+
+</chapter>
+
+