summaryrefslogtreecommitdiffstats
path: root/doc/artsbuilder/future.docbook
blob: 9bffa354e10438ada90bd2589b0bd4682af3b15d (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
<chapter id="future-work">
<title>Future Work</title>

<para>
This section describes some of the &arts; work that is in progress.
Development progresses quickly, so this information may be out of date.
You should check the TODO list file and the <link
linkend="mailing-lists">mailing list</link> archives to see what new
functionality is planned. Feel free to get involved in new design and
implementation.
</para>

<para>
This is a draft document which tries to give you an overview how new
technologies will be integrated in &arts;. Namely, it does cover the
following:
</para>

<itemizedlist>
<listitem><para>How interfaces work.</para></listitem>
<listitem><para>Codecs - decoding of mp3 or wav streams in a form that
they can be used as data.</para></listitem>
<listitem><para>Video.</para></listitem>
<listitem><para>Threading.</para></listitem>
<listitem><para>Synchronization.</para></listitem>
<listitem><para>Dynamic expansion/masquerading.</para></listitem>
<listitem><para>Dynamic composition.</para></listitem>
<listitem><para>&GUI;</para></listitem>
<listitem><para>&MIDI;</para></listitem>
</itemizedlist>

<para>
This is work in progress. However, it should be the base if you want to
see new technology in &arts;. It should give you a general idea how
these problems will be addressed. However, feel free to correct anything
you see here.
</para>

<para>
Things that will be use &arts; technology (so please, coordinate your
efforts):
</para>

<itemizedlist>
<listitem>
<para>
<application>KPhone</application>   (voice over <acronym>IP</acronym>)
</para>
</listitem>

<listitem>
<para>
&noatun;   (video / audio player)
</para>
</listitem>

<listitem>
<para>
&artscontrol; (sound server control program, for scopes)
</para>
</listitem>

<listitem>
<para>
<application>Brahms</application>   (music sequencer)
</para>
</listitem>

<listitem>
<para><application>Kaiman</application>   (&kde;2 media player - kmedia2 compliant)
</para>
</listitem>

<listitem>
<para>
<application>mpglib</application>/<application>kmpg</application>
(<acronym>mpg</acronym> audio and video playing technology)
</para>
</listitem>

<listitem>
<para>
<application>SDL</application>      (direct media layer for games not
yet started but maybe nice)
</para>
</listitem>

<listitem>
<para>
<application>electric ears</application> (author contacted me - status
unknown)
</para>
</listitem>
</itemizedlist>

<sect1 id="interfaces-how">
<title>How Interfaces Work</title>

<!-- I think this is now obsolete and documented elsewhere ? -->

<para>
&MCOP; interfaces are the base of the &arts; concept. They are the
network transparent equivalent to C++ classes. Whenever possible you
should orient your design towards interfaces. Interfaces consist of four
parts:
</para>

<itemizedlist>
<listitem><para>Synchronous streams</para></listitem>
<listitem><para>Asynchronous streams</para></listitem>
<listitem><para>Methods</para></listitem>
<listitem><para>Attributes</para></listitem>
</itemizedlist>

<para>
These can be mixed in any way you like. New technologies should be
defined in terms of interfaces. Read the sections about asynchronous
streams and synchronous streams, as well as the KMedia2 interfaces,
which are a good example how such things work
</para>

<para>
Interfaces are specified in <literal role="extension">.idl</literal>
code and run through the <command>mcopidl</command> compiler.  You
derive the
<classname><replaceable>Interfacename</replaceable>_impl</classname>
class to implement them, and use
<function>REGISTER_IMPLEMENTATION(Interfacename_impl)</function> to
insert your object implementations into the &MCOP; object system.
</para>

</sect1>

<sect1 id="codecs">
<title>Codecs - Data Decoding</title>

<para>
The kmedia2 interfaces allow you to ignore that wav files, mp3s and
whatever consist of data streams. Instead, you only implement methods to
play them.
</para>

<para>
Thus, you can write a wave loading routine in a way that you can play
wave files (as PlayObject), but nobody else can use your code.
</para>

<para>
Asynchronous streams would be the alternative. You define an interface
which allows you to pass data blocks in, and get data blocks out. This
looks like that in &MCOP;:
</para>

<programlisting>
interface Codec {
  in async byte stream indata;
  out async byte stream outdata;
};
</programlisting>


<para>
Of course codecs could also provide attributes to emit additional data,
such as format information.
</para>

<programlisting>
interface ByteAudioCodec {
  in async byte stream indata;
  out async byte stream outdata;
  readonly attribute samplingRate, bits, channels;
};
</programlisting>

<para>
This <interfacename>ByteAudioCodec</interfacename> for instance could be
connected to a <interfacename>ByteStreamToAudio</interfacename> object,
to make real float audio.
</para>

<para>
Of course, other Codec types could involve directly emitting video data,
such as
</para>

<programlisting>
interface VideoCodec {
  in async byte stream indata;
  out video stream outdata;      /* note: video streams do not exist yet */
};
</programlisting>

<para>
Most likely, a codec concept should be employed rather than the
<quote>you know how to play and I don't</quote> way for instance
<interfacename>WavPlayObject</interfacename> currently uses.  However,
somebody needs to sit down and do some experiments before an
<acronym>API</acronym> can be finalized.
</para>

</sect1>

<sect1 id="video">
<title>Video</title>

<para>
My idea is to provide video as asynchronous streams of some native
&MCOP; data type which contains images. This data type is to be created
yet. Doing so, plugins which deal with video images could be connected
the same way audio plugins can be connected.
</para>

<para>
There are a few things that are important not to leave out, namely:
</para>

<itemizedlist>
<listitem>
<para>
There are <acronym>RGB</acronym> and <acronym>YUV</acronym> colorspaces.
</para>
</listitem>
<listitem>
<para>
The format should be somehow tagged to the stream.
</para>
</listitem>
<listitem>
<para>
Synchronization is important.
</para>
</listitem>
</itemizedlist>

<para>
My idea is to leave it possible to reimplement the
<classname>VideoFrame</classname> class so that it can store stuff in a
shared memory segment. Doing so, even video streaming between different
processes would be possible without too much pain.
</para>

<para>
However, the standard situation for video is that things are in the same
process, from the decoding to the rendering.
</para>

<para>
I have done a prototypic video streaming implementation, which you can
download <ulink
url="http://space.twc.de/~stefan/kde/download/video-quickdraw.tar.gz">here
</ulink>. This would need to be integrated into &MCOP; after some
experiments.
</para>

<para>
A rendering component should be provided that supports XMITSHM (with
<acronym>RGB</acronym> and <acronym>YUV</acronym>), Martin Vogt told me
he is working on such a thing.
</para>

</sect1>

<sect1 id="threading">
<title>Threading</title>

<para>
Currently, &MCOP; is all single threaded. Maybe for video we will no
longer be able to get around threading. Ok. There are a few things that
should be treated carefully:
</para>


<itemizedlist>
<listitem><para>
SmartWrappers - they are not threadsafe due to non-safe reference
counting and similar.
</para>
</listitem>
<listitem>
<para>
Dispatcher / I/O - also not threadsafe.
</para>
</listitem>
</itemizedlist>

<para>
However, what I could imagine is to make selected modules threadsafe,
for both, synchronous and asynchronous streaming. That way - with a
thread aware flow system, you could schedule the signal flow over two or
more processors.  This would also help audio a lot on multiprocessor
things.
</para>

<para>
How it would work:
</para>


<itemizedlist>
<listitem>
<para>The Flow System decides which modules should calculate what - that
is:
</para>
    <itemizedlist>
	<listitem><para>video frames (with process_indata method)</para></listitem>
	<listitem><para>synchronous audio streams
	(calculateBlock)</para></listitem>
	<listitem><para>other asynchronous streams, mainly byte
	streams</para></listitem>
	</itemizedlist>
</listitem>
<listitem>
<para>
Modules can calculate these things in own threads. For audio, it makes
sense to reuse threads (&eg; render on four threads for four processors,
no matter if 100 modules are running). For video and byte decompression,
it may be more confortable to have a blocking implementation in an own
thread, which is synchronized against the rest of &MCOP; by the flow
system.
</para>
</listitem>

<listitem>
<para>
Modules may not use &MCOP; functionality (such as remote invocations)
during threaded operation.
</para>
</listitem>
</itemizedlist>

</sect1>

<sect1 id="synchronization">
<title>Synchronization</title>

<para>
Video and &MIDI; (and audio) may require synchronization. Basically, that
is timestamping. The idea I have is to attach timestamps to asynchronous
streams, by adding one timestamp to each packet. If you send two video
frames, simply make it two packets (they are large anyway), so that you
can have two different time stamps.
</para>

<para>
Audio should implicitely have time stamps, as it is synchronous.
</para>

</sect1>

<sect1 id="dynamic-composition">
<title>Dynamic Composition</title>

<para>
It should be possible to say: An effect FX is composed out of these
simpler modules. FX should look like a normal &MCOP; module (see
masquerading), but in fact consist of other modules.
</para>

<para>
This is required for &arts-builder;.
</para>

</sect1>

<sect1 id="gui">
<title>&GUI;</title>

<para>
All &GUI; components will be &MCOP; modules. They should have attributes
like size, label, color, ... . A <acronym>RAD</acronym> builder
(&arts-builder;) should be able to compose them visually.
</para>

<para>
The &GUI; should be saveable by saving the attributes.
</para>

</sect1>

<sect1 id="midi-stuff">
<title>&MIDI;</title>

<para>
The &MIDI; stuff will be implemented as asynchronous streams. There are
two options, one is using normal &MCOP; structures to define the types
and the other is to introduce yet another custom types.
</para>

<para>
I think normal structures may be enough, that is something like:
</para>

<programlisting>
struct MidiEvent {
  byte b1,b2,b3;
  sequence&lt;byte&gt; sysex;
}
</programlisting>

<para>
Asynchronous streams should support custom stream types.
</para>

</sect1>

</chapter>