summaryrefslogtreecommitdiff
path: root/doc/emotion_examples.dox
diff options
context:
space:
mode:
authorGustavo Sverzut Barbieri <barbieri@gmail.com>2013-01-10 03:43:32 +0000
committerGustavo Sverzut Barbieri <barbieri@gmail.com>2013-01-10 03:43:32 +0000
commitdfb84c1657bfb14a5236b881193b81f4c0b8a69b (patch)
treeb51b210fc88a21eec8e5907b8bbfe12ebc669f90 /doc/emotion_examples.dox
parent532284dbbe4259a9f2291f44d3eff376849e8031 (diff)
efl: merge emotion.
this one was quite a huge work, but hopefully it's correct. NOTES: * removed vlc generic module, it should go into a separate package. * gstreamer is enabled by default (see --disable-gstreamer) * xine is disabled by default (see --enable-gstreamer) * generic is always built statically if supported * gstreamer and xine can't be configured as static (just lacks command line options, build system supports it) * v4l2 is enabled by default on linux if eeze is built (see --disable-v4l2) * emotion_test moved to src/tests/emotion and depends on EFL_ENABLE_TESTS (--with-tests), but is still installed if enabled. TODO (need your help!): * fix warnings with gstreamer and xine engine * call engine shutdown functions if building as static * remove direct usage of PACKAGE_*_DIR and use eina_prefix * add eina_prefix checkme file as evas and others * add support for $EFL_RUN_IN_TREE * create separate package for emotion_generic_modules * check docs hierarchy (doxygen is segv'in here) SVN revision: 82501
Diffstat (limited to 'doc/emotion_examples.dox')
-rw-r--r--doc/emotion_examples.dox104
1 files changed, 104 insertions, 0 deletions
diff --git a/doc/emotion_examples.dox b/doc/emotion_examples.dox
new file mode 100644
index 0000000000..4015d9e158
--- /dev/null
+++ b/doc/emotion_examples.dox
@@ -0,0 +1,104 @@
1/**
2 * @page emotion_examples Emotion Examples
3 *
4 * Here is a page with some Emotion examples explained:
5 *
6 * @li @ref emotion_basic_example_c
7 * @li @ref emotion_signals_example.c "Emotion signals"
8 * @li @ref emotion_test_main.c "emotion_test - full API usage"
9 *
10 */
11
12/**
13 * @page emotion_basic_example_c Emotion - Basic library usage
14 *
15 * This example shows how to setup a simple Emotion object, make it start
16 * playing and register a callback that tells when the playback started. See @ref
17 * emotion_basic_example.c "the full code here".
18 *
19 * @dontinclude emotion_basic_example.c
20 *
21 * We start this example by including some header files that will be necessary
22 * to work with Emotion, and to display some debug messages:
23 *
24 * @until stdio.h
25 *
26 * Then a callback will be declared, to be called when the object starts its
27 * playback:
28 *
29 * @until }
30 *
31 * Some basic setup of our canvas, window and background is necessary before
32 * displaying our object on it. This setup also includes reading the file to be
33 * opened from the program's argument list. Since this is not directly related
34 * to Emotion itself, we are just displaying the code for this without an
35 * explanation for it:
36 *
37 * @until evas_object_show(bg);
38 *
39 * Finally, we start the Emotion part. First we have to create the object in
40 * this canvas, and initialize it:
41 *
42 * @until emotion_object_init
43 *
44 * Notice that we didn't specify which module will be used, so emotion will use
45 * the first module found. There's no guarantee of the order that the modules
46 * will be found, so if you need to use one of them specifically, please be
47 * explicit in the second argument of the function emotion_object_init().
48 *
49 * Now the callback can be registered to this object. It's a normal Evas smart
50 * object callback, so we add it with evas_object_smart_callback_add():
51 *
52 * @until NULL
53 *
54 * The object itself is ready for use, but we need to load a file to it. This is
55 * done by the following function:
56 *
57 * @until file_set
58 *
59 * This object can play audio or video files. For the latter, the image must be
60 * displayed in our canvas, and that's why we need to add the object to the
61 * canvas. So, like any other Evas object in the canvas, we have to specify its
62 * position and size, and explicitly set its visibility. These are the position
63 * and dimension where the video will be displayed:
64 *
65 * @until evas_object_show
66 *
67 * Since the basic steps were done, we can now start playing our file. For this,
68 * we can just call the basic playback control function, and then we can go to
69 * the main loop and watch the audio/video playing:
70 *
71 * @until main_loop_begin
72 *
73 * The rest of the code doesn't contain anything special:
74 *
75 * @until }
76 *
77 * This code just free the canvas, shutdown the library, and has an entry point
78 * for exiting on error.
79 */
80
81
82/**
83 * @example emotion_basic_example.c
84 * This example show how to create and play an Emotion object. See @ref
85 * emotion_basic_example_c "the explanation here".
86 */
87
88/**
89 * @example emotion_signals_example.c
90 *
91 * This example shows that some of the information available from the emotion
92 * object, like the media file play length, aspect ratio, etc. can be not
93 * available just after setting the file to the emotion object.
94 *
95 * One callback is declared for each of the following signals, and some of the
96 * info about the file is displayed. Also notice that the order that these
97 * signals are emitted can change depending on the module being used. Following
98 * is the full source code of this example:
99 */
100
101/**
102 * @example emotion_test_main.c
103 * This example covers the entire emotion API. Use it as a reference.
104 */