forked from enlightenment/efl
emotion - better introduction and link to emotion_test.
SVN revision: 61008
This commit is contained in:
parent
1d4852f11e
commit
ffcf50987f
|
@ -69,7 +69,7 @@ RECURSIVE = NO
|
|||
EXCLUDE =
|
||||
EXCLUDE_SYMLINKS = NO
|
||||
EXCLUDE_PATTERNS =
|
||||
EXAMPLE_PATH = @top_srcdir@/src/examples
|
||||
EXAMPLE_PATH = @top_srcdir@/src/examples @top_srcdir@/src/bin
|
||||
EXAMPLE_PATTERNS =
|
||||
EXAMPLE_RECURSIVE = NO
|
||||
INPUT_FILTER =
|
||||
|
|
|
@ -12,10 +12,10 @@
|
|||
* @image html e.png
|
||||
*
|
||||
* Emotion is a library that allows playing audio and video files, using one of
|
||||
* its backends (gstreamer and xine).
|
||||
* its backends (gstreamer or xine).
|
||||
*
|
||||
* It is integrated into Ecore through its mainloop, and is transparent to the
|
||||
* user of the library how the decoding of audio and video is happening. Once
|
||||
* user of the library how the decoding of audio and video is being done. Once
|
||||
* the objects are created, the user can set callbacks to the specific events
|
||||
* and set options to this object, all in the main loop (no threads are needed).
|
||||
*
|
||||
|
@ -38,8 +38,8 @@
|
|||
* @section work How does Emotion work?
|
||||
*
|
||||
* The Emotion library uses Evas smart objects to allow you to manipulate the
|
||||
* created object as any other Evas object, and to connect to its signals and
|
||||
* process them when needed. It's also possible to swallow Emotion objects
|
||||
* created object as any other Evas object, and to connect to its signals,
|
||||
* handling them when needed. It's also possible to swallow Emotion objects
|
||||
* inside Edje themes, and expect it to behave as a normal image or rectangle
|
||||
* when regarding to its dimensions.
|
||||
*
|
||||
|
|
|
@ -5,6 +5,7 @@
|
|||
*
|
||||
* @li @ref emotion_basic_example_c
|
||||
* @li @ref emotion_signals_example.c "Emotion signals"
|
||||
* @li @ref emotion_test_main.c "emotion_test - full API usage"
|
||||
*
|
||||
*/
|
||||
|
||||
|
@ -96,3 +97,8 @@
|
|||
* signals are emitted can change depending on the module being used. Following
|
||||
* is the full source code of this example:
|
||||
*/
|
||||
|
||||
/**
|
||||
* @example emotion_test_main.c
|
||||
* This example covers the entire emotion API. Use it as a reference.
|
||||
*/
|
||||
|
|
|
@ -156,8 +156,29 @@ extern "C" {
|
|||
*
|
||||
* @{
|
||||
*
|
||||
* @li Add the description of modules here.
|
||||
* @li Basic emotion example
|
||||
* Emotion provides an Evas smart object that allows to play, control and
|
||||
* display a video or audio file. The API is synchronous but not everything
|
||||
* happens immediately. There are also some signals to report changed states.
|
||||
*
|
||||
* Basically, once the object is created and initialized, a file will be set to
|
||||
* it, and then it can be resized, moved, and controled by other Evas object
|
||||
* functions.
|
||||
*
|
||||
* However, the decoding of the music and video occurs not in the Ecore main
|
||||
* loop, but usually in another thread (this depends on the module being used).
|
||||
* The synchronization between this other thread and the main loop not visible
|
||||
* to the end user of the library. The user can just register callbacks to the
|
||||
* available signals to receive information about the changed states, and can
|
||||
* call other functions from the API to request more changes on the current
|
||||
* loaded file.
|
||||
*
|
||||
* There will be a delay between an API being called and it being really
|
||||
* executed, since this request will be done in the main thread, and it needs to
|
||||
* be sent to the decoding thread. For this reason, always call functions like
|
||||
* emotion_object_size_get() or emotion_object_length_get() after some signal
|
||||
* being sent, like "playback_started" or "open_done". @ref
|
||||
* emotion_signals_example.c "This example demonstrates this behavior".
|
||||
*
|
||||
* @section signals Available signals
|
||||
* The Evas_Object returned by emotion_object_add() has a number of signals that
|
||||
* can be listened to using evas' smart callbacks mechanism. All signals have
|
||||
|
@ -168,6 +189,18 @@ extern "C" {
|
|||
* @li "open_done" - Emitted when the media file is opened
|
||||
* @li "position_update" - Emitted when emotion_object_position_set is called
|
||||
* @li "decode_stop" - Emitted after the last frame is decoded
|
||||
*
|
||||
* @section Examples
|
||||
*
|
||||
* The following examples exemplify the emotion usage. There's also the
|
||||
* emotion_test binary that is distributed with this library and cover the
|
||||
* entire API, but since it is too long and repetitive to be explained, its code
|
||||
* is just displayed as another example.
|
||||
*
|
||||
* @li @ref emotion_basic_example_c
|
||||
* @li @ref emotion_signals_example.c "Emotion signals"
|
||||
* @li @ref emotion_test_main.c "emotion_test - full API usage"
|
||||
*
|
||||
*/
|
||||
|
||||
/**
|
||||
|
@ -488,6 +521,8 @@ EAPI void emotion_object_size_get (const Evas_Object *obj,
|
|||
* @param smooth Whether to use smooth scale or not.
|
||||
*
|
||||
* @see emotion_object_smooth_scale_get()
|
||||
*
|
||||
* @ingroup Emotion_Video
|
||||
*/
|
||||
EAPI void emotion_object_smooth_scale_set (Evas_Object *obj, Eina_Bool smooth);
|
||||
|
||||
|
@ -499,6 +534,8 @@ EAPI void emotion_object_smooth_scale_set (Evas_Object *obj, Eina_B
|
|||
* @return Whether the smooth scale is used or not.
|
||||
*
|
||||
* @see emotion_object_smooth_scale_set()
|
||||
*
|
||||
* @ingroup Emotion_Video
|
||||
*/
|
||||
EAPI Eina_Bool emotion_object_smooth_scale_get (const Evas_Object *obj);
|
||||
EAPI void emotion_object_event_simple_send (Evas_Object *obj, Emotion_Event ev);
|
||||
|
|
|
@ -558,7 +558,7 @@ emotion_object_ratio_get(const Evas_Object *obj)
|
|||
return sd->ratio;
|
||||
}
|
||||
|
||||
/**
|
||||
/*
|
||||
* Send a control event to the DVD.
|
||||
*/
|
||||
EAPI void
|
||||
|
|
Loading…
Reference in New Issue