From 8d96311b143ebc5e1a3ad75d8b2f85774726d377 Mon Sep 17 00:00:00 2001 From: nils Date: Mon, 4 Mar 2024 15:35:25 +0100 Subject: [PATCH] fix list formatting --- docs/Build/Upgrading.md | 1 + docs/Build/build/GPAC-Build-Guide-for-Linux.md | 3 +++ docs/Filters/Rearchitecture.md | 4 ++++ docs/Howtos/avmix_tuto.md | 13 +++++++++++++ docs/Howtos/dash/DASH-intro.md | 4 ++++ ...ion,-segmentation,-splitting-and-interleaving.md | 1 + docs/Howtos/dash/HAS-advanced.md | 2 ++ .../Howtos/dash/HEVC-Tile-based-adaptation-guide.md | 4 ++++ .../HEVC-Tile-multi-resolution-adaptation-guide.md | 1 + docs/Howtos/dash/LL-DASH.md | 2 ++ docs/Howtos/dash/LL-HLS.md | 1 + docs/Howtos/dash/cmaf.md | 2 ++ docs/Howtos/dash/dash_transcoding.md | 1 + docs/Howtos/dash/hls.md | 1 + docs/Howtos/dynamic_rc.md | 2 ++ docs/Howtos/encoding.md | 1 + docs/Howtos/encryption/encryption-filters.md | 1 + docs/Howtos/filters-oneliners.md | 4 ++++ docs/Howtos/gpac-mp4box.md | 12 ++++++++++++ docs/Howtos/inspecting.md | 1 + docs/Howtos/jsf/evg.md | 1 + docs/Howtos/jsf/jsdash.md | 1 + docs/Howtos/jsf/jsfilter.md | 6 ++++++ docs/Howtos/jsf/jssession.md | 4 ++++ docs/Howtos/jsf/webgl.md | 1 + docs/Howtos/jsf/webgl_three.md | 3 +++ docs/Howtos/mp4box-filters.md | 3 +++ docs/Howtos/mp4box-inplace.md | 1 + docs/Howtos/network-capture.md | 1 + docs/Howtos/nodejs.md | 8 ++++++++ docs/Howtos/playlist.md | 5 +++++ docs/Howtos/python.md | 8 ++++++++ docs/Howtos/realtime.md | 2 ++ docs/Howtos/scenecoding/SceneCodingIntro.md | 1 + docs/Player/Playback.md | 1 + docs/Player/Player-Features.md | 3 +++ docs/Player/olay-composition.md | 1 + docs/xmlformats/BoxPatch.md | 3 +++ docs/xmlformats/Common-Encryption.md | 4 ++++ docs/xmlformats/NHML-Format.md | 2 ++ docs/xmlformats/XML-Binary.md | 2 ++ 41 files changed, 122 insertions(+) diff --git a/docs/Build/Upgrading.md b/docs/Build/Upgrading.md index 187d0618..9916708c 100644 --- a/docs/Build/Upgrading.md +++ b/docs/Build/Upgrading.md @@ -20,6 +20,7 @@ If you build GPAC directly in the source tree (i.e., running `./configure && mak # Out of source tree building To avoid the issue of cleaning dependencies, it is safer to have one dedicated build directory for each branch you test: + - `mkdir bin/master && cd bin/master && ../../configure && make -j` - `mkdir bin/somebranch && cd bin/master && git checkout somebranch && ../../configure && make -j` diff --git a/docs/Build/build/GPAC-Build-Guide-for-Linux.md b/docs/Build/build/GPAC-Build-Guide-for-Linux.md index 9187c1e0..bc277d85 100644 --- a/docs/Build/build/GPAC-Build-Guide-for-Linux.md +++ b/docs/Build/build/GPAC-Build-Guide-for-Linux.md @@ -1,6 +1,7 @@ _Preliminary notes: the following instructions will be based on Ubuntu and Debian. It should be easily applicable to other distributions, the only changes should be name of the packages to be installed, and the package manager used._ GPAC is a modular piece of software which depends on third-party libraries. During the build process it will try to detect and leverage the installed third-party libraries on your system. Here are the instructions to: + * build GPAC easily (recommended for most users) from what's available on your system, * build a minimal 'MP4Box' and 'gpac' (only contains GPAC core features like muxing and streaming), * build a complete GPAC by rebuilding all the dependencies manually. @@ -47,6 +48,7 @@ _If you are upgrading from a previous version (especially going from below 1.0.0 ## Use You can either: + - `sudo make install` to install the binaries, - or use the `MP4Box` or `gpac` binary in `gpac_public/bin/gcc/` directly, - or move/copy it somewhere manually. @@ -104,6 +106,7 @@ make 4. Use You can either: + - `sudo make install` to install the binaries, - or use the `MP4Box` or `gpac` binary in `gpac_public/bin/gcc/` directly, - or move/copy it somewhere manually. diff --git a/docs/Filters/Rearchitecture.md b/docs/Filters/Rearchitecture.md index ecb4b106..b8f8ebb5 100644 --- a/docs/Filters/Rearchitecture.md +++ b/docs/Filters/Rearchitecture.md @@ -36,6 +36,7 @@ The following lists the core principles of the re-architecture. Read the [genera # Filter Design Principles A filter object obeys the following principles: + - may accept (consume) any number of data stream (named `PID` in this architecture) - may produce any number of PIDs - can have its input PIDs reconfigured at run time or even removed @@ -65,6 +66,7 @@ The filter session main features are: - handle filters capability negotiation, usually inserting a filter chain to match the desired format The filter session operates in a semi-blocking mode: + - it prevents filters in blocking mode (output PIDs buffers full) to operate - it will not prevent a running filter to dispatch a packet; this greatly simplifies demultiplexers writing @@ -82,6 +84,7 @@ These properties may also be overloaded by the user, e.g. to assign a ServiceID # Media Streams internal representation In order to be able to exchange media stream data between filters, a unified data format had to be set, as follows: + - a frame is defined as a single-time block of data (Access Unit in MPEG terminology), but can be transferred in multiple packets - frames or fragments of frames are always transferred in processing order (e.g. decoding order for MPEG video) - multiplexed media data is identified as `file` data, where a frame is a complete file. @@ -155,6 +158,7 @@ gpac -i source.mp4 reframer:xround=closest:splitrange:xs=2:xe=4 -o dest.mp4 All other functionalities of MP4Box are not available through a filter session. Some might make it one day (BIFS encoding for example), but most of them are not good candidates for filter-based processing and will only be available through MP4Box (track add/remove to existing file, image item add/remove to existing file, file hinting, ...). __Note__ For operations using a filter session in MP4Box, it is possible to view some information about the filter session: + - -fstat: this will print the statistics per filter and per PID of the session - -fgraph: this will print the connections between the filters in the session diff --git a/docs/Howtos/avmix_tuto.md b/docs/Howtos/avmix_tuto.md index 144f6a7d..a20ffbe9 100644 --- a/docs/Howtos/avmix_tuto.md +++ b/docs/Howtos/avmix_tuto.md @@ -105,6 +105,7 @@ _Note_ A sequence not attached with a scene will not be visible nor played, even if active. Now let's add : + - a logo - a bottom rectangle with a gradient - some text @@ -131,6 +132,7 @@ In the following examples, we always use [relative coordinates system](avmix#coo ## Animating a scene Scenes can be animated through timer objects providing value interpolation instructions. A timer provides: + - a start time, stop time and a loop count - a duration for the interpolation period - a set of animation values and their targets @@ -167,6 +169,7 @@ It can be tedious to apply the same transformation (matrix, active, ...) on a su The simplest way to do this is to group scenes together, and transform the group. The following animates: + - the video from 90% to 100% , sticking it to the top-left corner and animated the rounded rectangle effect - the overlay group position from visible to hidden past the bottom-right corner @@ -270,6 +273,7 @@ This works with video scenes too: You will at some point need to chain some videos. AVMix handles this through `sequence` objects describing how sources are to be chained. Sequences are designed to: + - take care of media prefetching to reduce loading times - perform transitions between sources, activating / prefetching based on the desired transition duration @@ -297,6 +301,7 @@ AVMix handles this by allowing scenes to use more than one sequence as input, an _Note: Currently, defined scenes only support 0, 1 or 2 input sequences_ This is done at scene declaration through: + - a `mix` object, describing a transition - a `mix_ratio` property, describing the transition ratio @@ -347,6 +352,7 @@ Specifying an identifier on the sequence avoids that. ## Live mode Live mode works like offline mode, with the following additions: + - detection and display of signal lost or no input sequences - `sequence` and `timer` start and stop time can be expressed as UTC dates (absolute) or current UTC offset @@ -368,6 +374,7 @@ You should now see "no input" message when playing. Without closing the player, ] ``` And the video sequence will start ! You can use for start and stop time values: + - "now": will resolve to current UTC time - integer: will resolve to current UTC time plus the number of seconds specified by the integer - date: will use the date as the start/stop time @@ -448,6 +455,7 @@ This is problematic if you use AVMix to generate a live feed supposed to be up 2 To prevent this, the filter allows launching the sources as dedicated child processes. When the child process exits unexpectedly, or when source data is no longer received, the filter can then kill and relaunch the child process. There are three supported methods for this: + - running a gpac instance over a pipe - running a gpac instance over TCP - running any other process capable of communicating with gpac @@ -455,6 +463,7 @@ There are three supported methods for this: The declaration is done at the `sourceURL` level through the port option. For each of these mode, the `keep_alive` option is used to decide if the child process shall be restarted: + - if no more data is received after `rtimeout`. - stream is in end of stream but child process exited with an error code greater than 2. @@ -598,6 +607,7 @@ return 0; ``` Your module can also control the playlist through several functions: + - remove_element(id_or_elem): removes a scene, group or sequence from playlist - parse_element(JSON_obj): parses a root JSON object and add to the playlist - parse_scene(JSON_obj, parent_group): parses a scene from its JSON object and add it to parent_group, or to root if parent_group is null @@ -729,12 +739,14 @@ In this mode, the texturing parameters used by the offscreen group can be modifi AVMix can use a global alpha mask (covering the entire output frame) for draw operations, through the [mask](avmix#scene-mask) scene module. This differs from using an offscreen group as an alpha operand input to [shape](avmix#scene-shape) as discussed above as follows: + - the mask is global and not subject to any transformation - the mask is always cleared at the beginning of a frame - the mask is only one alpha channel - the mask operations can be accumulated between draws The following example shows using a mask in regular mode: + - enable and clear mask - draw a circle with alpha 0.4 - use mask and draw video, which will be blended only where the circle was drawn using alpha= 0.4 @@ -768,6 +780,7 @@ The following example shows using a mask in regular mode: The mask can also be updated while drawing using a record mode. In this mode, the mask acts as a binary filter, any pixel drawn to the mask will no longer get drawn. The following draws: + - an ellipse with first video at half opacity, appearing blended on the background - the entire second video at full opacity, which will only appear where mask was not set diff --git a/docs/Howtos/dash/DASH-intro.md b/docs/Howtos/dash/DASH-intro.md index f5e5a162..54535e2d 100644 --- a/docs/Howtos/dash/DASH-intro.md +++ b/docs/Howtos/dash/DASH-intro.md @@ -5,11 +5,13 @@ GPAC has extended support for MPEG-DASH and HLS content generation and playback. Basics concepts and terminology of MPEG-DASH are explained [here](DASH-basics) and, and the same terms are usually used in GPAC for both DASH and HLS. For more information on content generation: + - read MP4Box [DASH options](mp4box-dash-opts) - read the [dasher](dasher) filter help - check the dash and HLS scripts in the GPAC [test suite](https://github.com/gpac/testsuite/tree/filters/scripts) For more information on content playback: + - read the [dashin](dashin) filter help, used whenever a DASH or HLS session is read. - check the dash and HLS scripts in the GPAC [test suite](https://github.com/gpac/testsuite/tree/filters/scripts) @@ -19,6 +21,7 @@ If you generate your content with an third-party application such as ffmpeg, mak When using GPAC, this is usually ensure by using the `fintra` option. GPAC can be used to generate both static and live DASH/HLS content. For live cases, GPAC can expose the created files: + - directly through disk - through its own HTTP server - by pushing them to a remote HTTP server @@ -28,6 +31,7 @@ We recommend reading the [HTTP server](httpout) filter help, and looking at the ## Content Playback GPAC comes with a various set of adaptation algorithms: + - BBA0, BOLA, basic throughput (called `conventional` in the literature) - Custom throughput-based (`gbuf`) and buffer-based (`grate`) algorithms diff --git a/docs/Howtos/dash/Fragmentation,-segmentation,-splitting-and-interleaving.md b/docs/Howtos/dash/Fragmentation,-segmentation,-splitting-and-interleaving.md index 3c4eeb4f..bf37bf6f 100644 --- a/docs/Howtos/dash/Fragmentation,-segmentation,-splitting-and-interleaving.md +++ b/docs/Howtos/dash/Fragmentation,-segmentation,-splitting-and-interleaving.md @@ -20,6 +20,7 @@ Segmentation (`-dash`) is the process of creating segments, parts of an original Last, MP4Box can split (-split) a file and create individual playable files from an original one. It does not use segmentation in the above sense, it removes fragmentation and can use interleaving. Some examples of MP4Box usages: + - Rewrites a file with an interleaving window of 1 sec. `MP4Box -inter 1000 file.mp4` diff --git a/docs/Howtos/dash/HAS-advanced.md b/docs/Howtos/dash/HAS-advanced.md index 0bf8af8e..50f27720 100644 --- a/docs/Howtos/dash/HAS-advanced.md +++ b/docs/Howtos/dash/HAS-advanced.md @@ -15,6 +15,7 @@ Record the session in fragmented MP4 gpac -i $HAS_URL -o grab/record.mp4:frag ``` Note that we specify [frag](mp4mx#store) option for the generated MP4 so that: + - we don't have a long multiplexing process at the end - if anything goes wrong (crash / battery dead / ...), we still have a file containing all media until the last written fragment. @@ -80,6 +81,7 @@ gpac -i $HAS_URL dashin:forward=file -o route://225.1.1.0:6000 The [DASH reader](dashin) can be configured through [-forward](dashin#forward) to insert segment boundaries in the media pipeline - see [here](dashin#segment-bound-modes) for more details. Two variants of this mode exist: + - `segb`: this enables `split_as`, DASH cue insertion (segment start signal) and fragment bounds signalling - `mani`: same as `segb` and also forward manifests (MPD, M3U8) as packet properties. diff --git a/docs/Howtos/dash/HEVC-Tile-based-adaptation-guide.md b/docs/Howtos/dash/HEVC-Tile-based-adaptation-guide.md index 29dc28c8..29c26702 100644 --- a/docs/Howtos/dash/HEVC-Tile-based-adaptation-guide.md +++ b/docs/Howtos/dash/HEVC-Tile-based-adaptation-guide.md @@ -75,10 +75,12 @@ You can now playback your MPD using GPAC, and have fun with the different adapta ## Live setup If you want to produce a live feed of tiled video, you can either: + - produce short segments, package them and dash them using `-dash-live`, `dash-ctx` and `-subdur`, see discussion [here](https://github.com/gpac/gpac/issues/1648) - produce a live session with a [tilesplit](tilesplit) filter. GPAC does not have a direct wrapper for Kvazaar, but you can either: + - use a FFmpeg build with Kvazaar enabled (`--enable-libkvazaar` in ffmpeg configure) - check GPAC support using `gpac -h ffenc:libkvazaar` - use an external grab+Kvazaar encoding and pipe its output into GPAC. @@ -134,6 +136,7 @@ gpac The resulting filter graph is quite fun (use `-graph` to check it) and shows: + - only one (or 0 depending on your webcam formats) pixel converter filter is used in the chain to feed both Kvazaar instances - all tile PIDs (and only them) connecting to the dasher filter - 21 output PIDs of the dasher: one for MPD, 2 x (1+3x3) media PIDs. @@ -179,6 +182,7 @@ In 2D playback, the tile adaptation logic (for ROI for example) is controlled b The compositor can use gaze information to automatically decrease the quality of the tiles not below the gaze. The gaze information can be: + - emulated via mouse using [--sgaze](compositor#sgaze) option. - signaled through filter updates on the [gazer_enabled](compositor#gazer_enabled) [gaze_x](compositor#gaze_x) [gaze_y](compositor#gaze_y) diff --git a/docs/Howtos/dash/HEVC-Tile-multi-resolution-adaptation-guide.md b/docs/Howtos/dash/HEVC-Tile-multi-resolution-adaptation-guide.md index a58ae98e..5ad08dea 100644 --- a/docs/Howtos/dash/HEVC-Tile-multi-resolution-adaptation-guide.md +++ b/docs/Howtos/dash/HEVC-Tile-multi-resolution-adaptation-guide.md @@ -68,6 +68,7 @@ Check the [HEVC Tile-based adaptation guide](HEVC-Tile-multi-resolution-adaptati # Content Playback The logic of content playback is as follows: + - the MPD indicates SRD information and a GPAC extension for mergeable bitstream - when the compositor is used, the [hevcmerge](hevcmerge) filter is automatically created to reassemble the streams - otherwise (using vout), each PID is declared as an alternative to the other diff --git a/docs/Howtos/dash/LL-DASH.md b/docs/Howtos/dash/LL-DASH.md index 02570e71..e2e5b0c0 100644 --- a/docs/Howtos/dash/LL-DASH.md +++ b/docs/Howtos/dash/LL-DASH.md @@ -19,6 +19,7 @@ And when using gpac, you can enable real-time reporting of filters activities us The `gpac` application can be used for dashing whenever `MP4Box` is used, but the opposite is not true. Especially MP4Box cannot: + - use complex custom filter chains while dashing, such as transcoding in several qualities - produce two DASH sessions at the same time @@ -169,6 +170,7 @@ gpac -i source1 -i source2 reframer:rt=on -o http://ORIG_SERVER_IP_PORT/live.mpd We will now use a live source (webcam), encode it in two qualities, DASH the result and push it to a remote server. Please check the [encoding howto](encoding) first. Compared to what we have seen previously, we only need to modify the input part of the graph: + - take as a live source the default audio video grabbed by the [libavdevice](ffavin) filter - rescale the video as 1080p and 720p - encode the rescaled videos at 6 and 3 mbps diff --git a/docs/Howtos/dash/LL-HLS.md b/docs/Howtos/dash/LL-HLS.md index 3870927d..519662f8 100644 --- a/docs/Howtos/dash/LL-HLS.md +++ b/docs/Howtos/dash/LL-HLS.md @@ -9,6 +9,7 @@ In this howto, we will study various setups for HLS live streaming in low latenc The same setup for configuring segments and CMAF chunks is used as the [DASH low latency](LL-DASH#dash-low-latency-setup) setup. When you have low-latency producing of your HLS media segments, you need to indicate to the client how to access LL-HLS `parts` (CMAF chunks) while they are produced. LL-HLS offers two possibilities to describe these parts in the manifest: + - file mode: advertise the chunks as dedicated files, i.e. each chunk will create its own file. This requires double storage for segments close to the live edge, increases disk IOs and might not be very practical if you setup a PUSH origin (twice the bandwidth is required) - byte range mode: advertise the chunks as byte range of a media file. If that media file is the full segment being produced (usually the case), this does not induce bandwidth increase or extra disk IOs. diff --git a/docs/Howtos/dash/cmaf.md b/docs/Howtos/dash/cmaf.md index ee428b6e..a6adb670 100644 --- a/docs/Howtos/dash/cmaf.md +++ b/docs/Howtos/dash/cmaf.md @@ -5,6 +5,7 @@ GPAC can be used to generate DASH or HLS following the CMAF specification. CMAF defines two structural brands `cmfc`and `cmf2` for ISOBMFF-segmented content. The `cmfc` brand constraints: + - some default values in ISOBMFF boxes - a single media per file - a single track fragment per movie fragment (`moof`) @@ -16,6 +17,7 @@ The `cmfc` brand constraints: The `cmf2`brand further restrict the `cmfc` brand for video tracks: + - no edit list shall be used - negative composition offset (`trun` version 1) shall be used - sample default values shall be repeated in each track fragment diff --git a/docs/Howtos/dash/dash_transcoding.md b/docs/Howtos/dash/dash_transcoding.md index c9e807b4..3cfa61a9 100644 --- a/docs/Howtos/dash/dash_transcoding.md +++ b/docs/Howtos/dash/dash_transcoding.md @@ -5,6 +5,7 @@ In this howto, we will study various setups for DASH transcoding. Please make sure you are familiar with [DASH terminology](DASH-basics) before reading. It is likely that your source media is not properly encoded for DASH or HLS delivery, most likely because: + - openGOPs are used - key-frame position do not match between your different qualities - key-frame intervals are not constant diff --git a/docs/Howtos/dash/hls.md b/docs/Howtos/dash/hls.md index 33898f32..f7b0312f 100644 --- a/docs/Howtos/dash/hls.md +++ b/docs/Howtos/dash/hls.md @@ -37,6 +37,7 @@ This will generate `live.m3u8`, `video.m3u8` and `audio.m3u8` # Renditions ## Grouping When several renditions are possible for a set of inputs, the default behavior is as follows: + - if video is present, it is used as the main content - otherwise, audio is used as the main content diff --git a/docs/Howtos/dynamic_rc.md b/docs/Howtos/dynamic_rc.md index b623eeee..30230939 100644 --- a/docs/Howtos/dynamic_rc.md +++ b/docs/Howtos/dynamic_rc.md @@ -12,6 +12,7 @@ In this example we will use RTP as delivery mechanism and monitor loss rate of c ## RTP reader The reader is a regular video playback from RTP (using SDP as input). We will: + - locate the `rtpin` filter in the chain, i.e. the first filter after the `fin`filter used for SDP access - update every 2 second the `loss_rate`option of the `rtpin` filter: this will force the loss ratio in RTCP Receiver Reports, but will not drop any packet at the receiver side @@ -77,6 +78,7 @@ gpac.close() ## Encoder and RTP sender The encoder consists in a source (here a single video file playing in loop), an AVC encoder and an RTP output. We will: + - locate the `rtpout` filter in the chain, i.e. the first filter before the `fout` filter used for SDP output - monitor every 2 second the statistics of the input PID of `rtpout` to get the real-time measurements reported by RTCP - adjust encoder max rate based on the percentage of loss packets diff --git a/docs/Howtos/encoding.md b/docs/Howtos/encoding.md index 39e9d854..1c632a63 100644 --- a/docs/Howtos/encoding.md +++ b/docs/Howtos/encoding.md @@ -67,6 +67,7 @@ The above command will encode the video track in `source.mp4` into AVC|H264 at ```gpac -i source.mp4 c=avc::x264-params=no-mbtree:sync-lookahead=0::profile=baseline -o test.avc``` The above command will encode the video track in `source.mp4` into AVC|H264 and pass two options to ffmpeg encoder: + - `x264-params`, with value `no-mbtree:sync-lookahead=0` - `profile`, with value `baseline` diff --git a/docs/Howtos/encryption/encryption-filters.md b/docs/Howtos/encryption/encryption-filters.md index c5ecb508..146e7dce 100644 --- a/docs/Howtos/encryption/encryption-filters.md +++ b/docs/Howtos/encryption/encryption-filters.md @@ -66,6 +66,7 @@ Another possibility is to define the `CryptInfo` PID property rather than using gpac -i udp://localhost:1234/:#CrypTrack=(audio)drm_audio.xml,(video)drm_video.xml cecrypt -o dest.mpd:profile=live:dmode=dynamic ``` This example assigns: + - a `CryptInfo` property to `drm_audio.xml` for PIDs of type audio - a `CryptInfo` property to `drm_video.xml` for PIDs of type video - no `CryptInfo` property for other PIDs diff --git a/docs/Howtos/filters-oneliners.md b/docs/Howtos/filters-oneliners.md index af1d1b60..649830d3 100644 --- a/docs/Howtos/filters-oneliners.md +++ b/docs/Howtos/filters-oneliners.md @@ -1,10 +1,12 @@ # Foreword This page contains one-liners illustrating the many possibilities of GPAC filters architecture. For a more detailed information, it is highly recommended that you read: + - the [general concepts](filters_general) page - the [gpac application](gpac_general) help To get a better understanding of each command illustrated in this case, it is recommended to: + - run the same command with `-graph` specified to see the filter graph associated - read the help of the different filters in this graph using `gpac -h filter_name` @@ -13,6 +15,7 @@ Whenever an option is specified, e.g. `dest.mp4:foo`, you can get more info and The filter session is by default quiet, except for warnings and error reporting. To get information on the session while running, use [-r](gpac_general#r) option. To get more runtime information, use the [log system](core_logs). Given the configurable nature of the filter architecture, most examples given in one context can be reused in another context. For example: + - from the dump examples: ``` gpac -i source reframer:saps=1 -o dump/$num$.png @@ -34,6 +37,7 @@ _NOTE The command lines given here are usually using a local file for source or _Reminder_ Most filters are never specified at the prompt, they are dynamically loaded during the graph resolution. GPAC filters can use either: + - global options, e.g. `--foo`, applying to each instance of any filter defining the `foo` option, - local options to a given filter and any filters dynamically loaded, e.g. `:foo`. This is called [argument inheriting](filters_general#arguments-inheriting). diff --git a/docs/Howtos/gpac-mp4box.md b/docs/Howtos/gpac-mp4box.md index 1dd72119..c93238a8 100644 --- a/docs/Howtos/gpac-mp4box.md +++ b/docs/Howtos/gpac-mp4box.md @@ -3,6 +3,7 @@ Following the introduction of the filter architecture and the gpac application, you may have a hard time choosing between MP4Box and gpac. Before going any further, we assume: + - you are familiar with [MP4Box](MP4Box) - you understand the principles of GPAC [filters](filters_general) and are somehow familiar with using the [gpac](gpac_general) application @@ -13,6 +14,7 @@ We recommend that you quickly read the article on GPAC [re-architecture](Rearchi There are many features of libgpac available in MP4Box only, and most of them will probably never be ported to the general filter architecture. The things you can do with both MP4Box and gpac are: + - adding media tracks or image items to a __new__ ISOBMFF file - extracting media track to raw formats - fragmenting and DASHing a set of sources (ISOBMFF or not) @@ -21,6 +23,7 @@ The things you can do with both MP4Box and gpac are: - some XML dump operations (-dnal option of MP4Box) File concatenation can also be done using MP4Box as well as with gpac, but they do not use the same code base: + - MP4Box only concatenates ISOBMFF files, potentially requiring temporary ISOBMFF import - gpac can concatenate any source (live or not) using the [flist](flist) filter. @@ -37,29 +40,36 @@ The gpac application is only in charge of calling a filter session based on the MP4Box works in a completely different way to allow for ISOBMFF file edition. These are the logical steps in MP4Box processing, in their order of execution: If `-add` / `-cat`, then: + - run a filter session for each import (-add) operation. This may be optimized when creating a new file using [-newfs](mp4box-gen-opts#newfs), in which case a single session is used for all import operations. - store the result in a temporary file (unless `-flat` is set ) The input file is now either the source file (read-only or edit operations) or the edited file, potentially with new tracks If `-split`, then: + - run a filter session on the input file for file splitting using the [reframer](reframer) filter If `-raw`, then: + - run a filter session on the input file for each track to dump (usually involving the [writegen](writegen) filter) If `-add-image`, then: + - run a filter session with the target source adding the track to the input file, convert desired samples to items and remove added track If `-dash`, then: + - run a filter session on each input file names using the [dasher](dasher) filter - exit If `-crypt` or `-decrypt` , then: + - run a filter session for file encryption/decryption (potentially using fragmented mode) - exit If `-frag`, then: + - run a filter session for fragmentation - exit @@ -105,6 +115,7 @@ gpac -i video.264:options -i audio_en.264:options -i audio_fr.264:options -o res The track import syntax and the dashing syntax may be combined with filter declarations, as discussed [here](mp4box-filters). They are however restricted as follows: + - for importing, the destination format is always ISOBMFF - the filter chain described is fairly simple, going from source to destination ([mp4mx](mp4mx) or [dasher](dasher) filters) without any possible branch in-between. @@ -144,6 +155,7 @@ On the other hand, gpac can be interrupted using `ctrl+c` and the current sessio # gpac, not MP4Box You should use gpac rather than MP4Box in the following cases: + - sources are either live or simulate live (running forever) - outputs are not local files: HTTP output, RTSP server, ROUTE output, etc... - there are many filters manually specified in the pipeline diff --git a/docs/Howtos/inspecting.md b/docs/Howtos/inspecting.md index 3f252d7a..6088c80d 100644 --- a/docs/Howtos/inspecting.md +++ b/docs/Howtos/inspecting.md @@ -95,6 +95,7 @@ gpac -i source.mp4 inspect:interleave=false:deep:analyze=on The above command will open the given `source.mp4` file and create an XML dump of all PID media-specific info and all packets media-specific info if [deep](inspect#deep) is set. The analyze mode will check the payload of the decoder configuration (parameter sets) and the payload of each packet. Supported bitstream formats for analysis are: + - AVC, HEVC, VVC video - AV1 - VP8,VP9 diff --git a/docs/Howtos/jsf/evg.md b/docs/Howtos/jsf/evg.md index 95daceec..481c1d09 100644 --- a/docs/Howtos/jsf/evg.md +++ b/docs/Howtos/jsf/evg.md @@ -254,6 +254,7 @@ canvas.fill(brush); # Using textures GPAC EVG can use textures to fill path. There are several ways of creating a texture: + - create texture from your script data ``` diff --git a/docs/Howtos/jsf/jsdash.md b/docs/Howtos/jsf/jsdash.md index f9b2a238..b0a3c6f2 100644 --- a/docs/Howtos/jsf/jsdash.md +++ b/docs/Howtos/jsf/jsdash.md @@ -16,6 +16,7 @@ gpac -i source.mpd --algo=mydash.js ... ``` The JS script is loaded with a global object called `dashin` with 4 callback functions: + - period_reset: indicates start or end of a period (optional) - new_group: indicates setup of a new adaptation group (optional), i.e. DASH AdaptationSet or HLS Variant Stream. For HEVC tiling, each tile will be declared as a group, as well as the base tile track - rate_adaptation: performs rate adaptation for a group (mandatory). Return value is the new quality index, or -1 to keep as current, -2 to discard (debug, segments won't be fetched/decoded) diff --git a/docs/Howtos/jsf/jsfilter.md b/docs/Howtos/jsf/jsfilter.md index ca12aa2b..04e2df5e 100644 --- a/docs/Howtos/jsf/jsfilter.md +++ b/docs/Howtos/jsf/jsfilter.md @@ -22,6 +22,7 @@ We will assume in the rest of this article that the script file is called `scrip JS filter life cycle can be described as follows: + * creation of JS context * loading of the JS file (load or setup phase) * filter initializing, executed in the callback function `filter.initialize` @@ -31,6 +32,7 @@ JS filter life cycle can be described as follows: * final destruction of JS context While a filter is active, it can get the following notifications: + * if the filter accepts inputs, (re)configuration of input PIDs in the callback function `filter.configure_pid` * if the filter accepts inputs, removal of input PIDs through the callback function `filter.remove_pid` * events sent by the pipeline through the callback function `filter.process_event` @@ -40,6 +42,7 @@ Callback functions defined for `filter` object do not use exception error handli It is possible to use several JS filters in a given chain, but each JS filter will create its own JavaScript context, and JS objects cannot be shared between JS filters. If you need to pass JS data across filters, you will have to serialize to JSON your data and either: + - send it as PID information on a PID of your choice - send it as JSON-only packets through a dedicated JS PID - send it as associated property on existing packets @@ -182,6 +185,7 @@ The above code allows monitoring PID configuration and performs simple PID prope __Discussion__ Filters properties are mapped to their native type, e.g. unsigned int, boolean, string, float and double, or to objects for vector, arrays, and fractions. There are however a few exceptions here: + - the `StreamType` property is converted to a string (see `gpac -h props` and [Properties](filters_properties)) - the `PixelFormat` property is converted to a string (see `gpac -h props` and [Properties](filters_properties)) - the `AudioFormat` property is converted to a string (see `gpac -h props` and [Properties](filters_properties)) @@ -190,6 +194,7 @@ Filters properties are mapped to their native type, e.g. unsigned int, boolean, # Packet Query Once you have an input PID in place in your filter, you can start fetching packets from this PID in the `filter.process` callback. The packet access API follows the same principles as non-JS filters: + - packets are always delivered in processing order - only the first packet of an input PID packet queue can be fetched, and must be explicitly removed - packets can be reference counted for later reuse @@ -355,6 +360,7 @@ pid.opid.set_props(pid, "MIMEType", null); # Creating new packets GPAC uses several types of packets: + - packets holding data allocated by the framework. Examples: diff --git a/docs/Howtos/jsf/jssession.md b/docs/Howtos/jsf/jssession.md index cf7b0add..ce959589 100644 --- a/docs/Howtos/jsf/jssession.md +++ b/docs/Howtos/jsf/jssession.md @@ -22,6 +22,7 @@ The filter session API can only be loaded once per session. The implies that usi __Discussion__ Since the session API is available in a JSFilter, you can load a script directly using `gpac script.js`. This will however create a JSFilter inside the session, but this filter will be automatically disabled (not used in the graph resolution, leaving it not connected) if the following conditions are met after initialization: + - filter did not assign any capabilities - filter did not create any output PID - filter did not post any task using filter.post_task @@ -238,6 +239,7 @@ session.fire_event(f_evt); ``` The filter session can also be used to fire non-UI related events on filters. You must be extra careful when using this, as this might trigger unwanted behavior in the chain. Typically: + - upstream events (towards sink) should only be fired on source filters (nb_ipid = 0) - downstream events (towards source) should only be fired on sink filters (nb_opid = 0) @@ -252,6 +254,7 @@ session.fire_event(f_evt, target_filter); GPAC is by default compiled with [Remotery](https://github.com/Celtoys/Remotery) support, and can use the underlying websocket server of remotery to communicate with a web browser. You will need for this: + - to launch GPAC with remotery activate by specifying [-rmt](core_options#rmt) - set a handler function to listen to messages from the web client using `session.set_rmt_fun` - send messsages to the web client using `session.rmt_send` @@ -284,6 +287,7 @@ session.rmt_enabled = false; # Creating custom filters You can create your own custom filters in a JS session using `new_filter`. The returned object will be a [JavaScript Filter](jsfilter) with the following limitations: + - no custom arguments for the filter can be set - the `initialize` function is not called - the filter cannot be cloned diff --git a/docs/Howtos/jsf/webgl.md b/docs/Howtos/jsf/webgl.md index 2e23edc4..725971e0 100644 --- a/docs/Howtos/jsf/webgl.md +++ b/docs/Howtos/jsf/webgl.md @@ -203,6 +203,7 @@ In the above code, note the usage of `tx.nb_textures` : this allows fetching the The core concept for dealing with NamedTexture is that the fragment shader sources must be set AFTER the texture is being setup (upload / texImage2D). Doing it before will result in an unmodified fragment shader and missing uniforms. To summarize, NamedTexture allows you to use existing glsl fragment shaders sources with any pixel format for your source, provided that: + - you tag the texture with the name of the sampler2D you want to replace - you upload data to your texture before creating the program using it diff --git a/docs/Howtos/jsf/webgl_three.md b/docs/Howtos/jsf/webgl_three.md index 18b32827..88a25204 100644 --- a/docs/Howtos/jsf/webgl_three.md +++ b/docs/Howtos/jsf/webgl_three.md @@ -9,6 +9,7 @@ _Note: you may try to write a Canvas2D polyfill based on GPAC [EVG](evg)._ We recommend reading the [WebGL HowTo](webgl) before anything else. GPAC does not allow loading JS filters using remote scripts, so you will need to download the latest release of Three.js (this howto was tested with r130). We assume: + - your JS filter script is called `ex3D.js` - your Three.js distribution is unzipped as "three" in the same directory as `ex3D.js` @@ -156,6 +157,7 @@ The complete code for this example is [here](examples/three/ex1.js). In WebGL, textures are passed using `img`, `video` or `canvas` tags, which we don't have in GPAC. Three.js uses the DOM to load these elements with the desired source. We'll need to: + - trick Three.js again, by creating a `document` object intercepting calls to element creation. - use EVG textures to pass the data to WebGL @@ -293,6 +295,7 @@ The complete code for this example is [here](examples/three/ex3.js). We will now load a video as a texture in Three.js. Again, no `video` tag to help us, so we will use the same workaround as previously for images, and use EVG textures. We will try to load the given resource as an EVG texture, and if this fails we try to load the resource as a filter. This implies: + - your filter will now accept video inputs - you will have to relink input PIDs to sources diff --git a/docs/Howtos/mp4box-filters.md b/docs/Howtos/mp4box-filters.md index b5653971..b9511047 100644 --- a/docs/Howtos/mp4box-filters.md +++ b/docs/Howtos/mp4box-filters.md @@ -3,6 +3,7 @@ We discuss here how to use the [MP4Box](MP4Box-introduction) together with filters in GPAC. As discussed [here](Rearchitecture), the following features of MP4Box are now using the GPAC filter engine: + - Media importing - Media exporting - DASHing @@ -111,12 +112,14 @@ You may also specify several paths for the filter chain: MP4Box -add source.mp4:@ffsws:osize=160x120@enc:c=avc:fintra=2:b=100k@@ffsws:osize=320x240@enc:c=avc:fintra=2:b=200k -new file.mp4 ``` The above command will the source and: + - rescale it to 160x120 and encode it at 100 kbps - rescale it to 320x240 and encode it at 200 kbps __Discussion__ You may ask yourself whether using MP4Box or gpac is more efficient for such an operation: + - When you add a single track using MP4Box to a new file, gpac and MP4Box are strictly equivalent. - If you add several tracks in one shot in a new file, gpac will be more efficient as a single filter session will be used to import all tracks, whereas MP4Box uses one filter session per `-add` operation (unless [-newfs](mp4box-gen-opts#newfs) is set). - The filter architecture does not support (for the moment) reading and writing in the same file, so if you need to add a track to an existing file, you must use MP4Box for that. diff --git a/docs/Howtos/mp4box-inplace.md b/docs/Howtos/mp4box-inplace.md index d7abf893..dea65ac0 100644 --- a/docs/Howtos/mp4box-inplace.md +++ b/docs/Howtos/mp4box-inplace.md @@ -3,6 +3,7 @@ As of GPAC 2.0, MP4Box supports in-place editing of MP4 files. In-place editing is used whenever the following conditions are true: + - the media data has not been modified during the edit operations - no storage mode is specified - no output file name is specified diff --git a/docs/Howtos/network-capture.md b/docs/Howtos/network-capture.md index e4e369d5..4f37a391 100644 --- a/docs/Howtos/network-capture.md +++ b/docs/Howtos/network-capture.md @@ -5,6 +5,7 @@ We discuss here how to use network captures with GPAC 2.3-DEV or above # Overview GPAC can: + - write packets to a custom file format called GPC - read packets from pcap, pcapng and gpc files diff --git a/docs/Howtos/nodejs.md b/docs/Howtos/nodejs.md index e19551c5..0bb7c284 100644 --- a/docs/Howtos/nodejs.md +++ b/docs/Howtos/nodejs.md @@ -19,6 +19,7 @@ The binding is called gpac_napi.c, and is hosted in GPAC [source tree](https://g You will need to build the module using `node-gyp`, potentially editing `share/nodejs/binding.gyp` as required for your system. The `binding.gyp` provided is for GPAC: + - built in regular shared library mode for libgpac (i.e. NodeJS module is not compatible with mp4box-only build) - installed on your system (gpac headers available in a standard include directory, libgpac in standard lib directory), typically done with `sudo make install` after building gpac @@ -31,6 +32,7 @@ You can then build using: ``` If you don't want to install on your system, you will need to modify the `binding.gyp` file to set the include dir to the root of gpac source tree: + - "include_dirs": ["<(module_root_dir)/../../include"] If built using configure and make, you will likely have a custom config.h file, and the build tree root must also be indicated together with the `GPAC_HAVE_CONFIG_H` macro. @@ -80,6 +82,7 @@ A test program [gpac.js](https://github.com/gpac/gpac/blob/master/share/nodejs/t The first thing to do is to initialize libgpac. This is done by default while importing the bindings with the following settings: + - no memory tracking - default GPAC profile used @@ -192,6 +195,7 @@ console.log('Entering NodeJS EventLoop'); ## Callbacks in sessions Regardless of the way you run the session, you can request for being called back once or on regular basis. This is achieved by posting tasks to the GPAC session scheduler. A task object shall provide an `execute` method to be called. This function may return: + - `false` to cancel the task, - `true` to reschedule the task asap - a positive integer giving the time of next task callback in milliseconds @@ -251,6 +255,7 @@ Note that (as in GPAC JS or Python) properties referring to constant values are You can define your own filter(s) to interact with the media pipeline. As usual in GPAC filters, a custom filter can be a source, a sink or any other filter. It can consume packets from input PIDs and produce packets on output PIDs. Custom filters are created through the `new_filter` function of the filter session object. The custom filter can then assign its callbacks functions: + - `GF_Err process()` method called whenever the filter has some data to process. - `GF_Err configure_pid(pid, is_remove)` method called whenever a new PID must be configured, re-configured or removed in the custom filter - `Bool process_event(evt)` method called whenever an event is passing through the filter or one of its PIDs @@ -454,6 +459,7 @@ You can however enable or disable Remotery profiler using `gpac.rmt_enable(true) You can override the default algorithm used by the DASH client with your own algorithm. See [the documentation](https://doxygen.gpac.io/classlibgpac_1_1_filter.html#a05de5bc6b3cb9a3573e00d9f4ccfc056) for further details. The principle is as follows: + - the script can get notification when a period start/end to reset your stats and setup live vs on demand cases - the script can get notified of each created group (AdaptationSet in DASH, Variant Stream in HLS) with its various qualities. For HEVC tiling, each tile will be declared as a group, as well as the base tile track - the script is notified after each segment download on which quality to pickup next @@ -508,6 +514,7 @@ fs.run(); You can override the default behaviour of the httpout filter. See [the documentation](https://doxygen.gpac.io/group__nodehttp__grp.html) for further details. The principle is as follows: + - the script can get notification of each new request being received - the script can decide to let GPAC handle the request as usual (typically used for injecting http headers, throttling and monitoring) - the script can feed the data to GPAC (GET) or receive the data from GPAC (PUT/POST) @@ -583,6 +590,7 @@ let http_req = { GPAC allows usage of wrappers for file operations (open, close, read, write, seek...), and such wrappers can be constructed from NodeJS. A FileIO wrapper is constructed using: + - the URL you want to wrap - a 'factory' object providing the callbacks for GPAC. - an optional boolean indicating if direct memory should be used (default), or if array buffers are copied between GPAC and NodeJS. diff --git a/docs/Howtos/playlist.md b/docs/Howtos/playlist.md index bd1253cc..05a23824 100644 --- a/docs/Howtos/playlist.md +++ b/docs/Howtos/playlist.md @@ -115,6 +115,7 @@ vid3.mp4:#Period=1 This will result in a DASH MPD with three periods, the first (resp. third) period containing media from `vid1.mp4` (resp. `vid3.mp4`) and the second period containing media from `vid2.mp4`, `audio2.mp4` and `audio2_fr.mp4`. Note that in this example: + - audio sources override their language definitions - we use the `props` option for the second source entry to signal the DASH period ID of each source globally, rather than copying it for each source. @@ -195,6 +196,7 @@ The filter does **NOT** operate on the media content payload and cannot perform ## Static playlists In this example, we will replace the content from `main` source in the range [4, 10] with the content from `ad1` . We need to indicate: + - an `out` cue: the point in the `main` timeline when the content replacement must begin - an `in` cue: the point in the `main` timeline when the content replacement must end and the main content must resume. @@ -292,6 +294,7 @@ ad ``` This will resolve the splice start time to be the next SAP found on video stream, and the splice end time to be 10s after the splice start. You can use for `out` and `in`: + - a time in seconds. This time is expressed in the `main` media timeline - a date in XSD dateTime format - `now`, as explained previously @@ -320,10 +323,12 @@ ad2 When concatenating media streams (whether at the end of a previous media or at a splice point), the encoding characteristics of the source usually result in audio and video streams of different duration at the insertion point. When concatenating, the filter will use the highest frame time on all streams, and realign the next timeline starting from that point. For example: + - video @25fps, duration 10.0s - AAC 44100Hz, duration 10.03102s The next source timeline origin will be the last audio time (10.03102s), which will result in: + - the next audio frame starting exactly after the last audio - the next video frame starting at 31.02 ms after the last video frame, introducing a gap in video diff --git a/docs/Howtos/python.md b/docs/Howtos/python.md index 9fbe3b1d..514f9514 100644 --- a/docs/Howtos/python.md +++ b/docs/Howtos/python.md @@ -17,6 +17,7 @@ GPAC Python bindings are only available starting from GPAC 2.0. The GPAC Python bindings use [ctypes](https://docs.python.org/3/library/ctypes.html) for interfacing with libgpac filter session, while providing an object-oriented wrapper hiding all ctypes internals and GPAC C design. You __must__: + - use the bindings which come along with your GPAC installation, otherwise ABI/API might mismatch, resulting in crashes. - use a regular GPAC build, not a static library version (so python bindings are not compatible with mp4box-only build). - make sure the libgpac shared library is known to your dynamic library loader. @@ -51,6 +52,7 @@ You can also install libgpac bindings using PIP, see [this post](https://github. # Tuning up GPAC The first thing to do is to initialize libgpac. This is done by default while importing the bindings with the following settings: + - no memory tracking - default GPAC profile used @@ -396,6 +398,7 @@ fs.run() The following defines a custom filter doing raw video write access (e.g. pixel modification) and forwarding the result in the middle of the pipeline. We cover two methods here: + - inplace processing, where the input data is modified and sent - read access, where the output data can be anything (in this example, its is a copy of the input with a line drawn on the luma plane) @@ -532,6 +535,7 @@ First you must delegate all GL context management to your python app (must be do If you run the session in multithreaded mode, you may need to override the filter session `on_gl_activate` to properly activate the GL context for the calling thread. A typical packet processing will then be: + - if GPU texture - use `get_gl_texture` for each video plane, typically 3 for YUV, 2 for Y+packed YV (nv12), 1 for RGB/RGBA - set active texture units and uniforms using the textureID returned @@ -610,6 +614,7 @@ You can however enable or disable Remotery profiler using `gpac.rmt_enable(True/ You can override the default algorithm used by the DASH client with your own algorithm. See [the documentation](https://doxygen.gpac.io/group__pydash__grp.html) for further details. The principle is as follows: + - the script can get notification of period start/end to reset statistics, setup live vs on demand cases, etc. - the script can get notification of each created group (AdaptationSet in DASH, Variant Stream in HLS) with its various qualities. For HEVC tiling, each tile will be declared as a group, as well as the base tile track - the script is notified after each segment download on which quality to pickup next @@ -687,6 +692,7 @@ fs.run() GPAC allows using wrappers for file operations (open, close, read, write, seek...), and such wrappers can be constructed from Python. A FileIO wrapper is constructed using: + - the URL you want to wrap - a 'factory' object providing the callbacks for GPAC. @@ -760,6 +766,7 @@ This allows handling, with a single wrapper, cases where a URL resolves in multi You can override the default behaviour of the httpout filter. See [the documentation](https://doxygen.gpac.io/group__pyhttpout__grp.html) for further details. The principle is as follows: + - the script can get notification of each new request being received - the script can decide to let GPAC handle the request as usual (typically used for injecting http headers, throttling and monitoring) - the script can feed the data to GPAC (GET) or receive the data from GPAC (PUT/POST) @@ -849,6 +856,7 @@ class MyHTTPOutRequest(gpac.HTTPOutRequest): # Advanced example The following is an example showing: + - DASH custom logic - Custom sink filter with buffering control - Raw video access for both GPU-based or system-based decoders diff --git a/docs/Howtos/realtime.md b/docs/Howtos/realtime.md index 36f3ee50..84e09dd6 100644 --- a/docs/Howtos/realtime.md +++ b/docs/Howtos/realtime.md @@ -7,6 +7,7 @@ We discuss here how to simulate real-time sources in GPAC. Assume you have one or several sources dispatching data in a non real-time fashion, such as a local file, an HTTP download or a pipe input. You may want to produce data in real-time, for DASH, HLS, MPEG-2 TS or HTTP delivery. GPAC comes with the [reframer](reframer) filter, in charge of forcing a de-multiplexing of input data. This filter supports several features including: + - discarding frames based on their SAP type (e.g. build a stream containing only I-frames of the input stream) - force decoding of media data - and real-time regulation @@ -89,6 +90,7 @@ gpac flist:srcs=source.mp4:floop=-1 reframer:rt=sync -o live.mpd:dur=2:cdur=0.1: ## Icecast-like server In this example, we use a local playlist to generate an icecast server. If we don't inject a real-time regulation, the server will: + - drop all packets way too fast when no client is connected - send all packets way too fast when clients are connected diff --git a/docs/Howtos/scenecoding/SceneCodingIntro.md b/docs/Howtos/scenecoding/SceneCodingIntro.md index dfed761a..37796265 100644 --- a/docs/Howtos/scenecoding/SceneCodingIntro.md +++ b/docs/Howtos/scenecoding/SceneCodingIntro.md @@ -2,6 +2,7 @@ A scene description is a language describing animations, interactivity, 2D and 3D shapes, audio and video relationship in a presentation. GPAC supports a variety of scene description languages: + - MPEG-4 BIFS, in its binary form, [text](MPEG-4-BIFS-Textual-Format) form and [XML](MPEG-4-XMT-Format) form - Web3D VRML97 and X3D - SVG 1.2 Tiny profile diff --git a/docs/Player/Playback.md b/docs/Player/Playback.md index 6813f0ce..74a3d720 100644 --- a/docs/Player/Playback.md +++ b/docs/Player/Playback.md @@ -1,5 +1,6 @@ # Introduction GPAC can playback content in two main ways: + - through its interactive renderer using the [Compositor](compositor) filter - through simple audio and video output filters. diff --git a/docs/Player/Player-Features.md b/docs/Player/Player-Features.md index 3468c98c..31a6bf3e 100644 --- a/docs/Player/Player-Features.md +++ b/docs/Player/Player-Features.md @@ -29,6 +29,7 @@ The status of X3D implementation in GPAC can be checked [here](X3D-Implementati GPAC extends VRML/BIFS node set through its hardcoded proto mechanism. These are protos with predefined URLs and interfaces, allowing BIFS compression without modifying the language syntax. These nodes are identified by a proto URN starting with `urn:inet:gpac:builtin:`. The following hardcoded protos are available: + - PlanarExtrusion: extrude a 2D shape (except text) along a 2D path - PathExtrusion: extrude a 2D shape, including text, along a 2D path - PlaneClipper: set a 3D plane clipper @@ -87,6 +88,7 @@ The media stream composition (renderer) is performed by the [Compositor](composi # Media Decoders Decoders included in default builds: + - PNG, JPEG (libJPEG) and JPEG-2000 - MPEG-4 AAC, MPEG-1/2 audio, Dolby AC-3 - MPEG-1/2/4, H264|AVC, SVC, HEVC, L-HEVC @@ -97,6 +99,7 @@ Decoders included in default builds: # Networking Any possible input from GPAC filter architecture is supported by the player. This includes: + - File access from local drive, HTTP download, pipes and sockets. - MP4, 3GP, MP3/Shoutcast, JPEG, PNG, OGG/Icecast, AMR/EVRC/SMV, SAF, raw YUV and PCM - AAC files and radio streams (icecast AAC-ADTS) diff --git a/docs/Player/olay-composition.md b/docs/Player/olay-composition.md index f827dc4c..9ff60085 100644 --- a/docs/Player/olay-composition.md +++ b/docs/Player/olay-composition.md @@ -37,6 +37,7 @@ The [compositor](compositor) filter is a in charge of 2D+3D rasterization of nat ## Overlaying Assume you want to insert some text and logo over a video. To do this with GPAC, you can: + - use a BIFS/BT/XMT scene - use a [JavaScript drawing](evg) filter - use the [AVMix](avmix) filter (see [howto](avmix_tuto) ) diff --git a/docs/xmlformats/BoxPatch.md b/docs/xmlformats/BoxPatch.md index f005b3e3..24a81fea 100644 --- a/docs/xmlformats/BoxPatch.md +++ b/docs/xmlformats/BoxPatch.md @@ -1,6 +1,7 @@ GPAC allows adding or removing boxes in an ISOBMFF file through patches, in order to customize files. This box patching uses an XML description of where the box should be added or removed, and what the new box content is in case of box addition. The XML syntax used is: + - a root `GPACBOXES` element with no specified attributes - Any number of `Box` elements, where the payload is described using XML Binary [BS](XML-Binary) elements. @@ -16,6 +17,7 @@ The XML syntax used is: A `Box` element with no children implies a box removal, and the `path` attribute gives the path to the box to remove. Otherwise this specifies a box insertion, and the path attribute gives the path to the parent box or previous box. The path is formatted as a series of 4CC separated by a `.` indicating target child. When inserting a new box, a final character may be appended to the path: + - no character: the last 4CC shall identify a container box, and this specifies that the new box shall be inserted at the end of this container, e.g. `trak.mdia` means insert box as the last child of the media box - `+`: specifies that the new box shall be inserted after the indicated box, e.g. `trak.tkhd+` means insert box after track header - `-`: specifies that the new box shall be inserted before the indicated box, e.g. `trak.tkhd-` means insert box before track header @@ -23,6 +25,7 @@ The path is formatted as a series of 4CC separated by a `.` indicating target ch To insert a box at the root level, simply indicate after or before which root box you want to insert the file, e.g. `moov-`or `moov+`. The trackID may be: + - set in the Box patch - specified using the command line - derived from the PID attached to this box patch diff --git a/docs/xmlformats/Common-Encryption.md b/docs/xmlformats/Common-Encryption.md index 30ae7648..6f61c32c 100644 --- a/docs/xmlformats/Common-Encryption.md +++ b/docs/xmlformats/Common-Encryption.md @@ -35,6 +35,7 @@ Just like any XML file, the file must begin with the usual xml header. The file ## DRMInfo Element Semantics The `DRMInfo` element contains information needed by a Content Protection System to play back the content such as SystemID, the URL of license server(s) or rights issuer(s) used, embedded licenses/rights, embedded keys(s), and/or other protection system specific metadata. It is possible to specify  more than one DRM system by using one DRMInfo element per system ID. The children of this element use the binary XML construction of GPAC to build a binary blob representing: + - the CENC PSSH box payload without box size, type, version and no data size field - or a complete pssh box @@ -130,6 +131,7 @@ The key ID and value must be specified as a 32 bytes hex string, using an improp If the KID attribute is not specified, the key will match any KID in the file. Such a key should be placed after all other key declarations. The defined attributes are: + * `KID` : the ID of the key (KID in CENC) to use * `value`: the AES-128 bit key corresponding to this KID to use. * `hlsInfo`: the associated info for HLS, must contain at least `URI="..."` and may also contain other params of EXT-X-KEY, except `METHOD` which is set by GPAC. Multiple key options may be specified using `URI="uri1",KEYFORMAT="identity",URI="uri1",KEYFORMAT="myown"` (the code will look for `,URI` as a separator. @@ -160,6 +162,7 @@ The payload of a DRMInfoTemplate describing a PSSH blob is not encrypted, howeve There can be multiple `DRMInfoTemplate`, typically one per system ID. When a new key is activated, each `DRMInfoTemplate` will be serialized with the following templating: + - in the first child `` element with an attribute `ID128` set to `KEY`, attribute value is replaced with the encrypted value of the new key (encrypted with `DRMInfoTemplate@key`) - in the first child `` element with an attribute `ID128` set to `KID`, attribute value is replaced with the new key ID (usually not needed if using pssh version 1) @@ -306,6 +309,7 @@ GPAC Player can play protected files which use the GPAC SystemID. This system is The SystemID and _system_ key are _0x6770616363656E6364726D746F6F6C31_. The PSSH version is used as follows: + - version 0 (no KID) indicates that inband keys are used (key rolling) - version 1 identifies regular keys. diff --git a/docs/xmlformats/NHML-Format.md b/docs/xmlformats/NHML-Format.md index 8040f7b5..fcf89bbb 100644 --- a/docs/xmlformats/NHML-Format.md +++ b/docs/xmlformats/NHML-Format.md @@ -150,6 +150,7 @@ The following attributes are used when creating 3GPP DIMS sample descriptions: The decoder config of an `NHNTStream` can be specified using [XML bitstream constructors](XML-Binary). To do this, the BS elements shall be encapsulated in a `DecoderSpecificInfo` element present in the children of the `NHNTStream` element. The content of the `DecoderSpecificInfo` element is then inserted: + - in the ESD (MPEG-4 Systems) - or after the base sampleDescription (ISOBMFF generic), in which case the data should likely be formatted as a box (4 byte size, 4 byte type then payload). @@ -220,6 +221,7 @@ __WARNING Support for `SubSamples` requires GPAC 2.0 or above.__ The `SAI` element is used to associate auxiliary information to the parent sample. The children of this element must use [bitstream constructors](XML-Binary) to describe the data. Auxiliary information will be tanslated by the ISOBMFF multiplexer as: + - sample group description with `grouping_type` value of `type` if `group` is set - sample auxiliary information with `aux_info_type` value of `type` if `group` is not set diff --git a/docs/xmlformats/XML-Binary.md b/docs/xmlformats/XML-Binary.md index 28159792..cadbbbd8 100644 --- a/docs/xmlformats/XML-Binary.md +++ b/docs/xmlformats/XML-Binary.md @@ -1,4 +1,5 @@ It is possible to describe bit sequences when importing XML data. This applies to: + - [NHML](NHML-Format): some elements in the format may or must have child bitstream constructors - [Encryption](Common-Encryption): a `DRMInfo` element may have child bitstream constructors @@ -69,6 +70,7 @@ This example was used to generate files conforming to ISO/IEC 14496-18 AMD1. It ``` When used in an NHML sample, if a `BS` element describes file data (`dataLength` and/or `mediaOffset` are set) but no file is given, the source file is: + - the `mediaFile` indicated at the sample level, if present - otherwise the `baseMediaFile` indicated at the NHML stream level, if present - otherwise the media file associated with the NHML, e.g. `track.media` for `track.nhml`