JAINSLEE – Developer and business benefits

JAIN SLEE is the Java open standard for a SLEE ( Service Logic Execution Environment ). It is a  Java programming language API for developing and deploying network services.

DukeJAINSLEE

 Evolution of Open- Standard Platform (JAINSLEE)

There is a strong evolution being seen in CSP space. Now operators are looking forward to implement the open standard for intelligent networks. It reduces their dependency on proprietary platforms and on vendor’s road maps. Open –source platform gives operator flexibility to develop their own applications without being dependent on vendor. An open, standards based, service logic execution environment (SLEE) that integrates with current and future networks is the key to providing innovative and revenue generating services. Providing one (standards based) carrier grade execution environment that integrates SS7, SIP, OSA/Parlay, OSS/BSS and J2EE environments offers significant benefits to operator.

Business benefits of SIP JAINSLEE based platform

  1. Network Independence: The JAIN SLEE framework is independent of any particular network protocol, API or network topology. This is supported through the resource adaptor architecture
  2. Portable Services: Application components can be developed and then deployed on JAIN SLEE compliant platforms from different vendors without recompilation or source code modification.
  3. Supports Complex Applications: JAIN SLEE application components can have state, can be composed from other components, can create and destroy other application components, can invoke other application components both synchronously and asynchronously, and can invoke resource adaptors.
  4. Industry Standard: JAIN SLEE is specified via the Java Community Process which allows multiple companies and individuals to collaborate in developing Java technology specifications.
  5. In order to reduce the operating cost of legacy infrastructure more and more operators are investing and implementing open source platform. These new platforms bring agility and new service delivery capability to CSP.
  6. The JAINSLEE based platform can be used to develop and deploy carrier-grade applications that use SS7-based protocols such as INAP and CAP, IP protocols such as SIP and Diameter, and IT / Web protocols, such as HTTP Servlet, XML and Service Orientated Architectures (SOA).

Fundamental Concepts :

  • Application can be written once and run on many different implementations of JAIN SLEE.
  • Applications can access resources and protocols across multiple networks from within the JAIN SLEE environment.
  • Follows the ACID transaction .
  • component model for structuring the application logic of communications applications as a collection of reusable
  • object-orientated components, and for  composing these components into higher level and more sophisticated services.
  • SLEE specification also defines the management interfaces used to administer the application environment and also
  • defines set of standard Facilities (such as the Timer Facility, Trace Facility, and Alarm Facility so on  )
  •  Extension framework to allow new external protocols and systems (such as MSCs, MMSCs, SMSCs, Softswitchs, CSCFs, HLRs) to be integrated.

Characteristics of SLEE specification

• Event based model, asynchronous, support for composition

• Container manages component state

• Container manages garbage collection of components

• Transaction boundaries for demarcation and semantics of state replication

• Strongly typed event handling signatures

• 3rd party event driven components

• Management of lifecycle of Server, Services, Provisioned state

• Versioned services, upgrade of services, existing activities stay on existing service instances, new activities are directed to instances of upgraded services

• Independent of network technology/ protocols/elements through resource adaptor architecture

Entities :

jianslee environment

Service

A service in JAIN SLEE terminology is a managed field replaceable unit.

The system administrator of a JAIN SLEE controls the life cycle (including deployment, undeployment and on-line upgrade) of a service. The program code can include Java classes Profiles, and Service Building Blocks.

Profile

A JAIN SLEE Profi le contains provisioned service or subscriber data.

Service Building Blocks running inside the JAINSLEE may access profiles as part of their application logic.

Service Building Block

The element of re-use defined by JAINSLEE is the Service Building Block (SBB).

An SBB is a software component that sends and receives events and performs computational logic based on the receipt of events and its current state. SBBs are stateful.

The program code for an SBB is comprised of Java classes.

Event

An event represents an occurrence that may require application processing.

An event may originate from a number of different sources, for example, an external resource such as a communications protocol stack, from the SLEE itself, or from application components within the SLEE.

Resources and Resource ADAPTERS

Resources are external entities that interact with other systems outside of the SLEE, such as network elements (HLR, MSC, etc), protocol stacks, directories and databases.

A Resource Adaptor implements the interfacing of a Resource into the JAINSLEE environment.


WebRTC business benefits to OTT and telecom carriers

Historically, RTC has been corporate and complex, requiring expensive audio and video technologies to be licensed or developed in house. Integrating RTC technology with existing content, data and services has been difficult and time consuming, particularly on the web.
Now with WebRTC the operator finally gets a chance to take the shift the focus from OTT ( Over The Top service providers like SKype , Google chat WebEx etc that were otherwise eating away the Operators revenue ) to its very own WebRTC client Server solution , hence making the VOIP calls chargeable , while at the same time being available from any client ( web or softphone based )

To know more about what webrtc is read : https://telecom.altanai.com/2013/08/02/what-is-webrtc/

To read about how webrtc integrates with the SIP/IMS systems read https://telecom.altanai.com/2013/10/02/webrtc-solution/

OTT
OTT ( Over The Top ) Applications

Where are we Now ?

WebRTC has now implemented open standards for real-time, plugin-free video, audio and data communication.

Many web services already use RTC, but need downloads, native apps or plugins. These includes Skype, Facebook (which uses Skype Flash ) and Google Hangouts (which use the Google Talk plugin).
Downloading, installing and updating plugins can be complex, error prone and annoying , such as Flash , Java .,etc

Plugins can be difficult to deploy, debug, troubleshoot, test and maintain—and may require licensing and integration with complex, expensive technology. It’s often difficult to persuade people to install plugins in the first place/ bookmark it or keep it activated at all times.

WebRTC support across various browsers
WebRTC support across various browsers , pic source : caniuse.com

API support from browser

  • PeerConnection API
  • getUserMedia
  • WebAudio Integration
  • dataChannels
  • TURN support
  • Echo cancellation
  • MediaStream API
  • Multiple Streams
  • Simulcast
  • Screen Sharing
  • mediaConstraints
  • Stream re-broadcasting
  • getStats API
  • ORTC API
  • H.264 video
  • VP8 video
  • Solid interoperability
  • srcObject in media element
  • Promise based getUserMedia
  • Promise based PeerConnection API

WebRTC trends

disruptive graph
Biz users
ic source : Disruptiveanalysis

The APIs and standards of WebRTC can democratize and decentralize tools for content creation and communication—for telephony, gaming, video production, music making, news gathering and many other applications.

pic source: iswebrtcreadyyet.com

What is WebRTC?

webrtc draft
 

WebRTC 1.0: Real-time Communication Between Browsers – W3C Candidate Recommendation 13 December 2019 https://www.w3.org/TR/webrtc/

Read more in the layers of webrtc  and their functionalities here :  WebRTC layers

webrtc_development_logowebrtcdevelopment
Open Source WebRTC SDK and its implementation steps https://github.com/altanai/webrtc

What is WebRTC ?

WebRTC (Web Real-Time Communication) is an API definition drafted by the World Wide Web Consortium (W3C) that supports browser-to-browser applications for voice calling, video chat, and P2P file sharing without the need of either internal or external plugins.

  • Enables browser to browser media streaming over secure RTP profile
  • Standardization , on a API level at the W3C and at the protocol level at the IETF.
  • Enables web browsers with Real-Time Communications (RTC) capabilities
  • written in c++ and javascript
  • BSDD style license
  • free, open project avaiable in all major borwsers 

As of the 2019 update the W3C defines it as

a set of ECMAScript APIs in WebIDL to allow media to be sent to and received from another browser or device implementing the appropriate set of real-time protocols. The specification being developed in conjunction with a protocol specification developed by the IETF RTCWEB group and an API specification to get access to local media devices.

 The following is the browser side stack for webrtc media .  

WebRTC media stack Solution Architecture
WebRTC Media Stack

Open and Free Codecs

Codecs signifies the media stream’s compession and decompression. For peers to have suceesfull excchange of media, they need a common set of codecs to agree upon for the session . The list codecs are sent  between each other as part of offeer and answer or SDP in SIP.

WebRTC uses bare MediaStreamTrack objects for each track being shared from one peer to another. Codecs associated in those tracks is not mandated by webrtc soecification.

For video as per RFC 7742 WebRTC Video Processing and Codec Requirements , the manadatory codesc to be supported by webrtc clients are : VP8 and H.264‘s Constrained Baseline profile

For Audio as per RFC 7874 WebRTC Audio Codec and Processing Requirements ,browser must support  Opus codec as well as G.711‘s PCMA and PCMU formats.

Video Resolution handling

Unless the SDP specifically signals otherwise, the web browser receiving a WebRTC video stream must be able to handle video at at least 20 FPS at a minimum resolution of 320 pixels wide by 240 pixels tall.

In the best scenarios ( avaible bandwidth and media devices ) VP8 had no upper mark set on resolution of vdieo stream hence the stream can even go asfar as  maximum resolution of 16384×16384 pixels.

Independant of Signalling 

Webrtc does not specify any signalling / telecommunication protocl and it is upto the adoptor to perform ofeer/answer exchaneg in any way deemed fit for the usecase . For ex maple for a web only application on may use only plain websockets, whereas for a teelcom endpoints compatible app one should SIP as the protocol . 

Read more about WebRTC handshakes :

NAT-traversal technologies such as ICE, STUN, and TURN

Have written in detail about TURN based WebRTC flow diagrams .

https://telecom.altanai.com/2015/03/11/nat-traversal-using-stun-and-turn/. The post describe ICE  (Interactive Connectivity Establishment )  framework which is  mandatory by WebRTC standards.  It is find network interfaces and ports in Offer / Answer Model to exchange network based information with participating communication clients. ICE makes use of the Session Traversal Utilities for NAT (STUN) protocol and its extension, Traversal Using Relay NAT (TURN) 

NAT and TURN Relay

Learn about hosting / integrating different TURN servers for WebRTC

TURN server for WebRTC – RFC5766-TURN-Server , Coturn , Xirsys – https://telecom.altanai.com/2015/03/28/turn-server-for-webrtc-rfc5766-turn-server-coturn-xirsys/

Why is WebRTC importatnt ?

Significantly better video qualityWebRTC video quality is noticeably better than Flash.
Up to 6x faster connection timesUsing JavaScript WebSockets, also an HTML5 standard, improves session connection times and accelerates delivery of other OpenTok events.
Reduced audio/video latencyWebRTC offers significant improvements in latency through WebRTC, enabling more natural and effortless conversations.
Freedom from FlashWith WebRTC and JavaScript WebSockets, you no longer need to rely on Flash for browser-based RTC.
Native HTML5 elementsCustomize the look and feel and work with video like you would any other element on a web page with the new video tag in HTML5.

The major players behind conception and advancement of WebRTC standards and libraries are  :

IETF , W3C , Java community , GSMA .   The idea is to develop a Light -weight browser based call console , to make SIP calls from Web page .This was successfully achieved using fundamental technologies as Javascript , html5 , web-sockts  and TCP /UDP , open source sip server.It is good to note that there is no extra extension, plugin or gateway required , such as flash support  .Also it bears cross platform support ,  including Mozilla , chrome so on .

 Peer to peer Communication

 WebRTC forms a p2p communication channel between all the peers . that means as the participant count grows  , it converts to  a mesh networking topology with incoming and outgoing stream towards direction of each of its peers .

Two party call p2p

Peer to peer calling

two party call
p2p call

Multiparty Call and mesh network

Mesh based arrangement .

Multiparty party call
Mesh based webrtc video confeerncing

 In special case of broadcasting or  large number of viewers ( without outgoing media stream ) it is recommended to setup a Media Control Unit ( MCU) which will replay the incoming stream to large number of users without putting traffic load on the clients from where the stream is actually originating .   Important note :     1.It should be notes that these diagrams do not depict the ICE and NAT traversal and have been simplifies for better understanding. In real world scenarios there is almost all the time a STUN and TURN server involved .  

More on TURN Servers is given here : NAT traversal using STUN and TURN

2.Also the webrtc mandates the use of secure origin ( https ) on the webpage which invoke getusermedia to capture user media devices like audio , video and location .

Browser Adoption

As of March 2020 , webrtc is supported on following client’s browsers

  • Desktop PC
    Microsoft Edge 12+[25]
    Google Chrome 28+
    Mozilla Firefox 22+[26]
    Safari 11+[27]
    Opera 18+[28]
    Vivaldi 1.9+
  • Android
    Google Chrome 28+ (enabled by default since 29)
    Mozilla Firefox 24+[29]
    Opera Mobile 12+
  • Chrome OS
  • Firefox OS
  • BlackBerry 10
  • iOS
    MobileSafari/WebKit (iOS 11+)
  • Tizen 3.0

Furthermore , read about the Steps for building and deploying WebRTC solution – https://telecom.altanai.com/2014/12/04/steps-for-building-and-deploying-webrtc-solution/

TURN based media Relay

WebRTC APIs

Javascript functions  to access and process the browser media stack

getUserMedia

acquires the audio and video media (e.g., by accessing a device’s camera and microphone)

Properties

ondevicechange

Methods

enumerateDevices()
getDisplayMedia()
getSupportedConstraints()
getUserMedia()

navigator.mediaDevices.getUserMedia({ audio: true, video: true })
.then(function(stream) {
  var video = document.querySelector('video');
  // Older browsers may not have srcObject
  if ("srcObject" in video) {
    video.srcObject = stream;
  } else {
    // Avoid using this in new browsers, as it is going away.
    video.src = window.URL.createObjectURL(stream);
  }
  video.onloadedmetadata = function(e) {
    video.play();
  };
})
.catch(function(err) {
  console.log(err.name + ": " + err.message);
});

DOMException Error on getusermedia

Rejections of the returned promise are made by passing a DOMException error object to the promise’s failure handler. Possible errors are:

AbortError
Although the user and operating system both granted access to the hardware device, problem occurred which prevented the device from being used.

NotAllowedError
One or more of the requested source devices cannot be used at this time. This will happen if the browsing context is insecure( http instead of https) or if the user has specified that the current browsing instance /sessionis not permitted access to the device or has denied all access to user media devices globally.

NotFoundError
No media tracks of the type specified were found that satisfy the given constraints.

NotReadableError
Although the user granted permission to use the matching devices, a hardware error occurred at the operating system, browser, or Web page level which prevented access to the device.

OverconstrainedError
no candidate devices which met the criteria requested. string value is the name of a constraint which was not meet, and a message property containing a human-readable string explaining the problem.

exmaple conatraints :

var constraints = { video: { facingMode: (front? "user" : "environment") } };

SecurityError
User media support is disabled on the Document on which getUserMedia() was called.

TypeError
The list of constraints specified is empty, or has all constraints set to false.

RTCPeerConnection

enables audio and video communication between peers. It performs signal processing, codec handling, peer-to-peer communication, security, and bandwidth management.

Properties

canTrickleIceCandidates
connectionState
currentLocalDescription
currentRemoteDescription
getDefaultIceServers()
iceConnectionState
iceGatheringState
localDescription
onaddstream
onconnectionstatechange
ondatachannel
onicecandidate
oniceconnectionstatechange
onicegatheringstatechange
onidentityresult
onnegotiationneeded
onremovestream
onsignalingstatechange
ontrack
peerIdentity
pendingLocalDescription
pendingRemoteDescription
remoteDescription
sctp
signalingState

Methods

addIceCandidate()
addStream()
addTrack()
close()
createAnswer()
createDataChannel()
createOffer()
generateCertificate()
getConfiguration()
getIdentityAssertion()
getReceivers()
getSenders()
getStats()
getStreamById()
getTransceivers()
removeStream()
removeTrack()
restartIce()
setConfiguration()
setIdentityProvider()
setLocalDescription()
setRemoteDescription()

 signalling state transitions diagram , source W3C

RTC Signalling states

stable
There is no offer/answer exchange in progress. This is also the initial state, in which case the local and remote descriptions are empty.

have-local-offer
Local description, of type “offer”, has been successfully applied.

have-remote-offer
Remote description, of type “offer”, has been successfully applied.

have-local-pranswer
Remote description of type “offer” has been successfully applied and a local description of type “pranswer” has been successfully applied.

have-remote-pranswer
Local description of type “offer” has been successfully applied and a remote description of type “pranswer” has been successfully applied.
closed The RTCPeerConnection has been closed; its [[IsClosed]] slot is true.

RTCSDPType

offer
SDP offer.

pranswer
An RTCSdpType of pranswer indicates that a description MUST be treated as an [SDP] answer, but not a final answer.

answer
treated as an [SDP] final answer, and the offer-answer exchange MUST be considered complete. A description used as an SDP answer may be applied as a response to an SDP offer or as an update to a previously sent SDP pranswer.

rollback
canceling the current SDP negotiation and moving the SDP [SDP] offer back to what it was in the previous stable state.

RTCPeerConfiguration

Defines a set of parameters to configure how the peer-to-peer communication established via RTCPeerConnection

iceServers of type sequence
array of objects describing servers available to be used by ICE, such as STUN and TURN servers.

iceTransportPolicy of type RTCIceTransportPolicy.

bundle policy affects which media tracks are negotiated if the remote endpoint is not bundle-aware, and what ICE candidates are gathered. If the remote endpoint is bundle-aware, all media tracks and data channels are bundled onto the same transport.

  • relay
    ICE Agent uses only media relay candidates such as candidates passing through a TURN server.
  • all
    The ICE Agent can use any type of candidate when this value is specified.

bundlePolicy of type RTCBundlePolicy.
media-bundling policy to use when gathering ICE candidates.
Types :

  • balanced
    Gather ICE candidates for each media type in use (audio, video, and data). If the remote endpoint is not bundle-aware, negotiate only one audio and video track on separate transports.
  • max-compat
    Gather ICE candidates for each track. If the remote endpoint is not bundle-aware, negotiate all media tracks on separate transports.
  • max-bundle
    Gather ICE candidates for only one track. If the remote endpoint is not bundle-aware, negotiate only one media track.

rtcpMuxPolicy of type RTCRtcpMuxPolicy.
rtcp-mux policy to use when gathering ICE candidates.

certificates of type sequence
A set of certificates that the RTCPeerConnection uses to authenticate.

iceCandidatePoolSize of type octet, defaulting to 0
Size of the prefetched ICE pool as defined in [JSEP]

RTCDataChannel

allows bidirectional communication of arbitrary data between peers. It uses the same API as WebSockets and has very low latency.

getStats

allows the web application to retrieve a set of statistics about WebRTC sessions. These statistics data are being described in a separate W3C document

Peer to Peer DTMF

-tbd

Call Setup betweeb WebRTC Endpoints

updates in W3C 13 Dec , 2019

Over the years since its adoption many of the associated tech were depricated from the Webrtc based platforms and enviornments , some of which are: OAuth as a credential method for ICE servers
Negotiated RTCRtcpMuxPolicy (previously marked at risk)
voiceActivityDetection
RTCCertificate.getSupportedAlgorithms()
RTCRtpEncodingParameters: ptime, maxFrameRate, codecPayloadType, dtx, degradationPreference
RTCRtpDecodingParameters: encodings
RTCDatachannel.priority

Some of the newly added featufres include:

restartIce() method added to RTCPeerConnection
Introduced the concept of “perfect negotiation”, with an example to solve signalling races.
Implicit rollback in setRemoteDescription to solve races.
Implicit offer/answer creation in setLocalDescription to solve races.

References :

WebRTC 1.0: Real-time Communication Between Browsers – W3C Candidate Recommendation 13 December 2019https://www.w3.org/TR/webrtc/

WebRTC Stack Architecture and layers

WebRTC stands for Web Real-Time Communications and introduces a real-time media framework in the browser core alongside associated JavaScript APIs for controlling the media frame and HTML5 tags for displaying.

If you are new to WebRTC , read what is WebRTC ? From a technical point of view, WebRTC will hide all the complexity of real-time media behind a very simple JavaScript API. 

Codec Confusion :

Video Codecs

Currently VP8 is the codec of choice since it is royalty-free. In mobility today, the codec of choice is h264. H264 is not royalty-free. But it is native in most mobile handsets due to its high performance.

Audio Codecs

Opus is a lossy audio compression format developed by the Internet Engineering Task Force (IETF) targeting a broad range of interactive real-time applications over the Internet, from speech to music. As an open format standardized through RFC 6716, a reference implementation is provided under the 3-clause BSD license. All known software patents Which cover Opus are licensed under royalty-free terms.

G.711 is an ITU (International Telecommunications Union) standard for  audio compression. It is primarily used in telephony. The standard was released in 1972. It is the required standard in many voice-based systems  and technologies, for example in H.320 and H.323 specifications.
Speex is a patent-free audio compression format designed for speech and also  a free software speech codec that is used in VoIP applications and podcasts. Some consider Speex obsolete, with Opus as its official successor, but since
significant content is out there using Speex, it will not disappear anytime soon.

G.722 is an ITU standard 7 kHz Wideband audio codec operating at 48, 56 and 64 kbit/s. It was approved by ITU-T in 1988. G722 provides improved speech quality due to a wider speech bandwidth of up to 50-7000 Hz compared to G.711 of 300–3400 Hz.

AMR-WB Adaptive Multi-rate Wideband is a patented wideband speech coding standard that provides improved speech quality due to a wider speech bandwidth of 50–7000 Hz. Its data rate is between 6-12 kbit/s, and the codec is generally available on mobile phones.

Architecture :

WebRTC offers web application developers the ability to write rich, realtime multimedia applications (think video chat) on the web, without requiring plugins, downloads or installs. It’s purpose is to help build a strong RTC platform that works across multiple web browsers, across multiple platforms.

WebRTCpublicdiagramforwebsite

Web API – An API to be used by third-party developers for developing web-based video chat-like applications.

WebRTC Native C++ API – An API layer that enables browser makers to easily implement the Web API proposal

Transport / Session – The session components are built by re-using components from libjingle, without using or requiring the XMPP/jingle protocol.

RTP Stack – A network stack for RTP, the Real-Time Protocol.

STUN/ICE – A component allowing calls to use the STUN and ICE mechanisms to establish connections across various types of networks.

Session Management – An abstracted session layer, allowing for call setup and management layer. This leaves the protocol implementation decision to the application developer.

VoiceEngine – VoiceEngine is a framework for the audio media chain, from sound card to the network.

iSAC / iLBC / Opus

iSAC: A wideband and super wideband audio codec for VoIP and streaming audio. iSAC uses 16 kHz or 32 kHz sampling frequency with an adaptive and variable bit rate of 12 to 52 kbps.

iLBC: A narrowband speech codec for VoIP and streaming audio. Uses 8 kHz sampling frequency with a bitrate of 15.2 kbps for 20ms frames and 13.33 kbps for 30ms frames. Defined by IETF RFCs 3951 and 3952.

Opus: Supports constant and variable bitrate encoding from 6 kbit/s to 510 kbit/s, frame sizes from 2.5 ms to 60 ms, and various sampling rates from 8 kHz (with 4 kHz bandwidth) to 48 kHz (with 20 kHz bandwidth, where the entire hearing range of the human auditory system can be reproduced). Defined by IETF RFC 6176.

NetEQ for Voice– A dynamic jitter buffer and error concealment algorithm used for concealing the negative effects of network jitter and packet loss. Keeps latency as low as possible while maintaining the highest voice quality.

Acoustic Echo Canceler (AEC) – The Acoustic Echo Canceler is a software-based signal processing component that removes, in real-time, the acoustic echo resulting from the voice being played out coming into the active microphone.

Noise Reduction (NR) -The Noise Reduction component is a software-based signal processing component that removes certain types of background noise usually associated with VoIP. (Hiss, fan noise, etc…)

Video Engine – VideoEngine is a framework video media chain for video, from the camera to the network, and from network to the screen.

VP8  – Video codec from the WebM Project. Well suited for RTC as it is designed for low latency.

Video Jitter Buffer – Dynamic Jitter Buffer for video. Helps conceal the effects of jitter and packet loss on overall video quality.
Image enhancements -For example, removes video noise from the image capture by the webcam.

W3C contribution


w3c

  • Media Stream Functions

API for connecting processing functions to media devices and network connections, including media manipulation functions.

  • Audio Stream Functions

An extension of the Media Stream Functions to process audio streams (e.g. automatic gain control, mute functions and echo cancellation).

  • Video Stream Functions

An extension of the Media Stream Functions to process video streams (e.g. bandwidth limiting, image manipulation or “video mute“).

  • Functional Component 

 API to query presence of WebRTC components in an implementation, instantiate them and connect them to media streams.

  • P2P Connection Functions

API functions to support establishing signalling protocol-agnostic peer-to-peer connections between Web browsers

  • API specification Availability

WebRTC 1.0: Real-time Communication Between Browsers –  Draft 3 June 2013 available

  • Implementation Library: WebRTC Native APIs

Media Capture and Streams – Draft 16 May 2013

  • Supported by Chrome , Firefox, Opera in desktop of all OS ( Linux, Windows , Mac )
  • Supported by Chrome , Firefox  in Mobile browsers ( android )

IETF contribution

ietf

Communication model

Security model

Firewall and NAT traversal

Media functions

Functionality such as media codecs, security algorithms, etc.,

Media formats

Transport of non media data between clients

Input to W3C for APIs development

Interworking with legacy VoIP equipment

WG RFC   Date

  • draft-ietf-rtcweb-audio-02      2013-08-02
  • draft-ietf-rtcweb-data-channel-05      2013-07-15
  • draft-ietf-rtcweb-data-protocol-00      2013-07-15
  • draft-ietf-rtcweb-jsep-03      2013-02-27
  • draft-ietf-rtcweb-overview-07      2013-08-14
  • draft-ietf-rtcweb-rtp-usage-07     2013-07-15
  • draft-ietf-rtcweb-security-05      2013-07-15
  • draft-ietf-rtcweb-security-arch-07      2013-07-15
  • draft-ietf-rtcweb-transports-00      2013-08-19
  • draft-ietf-rtcweb-use-cases-and-reqs-11      2013-06-27
  • Plus over 20 discussion RFC drafts

What will be the outcome of WebRTC Adoption?

In simple words, it’s a phenomenal change in decentralizing communication platforms from proprietary vendors who heavily depended on patented and royalty bound technologies and protocols.  It will revolutionize internet telephony.  Also it will emerge to be platform-independent ( ie any browser, any desktop operating system any mobile Operating system ).

WebRTC allows anybody to introduce real-time communication to their web page as simple as introducing a table.

Read More about webRTC business benefits


update 2020 – This article was written very early in 2013 while WebRTC was being standardised and not as widely adopted since the inception of WebRTC began in 2012.

There are many more articles written after that to explain and emphasize the detailing and application of WebRTC. List of these is below :

For SIP IMS and WebRTC

Read about STUN and TURN which form a crtical part of any webrtc based communication system

Security of WebRTC based CaaS and CPaaS

WebRTC APIs


JAIN SLEE

•Jain SLEE :- JAIN is a Sun Java standards initiative and part of the Java Community Process.
JAIN specifies a comprehensive range of APIs that target converged IP and PSTN networks, including APIs for

– High-level application development (such as service provider APIs and the Service Logic Execution Environment (SLEE))

– call control

– signalling at the protocol level (such as SIP, MGCP and SS7)

•For telephony, data and wireless communications networks, the Java APIs defined through.

– service portability

– network independence

– open development

•A Service Logic Execution Environment (SLEE) is high-throughput, low-latency, event-processing application environment.
•JAIN SLEE  is designed specifically to allow implementations of a standard to meet the stringent requirements of communications applications (such as network-signaling applications).

Goals of JAIN SLEE are:

– Portable services and network independence.

– Hosting on an extensible platform.

– services and SLEE platform available from many vendors.

Key Features are  :

•Industry standard :- JSLEE is the industry-agreed standard for an application server that meets the specific needs of telecommunications networks.
•Network independence:-The JSLEE programming model enables network independence for the application developer. The model is independent of any particular network protocol, API or network topology.
•Converged services:- JSLEE provides the means to create genuinely converged services, which can run across multiple network technologies.
•Network migrations :-As JSLEE provides a generic, horizontal platform across many protocols, independent of the network technology, it provides the ideal enabler technology for smooth transition between networks.
•Global market—global services:-JSLEE-compliant applications, hosted on a JSLEE application server are network agnostic. A single platform can be used across disparate networks
•Robust and reliable:- As with the enterprise application server space, deploying applications on a standard application server that has been tested and deployed in many other networks reduces logic errors, and produces more reliable applications
•Standard object orientated component  architecture

Scope of JAINSLEE applications

•The principal features of the JSLEE programming model are :

– programs written in Java

-asynchronous programming paradigm

-well-defined event-delivery semantics

-component-based, object-oriented approach

-transactional model

-“profiles” of information, which represent provisioned data

-usage interfaces that support gathering service statistics

-support for standard Java APIs (such as JNDI and JDBC), and optionally, support integration with J2EE

-standard facilities for traces, alarms and timers, for use by the applications that are hosted on the SLEE

Resource adaptors

-The JSLEE provides integration capabilities using a plug-in architecture known as the resource adapter

architecture. Resource adaptors (RAs) provide interconnection with the “outside” world, for example,

interfaces to communication protocol stacks, directory services or external systems.

•SLEE management

-The JSLEE specification also defines the management capabilities of the SLEE. It adopts the Java standard

in this area, Java for Management Extensions (JMX).

————————————————————————————————————————