Part 7 of the JADC2 Blog Series
**Please note that with the April 2024 release of Connext Professional 7.3 LTS, the functionalities formerly comprising Connext Secure are now available as an optional component to Connext Professional, named Security Extensions.
One of the most challenging issues facing JADC2 systems is interoperability — the ability to integrate and deploy diverse command and control (C2) systems, which feature a vast array of sensor and effector platforms from multiple suppliers. How can one get these systems to rapidly and reliably communicate to support global operational domains, as well as evolve to respond to changing threat landscapes?
Global armed forces today must modernize and digitally transform all systems, including legacy applications, simulation, visualization, training, C2, sensor, and effector platforms. The foundation of this transformation is interoperability, which enables rapid and highly competitive force projection using the pan-domain intelligence of global allies across land, sea, air, space, and cyber operational domains. These dynamic systems-of-systems must now be able to interoperate with the capability to adapt and learn, while also supporting future advancements in technology.
Unfortunately, these systems do not all interoperate today. Therefore, a strategy is needed to enable this crucial interoperability, while incrementally upgrading key systems that can drive the highest JADC2 value in terms of achieving interoperability on a massive scale. System interoperability enables military branches to treat data as a weapon system and make critical decisions at the speed of war.
Open Interoperability Standards
Open standards will be a critical force in the global digital transformation of defense systems. Unfortunately, there is no one standard that can be applied to every system that would ease and accelerate the modernization effort. Trying to mandate a single interoperability standard or trying to convert existing legacy systems to use one standard is simply not feasible. Therefore, a rational, forward-looking strategy must allow for the inclusion of multiple open standards as an existential foundation of a JADC2 future. What is required is the selection of a set of open interoperability standards that underpin the communication of intelligence and actions of disparate defense platforms. For complete clarity, these interoperability standards must therefore all be open standards.
Levels of Interoperability
This section describes three levels of interoperability, with different levels of maturity and usability: message-centric, data-centric, and data-centric with open standards.
Technical Interoperability (Message-Centric)
Most legacy systems have a message-centric architecture deployed in single operational environments. These are disparate stove-piped solutions that share proprietary opaque data. There is no guarantee that these system suppliers understand each other unambiguously, only that they can receive each other’s data. To make sense of messages, each supplier needs extensive knowledge of the remote supplier’s platform, transport, and message format. For example, if there are (n) suppliers in a system, each supplier must understand (n-1) message formats. This type of interoperability is not scalable. If one supplier changes its format or a new one is added, all remote systems must be updated to reflect the change. This tight coupling results in very brittle systems that are easy to break and difficult to change. Therefore, these systems are difficult to develop, manage, and upgrade.
In addition, this approach makes sending only relevant data over networks difficult. One solution is to only send requested data. While this could potentially reduce the data on the network it poses a burden on the sender. If many remote entities request different subsets of data based on filtering criteria, the sender must manage these requests. This has the potential to cause excess CPU utilization for the sender. The other option is to have all the receivers filter the data. All message data would have to be received by each remote entity, which then needs to extract and analyze the data, use what is relevant and ignore all other data. This results in excess network bandwidth utilization and application-intensive systems on the receiving side.
Figure 1. Technical Interoperability (Message-Centric)
Syntactic Interoperability (Data-Centric)
This is the next level of interoperability and requires that technical interoperability be established. The term data-centric has been generally viewed as a solution where all data is stored in a central location like the cloud, where this data can be retrieved at any time. This is considered data centricity for data at rest. Often overlooked are data-centric solutions for data in motion. These solutions typically are real time, edge-to-cloud systems where entities communicate in a peer to peer architecture that allows data in motion to be requested similarly to how data at rest is requested. These systems are required to function even when they are disconnected from other systems. Data Distribution Service (DDS™) is an open, data-centric interoperability standard for data in motion which shares real-time data in a common structured data format.
In data-centric interoperability frameworks, the data is the interface. Traditionally, data formats rarely change, as opposed to interface-based technologies that refresh frequently. Entities have very little remote knowledge, which results in loose coupling that eases interoperability between systems, including legacy systems. Simulations can be rapidly enabled in live exercises, as live and simulated data are in the same format.
In addition to determining what data is to be exchanged between entities, it is equally important to specify how the data is to be delivered. A “data-aware” interoperability framework can apply Quality of Service (QoS) to ensure each data sample is delivered as requested. Filtering of live data is done at the framework layer, relieving the application of complexity. This ensures that only “relevant data” is delivered, reducing the burden on network bandwidth.
A layered databus architecture enables a system-of-systems approach that can scale indefinitely. Each system operates independently and exchanges a large amount of internal data. Gateways ensure “relevant data” is shared external to the system, and have the capability to modify the data by splitting, aggregating, or processing specific data in a user-defined fashion.
Since the framework is platform-agnostic, it is easier to future-proof systems – this allows for upgrades, new technology and suppliers, and the evolution of structured data. Dynamic discovery enables entities to easily join or leave systems, meaning it can share, learn, and adapt. This agile environment is critical when moving from capability-based to threat-based systems.
With syntactic interoperability, all entities understand the data on the system. However, there is still room for ambiguity. For example, the units for location could be published as meters and be interpreted as kilometers. The next level of interoperability addresses this issue.
Figure 2. Syntactic Interoperability (Data-Centric)
Semantic Interoperability (Data-Centric + Domain Standards)
Semantic interoperability builds on syntactic interoperability. Domain-specific standards provide well-defined, application-specific data models with structure and meaning, removing the ambiguity of syntactic interoperability.
Domain standards open the market to multi-vendor solutions that are innovative, best-in-class, and non-proprietary. This approach lowers program cost and accelerates time to market. Furthermore, it promotes consistency, quality, and safety of products. Well-defined data models combined with data-centric interoperability standards allow for unambiguous exchange of data within and external to systems.
Figure 3. Semantic Interoperability (Data-Centric + Domain Standards)
DDS as a Core Standard for Integration and Interoperability
DDS is a data-centric, open software interoperability standard managed by the Object Management Group® (OMG®). As the leading DDS solution, RTI Connext provides a proven and productive environment for developing DDS-compliant systems from edge to cloud. It is deployed in many industry verticals, including aerospace, defense, medical, energy, and autonomous systems. This wide deployment underpins its recognition as TRL9 technology.
Figure 4. DDS as a Software Interoperability Standard
RTI Connext is used today in many large systems, including demanding A&D applications. As the most comprehensive DDS solution, Connext provides many benefits for the development, maintenance, and upgrade of complex distributed systems. It offers an extensive array of QoS properties that uniquely determine how data is to be shared. This includes reliability properties that have been proven to work in Denied, Disrupted, Intermittent, and Limited (DDIL) environments.
In addition, Connext gateways are critical when designing a large system-of-systems using a layered databus architecture. Isolating systems with gateways reduces discovery so that only entities within a system discover each other, ensuring that local data is kept local. Gateway rules allow sharing of relevant topics between systems or domains, reducing bandwidth requirements. Furthermore, this data can be split, aggregated, and transposed as necessary to allow different domain standards to communicate. Another benefit of gateways is they allow non-DDS systems, including legacy systems, to communicate in a DDS environment.
It’s also possible to apply these semantic interoperability lessons to the handling of security. RTI Connext Secure, which is based on the OMG DDS-Security™ standard, is able to secure data at the source for communication within and between systems. Rather than securing the transport where all data is secured identically, DDS-Security allows different data types to be secured uniquely, based on the security requirements. It also has the added capability of sending secure data over unsecured transports. Connext Secure bridges secure domains and enforces security rules such as authentication, integrity, confidentiality, and access control lists.
Connext also offers an extensive suite of Tools to provide a data-centric view for debugging and optimizing DDS systems. Tools that provide insight into a distributed system can help analyze the root cause, find the right solution and shorten time to market for new offerings or services.
Conclusion
JADC2 sets out to define a strategy for interoperability that allows the systems of today to participate, while supporting systems of tomorrow. Defining a set of open interoperability standards to be used will be crucial, and perfectly aligns with Modular Open Systems Approach (MOSA) requirements that are the foundation of all future military procurements.
To support these requirements, Connext provides a data-centric software framework that is used today in over 1,000 A&D applications, from legacy systems to systems-of-systems, including over 70 command and control (C2) platforms. Connext’s data-aware framework manages the complexities of distributed systems to help significantly reduce development, integration and maintenance costs. Its loose coupling eases upgrades and enables interchangeability, while enabling multi-vendor legacy systems, and modern systems-of-systems to interoperate and connect to other interoperability standards.
RTI Connext is the software interoperability framework for JADC2. In short, Connext enables rapid deployments of advanced capabilities with global defense partners and delivers real-time data to JADC2 command centers at the speed of relevance.
For a deep dive on Architecting for Interoperability with MOSA, please click here.
Stay tuned to this JADC2 blog series as we continue to explore the requirements, challenges and successes of building and deploying JADC2 systems. Subscribe to the RTI Blog at the top of this post to ensure you receive the latest posts.
About the author:
Tim McGuire is a Senior Field Applications Engineer at RTI. He holds a Bachelor of Computer Science at Carleton University in Ottawa, Canada. Before joining RTI, Tim worked in software development at Zeligsoft, where he designed, developed and managed projects for modeling and code generation tools for the Software Communications Architecture (SCA) domain, as well as tools and middleware for Data Distribution Service (DDS). Prior to transitioning to development, Tim was a Field Applications Engineer responsible for supporting products, training, consulting and collaborating with sales and marketing to identify and secure new business opportunities. Tim brings over 25 years of experience in telecommunications, software design and development, consulting and product support.
Posts by Tag
- Developers/Engineer (173)
- Connext DDS Suite (77)
- Technology (74)
- News & Events (71)
- 2020 (54)
- Standards & Consortia (51)
- Aerospace & Defense (46)
- 2023 (34)
- Automotive (34)
- 2022 (29)
- IIoT (27)
- Leadership (24)
- 2024 (22)
- Cybersecurity (20)
- Healthcare (20)
- 2021 (19)
- Military Avionics (15)
- Culture & Careers (14)
- FACE (13)
- Connectivity Technology (11)
- Connext DDS Pro (10)
- JADC2 (10)
- ROS 2 (10)
- Connext DDS Tools (7)
- Connext DDS Micro (6)
- Databus (6)
- Transportation (5)
- Case + Code (4)
- Connext DDS (4)
- Connext DDS Cert (4)
- Energy Systems (4)
- FACE Technical Standard (4)
- Oil & Gas (3)
- RTI Labs (3)
- Research (3)
- Robotics (3)
- #A&D (2)
- Connext Conference (2)
- Edge Computing (2)
- MDO (2)
- MS&T (2)
- TSN (2)
- ABMS (1)
- C4ISR (1)
- ISO 26262 (1)
- L3Harris (1)
- LabView (1)
- MathWorks (1)
- National Instruments (1)
- Simulation (1)
- Tech Talks (1)
- UAM (1)
- Videos (1)
- eVTOL (1)