NASA engineers must consider the requirements of a particular system or subsystem in light of their impact on the requirements of related systems, and trace those intersections in order to verify all requirements have been met.
For example, the requirement, “Humans must be able to tolerate the vibrations of the rocket,” must link to a Hazard Record that specifies, “If you shake astronauts too much, they may become unconscious or critically injured,” to form the basis of a complex mathematical model that verifies the precise calibration of acceptable vibration and thereby mitigates the hazard.
To determine the impact a slight change in acceptable vibration has on humans, and even on downstream requirements like schedule and cost, NASA engineers took weeks to pull the relevant data manually from each dataset, analyze and qualify it, and then to create a report. Before any changes could be made, a review board examined the report. If anyone from the board asked a new question, or if the scope of the exercise changed in any way, the NASA engineers had to go back to the drawing board and repeat the whole process. Every question created an entirely new project that needed its own complex assessment. The review cycle was long and costly, and NASA engineers spent lots of valuable time wrestling with the data. Change became costlier and riskier due to the connectedness and complexity of the data.
Dismantling the silos between disparate systems was not a realistic option given the expense, time, and government regulations required. The NASA Human Exploration and Operations Mission Directorate (HEOMD), Exploration Systems Division (ESD) decided to implement Stardog’s Enterprise Knowledge Platform to provide the systems engineering and integration community with a unified view across the various engineering disciplines.
Stardog’s Knowledge Graph Platform allows NASA to manage, query, and analyze data at scale. The Knowledge Graph creates an abstract layer of knowledge over disparate data systems and organizes and represents information using entities (nodes) and relationships between those entities (edges). Stardog further uses machine learning to enable predictive analytics and extract patterns from the data in order to make intelligent predictions based on those patterns. This power of inference, ability to spot patterns, and surface hidden relationships can be the engine behind risk analysis, fraud detection, recommendation systems, and many more.
Stardog helps find the intersection of requirement (cluster on the left) to the math model (cluster on the right) and all the relationships they share.
At NASA, Stardog currently supports a Knowledge Graph that knows about Space Launch System (SLS), Orion, and Exploration Ground Systems (EGS). Thus it performs systems engineering tasks such as closure of requirements, verification, and validation, as well as gap analysis and various data products reporting.
Dependency analysis in Pelorus, showing the relationship of model records. Each relationship highlights who needs to attend a review to assess downstream impact.
In addition, NASA uses Pelorus, a Stardog-based faceted data navigation tool. Pelorus uses faceted navigation to provide a quick browse and navigation capability across 36 data facets or types. With Pelorus, a NASA engineer can find a requirement verification, associated hazard, intersection with a math model, and associated monitoring condition in seconds without needing to write a single query, manually navigating silos, run exports, or integrate data by hand.
Stardog’s Knowledge Graph goes beyond highlighting associations between datasets and within the data; it brings robust features to power NASA’s Deep Space Transportation System:
- Data virtualization speeds time to insight Stardog’s virtual graph and rich data services provide access to data without changing existing NASA processes or its ability to select best-of-breed tools.
- Reusable data modeling creates insight Stardog uses declarative graph modeling and rules to knit together the data relationships and overcome limits in the source systems. The modeling includes Stardog Rules for describing and optimizing specific relationships in the graph. Declarative graph modeling also enables traceability of individual data sources in order to compare and verify data pedigrees.
- Data quality and assurance without writing code Integrity Constraint Validation is built into the platform and flags potential errors and gaps in the Knowledge Graph, sometimes highlighting problems within a specific data source.
- AI and ML create contextual knowledge & data lineage Stardog combines the power of graphs with a knowledge toolkit fusing machine learning, reasoning, and rules for contextualized insight of all the data. SPARQL and PATH queries support a broader range of graph queries and analysis of complex reports.
- NASA-grade operations workflow Stardog is fully integrated into the NASA operations workflow with backup, restore, and single sign-on capability, in conjunction with Pelorus.
Since implementing Stardog, NASA engineers have saved countless hours assembling the answers they need from interconnected data. What took weeks to compile now takes seconds–minutes, at most. NASA engineers can ask new questions, explore data, and examine the connectedness of data and the impact of changes throughout the mission from the earliest flight modeling stages to launch and beyond. With Stardog providing complete traceability of data, issues within data sources come to light that would have been impossible to spot manually.
According to NASA, the next tangible frontier for human exploration is an achievable goal. At Stardog, we are proud to support the most ambitious goals to extend human presence into deep space.