Reliable, Repeatable, and Robust
GNO-SYS is founded on the principle of “operationalizing” geospatial technology. Our objective is to simplify geospatial infrastructure for space companies and others to allow engineers, scientists, and technicians to collaboratively create products and solutions in a reliable, repeatable, and robust system.
Need for Operationalization of Geospatial Technology
The pace of designing, building, and launching a remote sensing satellite into space is fast and accelerating. And while getting a satellite into space is an incredible achievement, the hardware by itself does not create a product. An often-overlooked or underestimated component of a space-based sensor is a robust and scalable downstream processing system which enables the use and exploitation of the signals recorded with the satellite sensor. If you are on the receiving end of a satellite raw-data downlink you’re probably thinking about:
- How do I manage and store this massive stream of raw and metadata?
- How do I process these data sets in time for my customers?
- Do I understand the workflow, and have provenance information and traceability on how it was created? Am I able to repeat the process a year from now?
- Do I have quality metrics? How do I know if the product is good?
- How do I run analytics (such as change detection, feature extraction) on all data for my customers?
- How do I minimize the costs of all of this?
To address all these important considerations companies need to focus on “operationalization” of the geospatial technology stack. This is frequently a challenge for companies that launch spacecraft and operate satellite constellations.
The same considerations are true for scaling airborne or terrestrial sensors such as LiDAR mobile mappers.
What are the Challenges
Most companies have various stakeholders within a single organization that are interested in creating the data products. In other words, there are many users and departments who need to get their hands on the data. These teams may not always agree on the types of products to be built, or the relative importance of each. The science team, for example, is eager to get access to the data to develop new algorithms and models to build capabilities. They are worried about:
- How can I tinker with and test new algorithms in python with the data?
- How can I scale my machine learning algorithms on the cloud?
- How can I turn my code into an automated production ready process?
The development team may want to access the data to create new API’s and applications. They are asking questions like:
- How can we search and find datasets?
- How do I setup an API to run analytics on the data?
- How can I scale these analytics to keep up with the customer needs?
Without a common architecture it’s difficult to create a framework for the company to work together as a cohesive team. But with a common, enhanced architecture, all stakeholders and users will get the right access and support they need to build their products and solutions.
And that’s exactly what we specialize in!
How we do it
GNO-SYS develops customized high-end software architectures for the optimal operationalization of geospatial data for aerospace and other companies. We do this with the following mindset
- Use open-source standards, libraries, and tools. This also means there are no expensive licensing costs when the business scales
- Leverage the latest scalable data processing systems offered through cloud providers. This ensures scalability and allows for high levels of automation
- Don’t overcomplicate it. Don’t optimize too early. Each product is unique and it’s rare that two satellite companies are doing the same thing, focus on the most meaningful parts.
- Rules and processes are as important as the technology. It’s important the people work hand in hand with the technology.
- Create unique IP that is 100% owned by our customers. We are not a product company, our goal is to help and accelerate businesses to maximize their value.
Each company we work with is unique, yet in our experience, there is a common set of core infrastructure components in all systems.
The combination of these three components create the framework for all things downstream:
- Data Store – A single repository to manage your geospatial data. This includes spatial / temporal indexing, searching, and catalogs. Data is stored in simple file systems in cloud optimized formats with detailed metadata and provenance
- Analysis Hub – An auto scaling cloud-based portal for developers, scientists, and GIS experts. The portal allows users to develop and access libraries and tools to examine, create, and experiment with datasets.
- Automated Workflows – Creating and defining workflows using the tools and libraries to automate processing. These can include processing of raw payloads through machine learning and triggered events.
These three components form the foundation for a scalable, repeatable, and robust geospatial data management and processing system. With this foundation in place we can begin to automate and customize the workflows to suit science teams, development teams, and management.
Putting it together
Our goal is to have an end-to-end system for collaboration and creation of data products. For a space company it may be scenario like this:
- Raw data is downlinked and
- automatically cataloged,
- quality checked, and
- processed into standardized data with metrics by the operations team.
- In the meantime, the R&D team trains a new machine learning models in Python in the online cloud environment with direct access to the new data and using on-demand auto-scaling resources.
- The resultant model is published into a workflow and is automatically run against all the data in a customer’s region.
- The developers are able to deploy new algorithms in a serverless environment and present these analytics as API’s for the customers.
That’s what we call “Operationalization”.