We grow businesses in the Signal Garden

Signal Garden is a technology developer with fundamental experience in signal processing.

Across sensors in your mobile phone and billions of connected sensors in the world, we fuse sensor data and make software for you that runs efficiently on small, low power, low compute devices.

As a virtual business accelerator, we build technology and tools to share with our Seeds.


Signal Garden offers expertise in Computer Vision, Augmented Reality, Artificial Intelligence, Custom Algorithms, Embedded Systems and designing hardware solutions to work for you.


Electrolyte Engine™️ is a powerful cross platform library of production ready tools designed to provide a smooth workflow from prototyping to publishing.


Seeding opportunities are infinitely flexible and scalable. We even share technology and developers. Join a small ecosystem of complementary businesses.

Signal Garden Manages Data for the Department of Homeland Security Science and Technology Directorate (DHS S&T) and National Aeronautics and Space Administration/Jet Propulsion Laboratory (NASA/JPL) Project Giving First Responders a Hand in Saving Lives.

Watch the video about AUDREY

The Assistant for Understanding Data through Reasoning, Extraction and sYnthesis, or AUDREY for short, is a software application that performs data fusion and provides tailored situational awareness to first responders.

Read the article The Future of Artificial Intelligence in Firefighting, in Fire Engineering Magazine (October 25th, 2018)

Google Augmented Reality Technology Case Study

The world’s first ARCore app to be published on the Play Store is Atom Visualizer from Signal Garden, developed in partnership with the National Science Foundation and the University of New Mexico. And, Google recently handed the Tango Developer Community on Google G+ to Signal Garden.

ARCore is built upon Google Tango – the first consumer hardware platform for mobile Augmented Reality at Android scale, targeting 100 million devices. ARCore uses computer vision and sensor fusion to detect the world around it without other external signals.

Collaborative Research with University of Copenhagen Faculty of Humanities / Technology Enhanced Vision in Blind and Visually Impaired Individuals

Sensor fusion and depth-sensing continues to be a burgeoning industry – making this technology very accessible as blind people can easily buy it and use it as part of their everyday lives. The technology and techniques in SensoryFusion could also be applied to various other domains including autonomous robotic navigation to solve real time contextual pathfinding and obstacle avoidance on a low power device. This report shows some of the potential of the technology.

Read our report summary.