Skip to content
Join our Newsletter

Made-in-B.C. tech headed for International Space Station next year

Metaspectral’s computer vision software can be used to quickly identify everything from forest fires to oil spills
issspacestationgetty-moment-robertomachadonoa
Vancouver-based Metaspectral's technology is set to be deployed on the International Space Station for six months, beginning early next year | Getty Images / Moment / Roberto Machado Noa

Brilliant, multi-galaxy images from the US$10-billion James Webb Space Telescope have been enchanting the globe over the past week. Coming up early next year, though, computer vision technology developed in B.C. will be tapping similar advances to help examine the Earth from its perch on the International Space Station (ISS).

The ISS National Laboratory has given the nod to MLVX Technologies Inc. (Metaspectral) to deploy a payload aboard the space station to help decipher what’s known as hyperspectral data. Hyperspectral imaging entails capturing a much broader spectrum of light rather than the typical red, green and blue used by conventional cameras.

It will be up to orbiting satellites to capture hyperspectral imagery and data targeting Earth – the same type of imagery the James Webb telescope has been capturing of the universe. Meanwhile, Vancouver-based Metraspectral’s technology will work to compress, stream and analyze the incoming data from low Earth orbit (a distance in space still relatively close to the planet’s surface). 

The goal is to make it exceptionally faster to share useful hyperspectral data with people based on Earth. Such data could be used to rapidly identify forest fires, methane leaks, oil spills and other events, according to the company.

Metaspectral CEO Francis Doumet said in a statement the project should be able to produce “actionable insights” in 15 minutes or less.

"Our technology makes it possible to bypass bandwidth constraints with our advances in data compression and machine learning," he said.

The tech will be delivered via an upcoming SpaceX mission and will remain operating on the ISS for six months.

Metaspectral is working on this project – known as Onboard Programmable Technology for Image Classification and Analysis (OPTICA) – with Miami’s HySpeed Computing LLC. The Canadian company is supplying the hardware and software, while the American company is providing the data-processing infrastructure. 

Computer vision software helps computers see and perceive the world as humans do.

While many computer vision applications rely on conventional pictures to correctly identify images of dogs or trees, Metaspectral taps high-end, hyperspectral sensors to collect mountains of data that its software can then analyze pixel by pixel in real time.

“Conventional computer vision … basically [sees] what humans can see. But there’s a lot that goes beyond the human eye that conventional cameras cannot capture. And that's where our technology ventures into,” Doumet told BIV last month, adding that those high-end hyperspectral sensors collect 300 frequencies of light compared with the three frequencies of light captured by typical cameras.

“That allows us to find defects, characteristics, and identify materials that conventional cameras, and therefore conventional computer vision, simply cannot find.”

For example, data interpreted by Metaspectral’s software is so precise that a major plastics recycler is working with the B.C. company to help with visually identifying the different types of plastics coming into their plants based on the materials they’re made from.

And last month, Metaspectral revealed the Canadian Space Agency (CSA) was providing it with $150,000 in funding for an initiative aimed at measuring carbon dioxide (CO2) levels on planet Earth by examining hyperspectral images and data captured by orbiting satellites.

The data collected for the CSA-backed venture comes from across the electromagnetic spectrum to allow Metaspectral’s software to identify and measure CO2. This means measuring how different frequencies of light invisible to the human eye reflect back to the sensors and then comparing it to previous measurements.

The company said that it’s able to accurately measure this within a three per cent margin of error.

“One of the key problems today with the whole carbon credit system is that farmers in general just have a hard time kind of defining exactly what their impact is,” Doumet said in June.

“And so if we were able to take out the guessing game, at least when it comes to accurately measuring the carbon sequestration potential of … a farmer's field, then that empowers them to kind of plug into the carbon credit system more efficiently.”

[email protected]

twitter.com/reporton