The search for Flight 370

flight370

courtesy of Wayan Vota (https://www.flickr.com/photos/dcmetroblogger/)

As the search for missing Malaysian Airways Flight 370 approaches it’s 5th week, the reliance of Geospatial technology and the skills to analyse large volumes of data are becoming increasingly clear. In this post we will look at some of the geospatial technology and techniques that have been used in the search for Flight 370.

Background

Flight-370 disappeared on the 8th of March 2014 having left Kuala Lumpur en-route for Beijing. There was simply no trace of it. Communications were lost somewhere over the Gulf of Thailand. Speculation quickly rose as to the fate of the aircraft with hijack and rouge pilots being muted as possible explanations.  A catastrophic break-up of the aircraft through an explosion was not ruled out but looked unlikely as this would generally be noticed. Furthermore, there was no sign of debris in the area of Flight 370 last known position.

Data feeds and extrapolation

After a few days, data started turning up that suggested that the plane had stayed aloft for several hours after all communication was lost.  Equipment onboard transmits information such as status updates and diagnostics.  The engineering teams can then monitor the health and performance of components while they are in use.

The engines had sent burst of data every hour and these had been picked up by a satellite operated by Inmarsat. By monitoring the Doppler effect in the received data, Inmarsat was able to chart 2 possible paths; one to the north and the other to the south.  This had never been done before and the innovative use of this data by Inmarsat allowed the rescue effort to be concentrated in 2 distinct areas.

After a bit of tweaking and refining, the Inmarsat scientists were able to discount the Northern corridor and the search re-focused on the Southern corridor, a vast expanse of ocean west of Australia with no suitable landing site.  How they achieved this was really quite clever. They used “truthing data” from other aircraft to monitor the Doppler effect and therefore refine their estimates for Flight 370. They then calculated the speed and altitude of the aircraft and were able to work out roughly where it would have run out of fuel and ditched into the ocean.  This greatly reduced the search area.

Satellite analysis

The search area had been focused to a small section of ocean (ok, so small in this case means the size of Western Europe, but given the size of the Southern Indian Ocean this can be considered to be small).  It was now feasible to start analysing aerial imagery to try and identify debris (given that there was nowhere for the plane to land, on the 24th March Malaysian officials announced that it was beyond reasonable doubt that the plane was lost after ditching in the Southern Indian Ocean). Trawling around to find out what satellites were used was harder than i thought it would be.  Below is a summary of what i found:

  • GAOFEN-1 – a high-resolution optical sensor run by CNSA (Chinese National Space Administration) which was launched in April 2013. Gaofen-1 is equipped with a 2 metre resolution CCD (Charge-Coupled Device), an 8 metre resolution multi-spectral scanner and 16 meter resolution wide-field multi-spectral imager. It is difficult to tell which sensor produced the image below, but from the resolution it looks like it was the 8m res multi-spectral scanner.
chinese satellite

Chinese satellite image of possible debris – Pic from The Guardian/Reuters

  • A French satellite operated by Airbus Defense and Space spotted 122 objects in a cluster. The objects were up to 23m in length and in a cluster. (image released by MOSTI). Airbus Defense and space have a host of satellites run through their Astrium including EnviSAT, CryoSAT, Copernicus, ELISA and Helios 2.
French

Airbus Defence Image

  • Australian AP-3C Orion – Orion aircraft were deployed to likely search areas and scanned the area.  It is likely that the crew were using a combination of electronic surveillance system and just their eyes. This might seem like old-school, but it is an effective method of verification as trained operators can discount or confirm sightings from remote sensing. The aircraft has a long-range and can fly low making it ideal for searching.

Ocean Currents

Why has it taken so long to refine the search area?  Well there are lots of satellites, but only a few of them would have had suitable sensors on-board. Data is collected and beamed back to a receiving centre. The raw data will most probably have to be processed before it can be used for anything.  This takes time.  The search area may well have been narrowed to a chunk of the southern Indian Ocean, but this still represents a huge area, not dissimilar to the size of Western Europe.  Processing and analysing data for such a large area is not easy and will rely on a degree of automation followed by humba verification.

The southern Ocean is a wild place with frequent storms. We can see from above the at optical sensors have been used and these will be unable to penetrate cloud cover. Scientists would have to wait for the satellite to pass over the same area to try and get a better, cloud-free image. The repeat cycle may be anything from 1 day to 10 days or more.

Then you add in the ocean currents.  Anything object floating in the ocean will not be static and could drift by 10′s of kilometres a day. Given that the plane is likely to have crashed 15 days previously, debris could be 100′s of kilometers from the crash site. That is, if it has not already broken up and sunk.  But we can at least model the ocean currents and estimate the potential dispersal of the debris.  The NY Times have some excellent visualisations of both the currents and the wave heights in the southern Indian Ocean during March.  These have been produced by the National Oceanic and Atmospheric Administration and the National Centers for Environmental Prediction through remote sensing data, in-situ data (buoys) and models.  While never 100% accurate, they provide an indication and convey the uncertainty involved in determining a search area.

Locating flight recorders

Once a search area has been identified, the searchers are able to deploy listening devices which locate “pings” emitted by Flight 370′s black box. This is achieved by towing a listening device (TLP-25) back and forth across a wide area.  Pings should be received periodically and the position and strength of these should triangulate the position of the black box. But the sea floor is not flat in this area. It is around 4500m deep with mountains up to 2500m high.  We actually know very little about remote ocean sea beds.  We have limited data collected by ships and most representations come from spaceborne remote sensing data. These are not very accurate and may “miss” large structures (1-2km high) such as seamounts. There is nice overview of ocean mapping on the BBC website.

The difficulties of retrieving debris from deep, remote oceans was highlighted by the search for French Airlines flight 447.  In this case, both black box transmitters failed.

A Chinese ship detected a ping on the 5th April and a day later an Australian ship detected a ping.  But the pings were quite far apart.  The Australian ships detection seemed more consistent and stronger and this was backed up by more detections in the same area on the 8th. It is a slow process, but each detection should help reduce the uncertainty.  The question is, will the batteries in the transponders last much longer?  They are already at the limit of what is expected so time is running out.

Remote Sensing Critical

It is clear that remote sensing technology has been critical in every stage of the search for Flight 370.  It will continue to be so until the plane is found.  It has been used effectively to narrow search areas and discount blind alleys. It is also interesting to note how associated data has been used in ways that it was not intended to locate the plane and praise should be given to the Inmarsat scientists who came up with a novel solution when data and information was scarce.

Articles:

  • The search for Malaysian Airlines Flight 370 – a great article in the New York Times that focuses on the remote sensing data that is being used now that search teams have identified a “likely” crash site in the Southern Indian Ocean.
  • Wikipedia – a growing resource for information about Flight 370
  • Wikipedia – French Airways flight 447
  • NY Times – nice collection of visualisations of ocean conditions in the search area

FOSS4G 2013 – 5 reasons you should attend

FOSS4G is the annual conference for anyone interested in Free and Open Source Software 4 Geospatial.  FOSS4G 2013 will be held in Nottingham between the 17th and 21st September. So what makes FOSS4G so important and why should you attend?

  1. Network – FOSS4G is the biggest gathering of developers and users of open geospatial software.  There will be over 700 people at the conference. This includes the lead developers on some of the larger open source projects such as OpenLayers and QGIS.
  2. Learn – You’ll learn a lot in a very short period of time.  No matter what your knowledge of open source geo from beginner to expert coder/developer you will learn something new at FOSS4G.  There are workshops for all levels that you can sign up to.
  3. Inspiration – You will be inspired by some of the major names in GIS and data analysis. The list of keynote speakers includes Paul Ramsey (co-founder of PostGIS), Kate Chapman (Acting Director of humanitarian team at OpenStreetMap) and Ben Hennig (Worldmapper Project).  For a full list of Keynote speakers, please refer to the FOSS4G keynote page.
  4. Double the fun – Visit AGI GeoCommunity’13 at the same time. Yes, that’s right FOSS4G and AGI GeoCommunity are happening in the same venue on the same week. This was no accident. GeoCommunity is a great event and the FOSS4G organisers wanted to bring the two audiences together. GeoCommunity’13 runs from the 16th to the 18th September.
  5. Can you afford to miss it?  – What does this mean?  Well, the conference package is quite reasonable given the number and diversity of talks on offer.  £165 for a day pass or £435 for the whole event (3 days and the FOSS4G Gala Night).  FOSS4G was last in Europe back in 2010 and it might not be back until 2017 as it moves between continents. So, if you are based in Europe attending FOSS4G might not be as easy for a number of years.

So, there are 5 pretty good reasons to attend.  I am sure there are many other reasons to come along.  To find out everything that will be going on at FOSS4G please look at the conference website and follow the event on twitter through the #FOSS4G hashtag.

FOSS4G 2013 takes place between the 17th – 21st September 2013 and will be held at the East Midlands Conference Centre, which is situated on The University of Nottingham campus.