Placeholder Image

Subtitles section Play video

  • (gentle music)

  • - [Keith] Okay, hello everybody. I'm Keith Krause.

  • I'm also with the Airborne Observation Platform team.

  • I'm going to take what Tristan did,

  • and go a little more advanced.

  • I'm going to keep this presentation a little shorter than

  • the number of slides I have. So, fortunately,

  • Tristan covered several of my slides already,

  • and I'm going to try to introduce a little more

  • LiDAR theory, because it's kind of in the context

  • that hopefully will make sense of why our

  • waveform data looks the way it does.

  • Hopefully, that will make sense, and also talk a little bit

  • more about, you know, ask questions about

  • target detection and that sort of thing.

  • Some of that's going to come up a little more

  • in the waveform.

  • Tristan already kind of showed you Discrete LiDAR.

  • You're essentially finding returns or objects.

  • You get geolocations. That could be X, Y, Z

  • on a map. Intensity, other attributes.

  • That's great. But the hope is, with full waveform LiDAR,

  • you're actually measuring that entire signal

  • as a function of time.

  • The hope is that you can do more with that data.

  • We'll talk a little more about that.

  • One of the challenges in the past is that

  • full waveform LiDAR data just hasn't been

  • available to people.

  • There's a handful of groups that are working

  • with it right now, but you don't typically

  • see lots of papers or presentations on the subject.

  • We're hoping to change that.

  • At the current moment, we have LiDAR products

  • from some of our 2013 and 2014 flights.

  • That's available by request.

  • Unfortunately, we weren't able to collect any

  • waveform data last year, due to some instrument

  • hardware issues.

  • But we've been collecting it this year, in 2016,

  • and we're currently processing that data,

  • so it should be coming available, hopefully,

  • within the next couple of weeks.

  • As I mentioned, we hope more people get involved

  • with waveform LiDAR.

  • So, these are just more graphical representations

  • of what you've already seen from Tristan.

  • But I think the big thing in terms of waveform LiDAR

  • and what I'm going to talk about, just keep in mind right,

  • once again, you have this outgoing laser pulse,

  • some time goes by, and then you're able to record

  • some reflection of light as a function of time.

  • We're going to keep coming back to this change in time.

  • This is another form of one of the figures Tristan showed,

  • but from Texas A&M and Dr. Sorin Popescu.

  • We're going to zoom a little more into this plot here

  • in a second.

  • Once again, it's a 2D beam that's interacting

  • with objects as a function of time.

  • Keep in mind with LiDAR, time is distance.

  • Distance is time.

  • So, you have discrete return in full waveform.

  • Discrete return, there's usually onboard processing.

  • That realtime. We'll look at that signal and

  • try to do target detection, and then it does the ranging.

  • In Sorin's figure, he kind of talks about this idea that

  • depending on what sort of algorithm or hardware is used,

  • there could be a period of time where it detects an object,

  • and then might have to reset itself.

  • So, it can actually miss things.

  • The nice thing about the full waveform is,

  • you'll capture this entire signal as a function of time.

  • So, hopefully with post-processing,

  • you can go in and get more detail,

  • but as you'll see in a minute,

  • there are some complications too.

  • The hope is, looking at these waveforms,

  • just like with discrete data, you can start

  • to maybe imagine, based on the way the tree structure is,

  • that you might have overstory and some understory,

  • and maybe the ground.

  • You can start thinking about stratification of

  • either vegetation or other objects.

  • I'm not going to spend too much time here,

  • but just the general process of LiDAR is,

  • you fire your laser. You record your signal.

  • You do some sort of target detection.

  • Basically, once you've identified a target,

  • you can then look at the change in time

  • between that outgoing pulse and the received pulse.

  • You do some calculations that converts

  • that time-of-flight into a range.

  • Then from the range, you have your GPS IMU, and you can

  • figure out what direction the scan mirror is pointed at,

  • and then that gets you your coordinates.

  • So, just like discrete return points can have geolocation,

  • full waveforms can too.

  • You'll see with the product

  • that we include geolocation information.

  • In general, ranging follows kind of the basic

  • speed of light calculations from, I don't know,

  • a couple hundred years ago.

  • But essentially, in this case, we know the speed of light,

  • but you have the speed of light.

  • You have the change in time between

  • that outgoing and return pulse.

  • Remember the light has to travel there

  • and then also come back.

  • So the distance is actually half of that time.

  • And then, of course, you have the index of refraction

  • of air, because that laser light's actually going to

  • slow down a little bit traveling in air than say,

  • it would in space if you were in a vacuum.

  • So, that's just that absolute range.

  • You might also hear the term of range resolution.

  • Some people call this different, but Tristan mentioned,

  • you know, when objects get too close to each other,

  • you can't resolve them anymore,

  • and I'll show a figure of that.

  • But, essentially, that's going to be driven by

  • the outgoing pulse shape. So, these laser pulses don't

  • infinitely, or infinitesimally, jump up to a peak signal.

  • It does take time for it to ramp up, fire that laser,

  • and then ramp back down.

  • So, that shape will actually cause blurring,

  • and that's why you can't detect objects.

  • So, there are several different algorithms

  • for how you would do ranging.

  • Different manufacturers will use their different

  • proprietary algorithms.

  • I'm just going to show the really simple ones.

  • You can imagine that if you have your outgoing laser pulse,

  • then some time goes by. It reflects off,

  • in this case, probably the ground,

  • since you just get a single peak.

  • We're going to find the peaks, and then, in this case,

  • we're going to say, well, let's go and figure out

  • where the 50% energy is on the left side.

  • This would be called leading edge detection.

  • That's done in this case, mostly because,

  • if you look at the shape of this outgoing pulse,

  • it actually is kind of pushed more onto the right side.

  • So, it's not perfectly Gaussian.

  • Combination of, you have a sharper rise than you do a fall.

  • And then the other pieces.

  • This is the ground, so it's pretty simple,

  • but if you're interacting with the canopy,

  • you can imagine that that left edge

  • is going to be the top of the canopy,

  • so that might be where you actually want to range to.

  • I guess, one other thing to note.

  • The time between the outgoing pulse and the return pulse,

  • ends up being about 6500 nanoseconds.

  • When you do all the conversation, that comes out to about,

  • in this case, 983 meters.

  • You can imagine, if we're trying to fly

  • at about 1000 meters above the ground,

  • you have some terrain variation,

  • and there you're getting 983.

  • So, this may address your question a little bit,

  • but you can see, just with discrete and waveform,

  • you might get multiple peaks.

  • So, in this case, you could identify three objects,

  • and each of them has a leading edge.

  • So, you could identify in the discrete return,

  • three targets.

  • If you were just looking at the relative time difference

  • between these,

  • maybe you could say, this is the ground,

  • and this is the canopy top, and in this case,

  • the canopy would be 14 meters tall.

  • So, you can start to see, that might be one way that

  • you might analyze waveform data.

  • Rather than building a canopy height model on a raster grid,

  • you might be able to identify canopies and ground

  • within a single laser pulse, and now,

  • start looking at distance measurements that way.

  • So, a little more on range resolution and target separation.

  • This, hopefully illustrates what Tristan talked about.

  • In this case, I've just done a simulation,

  • and we're using a 10 nanosecond outgoing pulse,

  • which is typical of the Optech system,

  • I think at 70 kilohertz.

  • 100 kilohertz might be a little wider,

  • so it would actually blur more.

  • But you can see in this case,

  • if you have a 10 nanosecond wide Gaussian,

  • and you take two ideal targets,

  • and put them 40 nanoseconds away from each other,

  • clearly you can see two peaks, and that's easy.

  • If you move them closer, you can see that the signal

  • starts to blend in the middle,

  • but you can still identify them.

  • Even here, no problem.

  • But you can see here, if you actually separate them

  • by exactly one of the full width half maxes,

  • to you and I, we still see kind of a double peak,

  • but actually a lot of algorithms might have a hard time

  • trying to determine where exactly those two peaks are,

  • and it might still say that there's one peak.

  • And as you get below,

  • if you get less than the full width half max,

  • you still had two targets in the original,

  • but you can see the signal sums into a single shape.

  • So, at this point, you've effectively lost your ability

  • to say there's definitely two objects there.

  • It could just be one object that was brighter.

  • And as you go even further, same kind of thing.

  • And you'll see, if we put some actual Gaussians on this.

  • At least in this case,

  • if you had a really sensitive algorithm,

  • you might say that, I only have one object,

  • but it's not a perfect Gaussian,

  • so maybe there's something else there.

  • But at this point, at half the full width half max,

  • you'd probably have no way of knowing

  • that there's two objects.

  • So, that's kind of the idea of range resolution.

  • You can imagine different branches in a tree,

  • if they're too close together, their signal is just

  • going to sum up and it's going to look like one big branch.

  • Not going to talk too much about this,

  • other than I do have a figure to kind of explain this.

  • But, one of the challenges with all these systems,

  • is being able to write the data fast enough to keep up.

  • Kind of as a comparison, the hyper spectral data.

  • You have a 640x480 array.

  • You're running it at 100 lines per second.

  • That's effectively equivalent to the data rate

  • the LiDAR runs at, at 100 kilohertz,

  • if we had 310 time bins that we were trying to save out.

  • Now, the difference is, the spectrometer has a fancy

  • computer, and I think it simultaneously writes

  • to four hard drives at the same time.

  • Whereas, the LiDAR, I think has a single hard drive.

  • So, there's kind of games you have to play,

  • making sure you're saving out that data fast enough,

  • or else the laser's going to keep firing,

  • and it'll just miss everything.

  • As an example, you might love to save the entire data space

  • from when you fired that outgoing laser all the way

  • through the air and down to the ground and back,

  • but unfortunately, that would be over 6000 bins of data,

  • and just with 100 kilohertz, which is our nominal PRF,

  • and if we had 8-bit data, let's say,

  • which most of the newer systems

  • are running higher than that like 16-bits,

  • you'd actually need to write out

  • at about 5 gigabits per second.

  • Now, the other day I just copied some data

  • from a hard drive and it was running at like

  • 30 megabits per second.

  • So, you can imagine, it's orders of magnitude.

  • You just can't save everything.