In the May-1994 issue of an IEEE journal (Trans. on ...), there was a paper, with a claim of "analysis of unstructured environments, using sonar as a sensing device." However, it turns out to be only a trickery. The abstract is drowned in absurd claims.
As a result, we may notice that, the case is only an ultimately trivial case study of what you may do with a Polaroid sensor. That may pass as a trivial example, or as an advertorial, to demonstrate the data collected by that sensor, but the outrageous thing is that, the paper is the last paper in a series, which has "earned" a doctoral degree, in an IVY League university. (The second author is his Ph.D. advisor.)
In other words, when I was a student, I was saved by sheer luck that, I have not met him as a professor in some lecture -- as he was licensed with that doctoral degree, to get the title of a professor. e.g: As with the case of a worthless, plagiarist. Let's keep alert against such catastrophical cases.
Let me explain (a bit) about the points I notice, as concerns that simplistic paper.
Their trickery is that, their simplistic "example" is, in fact, the ONLY case where their simplistic algorithm would not mess -- as their "example" is with isolated-objects, which may not interfere, when each of those objects reflect/scatter the waves. If the objects were to get closer, that neatness would explode into its inherent complexity.
If the authors were to find a significant question, or a versatile method, we would understand that. But "to map the environment" is the most unimaginative claim, especially if it is not achieved, either. They were unable to accomplish that
Their case may sound as a trivial case of CSG (solid-geometry) based computer-vision. By 1994, even CSG-vision was more advanced. The paper is a trivial template-matcher, which tries to match the pre-computed templates (precomputed and stored sonar data) to the "real world" cases - in lab. (Where do you find a "coal mine" where objects stand in isolation, like that? Even to track the constancies, e.g: to subtract the effects of a certain type of wall, even if that was feasible to measure/calculate, that was not accounted by that paper, either.) If with a computer-graphics analogy, the template-matcher would try to identify a circle in various (3-D) view-angles, and at various distances. To do that, a few thousands of pre-computed example-cases may suffice. But the task may only explode (storage&time need), when with multiple (overlapped, or near) objects.
Or, that may resemble a room-scale version of CT-scan (with SONAR, instead of X-rays, especially when with multiple transducers/sensors). But the CT threshold does not have to deal with interference -- if we take, the X-rays are not perceived to bounce from tissue-to-tissue, as the CT sensor is reporting the absorbed energy-level at that voxel, versus the SONAR sensor must wait until the reflected waves, finally, would arrive itself.
The significance of presence of data, is only with a practical application. e.g: To know the middle-name of your neighbor, need not help (or, hurt).
If "heterogeneity" was supposed to mean, to account for smooth objects, along with the rough, then a paper which keeps the objects totally isolated, is emphatically not heterogeneous.
Furthermore, I reflect (ponder) that, if those objects were ever near each other, even if they were several smooth objects, the waves would reflect, as if that were a rough (single or aggregate) object -- unless all of the objects were to pose at an equal angle.
In general, in a scene, if there may exist smooth next to rough, the result is [most probably] rough. That is, rough+smooth=rough, if near each other.
To place each object/surface at a separate side, in a lab, is no way to "verify" the [real-world] applicability of a proposed method.
In summary, we may state that, their absurd theory/claims were nodded, with only a non-test. If there were a real-test, with serious application-cases to test heterogeneity, then that paper would probably never get published, at all. Not to mention the Ph.D.