Show us why the public should always call a land surveyor

Where are you surveying?

Mark Hill's Closures

 

Mark,

 

It be time to stop thinking about our work in terms of closure. Most everything we rely on is moving away from that and towards positional certainty (error theory based statistics). NGS, standards, etc...

 

If you have calculated the error ellipse on your closing point, I think that you'll probably find yourself inside that (assuming you keep all your equipment calibrated, in good working order, and without blunders). Not knowing the survey specifics, but assuming you took post processed GPS coordinates as your start and backsight, then traversed to another GPS coordinate, you might propagate the error in angles and distances and come up with an ellipse that matches your result. Your closure in inverse could be anywhere in that ellipse and you'd be surveying to the limits of your equipment and techniques. If you closed flat, that's a matter of chance, it doesn't mean your points on the way there are perfect ... your end value could just as well be any other value in that ellipse ... that's the uncertainty and probability.

 

Because of GPS's ability to measure accurately over larger distances than our conventional equipment, we have a tendency to consider the coordinates as accurate at all times. That's quickly a problem if we set up an instrument between inter-visible points and take an EDM reading. My own EDM has a standard deviation of 0.006' Considering that with day-to-day dual-occupation, static network, post-processed GPS work we can rarely get better than a 95% confidence ellipse better than 0.03 on GPS'd points we'll typically get an "error" when checking our "good" GPS control.

 

Earlier in my career I followed a similar methodology to what you describe: GPS main control points and fill in the gaps with conventional traverse. When elevation was a critical factor, there were levels run. In the end, I was left with three sets of disparate data and the difficult task of manually making it all fit together with sufficient confidence for my project's needs.

 

One day I finally took advice I'd been given for years and looked into least squares adjustment software. Now nearly every project in my shop is run through an LSA program. It has made me a better surveyor many times over by giving me not only a better understanding of measurement error theory and practice, but by showing me consistently and visually how it can be applied to everything we do. I swear just a couple weekends ago when I was processing some baseline work on a deformation monitoring project for a retaining wall I was literally giddy with the way I could see things that unfortunately most people never think of.

 

It is interesting to find as I show people the software and help them learn (whether my own employees or associates) the younger generation like me has the most difficult time understanding and truly appreciating what it does. An obviously experienced surveyor as yourself would find it a true delight and you'd never want to apply a compass adjustment again!

 

The key to your project is having sufficient redundant information to make proper assessments of the results and achieve valid accuracy statements.

 

I have settled on Starnet by StarPlus. While i've become a diehard user, I don't think that any one software is the best solution for each situation. I know that Carlson, for example, has Survnet built in to their survey module and there are surely many programs that others here can suggest as appropriate for your setup. I saw an ad last week for Move3, but I haven't reviewed it yet.

 

The key in my opinion is to be able to quickly bring all the data together for a simultaneous adjustment. Preferably you're able to bring in the raw GPS vectors, raw angles and distances, etc. If you can't bring the GPS in as raw vectors, then you need to enter those coordinates with true and accurate positional uncertainty. I don't know how you're generating the GPS coordinates, but the program used should be be able to provide this information for each point generated (adjustments in GPS data are by nature LSA). If you're starting off with something weak on GPS like a one-shot RTK position, I'd be careful to hang your traverse data just on that.

 

It may be out of place, but I have to suggest that ALL traverse for control be a minimum of 2 full sets of angles to tribrach mounted glass (all equipment weekly/monthly checked). There should be a sufficient number of redundant information collected at all opportunities to help you feel comfortable. Things like angles to a natural from multiple positions are great for strengthening a traverse and increasing accuracy.

 

Enough for now ... I could go on and on, but I don't want to get away from what your specific issues are. Post more detail on your process and I'm sure folks will have some input.


Rich

 

You need to be a member of Land Surveyors United - Global Surveying Community to add thoughts!

Join Land Surveyors United - Global Surveying Community

Email me when people reply –

Replies

  • Hey Rich. You've hit the nail right on the head when it comes to the answer to my problem. As much as I hate to admit, I am a bit behind the times when it comes to technology. Not as much for lack of will but more so in the bank account department. I also must admit that I tend to adhere to the older methods I was taught and know so well. You absolutely must have a closure in your traverse or at the least in all cases some sort of check. I hear of a lot of guys out there are relying on sideshots and open ended traverses. I guess I'm more comfortable with mathematical facts rather than probability but I'm gonna get left behind if I strictly adhere to that attitude. The reason I started the discussion was to see if anyone might discuss how they were handling such a task. I've got everything figured out and it's not looking that bad now even though it wasn't absolutely devastating in the first place, just could have been better and especially more efficient if I had more knowledge and experience with GPS. I would have never bought GPS equipment to begin with if it had not been made a requirement by my county to provide at least 2 state plane coordinates on a plat submitted for recording. Since I did make that investment I've come to realize many other advantages especially when surveying large acreage tracts. I've read a little about least squares adjustment and I know of the advantages but don't have it available on any of my outdated software. It seems I need to invest in some modern equipment, software and probably even education. I'll check in to Starnet and maybe some of the others, that is some excellent advise. Got my eye on a new RTK setup but the subscription to the network is over $400 a month. I'm hoping that will start to come down soon and there are a few are not so expensive but I hear they have some problems. You mentioned earlier in your career the statements "fill in the gaps, separate level runs, three sets of data" that's almost the way I'm still operating. Thanks a lot and I'll try to take some of that advise.
    • Mark,

      I think most of us are continuously behind the time times on technology … who can afford it? I haven’t gotten my terrestrial scanner yet and already I’m being sold that I should have two mounted on the back of my suburban for mobile scanning!

      I am all for the “older methods” because they are at the heart of everything we do. A closure is a check, no doubt, but that’s all it is … it doesn’t really tell us how accurate we were, in my opinion. I think the older methods give one all the education required to take advantage of the new technology and software … in fact, I find that experience key to truly understanding. I myself came into surveying in the button pushing and coordinate age and my knowledge and appreciation for the true guts of the work has only increased once I got away from coordinate surveying and started working with LSA and the raw data.

      I have some concerns with the widely available RTK networks, a discussion for another post. Even if I work with RTK (rarely) I bring that information into an LSA program as vectors with appropriate weighting or coordinates with realistic positional certainties (almost never). For the type of work you describe, 2 static receivers (even single frequency) placed on some of the control to obtain vectors from CORS and on pairs of points to form closures and a network are all that is needed. If you work with an RTN think about having to calibrate to local published control before every survey to be able to create valid datum statements. (I tried asking the local networks for datum statements, even proof of their adjustments of the base stations… never got an answer). Process the data yourself from CORS and your datums are self evident.

      I’m going to send you some information to read. If you can find the time to skim what I send I’m available for questions and answers. Then, if you are interested I’ll send you some samples and examples that might interest you. By the description of your work I can easily give you case examples of exactly what you are doing to see how we are using LSA.

      Depending on the amount of data and the format it is in, I might even give it a run through StarNet for you … but let’s wait until you have looked at the info I’ve sent you so I’m not speaking Greek to you right off the bat.

      Take care,

      Rich
    • My idea on this is that I cannot simply go directly to data reduction work (LSA) without getting into the very baseline of comparison. We cannot simply compare mango to apple. To my point of view, I'll do local to grid transformation then undertake LSA. Forcing LSA between two different systems is a blunder.
    • Arnel,
      I could not agree more with the idea of not "forcing between two different systems". I would never suggest that as a course of action.

      I submit that a comprehensive LSA program like StarNet (I am not a salesman, but I can only speak to it or SurvNet) is the best tool we have for comparing apples and mangos to find any that are rotten. The key is to work with the raw data, where StarNet can reduce all the data to a common reference frame for comparison and eventually adjustment. With observations properly weighted by equipment specifications, conditions, and methodologies, I can run blunder detection routines, observe residuals, or see where my network is weak and could benefit by additional redundancy.

      If I were to manually reduce my conventional data to grid, I'd either be using an average reduction factor or calculating on a line by line basis the more correct factors to use (tedious). On top of that I imagine I’d be working with resultant bearings or azimuths instead of considering the more important raw data. Once I include more than one possible solution for any part of my survey (critical for a proper LSA confidence) I quickly leave the realm of realistic hand computations. For example, from any setup in our traverse we routinely take every opportunity to include an observation to points that can be seen from other setups. This increased strength in my network will quickly point out if I have a bad piece of equipment, from a tribrach, to a rod, or even the instrument.

      The adjustments should first be run as minimally constrained to check the integrity of the raw data itself wherever possible before fixing to more highly weighed “control data”. In the case described, the traverses alone appear to have no conventional closure and the GPS vector between the start and end with the angles and distances on ground could be reduced in an LSA program to a common reference and if properly weighted, any blunders or issues would stand out.

      Rich
This reply was deleted.