Geodesy and Geomatics Engineering Technical Reports

Pages

Design and implementation of an inshore hydrographic surveying system
Design and implementation of an inshore hydrographic surveying system
Computers are no widely used in hydrographic surveys to take advantage of their ability to provide navigation function and data processing capabilities. The design parameters that play an important role when we try to create an automatic data acquisition and processing systems are the accuracy with which sea bottom is represented, the reliability, the compatibility with existing hardware, the man/machine interaction, the modularity and the cost of the system. In order to optimize the design parameters, an automatic data acquisition and processing system was built around the Apple IIe personal computer. The algorithms required to meet the design objectives have been implemented in larger systems. These systems are downs scaled to a system that will serve the needs of near shore hydrography. Different systems of lines, used to sample the depth, are compared: straight parallel lines, lines of position of electronic positioning systems, circles and radial lines (“star” mode). In order to filter the position data simple getting techniques are compared with Least Squares and Kalman Filters. In order to filter the depth data depth filtering techniques are examined. The results indicate that the Apple IIe with commercially available peripherals and standard software, offers great promise of replacing larger and more expensive hydrographic data acquisition and navigation controllers. Straight parallel lines give efficient coverage of the survey area only if they can be modified on-line. The “star” mode is the best shoal examination pattern with respect to track keeping ability. The memory and computational speed constraints of personal computers require the use of simple linear filters to filter the position data. Visual comparison of digital depths with analog depths is an efficient depth filtering technique.
Design of a conceptual land information management model for the rural cadastre in Brazil
Design of a conceptual land information management model for the rural cadastre in Brazil
The rural cadastral reform established by Law# 10,267/2001 is the most recent benchmark in the land administration history in Brazil. It is important not only because after this law rural properties must be geo-referenced, but also because for the first time Brazilian law has called for a common multipurpose cadastral system, called the National Cadastre for Rural Properties – CNIR. CNIR will integrate legal (tenure information), fiscal (value information) and agrarian (land use and management policies) and environmental (protected areas) databases. As in many countries which are in the process of a cadastral reform, Brazil faces serious political, legal, and technical challenges in developing a national rural cadastral system. However, through the harmonization of land information, it is hoped that land conflicts can be reduced, land can be more fairly redistributed and taxed, interests in traditional lands and protected areas can preserved and, most importantly, that what is registered at the registry offices is the same as what is represented on the ground. This research supports CNIR implementation by providing the design of a conceptual model based on user requirements of all collaborating agencies. The primary purpose of the model is to provide a framework for the integration of the current cadastral systems, under several land administration agencies, in order to obtain more accurate and concise land information to support and regularization and secure tenure in rural areas. More specifically, the model is designed to provide a well defined, structured design for CNIR implementation based on user requirements and project management methodologies. The research includes problem definition, analysis of requirements, constraints and opportunities, and design of a model using soft systems methodologies. The results are definition of required CNIR functions, data flow, and minimum content and implementation strategies. Working together with CNIR managers, the research has provided input for its development. The research is based on the assumption that land information, well managed and legally formalized, can help to provide better security of tenure, and as a consequence it may become the proposed model to bring improvement in land reform programs and in public services in Brazil.
Design of a semi-automated LiDAR point classification framework
Design of a semi-automated LiDAR point classification framework
Data from airborne light detection and ranging (LiDAR) systems are becoming more commonplace and are being used in applications other than traditional remote sensing and GIS applications, such as for archaeological surveys. However, non-expert LiDAR users face challenges when working with LiDAR data or derived products. Anecdotal evidence suggests that many users may not have much knowledge of how a LiDAR product was derived or the qualities of the original LiDAR point cloud. In addition, suitable processing software may not be accessible due to cost or may require extensive training and familiarity with the tools for users to achieve their desired results. This thesis addresses some of the challenges non-expert LiDAR users may face by developing a semi-automated point classification framework that does not require expert user input to classify individual points within the point cloud. The Canadian Airborne LiDAR Acquisition Guideline, released by Natural Resources Canada in 2014, was used as a guide in the development process. The framework consists of a multi-stage classification process that can be applied using LiDAR point clouds exclusively or using LiDAR data integrated with other types of data. Code developed as part of this thesis to implement the framework is hosted in a repository on Bitbucket. The first stage is a ground point identification process that requires little or no operator input to classify ground points within a LiDAR point cloud. It achieved greater than 95% accuracy in sample tests, as compared to available classified ground data. Subsequent stages add or refine classification of points within the point cloud. If only LiDAR data are used, points are classified as building/structure, low vegetation, medium vegetation, high vegetation, unpaved ground, road or paved surface, or points above paved surface. Points that do not meet the criteria for any of the classes are left unclassified. Additional data can be introduced at any stage to improve processing time; add classes, for example, water; or refine results. Recommendations for future research include making greater use of 3D data structures, making greater use of point level information, and improving methods used to refine classification results.
Design of semi-automatic algorithm for shoreline extraction using Synthetic Aperture Radar (SAR) images
Design of semi-automatic algorithm for shoreline extraction using Synthetic Aperture Radar (SAR) images
The coastal zones are one of the most rapidly changing environments in the world. The coast takes up a big portion of the Chilean territory, which results in one of the highest ratios of coastal kilometres to territory per km2 in the world. The capability for all-weather, day/night-imaging acquisition, short revisit periods and global coverage of Synthetic Aperture Radar (SAR) makes them a powerful remote sensing tool for mapping or maps updating. Over coastal areas, the nautical chart is one of the most usable sources of information for navigational, military, planning and coastal management purposes. One of the most important features in nautical charts is the coastline, which constitutes the physical boundaries of oceans, seas, straits and canals, etc. Digitizing a feature such as the coastline is a very tedious and time-consuming operation. The development of a semi-automatic algorithm to detect shoreline is a required task that has to be implemented in the process of nautical chart production in the Chilean Hydrographic and Oceanographic Service. Doing so would save a large amount of time in the process of coastline digitizing, which currently is achieved manually. Previous works have achieved good results in shoreline extraction. But, these works are less appropriate when applied to areas with high local environmental noise caused by the rough sea surface. After identifying the general steps governing the detection of shorelines in SAR images, this thesis develops a new technique to enhance land-water boundaries called the Multitemporal Segmentation Method. Also, the iterative application of windows to get rid of the noise over the sea surface is developed to achieve land-water separation. After detecting the coastline, the bias in the delineation is acceptable, reducing the offsets towards the sea that can result from the application of common filters. Depending on the application and the scale of the final product, analysis by the operator is still very important in this semiautomatic method. The extracted coastline requires a final examination. Also, the extracted coastline is referred to the in-situ water line; consequently it must be referred to the desired tidal datum.
Determination of a geoid model for Ghana using the Stokes-Helmert method
Determination of a geoid model for Ghana using the Stokes-Helmert method
One of the greatest achievements of humankind with regard to positioning is Global Navigation Satellite System (GNSS). Use of GNSS for surveying has made it possible to obtain accuracies of the order of 1 ppm or less in relative positioning mode depending on the software used for processing the data. However, the elevation obtained from GNSS measurement is relative to an ellipsoid, for example WGS84, and this renders the heights from GNSS very little practical value to those requiring orthometric heights. Conversion of geodetic height from GNSS measurements to orthometric height, which is more useful, will require a geoid model. As a result, the aim of geodesist in the developed countries is to compute a geoid model to centimeter accuracy. For developing countries, which include Ghana, their situation will not even allow a geoid model to decimeter accuracy. In spite of the sparse terrestrial gravity data of variable density distribution and quality, this thesis set out to model the geoid as accurately as achievable. Computing an accurate geoid model is very important to Ghana given the wide spread of Global Positioning System (GPS) in the fields of surveying and mapping, navigation and Geographic Information System (GIS). The gravimetric geoid model for Ghana developed in this thesis was computed using the Stoke-Helmert approach which was developed at the University of New Brunswick (UNB) [Ellmann and Vaníček, 2007]. This method utilizes a two space approach in solving the associated boundary value problems, including the real and Helmert’s spaces. The UNB approach combines observed terrestrial gravity data with long-wavelength gravity information from an Earth Gravity Model (EGM). All the terrestrial gravity data used in this computation was obtained from the Geological Survey Department of Ghana, due to difficulties in obtaining data from BGI and GETECH. Since some parts of Ghana lack terrestrial gravity data coverage, EGM was used to pad those areas lacking in terrestrial gravity data. For the computation of topographic effects on the geoid, the Shuttle Radio Topography Mission (SRTM), a Digital Elevation Model (DTM) generated by NASA and the National Geospatial Intelligence Agency (NGA), was used. Since the terrain in Ghana is relatively flat, the topographic effect, often a major problem in geoid computation, is unlikely to be significant. This first gravimetric geoid model for Ghana was computed on a 1' 1' grid over the computation area bounded by latitudes 4ºN and 12ºN, and longitudes 4ºW and 2ºE. GPS/ trigonometric levelling heights were used to validate the results of the computation. Keywords: Gravimetric geoid, Stokes’s formula, Earth Gravity Model, Topographic effect, Digital Terrain Model, Boundary value problem, GPS/trigonometric levelling.
Determination of earth rotation parameters and adjustment of a global geodetic positioning system
Determination of earth rotation parameters and adjustment of a global geodetic positioning system
This thesis focuses on the determination of the Earth rotation parameters (ERO) and the adjustment of a global geodetic network using the Global Positioning System (GPS) technique. Basedon the GPS Differential POsitioning Program (DIPOP) software package of the Department of Geodesy and Geomatics Engineering at the University of New Brunswick, the advanced software name DIPOP.ERP has been implemented. DIPOP.ERP accompanied by preprocessors PREDD.ERP and PREGE.ERP has been extensively used to process data collected during the GPS'92 campaign organized by the International GPS Geodymanics Service (IGS). Data from a seven-day period (from 25 to 31 July, 1992) collected from this campaign have been processed. Four strategies were designed for the software testing and parametric estimation on daily and seven day bases. In the first three of them, the number of the different fixed stations was chosen. In the last strategy, the precise orbits were used in addition to the coordinates of the 28 IGS core stations equipped with dual frequency Rogue receivers, the initial orbital parameters of 18 GPS satellites, the tropospheric scale factor for each station and ambiguities, daily Earth rotation parameters were estimated, based on the double difference algorithm for the carrier beat phase measurement. The results of this experiment demonstrate that the Earth rotation parameters can be recovered with an accuracy of about a few tenths of a milli-arcsecond. A comparison of the estimated polar motion values with the IERS values at the few milli-arcsecond level. The daily repeatabilities for the coordinates of most stations ranged from a few centimeters to about ten centimeters.
Determination of geoidal height difference using ring integration method
Determination of geoidal height difference using ring integration method
With the advent of artificial satellites, it is possible to determine relative positions of points to an accuracy of a few parts per million (ppm). The coordinate differences ca, in turn, be transformed into differences in latitude, in longitude and in height on an ellipsoid, provided that their positions relative to the geocentric cartesian coordinate system are known. For some applications, such as mappings and vertical crustal movements, orthometric heights or height differences are needed. In order to convert these ellipsoidal heights to orthometric heights, geoidal heights are required. A software has been developed to determine geoidal height differences utilizing terrestrial gravity anomalies. The approach used here is the ring integration which consists of compartments formed by the intersection of rings and lines radiating out from the point of interest. In this approach, the integration is regarded as the summation of all the predicted gravity anomalies at the midpoints of the compartments. The difference of summation of all the predicted gravity anomalies between endpoints of the line is then multiplied by a constant (0.0003 m/mGal) to obtain the geoidal height difference for the inner zone contribution. The remote zone contribution is obtained using the high order geopotential model – RAPP180. The contributions from these two zones are then summed up to obtain a full geoidal height difference. The results generated from this software are compared to the results obtained from other independent methods, such as GPS/Levelling method and the UNB Dec.’86. A mean-relative-accuracy (MRA) of 1.7 ppm was obtained between the ring integration method and the GPS/Levelling method using a cap size of radius. ψ ₒ = 0.6 ° radius. The comparisons showed that an improvement of the geoidal height difference was possible when the inner zone contribution was added to the remote zone contribution – improve from MRA of 4.0 ppm to MRA 1.7 ppm. The mean-relative-accuracy between the ring integration method and the UNB Dec.’86 approach is 0.92 ppm using a cap radius of 0.4 °.
Developing a nested finite-element hydrodynamic model to predict phase and amplitude modification of the tide within narrow fjords
Developing a nested finite-element hydrodynamic model to predict phase and amplitude modification of the tide within narrow fjords
A long term monitoring project to measure the inter-annual change in pro-glacial deltaic sediments has been initiated in Oliver Sound, one of a cluster of fjords that lie off Eclipse Sound at the northern tip of Baffin Island, Canada. In order to confidently identify the decimetre-level changes in seabed morphology from multibeam surveys, adequate tidal control is required. Surveying in such remote locations presents conditions, logistics and time constraints that prohibit the installation of tide gauges throughout the survey area and existing predicted tide stations are separated from the survey area by complex fjords and islands. To overcome these hurdles, a high resolution hydrodynamic model simulation has been constructed to predict the tides throughout the survey region which accounts for the changes in tidal phase and amplitude within the complex fjords. The simulation results are compared to existing lower resolution tidal models, nearby predicted tides and Globally Corrected GPS data from survey vessels working and transiting throughout the area.
Development and assessment of loosely-coupled ins using smartphone sensors
Development and assessment of loosely-coupled ins using smartphone sensors
Smartphone accelerometers and gyroscopes are quite common in today‟s society but little work has been done on assessing how accurate and reliable they are to be used in inertial navigation systems (INS). The goal of this research is to develop a loosely-coupled INS filter that only uses sensors found inside a Moto-X Android smartphone. Micro-electro-mechanical sensors (MEMS) accelerometers and gyroscopes provide the raw motion sensor data whereas the high-sensitivity GNSS receiver in the smartphone is used to provide position and velocity updates to the filter. Magnetometers, also included in the MEMS are a potential source of heading aiding that not only aids in INS alignment but helps constrain the heading drift. A successful filter implementation could potentially open the doors of inertial navigation to the everyday smartphone user. This would allow developers of smartphone applications to focus on the creative side of their application while using the loosely-coupled INS in the background. The loosely-coupled INS filter was developed in C++ and was run offline although the operations are exactly those that would be applied in real time. The INS filter was verified by using raw inertial measurement unit (IMU) measurements from a high-end Northrop Grumman IMU-LN200 motion sensor and single-point GNSS position/velocity updates from a high accuracy NovAtel Flexpak6 receiver. Two datasets with distinct environments were used. The first one was a relatively open-sky dataset in NW Calgary and the second was an urban canyon dataset in downtown Calgary. Once the INS was verified to work within expectations, two more datasets were collected, this time with the Moto-X Android smartphone and the NovAtel SPAN system (IMU-LN200 + Flexpak6 running INS capable firmware). The datasets were again in open-sky and urban canyon environments. Due to the high noise of the Moto-X sensors, the high frequency noise of the raw data was removed via wavelet decomposition. This was very important as the faint sensor signal is buried under a lot of noise. Empirically derived estimates for sensor turn-on bias and scale factor errors were then found. The easiest way to assess the validity of the filter is to compare the attitude with the truth trajectory, where the truth trajectory is that of the NovAtel SPAN solution. The reason for this is that position and velocity are directly dependent on the quality of input filter updates. It is possible to have good results in position and velocity but still have a filter that diverges in attitude. When ran with the IMU-LN200 and NovAtel Flexpak6 data, the loosely-coupled INS filter had RMS differences in pitch and roll under 0.4º in the open-sky dataset and under 0.8º in the urban canyon dataset. RMS differences in heading were below 1º in the open-sky dataset and slightly above 1º in the urban canyon dataset. When ran with the Moto-X Android smartphone sensors, the INS filter had RMS differences in pitch and roll below 4.5 º in the open-sky dataset and below 16 º in the urban canyon datasets respectively. The RMS differences in heading were around 13º for the open-sky dataset and large enough to make the system useless for the urban canyon dataset. The results show the Moto-X Android smartphone sensors can be used for civilian enthusiast level of navigation under open-sky environments. It is however expected for MEMS sensors to improve over time thus improving the usability of a loosely-coupled INS filter using smartphone sensors.
Development and testing of in-context confidence regions for geodetic survey networks
Development and testing of in-context confidence regions for geodetic survey networks
The objectives of the contract were to develop and numerically test in-context absolute and relative confidence regions for geodetic networks. In-context confidence regions are those that relate to many points simultaneously, rather than the conventional notion of speaking about the confidence region about only one point without regard to any others. An out-of-context test is conducted on some piece of data without regard for the remaining data in the set. An in-context test is conducted on a quantity in the context of being a member of a larger set. Adjustment software, such as GHOST and GeoLab, that use the so-called Tau test of residuals are based on in-context testing. However, we are aware of no software that is capable of performing in-context testing on confidence regions for the estimated coordinate parameters. Another issue needing clarification is the matter of local versus global testing. Global testing is understood to be a single test involving the entire group of variates under examination. A global test statistic is typically a quadratic form which transforms the variates into a scalar quantity, containing all the information about the group. On the other hand, local testing is the process of testing individual variates in the group, either in-context or out-of-context. Since these tests can be conducted in either parameter or observation space, they should use a consistent approach in both spaces whenever possible. The development of confidence regions corresponding to one solution is different from the statistical testing of the compatibility (or congruency) of one solution against another. In this report we focus on the development of confidence regions for the analysis of a single network solution, rather than the development of statistical tests for applications such as deformation analyses that require the comparison of two solutions. The key issue of in-context testing is the formation of a mathematical link between the various statistical tests that may be conducted not only on the estimated parameters but also on the estimated residuals. The consequence of a mathematical link is compatibility of statistical tests throughout observation and parameter space. Three approaches to the computation of in-context confidence regions were examined during this contract: the Bonferroni, Baarda and projection approaches. The Bonferroni approach equates the simultaneous probability of the individual in-context confidence regions to a selected global probability level. However, it neglects any correlations between the tested quantities, which can have serious consequences for parameter confidence regions. The Baarda (or Delft) approach uses the relation between Type I and II errors for both global and local testing, but arbitrarily assumes the probability and non-centrality parameters for both local and global Type II errors are the same. Finally, the projection approach simply uses the global confidence region or test and projects it to the individual subspaces for local confidence regions or tests. It uses the global expansion factor for all individual in-context confidence regions and tests, which results in unreasonably large confidence regions that can grow without bound. Strictly speaking this is not an in-context approach as defined above. It is effectively a global test on the individual quantities. That is, the failure of one individual local test also implies the failure of the global test. To summarize, the projection method tests hypotheses that are different from what we want and its in-context expansion factors are unreasonably large and grow without bound for large networks. Baarda's approach gives relatively large in-context expansion factors which grow without bound (although much more slowly than in the projection approach). The Bonferroni approach yields the smallest and most reasonable expansion factors for in-context confidence regions and tests. The expansion factors are also bounded to reasonable values for even the largest of networks. However, this approach neglects the effects of correlations which can be very large between coordinate parameters in geodetic networks. Primarily because of the smaller expansion factors, we recommend to use the Bonferroni approach for in-context confidence regions and tests, in spite of the neglect of correlations. It is recommended to further investigate the effects of large correlations and possible ways of accounting for them. The recommended approach for the in-context statistical analysis of the adjustment of a geodetic network is to first chose a global significance level (a) to be used as the basis for all global and local in-context tests and confidence regions. The specific significance levels to use for the various tests and confidence regions are: • Global test on residuals (variance factor test). Use the global significance level (a). • Local tests of individual residuals (outlier tests). Use the in-context significance level ao = a/n, where n is the degrees of freedom of the adjustment. • Global confidence region. For a global confidence region for all points in the network, use the global significance level (a). • Local absolute (point) confidence regions. For absolute in-context confidence regions at individual points in the network, use the in-context significance level ao = a/n, where n is the number of points being simultaneously assessed. • Local relative confidence regions. For relative in-context confidence regions between pairs of points in the network, use the in-context significance level ao = a/m, where m is the number of linearly independent pairs of points to be simultaneously assessed.
Development of a geospatial reference framework
Development of a geospatial reference framework
This thesis describes the development of a geospatial reference framework for categorizing, organizing, validating, browsing, and representing survey camp topographic data. This topographic data is collected annually by the Geodesy and Geomatics Engineering (GGE) students at the University of New Brunswick (UNB) as part of the requirements for a UNB course. ESRI ArcGIS 10 was used to build the information products associated with the geospatial framework. The information products were employed for analyzing, organizing, and managing the past and future topographic map collections. In order to make the geospatial reference framework easily accessible, a Web-GIS application was developed using ArcGIS Server on the server side and ArcGIS JavaScript API on the client side. This thesis represents the establishment of an appropriate geodatabase model that has been designed and built to satisfy the requirements and characteristics defined by the spatial reference framework. The project contributed in designing and producing geospatial information products including a geographical repository of UNB campus, geospatial data validation tools, repository maintenance methods, and the Web-GIS service. These geospatial information products constitute a geospatial reference framework. This framework allows for organization, storage, and representation of past survey camp data collections and provides the specifications and standards for the future collections.
Development of a semi-automated system for structural deformation monitoring using a reflectorless total station
Development of a semi-automated system for structural deformation monitoring using a reflectorless total station
The failure of a large structure could have severe consequences. For this reason, early detection of possible structural damage is critical. This stimulates the need for a reliable methodology for routine structural deformation monitoring. Large, above ground oil storage tanks are examples of structures that must be routinely surveyed to monitor their stability and overall integrity. Presented here is the research and development of a methodology and software system to perform the semi-automated deformation monitoring of such tanks. The new system, “SCAN”, greatly improves upon a current, drastically outdated monitoring scheme with the implementation of a robotic total station with reflectorless laser technology. SCAN has been interfaced with an existing deformation monitoring software system, ALERT, developed by the Canadian Centre for Geodetic Engineering at the University of New Brunswick. The full functionality and reliability of this system were tested by simulating an oil tank with a large water tank of comparable dimensions. The results from this field test indicate that the ALERT SCAN system greatly increases surveying efficiency by reducing the time required to collect entire tank data from two weeks (with three persons) to one half-day (with one person). The system is also tested to be a reliable method to perform semi-automated data collection and processing. Based on this system, research has continued into a more sophisticated, adaptable version of SCAN that would have the potential to perform the automated deformation monitoring of almost any structure.

Pages

Zircon - This is a contributing Drupal Theme
Design by WeebPal.